Over the past few weeks, I spent a lot of time exploring new AI tools with which I could build a Gen AI application. Of the many AI tools I found, Griptape impressed me the most. Wondering why I got interested in Griptape? You will find out what makes it special and my reason for using it.
TL;DR š§š½
In this article, you will learn about Griptape, its core components, and how to kickstart building and deploying your first AI application with Griptape and your favorite LLM.
Curious to see what Griptape offers? Letās dive in!
Introduction to Griptape
Griptape is a powerful platform for building, deploying, and managing AI applications with ease ā it provides its users with a framework that helps simplify complexities of building AI applications and a cloud infrastructure for seamless deployment of Griptape applications.
With Griptape, you donāt need to worry about managing the underlying infrastructure as the framework takes care of it. The platform is a great choice for developers and teams that want to accelerate development of AI applications. Griptapeās modularity ensures it remains lightweight, making it perfect for building high-performance applications without unnecessary overhead.
Griptape Framework
The Griptape framework is an open-source Python framework designed to facilitate AI development. It makes it easy to integrate LLMs into applications with manageable pipelines and workflows. With the framework, developers can create conversational AI agents and other generative AI applications to handle data securely and efficiently.
The Griptape framework allows developers to integrate LLMs with third-party data sources or custom AI models. All you have to do is build Extract, Transform, Load (ETL) pipelines to protect access to LLMs and write agents, workflows, and pipelines to integrate your custom business logic.
Griptape Cloud
Griptape provides a cloud infrastructure that enables developers to deploy and scale their applications without worrying about infrastructure management. The platform takes care of resource allocation, load balancing, and uptime management, allowing developers to focus on building, deploying, and managing their applications with ease.
Griptape's cloud infrastructure supports Retrieval-Augmented Generation (RAG) pipelines, which are particularly useful for pre-processing information from external data sources before generating responses. This process allows developers to retrieve relevant data, integrate it with the response generation, and plug it seamlessly into client applications, creating more dynamic and data-enriched responses.
In addition, it lets you connect to external data sources, therefore, allowing you to extract and refine data. Whether you're deploying lightweight or complex AI agents, the cloud infrastructure automatically adjusts to handle increased demand, ensuring seamless performance.
Key Feature of Griptape
Griptape provides many features to its users:
- LLM Integration (Framework): Griptape simplifies working with LLMs by reducing the code needed for integration. The framework allows you to easily integrate an LLM into your applications, enabling them to generate responses and context and perform other supported tasks efficiently.
- Infrastructure Management (Cloud): You do not need to worry about this when using Griptape because Griptape Cloud helps manage all the infrastructure tasks and load, including uptime management and resource allocation. This helps save time, as the developerās focus will be centered on the applicationās development.
- Scalability (Cloud): Griptape Cloud automatically scales to meet your application's demands without any manual configuration. This seamless scaling doesn't compromise performance, ensuring your Griptape application runs smoothly even as demand increases. Your application remains responsive and efficient as resources are dynamically adjusted to ensure optimal performance. Whether you are building a large-scale AI application or not, thereās nothing to worry about ā Griptape Cloud manages everything for you.
- Modular and Lightweight (Framework): The Griptape framework was designed to be modular and lightweight for high performance and flexibility in development. With its modular approach, you only need to integrate the components developers need, saving time and reducing overhead. The frameworkās lightweight structure ensures that complex AI workflows and pipelines can be developed and executed without reducing performance quality, making it ideal for any type of project.
- Open-source (Framework): Griptape's framework is open-source, making it flexible for development. Developers can make contributions to extend Griptape's capabilities. Contributions such as GitHub issues or bug fixes can also be made to the repository.
-
Pipeline & Workflow Management (Framework): The framework enables developers to build and manage workflows and pipelines securely and efficiently. With the help of Griptapeās
off-prompt
(Griptapeās component designed for security and higher performance), developers have control over the data that goes into the LLM and ensure processes run smoothly. Theoff-prompt
feature also allows for seamless orchestration of tasks, enabling applications to process any amount of data and handle complex workflows. - RAG Pipeline Optimization Support and Optimization: Griptapeās framework supports RAG pipelines, enabling developers to generate relevant responses from external sources. The support allows for flexibility from different sources outside the LLM. In addition, Griptape Cloud improves the quality of RAG pipelines making it easy to compile data before generating responses, improving the relevance and accuracy of the responses.
- Seamless Deployment (Cloud): Developers can easily deploy Griptape applications using the Cloud Infrastructure without the need for manual configuration or complexities, simplifying the entire deployment process.
-
Griptape
TaskMemory
: This is a configurable feature provided by the Griptape Framework for storing and retrieving information generated during task execution within workflows, without sending any data back to the LLM. By usingTaskMemory
, developers can easily manage the state between different tasks without needing to manually track it across different stages. This feature also serves as a cost-effective solution for teams using LLMs, as the costs are based on token counts (the amount of data LLMs use for input and output) of the messages sent.
Give Griptape a Star on GitHub š
As mentioned earlier, Griptape is open-source; all contributions are appreciated. You can support Griptape by giving It a Star on GitHub.
Want to be a part Griptapeās growth? Please help Griptape grow by giving the project a Star on GitHub.
Now that you know about Griptape, its major components, and key features, itās time to get our hands a bit dirty by creating a simple AI application with Griptapeās Framework.
Developing an AI Application Using Griptape and an LLM š§āāļø
Building an application with Griptape is straightforward - you only need to integrate an LLM into your AI application. Here is a step-by-step guide of how to develop a simple calculator using Griptape and an LLM:
- Install and Setup Griptape Framework: Youāll need to install Griptape Framework into your application using Pythonās package manager:
pip install "griptape[all]" -U
- Get Your OpenAI API key and Add to your Environment: If you have an OpenAI account, generate a secret API key. Griptape uses OpenAIās Chat Completion API to execute LLM tasks. The command below establishes a connection between OpenAI and Griptape by adding your API key to your environment:
export OPENAI_API_KEY=āYOUR_API_KEYā
-
Importing necessary Griptape Modules: Here comes the first of the major steps; writing some Python code that leverages Griptapeās Framework and
gpt-4
to perform tasks. Since we are building a simple calculator, we will make imports from two Griptape modules:
from griptape.tools import Calculator
from griptape.structures import Agent
In the code above, we will import the Calculator
tool from the griptape.tools
module. This tool is designed to perform arithmetic functions in Griptape.
The Agent
class was also imported from the griptape.structures
module for making interactions between the user and the Calculator
tool by taking in calculations and generating responses.
-
Creating an Agent: After importing the modules, you need create an agent with the
Agent
class imported.
agent = Agent(
tools=[Calculator()]
)
The Agent
works directly with the Calculator
tool to provide input, while the Calculator
tool performs calculations based on the userās command.
- Defining the Calculatorās Loop: We will create a function to print the main logic of the calculator. In this loop, we will print the instruction for the user.
def calculator_loop():
print("welcome to Griptape calculator")
print("type 'exit' to quit the calculator")
-
Create User Interaction: We will initiate a
while
loop that enables continuous interaction with the user even after getting responses. The loop will only be terminated if the user inputs āexitā, this end the session as well by printing ābye!ā.
while True:
user_input = input("input a calculation like 5 + 3 * 2: ")
if user_input.lower() == 'exit':
print("bye!")
break
-
Handling User Input: Now letās create a logic to make user input perform calculations. The userās input is passed to the agentās
agent.run(user_input)
method that process the calculation using theCalculation
tool integrated into theAgent
class. Once the agent has processed the inputted calculation, it is expected to display a key asoutput
. In the end, the result will be extracted and printed in a readable format. One thing also added to the code below is a logic to handle errors; once there is an error. it notifies the user in a friendly way.
try:
response = agent.run(user_input)
result = response['output']
print(f"Result: {result}")
except Exception as e:
print(f"Error: {e}")
- Run the Calculator Loop: Lastly, we will create a calculator loop to restart the calculator after performing
calculator_loop()
This is the code you should have before running or deploying:
from griptape.tools import Calculator
from griptape.structures import Agent
agent = Agent(
tools=[Calculator()]
)
def calculator_loop():
print("welcome to Griptape calculator")
print("type 'exit' to quit the calculator")
while True:
user_input = input("input a calculation like 5 + 3 * 2:")
if user_input.lower() == 'exit':
print("bye!")
break
try:
response = agent.run(user_input)
result = response['output']
print(f"Result: {result}")
except Exception as e:
print(f"Error: {e}")
calculator_loop()
Now, letās run the script and test our first Griptape Application:
python3 my_griptape_script.py
Finally, letās test our first Griptape application:
Welcome to Griptape Calculator
Type 'exit' to quit the calculator.
Input a calculation like 5 + 3 * 2: 83 - 10 * 2 + 4 / 2
[10/04/24 00:47:03] INFO ToolkitTask 038e687684c04ae2b83467221ca9f6c4
Input: 83 - 10 * 2 + 4 / 2
[10/04/24 00:47:08] INFO Subtask 989a11849ec3406f85b5db3de17089bf
Thought: To solve the expression \(83 - 10 \times 2 + \frac{4}{2}\), I need to follow the order of operations: parentheses, exponents,
multiplication and division (from left to right), and addition and subtraction (from left to right).
1. First, perform the multiplication: \(10 \times 2\).
2. Then, perform the division: \(\frac{4}{2}\).
3. Finally, perform the subtraction and addition in sequence.
Actions: [{"name":"Calculator","path":"calculate","input":{"values":{"expression":"10 *
2"}},"tag":"multiplication"},{"name":"Calculator","path":"calculate","input":{"values":{"expression":"4 / 2"}},"tag":"division"}]
[10/04/24 00:47:10] INFO Subtask 989a11849ec3406f85b5db3de17089bf
Response: Output of "Calculator.calculate" was stored in memory with memory_name "TaskMemory" and artifact_namespace
"ec34a8df918c41c18ae2f53f0654b611"
Output of "Calculator.calculate" was stored in memory with memory_name "TaskMemory" and artifact_namespace
"e2ce65a931e34b36bda8c62063621fc1"
[10/04/24 00:47:12] INFO Subtask bf8589416c6f4f6487d22dc7b217b599
Thought: I have the results of the multiplication and division stored in memory. I need to retrieve these results and use them to complete
the calculation for the expression \(83 - 20 + 2\).
Actions: [{"tag": "subtraction_addition", "name": "Calculator", "path": "calculate", "input": {"values": {"expression": "83 - 20 + 2"}}}]
INFO Subtask bf8589416c6f4f6487d22dc7b217b599
Response: Output of "Calculator.calculate" was stored in memory with memory_name "TaskMemory" and artifact_namespace
"d29c94ed49f84111ae76f02ae94e7df2"
[10/04/24 00:47:13] INFO ToolkitTask 038e687684c04ae2b83467221ca9f6c4
Output: The result of the expression \(83 - 10 \times 2 + \frac{4}{2}\) is 65.
Input a calculation like 5 + 3 * 2: exit
bye!
As you can see, it doesnāt just give the calculation's output; it is also interactive. It allows users to see the process of calculating the arithmetic expression. This transparency enhances the user experience by enabling users to understand how the final result is obtained.
Deploying Your Griptape Application
While we wonāt be deploying this application, you can follow the instructions provided in the Griptape Cloud documentation. Griptape Cloud offers a component, Structures, for deploying applications whether it uses the Griptape Framework or not.
It also enables developers to build RAG pipelines by connecting to Griptapeās Data Sources for ingestion. To deploy your application, you can also start by creating an account and integrating with GitHub.
For more detailed steps on how to deploy a Griptape application, you can refer to the deployment guide in Griptape's Cloud documentation.
Summary
Building and deploying an AI application using Griptape shows how easily you can integrate LLM tools into your project. From handling user input to interacting with LLMs, Griptape simplifies the development process of developing an AI Application with its framework.
This article only focused on Griptapeās entire structure and how Griptape can be used to develop interactive AI applications. If you want to learn more about Griptapeās other offerings, I recommend checking out their official website.
Lastly, please do well to give Griptape a Star on GitHub. š
Thank you for reading this article. I look forward to hearing what you think about Griptape in the comment section! š
Top comments (2)
Nice article š„š„
Great Article on Griptape