DEV Community

Cover image for Tutorial: Build an Agentic AI Application with Agents for Amazon Bedrock

Tutorial: Build an Agentic AI Application with Agents for Amazon Bedrock

Here's a step-by-step process for building an application that uses Agents for Amazon Bedrock to trigger a Lambda function with the ability to execute tasks for you. The code I used can be found here. The cost will be <$1, if you remember to run all the clean-up steps at the end!

This simple application allows parents to book an appointment with their child's high school teacher at the upcoming Parents and Teachers Evening. Data relating to the available time slots, existing appiontment bookings, teachers, their subjects, and classrooms is stored in DynamoDB tables.

The architecture looks like this:

Architecture diagram showing the interaction between Amazon Bedrock, Lambda and DynamoDB

Architecture Components

Amazon Bedrock

Used to provide API access to the required foundation model. In this example, the model we are using is: anthropic.claude-3-sonnet-20240229-v1:0

Amazon Bedrock Agent

The Bedrock agent uses the reasoning capabilities of the specified AI model, along with the data available to it, and the available actions to figure out how to deal with the user requests it receives. In this example, the agent needs to answer questions like checking the availability of appointments at the Parents and Teachers Evening, as well as booking the appointments.

Amazon Bedrock Action Group

This defines the actions that the agent can take, for instance the ability to search and update the data held in DynamoDB. The actions are really API calls that the agent is able to make in order to fulfil user requests.

Lambda Function

A Lambda function is used to make the necessary API calls, like searching and updating the data held in DynamoDB.

DynamoDB

DynamoDB is used to store the data held by the system. Two tables are created and populated with data relating to appointments with teachers at the upcoming parents' evening. This is our custom data that our agent is able to interact with.

Jupyter Notebook

A Jupyter notebook running in SageMaker is used as the IDE (Integrated Dev Environment), to run all commands used to build all the various components in AWS, as well as download the GitHub repository and run the Python code.

Prerequisites

1) Do everything in us-east-1.
2) In your AWS account, be sure to request access for the Bedrock models that you would like to use. You'll find this in the Bedrock console, under model access. (For this exercise, I enabled anthropic.claude-3-sonnet-20240229-v1:0 .)

Image showing how to enable access to models in Bedrock

3)Before creating the SageMaker Notebook, first make sure you have a SageMaker AI Domain in us-east-1, this one-time step creates home directory space, and VPC configurations needed by any notebooks you create in this region. If you don't have one already, select the Create domain option, and it will do everything for you.

Image showing the creation screen for a SageMaker AI Domain

Building the Application

1) Use this CloudFormation template - create_SM_notebook.yaml to create a SageMaker Notebook, that we'll use to run the commands from. The template will configure the SageMaker Notebook instance, with an associated IAM role that includes permissions for a few required services, including:

Bedrock full access
DynamoDB full access
IAM full access
Lambda full access

This access is needed in the beginning because we'll be running commands on the notebook instance to build everything. After everything has been configured, these permissions can be tightened up.

2) When the notebook is ready, select the notebook instance and select open Jupyter Lab. The required GitHub repository containing the application code will already be downloaded and saved to a folder named bedrock-agent-demo.

Image showing the Jupyter Lab environment, with downloaded GitHub repository

3) From the cloned repository, open the file named: bedrock_agents_demo.ipynb - this is an Interactive Python Notebook, each block of code is displayed in a cell that can be run in sequence, to observe the outcome of each step.

Image showing the bedrock_agents_demo.ipynb file

4) Run all the cells in contained in the bedrock_agents_demo.ipynb file, which at a high level, will do the following:

Install required libraries like boto3, which is the AWS SDK for Python that interacts with Bedrock.

Create the two DynamoDB tables to store data relating to teachers, and appointments. Then populate them with some sample data that the agent will be able to interact with.

Image showing the items in the DynamoDB table

Create a Lambda function that has the ability to check if a teacher is available at a specific time, book an appointment with a teacher, and get all available appointment slots.

Create the agent and action group
The action group defines all the functions that the agent is able to do, for instance check_teacher_availability and book_appointment.

Image showing the action group in the AWS console

Testing

Run some prompts to test that everything is working. You can update the input text to modify the prompts to see what is possible.

Image showing a prompt being provided to the application

Image showing the response from the application

Example Prompts to Try:

Try running the following prompts, or create your own:

  1. Who teaches Sociology?
  2. Book the first available appointment with the I.T. teacher
  3. When is the history teacher available?
  4. List all Miss White's appointments
  5. Cancel the 18:30 appointment with Mr Stokes

Observe how the agent uses reasoning, the small number of very basic actions, and the limited data that it has available to try to fulfil your requests. Explore what the agent is able to do, and what the limitations are. Even with a small amount of data and some simple actions, it is able to do quite a lot using reasoning or asking for clarification to try to get the job done!

Cleaning Up to Avoid Charges

After exploring, be sure to run the last few cells in the notebook, to clean up the DynamoDB tables, the Bedrock Action Group and Agent, and the Lambda Function to avoid unnecessary charges. Then remember to delete the CloudFormation stack as well if you no longer need the Jupyter notebook instance.

Top comments (1)

Collapse
 
rdarrylr profile image
Darryl Ruggles

Agentic AI approaches are everywhere these days. Thanks Faye for showing a good example of doing this all on AWS.