1. Overview
In this post, we'll explore how to leverage direct service integrations in AWS Step Functions to build a workflow for executing DynamoDB transactions. AWS Step Functions are an excellent tool for breaking down business workflows into individual steps, promoting separation of concerns and encapsulating discrete actions within each step.
2. The Use Case
Let's consider a real-life scenario to demonstrate this approach. We start with an object stored in Amazon S3. When the file is deleted, we must remove an item from two DynamoDB tables. To ensure data consistency, we'll wrap both delete operations inside a transaction, preventing a situation where one delete succeeds while the other fails.
Here's an example of an Amazon EventBridge rule that captures all delete events from a specific Amazon S3 bucket:
{
"detail": {
"bucket": {
"name": ["bucket_name"]
},
"deletion-type": ["Permanently Deleted"]
},
"detail-type": ["Object Deleted"],
"source": ["aws.s3"]
}
3. The Traditional Lambda Solution
A classic design would involve enabling Amazon S3 event notifications to Amazon EventBridge. Once the event reaches the event bus, an Amazon EventBridge rule would trigger a AWS Lambda function to execute the DynamoDB transaction. Here's what this architecture might look like:
Let's examine a potential AWS Lambda function implementation:
def lambda_handler(event, context):
# Extract the S3 object key from the EventBridge event
s3_key = event.get["detail"]["object"]["key"]
# Construct your DynamoDB delete operations
delete_item_in_table_A = {
'Delete': {
'TableName': "ddb_table_a",
'Key': {
'YourPrimaryKeyAttributeName': {'S': s3_key}
}
}
}
delete_item_in_table_B = {
'Delete': {
'TableName': "ddb_table_b",
'Key': {
'YourPrimaryKeyAttributeName': {'S': s3_key}
}
}
}
# Perform a DynamoDB transaction to ensure both deletes happen together
response = dynamodb.transact_write_items(
TransactItems=[delete_item_in_table_A, delete_item_in_table_B]
)
return {
'statusCode': 200,
'body': json.dumps('Delete transaction succeeded'
}
While this solution is concise and functional, it has some drawbacks. The AWS Lambda function merely receives an event and performs an API call, without any substantial business logic. It serves as a simple connector in a data pipeline chain, executing some delete operations.
AWS Lambda functions that primarily connect different services or transform events without complex business logic can often be replaced with service integrations.
Let's explore this alternative approach.
4. The AWS Step Functions Solution
Amazon EventBridge supports AWS Step Functions as a target, allowing us to replace the AWS Lambda function with a AWS Step Functions workflow. This approach enables us to build a no-code solution using DynamoDB direct service integrations within the workflow.
Here's an overview of this solution:
Now, let's dive into the implementation of the AWS Step Functions workflow.
4.1 AWS Step Functions Service Integrations
Since our use case doesn't involve complex logic, we can build our workflow using service integrations. AWS Step Functions offer two types of integrations with other AWS services:
- AWS SDK Integrations: These cover over 200 services and are similar to API calls you'd make in a AWS Lambda function.
- Optimized Integrations: Available for about 20 core services, these add convenience by automatically converting output to JSON and handling asynchronous tasks, eliminating the need for custom polling mechanisms.
We have two AWS SDK integration options, DynamoDB:TransactWriteItems and DynamoDB:ExecuteTransaction, to wrap both delete statements in a transaction.
Let's explore both implementations.
4.1 DynamoDB:TransactWriteItems
The TransactWriteItems API allows for synchronous, atomic write operations across multiple items. It supports up to 100 actions (Put, Update, Delete, or ConditionCheck) in different tables within the same AWS account and region. This API doesn't allow read operations within the transaction and ensures all actions either succeed or fail together.
Using this approach, we need just a single step in our workflow:
Here's the workflow's ASL (Amazon State Language) definition for the transaction:
{
"Comment": "DynamoDB Transaction for Delete Statements",
"StartAt": "DeleteTransaction",
"States": {
"DeleteTransaction": {
"Type": "Task",
"Parameters": {
"TransactItems": [
{
"Delete": {
"TableName": "ddb_table_a",
"Key": {
"PK": {
"S.$": "$.detail.object.key"
}
}
}
},
{
"Delete": {
"TableName": "ddb_table_b",
"Key": {
"PK": {
"S.$": "$.detail.object.key"
}
}
}
}
]
},
"Resource": "arn:aws:states:::aws-sdk:dynamodb:transactWriteItems",
"End": true
}
}
}
4.2 DynamoDB:ExecuteTranscation
The ExecuteTransaction API allows for transactional reads or writes using PartiQL statements. A transaction can contain up to 100 statements, but all operations must be either reads or writes, not a mix. It ensures that all statements in the transaction are executed atomically.
Our workflow would look like this:
Defining the delete statements for this approach can be tricky. Here's an example implementation:
{
"Comment": "DynamoDB Transaction for Delete Statements",
"StartAt": "ExecuteTransaction",
"States": {
"ExecuteTransaction": {
"Type": "Task",
"Parameters": {
"TransactStatements": [
{
"Statement": "DELETE FROM \"ddb_table_a\" WHERE PK = ?",
"Parameters": [
{
"S.$": "$.detail.object.key"
}
]
},
{
"Statement": "DELETE FROM \"ddt_table_b\" WHERE PK = ?",
"Parameters": [
{
"S.$": "$.detail.object.key"
}
]
}
]
},
"Resource": "arn:aws:states:::aws-sdk:dynamodb:executeTransaction",
"End": true
}
}
}
In both cases, we only need to define a single state with the delete statements. This approach eliminates the need for maintaining code, dealing with cold starts, or managing runtime updates.
Cost Considerations
When it comes to costs, choosing the Express workflow type is the most economical option for such a simple and fast workflow.
An often overlooked fact is that AWS Step Functions Express offers a minimum 64 MB configuration option, which is more cost-effective than the minimum 128 MB AWS Lambda function.
To illustrate, let's consider a scenario with 3 million invocations, each lasting 100ms:
- A 128 MB AWS Lambda function in us-east-1 would cost $0.51
- A 64 MB Express Step Function in the same region would cost $0.31
This demonstrates the potential for significant cost savings when using AWS Step Functions for simple workflows.
Conclusion
By leveraging AWS Step Functions with direct service integrations, we can create efficient, no-code solutions for executing DynamoDB transactions. This approach offers several advantages over traditional AWS Lambda-based implementations:
- Simplified architecture with reduced code maintenance
- Improved separation of concerns
- Potential cost savings, especially for simple, high-volume workflows
- Elimination of cold starts and runtime management
As we've seen, both TransactWriteItems and ExecuteTransaction APIs provide robust options for implementing transactional operations in DynamoDB through AWS Step Functions. The choice between them depends on your specific use case and whether you need to include read operations in your transactions.
By adopting this serverless, no-code approach, you can simplify your data pipeline processes and focus more on building scalable, maintainable applications in AWS.
Top comments (0)