Hey developers! ๐ For the last post in the series, we'll provision the infrastructure on AWS and deploy our API. Let's dive in! You can check out my GitHub for the complete code.
Configure Terraform
First, we'll define the Terraform version and set up the backend to store the state in an S3 bucket. Since Terraform doesn't create the backend bucket automatically, you'll need to provision it beforehand. Variables cannot be used in the backend configuration so the values to be hardcoded but feel free to change them.
#infrastructure/versions.tf
terraform {
required_providers {
aws = {
source = "hashicorp/aws"
version = "~>5.0"
}
}
}
#infrastructure/terraform.tf
terraform {
backend "s3" {
bucket = "translate-state-bucket"
region = "us-east-1"
key = "terraform.tfstate"
}
}
We'll also create a few local variables
#infrastructure/locals.tf
locals {
tags = {
Terraform = true
Application = "translate"
}
}
DynamoDB Table
Now we'll define the configuration to provision the DynamoDB table. DynamoDB tables require a name
, a hash_key
,which should also specified as an attribute, and a billing_mode
.
resource "aws_dynamodb_table" "this" {
name = "translate-table"
hash_key = "id"
billing_mode = "PAY_PER_REQUEST"
attribute {
name = "id"
type = "S" #string
}
tags = local.tags
}
S3 Bucket
Next, we'll provision the output S3 bucket by specifying a name using either the bucket
or bucket_prefix
field. To manage storage efficiently, we'll add a lifecycle configuration that automatically deletes files after one day. Lifecycle configurations require the bucket name and at least one rule to define the retention policy.
resource "aws_s3_bucket" "this" {
bucket_prefix = "translate-output-"
tags = local.tags
}
resource "aws_s3_bucket_lifecycle_configuration" "this" { # A lifecycle hook to delete files after 1 day
bucket = aws_s3_bucket.this.id
rule {
id = "rule-1"
filter {}
expiration {
days = 1
}
status = "Enabled"
}
}
Lambda functions
We'll start by assigning the AWSLambdaBasicExecutionRole
, which grants essential permissions like creating CloudWatch logs.
To package the Python scripts for the Translate Text and Translate File endpoints, we'll use the archive_file
data resource. Since the Translate File API has dependencies, we'll install them beforehand using a null_resource
before packaging.
We'll also create an IAM role for the Lambda functions and attach the necessary policies. Finally, we'll define the Lambda functions using the packaged files, set the handler to main.handler
, and configure the required environment variables.
data "aws_iam_policy" "this" {
name = "AWSLambdaBasicExecutionRole"
}
data "archive_file" "translate" {
type = "zip"
source_dir = "${path.module}/../api/translate"
output_path = "${path.module}/files/translate.zip"
}
data "archive_file" "translate-file" {
type = "zip"
source_dir = "${path.module}/files/translate_file"
output_path = "${path.module}/files/translate-file.zip"
depends_on = [ null_resource.this ]
}
resource "null_resource" "this" {
provisioner "local-exec" {
command = <<EOT
rm -rf ${path.module}/files/translate_file
cp -r ${path.module}/../api/translate_file ${path.module}/files/translate_file
cd ${path.module}/files/translate_file
pip install -r requirements.txt -t ./
EOT
}
triggers = {
always_run = timestamp()
}
}
data "aws_iam_policy_document" "assumeRole" {
statement {
effect = "Allow"
actions = ["sts:AssumeRole"]
principals {
identifiers = ["lambda.amazonaws.com"]
type = "Service"
}
}
}
data "aws_iam_policy_document" "this" {
statement {
effect = "Allow"
actions = ["translate:TranslateText",
"translate:TranslateDocument",
"comprehend:DetectDominantLanguage"
]
resources = ["*"]
}
statement {
effect = "Allow"
actions = ["dynamodb:*"]
resources = [
aws_dynamodb_table.this.arn,
"${aws_dynamodb_table.this.arn}/*"
]
}
statement {
effect = "Allow"
actions = ["s3:*"]
resources = [
aws_s3_bucket.this.arn,
"${aws_s3_bucket.this.arn}/*"
]
}
}
resource "aws_iam_role" "this" {
name_prefix = "translate-app-role"
assume_role_policy = data.aws_iam_policy_document.assumeRole.json
tags = local.tags
}
resource "aws_iam_role_policy" "this" {
role = aws_iam_role.this.name
policy = data.aws_iam_policy_document.this.json
}
resource "aws_iam_role_policy_attachment" "this" {
role = aws_iam_role.this.name
policy_arn = data.aws_iam_policy.this.arn
}
resource "aws_lambda_function" "translate" {
function_name = "translate-app-translate-lambda"
description = "Lambda function for the /translate endpoint"
role = aws_iam_role.this.arn
filename = data.archive_file.translate.output_path
handler = "main.handler"
source_code_hash = data.archive_file.translate.output_base64sha256
runtime = "python3.10"
environment {
variables = {
DYNAMODB_TABLE = aws_dynamodb_table.this.name
}
}
tags = local.tags
}
resource "aws_lambda_function" "translate-file" {
function_name = "translate-app-translate-file-lambda"
description = "Lambda function for the /translate/file endpoint"
role = aws_iam_role.this.arn
filename = data.archive_file.translate-file.output_path
handler = "main.handler"
source_code_hash = data.archive_file.translate-file.output_base64sha256
runtime = "python3.10"
environment {
variables = {
DYNAMODB_TABLE = aws_dynamodb_table.this.name
S3_BUCKET = aws_s3_bucket.this.bucket
}
}
tags = local.tags
}
API Gateway
Finally, we'll define the API Gateway API. To handle file uploads properly, we'll include multipart/form-data
in the binary_media_types
configuration, ensuring that multipart requests are base64-encoded before being sent to the Lambda function.
Additionally, we'll configure the API resources as deployment triggers, so any changes to their properties automatically trigger a new deployment.
data "aws_caller_identity" "this" {}
data "aws_region" "this" {}
resource "aws_api_gateway_rest_api" "this" {
name = "translate-app-api"
description = "The translate app API"
binary_media_types = ["multipart/form-data"]
tags = local.tags
}
resource "aws_api_gateway_resource" "translate" {
rest_api_id = aws_api_gateway_rest_api.this.id
parent_id = aws_api_gateway_rest_api.this.root_resource_id
path_part = "translate"
}
resource "aws_api_gateway_method" "translate" {
rest_api_id = aws_api_gateway_rest_api.this.id
resource_id = aws_api_gateway_resource.translate.id
http_method = "POST"
authorization = "NONE"
}
resource "aws_api_gateway_integration" "translate" {
rest_api_id = aws_api_gateway_rest_api.this.id
resource_id = aws_api_gateway_resource.translate.id
http_method = aws_api_gateway_method.translate.http_method
integration_http_method = "POST"
type = "AWS_PROXY"
uri = aws_lambda_function.translate.invoke_arn
}
resource "aws_api_gateway_resource" "translate-file" {
rest_api_id = aws_api_gateway_rest_api.this.id
parent_id = aws_api_gateway_resource.translate.id
path_part = "file"
}
resource "aws_api_gateway_method" "translate-file" {
rest_api_id = aws_api_gateway_rest_api.this.id
resource_id = aws_api_gateway_resource.translate-file.id
http_method = "POST"
authorization = "NONE"
}
resource "aws_api_gateway_integration" "translate-file" {
rest_api_id = aws_api_gateway_rest_api.this.id
resource_id = aws_api_gateway_resource.translate-file.id
http_method = aws_api_gateway_method.translate-file.http_method
content_handling = "CONVERT_TO_TEXT"
integration_http_method = "POST"
type = "AWS_PROXY"
uri = aws_lambda_function.translate-file.invoke_arn
}
resource "aws_api_gateway_deployment" "this" {
rest_api_id = aws_api_gateway_rest_api.this.id
triggers = {
redeployment = sha1(jsonencode([
aws_api_gateway_rest_api.this,
aws_api_gateway_resource.translate,
aws_api_gateway_method.translate,
aws_api_gateway_integration.translate,
aws_api_gateway_resource.translate-file,
aws_api_gateway_method.translate-file,
aws_api_gateway_integration.translate-file,
]))
}
lifecycle {
create_before_destroy = true
}
}
resource "aws_api_gateway_stage" "this" {
deployment_id = aws_api_gateway_deployment.this.id
rest_api_id = aws_api_gateway_rest_api.this.id
stage_name = "prod"
tags = local.tags
}
resource "aws_lambda_permission" "translate" {
action = "lambda:InvokeFunction"
function_name = aws_lambda_function.translate.function_name
principal = "apigateway.amazonaws.com"
source_arn = "arn:aws:execute-api:${data.aws_region.this.name}:${data.aws_caller_identity.this.account_id}:${aws_api_gateway_rest_api.this.id}/*/${aws_api_gateway_method.translate.http_method}${aws_api_gateway_resource.translate.path}"
}
resource "aws_lambda_permission" "translate-file" {
action = "lambda:InvokeFunction"
function_name = aws_lambda_function.translate-file.function_name
principal = "apigateway.amazonaws.com"
source_arn = "arn:aws:execute-api:${data.aws_region.this.name}:${data.aws_caller_identity.this.account_id}:${aws_api_gateway_rest_api.this.id}/*/${aws_api_gateway_method.translate-file.http_method}${aws_api_gateway_resource.translate-file.path}"
}
We'll also define a few outputs.
#infrastructure/outputs.tf
output api_url {
value = aws_api_gateway_stage.this.invoke_url
}
To provision the infrastructure, we first configure the AWS CLI with our credentials:
aws configure
Next, we initialize Terraform and apply the configuration:
terraform init
terraform apply
After successful provisioning, we can access the API via the output url.
To wrap up this series, we've walked through writing python code for lambda functions and provisioning a complete infrastructure using Terraform and AWS.
If you've followed along, you should now have an API that can translate text and files running on AWS. But this is just the beginning! Thereโs always more to exploreโwhether itโs optimizing performance, integrating monitoring tools, or adding a CI/CD pipeline.
Iโd love to hear your thoughts! Drop a comment below if you have questions, insights, or ideas for future topics. ๐ Thanks for reading, and happy coding! ๐
Top comments (0)