Introduction
How do we usually debug a Lambda function? We deploy it to aws using the Web Console or Terraform then run it, see the output and repeat until we are satisfied. But hey, every lambda execution costs, not to mention the tiresome process to run it manually.
What if I tell you that you can deploy an aws-lambda function in locally simulated aws environment and actually run it with fruitful results! No more extra costs and boring process.
Imagine deploying your full micro services architecture in local! And yeah you don't have to have an aws account too. I can't wait to show you.
The Game Plan
We need LocalStack which simulates aws cloud inside our local machine for deploying, testing, debugging aws cloud infrastructure. It exposes an API (default: http://localhost:4566) using which we can deploy and manage aws resources.
There's another tool named tflocal, which is a wrapper for Terraform for LocalStack.
We just have to define some mock credentials in the provider section and that's it.
We will deploy the infrastructure (lambda here) and invoke it then save the output in a .json file for the sake of debugging. Let's go!
Build the infrastructure
We will write a basic lambda function. This will be the directory structure for the project -
├── README.md
├── Terraform
│ ├── lambda.tf
│ └── provider.tf
└── app
└── lambda_function.py
1. Write the lambda function
First let's write the basic lambda function using python app/lambda_function.py
-
import json
def lambda_handler(event, context):
return {
'statusCode': 200,
'body': json.dumps('Hello, I ran using LocalStack!')
}
2. Write the Terraform code
Now let's write the provider Terraform/provider.tf
-
terraform {
required_providers {
aws = {
source = "hashicorp/aws"
version = "~> 5.0"
}
}
}
provider "aws" {
access_key = "test"
secret_key = "test"
region = "us-east-1"
s3_use_path_style = true
skip_credentials_validation = true
skip_metadata_api_check = true
skip_requesting_account_id = true
}
We can write the lambda manifest now Terraform/lambda.tf
. This automatically zips the lambda function, so chill.
# Automatically zip the Lambda code from the ../app directory
resource "archive_file" "lambda_zip" {
type = "zip"
source_dir = "../app"
output_path = "../app/lambda_function.zip"
}
resource "aws_lambda_function" "my_lambda" {
function_name = "my-test-lambda"
role = aws_iam_role.lambda_exec.arn
handler = "lambda_function.lambda_handler"
runtime = "python3.8"
# Specify the location of the Lambda function code (the zip file)
# the script takes it automatically
filename = "../app/lambda_function.zip"
}
resource "aws_iam_role" "lambda_exec" {
name = "lambda_exec_role"
assume_role_policy = data.aws_iam_policy_document.lambda_assume_role_policy.json
}
data "aws_iam_policy_document" "lambda_assume_role_policy" {
statement {
actions = ["sts:AssumeRole"]
principals {
type = "Service"
identifiers = ["lambda.amazonaws.com"]
}
}
}
You can now run terraform init
, terraform plan
to see if things went fine, but that's optional. We will use tflocal in the next step anyway.
All the codes can be downloaded from here.
3. Setup LocalStack
3.1. Install and run docker on your machine.
3.2. Install localstack - see instructions. I just installed using homebrew -
brew install localstack/tap/localstack-cli
3.3. Install tflocal
- a wrapper for localstack
to work with Terraform
pip3 install terraform-local
3.4. Run LocalStack -
localstack start -d
localstack status
# to stop
# localstack stop
3.5. You have AWS CLI installed, don't you?
4. Deploy and Run Lambda
4.1. Deploy the Lambda
Now on your project's Terraform/
directory run
tflocal init
Plan to see the draft output -
tflocal plan
Now we can apply the resources (localstack must be running) -
tflocal apply --auto-approve
Now let's see if there are any functions deployes. Let's use the aws cli command to list the lambda functions. Note, we have to pass the endpoint-url
, that's the only change.
aws --endpoint-url=http://localhost:4566 --region us-east-1 lambda list-functions
There you have it!! We have successfully deployed a lambda function locally using localstack!!
4.2. Invoke the lambda
Let's invoke (run) the lambda to test, again using aws CLI commands but with the endpoint url passed -
aws --endpoint-url=http://localhost:4566 --region us-east-1 lambda invoke --function-name my-test-lambda output.json
So this will invoke the lambda function and save it's output in a file called output.json
Congratulations! Now you know how you can run aws lambda functions without putting pressure on the bills and automate any way possible.
What's next?
LocalStack opens up a whole new window to explore and enhance your DevOps project management capability. Without deploying on the actual cloud you can now play around on the local which gives you enough flexibility to explore and no worries about costs and errors.
The next projects where we can explore in future -
- Connecting Lambda with API Gateway
- Testing with event payloads
- Running multiple microservices locally
May be I can write another blog on it with API GateWay. Till then, best wishes. Happy learning.
Top comments (0)