DEV Community

AWS Serverless hands on part 2/2

At the beginning, I would like to emphasize that this is the second part of our journey. Please read First Part before you move forward!

Short Intro

I hope you took some time after reading the first part, and now we are ready to continue! We have already learnt about some serverless services and architecture of what we want to achieve. We also know what are our main assumptions and which technology we want to use. Our next step will be to get familiar with the scope of application and details about backend and frontend.

Application Description

Coming up with an idea for this application was the most challenging part for me. I didn’t want to simply display an index.html file with "Hello World," but I also struggled to think of a more complex use case. Since I wanted to implement CRUD (Create, Read, Update, Delete) functionality, I decided to use DynamoDB as the database and focus on GET/POST endpoints.

Eventually, the idea I settled on may not be the most sophisticated, but it serves its purpose well as a hands-on project.

Our serverless application functions as a kind of guest book. It is hosted on CloudFront with static content stored in S3. The application offers two main options:

  • "Add Data" – Allows users to submit information
  • "Fetch Data From Backend" – Retrieves stored data based on user input

Behind the scenes, we have API Gateway with GET and POST endpoints handling the requests. When a user clicks on "Add Data", they are prompted to answer a few mandatory questions: City, Name, and Year of Birth. The City serves as the partition key in DynamoDB.

To retrieve stored data, users click "Fetch Data From Backend", enter a City, and receive all records associated with that city from the database.

The application in the web browser looks as follow:
alt image

Data Storage and Directory Structure
Every value entered into the application is recorded in DynamoDB. In the following sections, we will dive into the details of how the various services work and how they are connected in our setup.

For now, let's take a look at the Directory Structure. The CDK (Cloud Development Kit) is already initialized, and the tests, stacks, and GitHub workflow are pre-configured. Below is the directory structure:

Image description

A brief overview of each folder:
.github/worfkflow - Defines our pipeline for deploying stacks
deploy/cdk/bin - The entry point for CDK with defined stacks
deploy/cdk/lib - Contains the main CDK constructs
deploy/cdk/frontend - Holds the frontend files for our page
deploy/cdk/lambda - includes the Lambda function which is triggered by API Gateway
deploy/cdk/test - Contains test files for CDK
deploy/cdk/{cdk.json,package.json,package-lock.json,jest.config.js} - npm/CDK configuration files

The repository with the code is located here:
aws-serverless-hands-on-template.

The README.md file provides a tl;dr version of the deployment steps. You can simply fork the repo, make a few adjustments, and deploy. In the next section, we will explore each part in more detail.

Workstation Requirements

Before we begin the hands on portion, ensure that the following prerequisites are met:

  • You have AWS account that you can use
  • Your local station is properly configured and AWS CLI is working with your account AWS CLI Docs
  • You have a basic understanding of AWS CDK Getting started with CDK
  • Node.js and npm are installed on you workstation Install Node/NPM
  • Your AWS environment has been bootstrapped properly CLI-bootstrap

Once these requirements are met, you're ready to proceed!

OIDC Stack

Before we begin with deployment, we need to ensure that GitHub is connected to our AWS account. The first step is to modify deploy/cdk/lib/components/oidc.ts file. In this file there is an oidc class where we specify which GitHub repository can assume the role

export class oidc{
  constructor(scope: Construct, rolename: string, repo: string, provider: GithubActionsIdentityProvider) {
    const accessSSMRole = new GithubActionsRole(scope, rolename, {
        provider: provider,   
        owner: '<your_github_owner>',
        repo: repo,
        roleName: rolename
    });   accessSSMRole.addManagedPolicy(ManagedPolicy.fromAwsManagedPolicyName('PowerUserAccess'));
  }
}
Enter fullscreen mode Exit fullscreen mode

Adjusting the role

  • Replace with the owner of your GitHub repository.
  • Since this is test code, I have assigned the PowerUserAccess policy to simplify deployment.
  • ⚠️ This is NOT best practice. If you plan to use this in a production environment, consider implementing a custom policy with the necessary permissions (least privilege approach).

Deploying the OIDC Stack
Once the role is adjusted, deploy the stack using the following commands:

cd deploy/cdk
npm i -D aws-cdk-github-oidc
npm run cdk deploy OidcStack -- --profile <your_aws_profile>
Enter fullscreen mode Exit fullscreen mode

After a few seconds, the deployment should be complete, and runners triggered from our GitHub repository should be able to authenticate with AWS.

Defining the Role in the Main Stack
The role name we can be defined in the main stack located in deploy/cdk/lib/oidc-stack.ts

new components.oidc(this, "github-actions-role", "aws-serverless-hands-on-template", provider);
Enter fullscreen mode Exit fullscreen mode

Make sure to update the role name and repository name accordingly. Additionally, this change must be reflected in the GitHub Actions Workflow, which we will cover in the GitHub Actions Deployment section.

Backend Stack

The Backend Stack is responsible for creating the Lambda function, API Gateway, and DynamoDB table. The code for this stack is located in deploy/cdk/lib/backend-stack.ts.
The implementation is straightforward, but one important point to highlight is CORS (Cross-Origin Resource Sharing). Without CORS headers, the API Gateway would block frontend requests from CloudFront. To resolve this, we explicitly define the necessary CORS headers in the API Gateway response.

Configuring CORS in API Gateway
To allow the frontend to access the backend, we need to modify the API Gateway responses:

    resource.addMethod('GET', lambdaIntegration, {
        methodResponses: [
            {
                statusCode: '200',
                responseParameters: {
                    'method.response.header.Access-Control-Allow-Origin': true,
                    'method.response.header.Access-Control-Allow-Methods': true,
                    'method.response.header.Access-Control-Allow-Headers': true,
                },
            },
        ],
    });
Enter fullscreen mode Exit fullscreen mode

Additionally, we define CORS options on API resources, ensuring that data can be shared between the backend and frontend:

    resource.addCorsPreflight({
        allowOrigins: ['*'],
        allowMethods: ['GET', 'POST', 'OPTIONS'],
        allowHeaders: ['Content-Type', 'X-Amz-Date', 'Authorization', 'X-Api-Key', 'X-Amz-Security-Token'],
    });
Enter fullscreen mode Exit fullscreen mode

Exposing API Gateway URL
This stack also outputs the API Gateway URL, which is used in the Frontend Stack to automate the deployment process. This prevents the need for manual updates.

new CfnOutput(this, 'API_URL', { value: api.url })
Enter fullscreen mode Exit fullscreen mode

The API_URL is later updated dynamically in the frontend configuration via GitHub Actions, which we will cover in the CI/CD section.

Lambda Function
The Lambda function is located in the deploy/cdk/lambda/index.js file. The handler logic is straightforward:

  • GET request → Reads from DynamoDB. Returns 400 if no value is provided or 200 on success.
  • POST request → Writes to DynamoDB. Returns errors for a few predefined conditions.

Connecting to DynamoDB & CORS
At the beginning of the Lambda function, we initialize the DynamoDB client and define CORS headers to allow cross-origin access:

const dynamoClient = new DynamoDBClient();
const docClient = DynamoDBDocumentClient.from(dynamoClient);

const headers = {
    "Access-Control-Allow-Origin": "*", 
    "Access-Control-Allow-Methods": "OPTIONS,GET,POST",
    "Access-Control-Allow-Headers": "Content-Type",
};
Enter fullscreen mode Exit fullscreen mode

This ensures that requests from the frontend can communicate with the backend without CORS restrictions.

Frontend Stack

Before diving into the details, I have to admit that I had no experience working on the frontend side. So, I teamed up with ChatGPT, which helped me generate a simple index.html, style.css, and script.js to make things work.

**Main Structure (index.html)
The core part of the frontend is the index.html file. The key section is:

<body>
    <div class="container">
        <h1>Welcome to the Serverless App</h1>
        <h2>Please leave the trace and add yourself as visitor. You can then check the data if anyone is from your city!</h2>
        <button id="fetchVisitors">Fetch Data from Backend</button>
        <button id="addVisitor">Add Data</button>
        <pre id="output"></pre>
    </div>
    <script src="script.js"></script>
</body>
Enter fullscreen mode Exit fullscreen mode

Key Elements

  • The two buttons (fetchVisitors and addVisitor) allow users to interact with the backend.
  • The script.js file is referenced at the bottom—it handles the logic when buttons are clicked.

Handling API Calls (script.js)
The first line of script.js defines the API_URL, which is dynamically replaced during the GitHub Actions (GHA) deployment step:

const API_URL = 'PLACEHOLDERdata';

During deployment, GitHub Actions replaces PLACEHOLDERdata with the actual API Gateway URL from the Backend Stack, ensuring that the frontend communicates with the correct backend.

Event Listeners for Button Clicks
Inside script.js, we listen for the DOMContentLoaded event and define what happens when each button is clicked:

  • "Fetch Data from Backend" button → Sends a GET request to API Gateway.
  • "Add Data" button → Sends a POST request with user input to the backend. The logic is straightforward, so you can go through the code and let me know if you have any questions.

GitHub Actions Deployment

This is the most crucial part of our setup, as it automates the deployment of our stacks through a CI/CD pipeline. The workflow file is located in the ./github/workflows folder. Let’s go through it step by step.

Workflow Configuration

name: Deploy serverless app to AWS
on:
  workflow_dispatch:
  # push:
  #   branches:
  #     - main

permissions:
  id-token: write
  contents: read
Enter fullscreen mode Exit fullscreen mode
  • Apart from defining the workflow name, we should remove the # comments from lines 4-6 in the forked repository.
  • I commented out those lines in the template repo to prevent automatic pipeline runs.
  • The workflow_dispatch event allows us to manually trigger the workflow (when working on the main branch).
  • The permissions section is crucial for configuring AWS credentials, it must be included for the workflow to function properly.

Job Definition

jobs:
  deploy:
    runs-on: ubuntu-latest
    env:
      AWS_ACCOUNT: '<your_AWS_ACCOUNT>'
Enter fullscreen mode Exit fullscreen mode
  • We define one job, named deploy.
  • The job runs on Ubuntu (ubuntu-latest), a default GitHub runner with pre-installed packages.
  • The AWS account number is stored in the AWS_ACCOUNT environment variable.

Deployment Steps

Checkout Code

    steps:
      - name: Checkout Code
        uses: actions/checkout@v4
Enter fullscreen mode Exit fullscreen mode
  • The first step is checking out the repository code.
  • Instead of writing custom scripts, we use a pre-built action from GitHub Market Place.

Set Up Node.js

      - name: Set Up Node.js
        uses: actions/setup-node@v4
        with:
          node-version: 18
Enter fullscreen mode Exit fullscreen mode
  • This step ensures Node.js v18 is installed

Install Dependencies & Run Tests

      - name: Install Dependencies & test
        run: |
          pushd /home/runner/work/aws-serverless-hands-on/aws-serverless-hands-on/deploy/cdk
          npm install -g aws-cdk
          npm install
          npm test
Enter fullscreen mode Exit fullscreen mode
  • Installs AWS CDK, project dependencies, and executes unit tests before proceeding.

Configure AWS Credentials

      - name: Configure AWS Credentials
        uses: aws-actions/configure-aws-credentials@v4
        with:
          role-to-assume: arn:aws:iam::${{ env.AWS_ACCOUNT }}:role/github-actions-role
          aws-region: eu-west-1
Enter fullscreen mode Exit fullscreen mode
  • This step assumes the IAM role created in the OIDC Stack, allowing GitHub Actions to deploy resources in our AWS account.

Deploy Backend and Frontend
Deploy Backend Stack

      - name: Deploy Backend
        id: backend
        run: |
          pushd /home/runner/work/aws-serverless-hands-on/aws-serverless-hands-on/deploy/cdk
          cdk deploy BackendStack --require-approval never --outputs-file cdk.outputs.json
          API_URL=$(jq -r '.["BackendStack"]["APIURL"]' cdk.outputs.json)
          echo "API_URL=$API_URL" >> $GITHUB_OUTPUT
Enter fullscreen mode Exit fullscreen mode
  • The BackendStack is deployed first.
  • The API Gateway URL is captured in cdk.outputs.json and stored as API_URL.
  • We assign an ID (backend) to this step, making it easier to reference its output later.

Deploy Frontend Stack

      - name: Deploy Frontend
        run: |
          echo ${{ steps.backend.outputs.API_URL }}
          pushd /home/runner/work/aws-serverless-hands-on/aws-serverless-hands-on/deploy/cdk
          sed -i 's|PLACEHOLDER|'${{ steps.backend.outputs.API_URL }}'|g' frontend/script.js
          cdk deploy FrontendStack --require-approval never
Enter fullscreen mode Exit fullscreen mode
  • The frontend deployment retrieves the API_URL from the backend deployment step.
  • It replaces the PLACEHOLDER in frontend/script.js with the actual API Gateway URL.
  • Finally, the FrontendStack is deployed.

Checking Deployment Output*
It might take up to 8-10 minutes. Once completed, the pipeline should be green.
To find the **CloudFront URL
, click on Deploy Frontend scroll down a little bit, and look for the output:

Outputs:
FrontendStack.DistributionId = d1gmj3c7gkt5q0.cloudfront.net
Enter fullscreen mode Exit fullscreen mode

And maybe it would be also helpful to see how it looks in practice.
alt image

Costs and Considerations

We did it! If everything went well your serverless app should not be up and running.
If you encountered any issues or found some sections unclear, feel free to leave a comment!

Why This Exercise Matters
Personally I believe that hands-on projects like this are an excellent way to gain real-world experience with AWS infrastructure, cloud services and automation. While going through each stage carefully can be time-consuming, the knowledge and confidence you gain are absolutely worth it.

Costs Breakdown
The good news is that this hands-on project is extremely cost-effective.
On my personal AWS account, I did not pay a single dollar, even though my deployed stacks remained active for nearly a month with multiple tests. This is thanks to AWS's Free Tier, which provides generous limits across various services.
Here’s a quick breakdown of the AWS Free Tier benefits relevant to this project:
Lambda has 1 milion requests per month free,
API GW for first 12 months 1 milion requests are also free but even if not, I would say we could spend <1$ if actively used during labs.

S3 Upon sign-up, new AWS customers receive 5GB of Amazon S3 storage in the S3 Standard storage class for first 12 months (per month)
DynamoDB provides 25GB of storage, along with 25 provisioned Write and 25 provisioned Read Capacity Units (WCU, RCU) which is enough to handle 200M requests per month
CloudFront 1 TB of data transfer out to the internet per month
10,000,000 HTTP or HTTPS Requests per month

Sources - you can find here
Note: After the free-tier limits expire, costs will vary depending on usage. However, even moderate activity is unlikely to exceed a few dollars per month.

Final Thoughts
This project also demonstrates how cost efficient serverless architectures can be. With AWS pay-as-you-go pricing and Free Tier, you can experiment and learn without worrying about high costs.

Thanks a lot for reading!

If you have any feedback, questions, or issues just let me know.

Top comments (0)