DEV Community

Cover image for Dumriya Live - AWS Oriented Full Stack Application Infrastructure Overview
Janith Disanayake
Janith Disanayake

Posted on

Dumriya Live - AWS Oriented Full Stack Application Infrastructure Overview

Introduction

  • This project is based on serverless architecture with lambda.
  • Backend of this application is develop with express js.
  • Based on MVC architecture with serverless framework
  • Used MongoDB online server as the database.

Services and Tools

Cloud Services

  • Lambda – Serverless computing service (AWS)
  • S3 – Object storage service (AWS)
  • API Gateway – Managed service for creating and managing APIs (AWS)
  • CloudFront – Content delivery network (CDN) (AWS)
  • ECR – Elastic Container Registry for Docker images (AWS)

Database

  • Atlas MongoDB – Cloud-based database service (MongoDB)
  • DevOps & CI/CD
  • Github Actions – CI/CD automation and workflows

Development Tools

  • NeoVim - Main Editor
  • Swagger UI – Tool for API documentation and interaction
  • JWT – Token-based authentication

Domain & DNS Services

  • Name.com – Domain registration and DNS services
  • Cloudflare – DNS Services

API

API Architecture

Image description

API gateway

To handle the load before reach the lambda and provide protection to the each and every endpoint.

Lambda function

  • Handing all the api that user sends
  • Uses container image to create the server
  • The image can runs on both lambda as well as normal server (You can configure to run on EC2 instance when setting up environment variable)
  • JavaScript is the programming language
  • Express JS is the framework
  • Implements with serverless-http package that helps to run on lambda

Main Endpoints

Schemas

Model Description
Train Representation of the train that is going through the track.
TrainInput Input model for creating or updating train details.
TrainLive Representation of the live train currently riding.
Updated every 5 minutes by a specific Lambda function.
Not updated by users or the train directly.
TrainLiveLog Stores the data received from the train's IoT device.
Station Representation of a station.
StationInput Input model for creating or updating station details.
Route Representation of the route or train track (the pathway of the train).
RouteInput Input model for creating or updating route details.
Schedule Representation of the schedule for when the train is set to run on a specific track at a specific time.
ScheduleInput Input model for creating or updating schedule details.
Device Sub-model representing the engine's device and the device authorized users have.
GeoLocation Sub-model used to store location data. This model is shared by the station, route, and other models.

Frontend

Frontend Architecture

Image description

SvelteKit Overview

SvelteKit is a modern framework for building web applications using Svelte. It features file-based routing and server-side rendering, optimizing performance and user experience.

Package Configuration

Defines the project's dependencies and scripts:

  • Scripts: Key commands for development and testing, such as:
    • dev: Starts the development server.
    • build: Compiles the application for production.
    • test: Runs unit and integration tests.
  • Dependencies:
    • devDependencies: Tools needed during development, including SvelteKit, Vite, ESLint, and Prettier.
    • dependencies: Essential for production, including dotenv for managing environment variables.

Integration with Dumriya-Live API

  • The client-side application makes API calls to api.dumriya.live to fetch real-time data, enabling features like live updates, data visualization, or interactions based on the Dumriya-Live service.
  • The application uses environment variables (e.g., API_URL) to securely store and access the Dumriya-Live API endpoint and other sensitive information.

CICD Pipeline

Pipeline Diagram

Image description

CI/CD Workflow: Deploying a Dockerized AWS Lambda Function

Overview

This workflow automates the deployment of a Dockerized AWS Lambda function, triggered on pushes to the prod branch. It utilizes GitHub Actions, AWS services, and Docker to ensure a streamlined, secure, and reliable deployment process.

Key Features

  1. GitHub Actions: Automates the CI/CD pipeline with a structured sequence triggered on prod branch updates.
  2. Secure Credential Management: Leverages GitHub Secrets to securely store AWS credentials.
  3. AWS Elastic Container Registry (ECR): Stores and manages Docker images for the Lambda function.
  4. Docker Integration: Builds, tags, and pushes Docker images, ensuring consistent runtime environments.
  5. AWS CLI: Authenticates and interacts with AWS services, such as logging in to ECR.
  6. Region-Specific Deployment: Operates in the AWS us-east-1 region to ensure environment alignment.

Workflow Breakdown

1. Event Trigger

  • Condition: Activated by a push to the qa branch.

2. Steps in the Workflow

  • Checkout Code: Pulls the latest code from the repository using actions/checkout.
  • Set up Docker Buildx: Configures Docker Buildx for advanced build capabilities.
  • Configure AWS Credentials: Authenticates with AWS using secrets for AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY, and the region us-east-1.
  • Log in to Amazon ECR: Logs into Amazon Elastic Container Registry (ECR) to allow image pushes.
  • Build Docker Image: Builds the Docker image with environment variables from secrets (ENVIRONMENT, SECRET_KEY, MONGODB_URI).
  • Tag Docker Image: Tags the built image as web-api:latest.
  • Push Docker Image to ECR: Pushes the tagged image to the specified ECR repository.
  • Update Lambda Function: Updates the AWS Lambda function with the new image from ECR.

Benefits

  • Automation: Reduces manual intervention by automating key stages like building and deploying Docker images.
  • Security: Protects sensitive credentials using GitHub Secrets and applies AWS best practices.
  • Consistency: Docker ensures predictable application behavior across environments.
  • Scalability: Supports easy scaling and adaptation to different environments or regions.

This workflow integrates essential tools and services to enable secure, fast, and reliable deployment of containerized AWS Lambda functions in production.

CI/CD Workflow: Build and Deploy to S3

Overview

This CI/CD workflow automates the process of building a Node.js project and deploying it to an Amazon S3 bucket. It is triggered by pushes to the prod branch (or a specified branch) and integrates GitHub Actions, AWS S3, and environment variables for a seamless deployment process.

Key Features

  1. GitHub Actions: Automates the build and deployment pipeline with a defined sequence triggered on prod branch updates.
  2. Secure Credential Management: Uses GitHub Secrets to securely store sensitive information, such as API keys and AWS credentials.
  3. Node.js Environment: Sets up and manages the required Node.js runtime and dependencies.
  4. Build Process:
    • Utilizes environment variables such as MAP_API_KEY and API_URL during the build stage for dynamic configuration.
    • Executes the project’s build script using npm run build.
  5. AWS S3 Integration:
    • Uses AWS CLI to sync the built files to the specified S3 bucket.
    • Ensures efficient updates by removing outdated files with the --delete flag.

Workflow Breakdown

1. Event Trigger

  • Activated by a push to the prod branch (customizable for other branches).

2. Steps in the Workflow:

  • Checkout Code: Pulls the latest code from the repository using actions/checkout.
  • Set Up Node.js Environment: Configures the specified Node.js version (default is 18).
  • Install Dependencies: Installs project dependencies using npm install.
  • Build Project:
    • Reads environment variables such as MAP_API_KEY and API_URL from GitHub Secrets.
    • Builds the project using the npm run build command.
  • Deploy to S3:
    • Authenticates with AWS using credentials from GitHub Secrets.
    • Syncs the built project to the configured S3 bucket, ensuring an up-to-date deployment.

Benefits

  • Automation: Eliminates manual tasks for building and deploying the application, improving efficiency.
  • Security: Ensures sensitive credentials and environment variables are managed securely via GitHub Secrets.
  • Scalability: Easily adaptable to different S3 buckets, environments, or Node.js versions.
  • Reliability: Ensures consistency by using a defined build process and removing outdated files from S3.

This workflow provides a secure, automated, and efficient solution for building and deploying Node.js projects to Amazon S3, ensuring seamless updates and reliable performance in production.


References

Links for the applications

Repositories

Documents

Special Thanks

Top comments (0)