DEV Community

Cover image for Mastering Cost Optimisation with Shell Scripting: Automate Log Storage in S3 for Budget-Friendly Infrastructure
Pravesh Sudha
Pravesh Sudha

Posted on

Mastering Cost Optimisation with Shell Scripting: Automate Log Storage in S3 for Budget-Friendly Infrastructure

Learn how to leverage Shell scripting to streamline Jenkins log management and reduce cloud storage costs with AWS S3

๐Ÿ’ก Introduction

Welcome to the world of DevOps! Today, we are diving into shell scripting by creating a script that will help reduce our infrastructure costs by storing Jenkins logs in AWS S3 rather than VMs.

Imagine a company like Google, which has thousands of microservices running, collecting logs, metrics, and traces stored in servers. Popular observability stacks used by companies are ELK or EFK. These tools collect different types of logs, such as:

  • Application Logs: High-priority logs used to troubleshoot applications.
  • Kubernetes Control-Plane Logs: High-priority logs for troubleshooting cluster issues.
  • Infrastructure Logs: Logs from tools like Jenkins or Terraform.

Infrastructure logs are typically not as critical to store on servers. For example, if a Jenkins build fails, notifications are sent via email or Slack, enabling instant troubleshooting. However, retaining these logs for backup and restoration purposes is important.

To address this, weโ€™ve created a shell script that uploads Jenkins build logs to S3, ensuring cost optimisation while maintaining access to logs when needed.


๐Ÿ’ก Prerequisites

Before starting the project, ensure the following requirements are met:

  • An AWS Account.
  • Basic understanding of Shell Scripting.
  • Basic understanding of AWS S3.
  • Jenkins Installed on your system.

๐Ÿ’ก Setting Up the Environment

Before we create the script, we need:

  1. A Sample Jenkins Project:
    • Create a pipeline project in Jenkins named hello-world-project using a Hello World template.

Image description

  • Run the pipeline 3โ€“4 times to generate 3โ€“4 log files.

Image description

  1. An S3 Bucket:
    • Create an S3 bucket. For example: bucket-for-jenkins-logs (ensure the name is unique).

Image description

  1. Configure AWS CLI:
    • Generate an access key from AWS IAM.
    • Use the command aws configure to set up your AWS CLI.

Image description

  1. Locate the Jenkins Home Directory:
    • Go to Jenkins > Manage Jenkins > System. The Jenkins home directory path is displayed at the top.

Image description

Image description

Image description


๐Ÿ’ก Writing the Shell Script

Now, letโ€™s create the script that will automate the upload process:

The Script: s3upload.sh

#!/bin/bash

######################
## Author: Pravesh-Sudha
## Description: Shell script to upload Jenkins logs to S3 bucket
## Version: v1
######################

# Variables
JENKINS_HOME="/Users/praveshsudha/.jenkins"  # Replace with your Jenkins home directory
S3_BUCKET="s3://bucket-for-jenkins-logs"  # Replace with your S3 bucket name
DATE=$(date +%Y-%m-%d)  # Today's date

# Check if AWS CLI is installed
if ! command -v aws &> /dev/null; then
    echo "AWS CLI is not installed. Please install it to proceed."
    exit 1
fi

# Iterate through all job directories
for job_dir in "$JENKINS_HOME/jobs/"*/; do
    job_name=$(basename "$job_dir")

    # Iterate through build directories for the job
    for build_dir in "$job_dir/builds/"*/; do
        # Get build number and log file path
        build_number=$(basename "$build_dir")
        log_file="$build_dir/log"

        # Check if log file exists and was created today
        if [ -f "$log_file" ] && [ "$(date -r "$log_file" +%Y-%m-%d)" == "$DATE" ]; then
            # Upload log file to S3 with the build number as the filename
            aws s3 cp "$log_file" "$S3_BUCKET/$job_name-$build_number.log" --only-show-errors

            if [ $? -eq 0 ]; then
                echo "Uploaded: $job_name/$build_number to $S3_BUCKET/$job_name-$build_number.log"
            else
                echo "Failed to upload: $job_name/$build_number"
            fi
        fi
    done
done
Enter fullscreen mode Exit fullscreen mode

Running the Script

  1. Grant Execution Permissions:
   chmod 777 s3upload.sh
Enter fullscreen mode Exit fullscreen mode
  1. Execute the Script:
   ./s3upload.sh
Enter fullscreen mode Exit fullscreen mode

The script will:

  • Iterate through Jenkins job and build directories.
  • Identify logs created on the current date.
  • Upload logs to your specified S3 bucket with appropriate naming.

Image description

Image description


๐Ÿ’ก Conclusion

Congratulations! ๐ŸŽ‰ Youโ€™ve successfully created a shell script to automate the process of uploading Jenkins logs to S3, reducing costs and ensuring seamless log management.

But thereโ€™s more! AWS S3 supports Lifecycle Management, allowing you to define rules to automatically transition older or less important logs to cheaper storage classes like Glacier or Deep Archive. These options provide even greater cost savings for infrequently accessed logs.

This project demonstrates how a simple shell script, combined with cloud services, can solve real-world challenges in a DevOps workflow.

Keep experimenting, keep learning, and most importantly, keep scripting! ๐Ÿ˜Š

๐Ÿš€ For more informative blog, Follow me on Hashnode, X(Twitter) and LinkedIn.

Top comments (0)