Hey, I’m Malarvizhi! Thank you for your interest in my first blog! In this post, we’ll explore the Software Development Lifecycle (SDLC) and explore how CI/CD pipelines streamline traditional processes for faster, more reliable software delivery.
In today’s fast-paced digital world, software development relies on structured methodologies to deliver reliable and high-quality products. One such methodology is the Software Development Lifecycle (SDLC) — a step-by-step framework that guides the development of software from inception to maintenance.
With the rise of CI/CD (Continuous Integration/Continuous Deployment) pipelines, SDLC has become more automated and agile, allowing teams to rapidly deliver updates while maintaining high standards. In this blog, we’ll explore the SDLC, its phases, and how modern CI/CD pipelines enhance its efficiency.
What is SDLC?
SDLC means, it is a structured framework, where it used to design, develop, test and deploy the software applications. It covers the entire process, from the creation of software to its deployment in the production environment.
ok, so what is the need of SDLC — Well, It ensures software (or application) are well-managed, delivered on time, and meet both business and quality standards, ultimately enhancing customer satisfaction and helps the organization build a stronger brand reputation.
For example, A company is planning to launch a new e-commerce website to allow customers to browse the products, make purchases, and track orders. The company wants to ensure the site is user-friendly, secure, and performs well under traffic spikes. Now the company must have to follow the process of SDLC to create an application and send it to production environment. Let’s see how SDLC plays in the given example.
First, let discuss what are the phases of SDLC and how we can build an e-commerce application in the phases of SDLC.
To meet the requirements of the production environment, the SDLC consists of several phases, those are:
Planning — Goal: Define project goals, scope, and resources.
Example — talks about the scope of how website should look, features it needs(product catalog, shopping cart, and payment gateway), discuss on target launch date.Requirement Analysis — Goal: Understand what software must do and gather functional and non-functional requirements.
Example — ensures the business requirements from stakeholders are clearly understood such as enabling easy browsing, checkout systems, and efficient order tracking.
- Design — Goal: Architect the software with workflows, modules, and database schemas
Example — helps to plan the website’s architecture. The development team would design the website’s UI and define how the application will interact with the database to store user data, product details, and transaction history.
- Development (Implementation) — Goal: Write and integrate code based on design documents.
Example — The development team writes the code to build the website’s functionality, such as the product catalog, shopping cart, and checkout system. Then the application is tracked using version control, and checks the code is regularly updated to ensure progress.
Note: why tracking using version control, because it allows developers to keep track of code changes over time.
- Testing — Goal: Identify and fix bugs through functional and performance
Example — Thorough testing is conducted to ensure the website works as expected. Tests like unit test to ensure each feature functionality; integration tests to verify entire application works together as expected; performance testing ensures the website can handle a high number of users during peak traffic times.
- Deployment — Goal: Launch the software in a production environment.
Example — Here the company will use some deployment tools to push the application to deploy in the production environment, where it becomes publicly accessible.
- Maintenance — Goal: Provide updates, fix bugs, and ensure long-term performance.
Example — After the website is live, the development team monitors its performance and addresses any issues that arise, such as bug fixes or updates regularly.
Hope, we have covered the phases of SDLC with its best example, lets check how CI/CD pipelines are used to automate SDLC phases in modern development.
CI/CD process in SDLC:
What are the process that CI/CD includes in SDLC — Build, Test, Package, Deploy
Where these process are relating to SDLC phases — Build (Development phase), Test(Testing phase), Package(Testing phase), Deploy(Deployment phase)
Lets discuss, what developers will do to meet business needs and on time delivery before CI/CD. Before the technology of CI/CD pipeline automation, developers will work largely on manuals, slower and mostly it prone to errors.
Here the overview: Manually writing the code; testing was done manually or with limited automation tools; Once testing was complete, the application was manually packaged into deployable formats (e.g., JAR or WAR files); Deployment involved copying files to servers manually or running custom scripts. Overall its time-consuming for developers.
Transition to CI/CD:
CI/CD replaced these manual processes with automated pipelines, enabling faster builds, automated testing, seamless deployments, and quick feedback loops. Let check what it does :)
Continuous Integration (CI):
- Automates the integration of code changes into a shared repository.
- Example: Developers push code to GitHub repository, and Jenkins gets triggers/updates and it automatically builds and tests the application.
Continuous Deployment (CD):
- Ensures the software is always ready for deployment.
- Example: Artifacts like Docker containers are stored in a repository like JFrog Artifactory, ready for staging or production.
Lets don’t get confused, we will break this down now. We discussed on what are the process it needs for CI/CD pipeline workflow to complete the SDLC process. In that process the organization might follows in 2 different menthods. Let break it.
METHOD 1:
This process is handled entirely within Jenkins, which builds, tests, packages, and deploys the software:
Till Phase 3, the process will be same, from phase 4 of SDLC the following CI/CD workflow will happen. Let’s explore.
Build — When Jenkins receives updates/triggers from repositories (like GitHub, GitLab, Bitbucket), it pulls the latest code, use that code to build and compiles it, and resolves any dependencies.
Test — Jenkins runs automated tests — including unit tests, integration tests — to verify the code functions correctly as expected.
Package — After successful testing, Jenkins packages the code into a deployable build artifact. The build artifacts could be executable files (eg., JAR file, WAR file, ZIP file or docker image)
Deployment — Jenkins automatically pushes the software to CD tools like Ansible, Kubernetes, or other deployment tools for staging and production environments.
Key Points: This process compiles and packages code directly into deployable artifacts and sends them for deployment without intermediate steps. It works best for small-scale applications or environments that don’t require storing artifacts in a centralized repository for reuse or auditing. Since there’s no artifact repository (like JFrog Artifactory or Docker Hub ) in this workflow, artifacts move directly from Jenkins to the deployment tool.
METHOD 2:
Build — When Jenkins receives triggers for updated code from repositories (like GitHub, GitLab, Bitbucket), it pulls the latest code, builds and compiles it, and resolves dependencies.
Test — Jenkins runs automated tests, including unit tests, integration tests, to ensure the code works as expected.
Package -
- Jenkins packages the application into a build artifact, such as a JAR file, WAR file, or Docker image.
- The packaged artifacts are pushed to an artifact repository (e.g., JFrog Artifactory, AWS, Azure, GCP) using Jenkins Artifactory plugins.
Note: Here the Jenkins do not know where the artifactory repository is placed. However, jenkins have plugins which is integrated into CI/CD pipelines.
- For additional containerization: Jenkins retrieves the artifacts from the artifact repository.
- Creates Docker images using a Dockerfile that defines the container environment.
- The Docker images are sent back to the artifact repository for: Storage for later use.
- Once the above process and security checks are done, developers retrieve the build artifacts from the repository for deployment.
Keypoints: This process includes the use of an artifact repository like JFrog Artifactory, Docker Hub, Aws, GCP, Azure for centralized storage and distribution of build artifacts. It is common in large-scale applications or containerized environments where multiple teams need access to the same artifacts, Artifacts need to be stored for auditing, compliance, or reusability. The use of an artifact repository provides better version control, traceability, and scalability.
Deployment — After the above processes, Jenkins triggers a deployment job that passes the artifact to a CD tool (such as Ansible for server configuration or Kubernetes for container orchestration). In other words, the CI tool (Jenkins) shares the artifact with CD tools (Ansible or Kubernetes) for production deployment.
And Lastly, after the deployment phase, it’ll go for continuous monitoring.
Maintenance —
Provide updates, fix bugs, and ensure long-term performance.
Example: Regularly patching vulnerabilities and releasing new features based on user feedback.
Real-World Example: Jenkins and Ansible for an E-Commerce Website
Let’s see how Jenkins and Ansible work together when deploying an e-commerce website with a new payment gateway feature:
Code Integration:
- Developers push code updates to a GitHub repository.
- Jenkins automatically fetches the code and runs unit tests to verify the payment gateway functionality.
- Build and Artifact Storage:
- Jenkins creates a Docker image of the website and stores it in a secure repository (e.g., Docker Hub or JFrog Artifactory).
- Deployment:
- Kubernetes pulls the Docker image, deploys it to production, and manages automatic scaling during high-traffic periods.
This automated pipeline accelerates delivery while maintaining quality standards, minimizing human error, and meeting business objectives.
Note: CI and CD tools are vary based on the project needs. Few CI and CD tools are listed for follow up :)
_Other CI Tools: Jenkins, GitLab CI, Travis CI, Circle CI, TeamCity, Azure DevOps, etc.
Other CD Tools: Bamboo, Ansible, Argo CD, Spinnaker, Harness, Buildkit_
Conclusion:
The SDLC forms the backbone of structured software development, while modern CI/CD pipelines enhance it with speed, automation, and reliability. This powerful combination enables teams to rapidly deliver high-quality applications that meet both user expectations and business needs.
Let me know what are your thoughts on SDLC and CI/CD? Share your experiences or questions in the comments below!
In the next blog, we’ll explore how to integrate security into SDLC, transforming it into Secure SDLC (SSDL) to ensure robust, secure software development. Stay tuned!
Top comments (0)