💡 Introduction
Welcome to the world of DevOps! Today, we’re diving into an exciting tool that solves the classic software engineering problem:
"It worked on my machine, but not on yours!"
Yep, we’ve all been there. But thanks to Docker, this headache is now a thing of the past!
In this blog, we’ll explore how Docker has revolutionized the DevOps industry, saving companies millions of dollars in computing costs. We’ll also get hands-on by creating a Dockerfile for a cloud-native monitoring application that tracks CPU and memory usage.
So, grab your coffee, and let’s get started! 🚀
💡 What is Docker? Understanding Containers & Why They Matter
Docker is a platform that allows us to develop, ship, and run applications in containers.
Wait… containers? Shipping? What’s going on here? 🚢
Let’s slow down and break it down step by step.
Docker in a Nutshell
Think of Docker as a tool that helps us create a blueprint of our application. Using this blueprint, we can run the application inside a container.
But what exactly is a container in Docker? 🤔
A container is like a lightweight operating system that includes everything needed to run an application—its dependencies, libraries, and configurations.
Here’s the basic workflow of using Docker:
We create a Dockerfile (this is the exact name it should have).
Inside the Dockerfile, we define how our application should be built and run.
Using some Docker commands, we generate a Docker Image—this is the actual blueprint of our application.
Finally, we use the Docker Image to run a Docker Container, where our application will actually execute.
Why Use Docker? Isn’t This More Complicated?
At first, Docker might seem complex, but here’s why it’s a game changer:
🔹 Efficient Resource Usage:
Before Docker, we used Virtual Machines (VMs) to run applications on the cloud. However, VMs often waste resources, leading to higher costs. With Docker, we use lightweight containers that run on VMs, making applications much faster and more efficient.
🔹 Consistency Across Environments:
When we create a Dockerfile, we choose a base image for our project (like Ubuntu, Alpine, or Python). This ensures that all developers and servers use the exact same environment, preventing the "works on my machine but not yours" issue.
🔹 Multiple Applications on One Machine:
With Docker, we can run multiple applications on the same VM. Before Docker, we could usually run only one application per VM. This means better resource utilization and lower cloud costs.
Now that we understand why Docker is awesome, let’s move forward and build something cool with it! 🚀
💡 Installing Docker & Running Your First Container
Enough theory—let’s jump into action! 🏃💨
Step 1: Installing Docker
First, we need to install Docker on our system. Here’s how:
✅ Windows & macOS Users: Download Docker Desktop from Docker’s official website. It provides an easy-to-use GUI along with the Docker CLI.
✅ Linux Users: You can install Docker using terminal commands. If you’re on Ubuntu, run:
sudo apt update -y
sudo apt install docker.io -y
sudo usermod -aG docker $USER && newgrp docker
docker --version # This will display the installed Docker version.
Once installed, Docker is ready to roll! 🎉
Step 2: Understanding DockerHub & Image Management
Earlier, we talked about Dockerfiles 📝 helping us create Docker images 📦. But what if we lose the image we created? Our blueprint would be gone!
To avoid this, we have DockerHub—a cloud-based repository where we can store and share Docker images, just like GitHub for code.
Public Repositories: Open for everyone to use.
Private Repositories: Only accessible to authorized users.
This allows teams to collaborate and deploy applications consistently across different environments.
Step 3: Running Your First Docker Container
Let’s test Docker by running a demo container:
docker run -d -p 8080:80 docker/welcome-to-docker
🔍 Breaking it Down:
docker run
→ Runs a new container.-d
→ Runs the container in detached mode (in the background).-p 8080:80
→ Maps port 8080 on your local machine to port 80 inside the container.docker/welcome-to-docker
→ The image we are running, which is pre-configured to display a welcome message.
Now, open your web browser and go to:
You should see a Docker Congratulations page! 🎉
This page is served by the docker/welcome-to-docker image, which contains a simple web server (NGINX or Apache) displaying the welcome message.
Now that you’ve run your first Docker container, let’s move on to building our own! 🚀
💡 Building Our Own Docker Image: Cloud Monitoring App
Now that we’ve experimented with a demo Docker container, it’s time to build our own Docker image! 🚀
We’ll use a Flask-based cloud monitoring application that displays CPU and memory metrics of the machine it runs on.
Step 1: Cloning the Project Repository
First, let’s clone the project from GitHub:
git clone https://github.com/Pravesh-Sudha/cloud-native-monitoring-app.git
Next, navigate inside the project directory:
cd cloud-native-monitoring-app
Inside this directory, you’ll find a Dockerfile. Let’s break it down!
Step 2: Understanding the Dockerfile
Here’s the Dockerfile for our Flask-based cloud monitoring app:
# 1️⃣ Set the base image
FROM python:3.9-slim-buster
# 2️⃣ Set the working directory inside the container
WORKDIR /app
# 3️⃣ Copy the dependency file into the container
COPY requirements.txt .
# 4️⃣ Install dependencies
RUN pip3 install --no-cache-dir -r requirements.txt
# 5️⃣ Copy the entire project into the container
COPY . .
# 6️⃣ Set environment variables
ENV FLASK_RUN_HOST=0.0.0.0
# 7️⃣ Expose the application’s port
EXPOSE 5000
# 8️⃣ Define the command to run the Flask app
CMD ["flask", "run"]
Breaking It Down:
📌 FROM python:3.9-slim-buster
→ This sets our base image.
We’re using
python:3.9-slim-buster
, a lightweight version of Python 3.9.Slim images are smaller in size, making the container more efficient.
📌 WORKDIR /app
→ This sets the working directory inside the container.
- Just like we create a folder for our project, we define
/app
as the directory where our application code will live inside the container.
📌 COPY requirements.txt .
→ This copies requirements.txt
into the container.
- This file contains all the dependencies required for our Flask app.
📌 RUN pip3 install --no-cache-dir -r requirements.txt
→ This installs the required Python packages.
📌 COPY . .
→ This copies all our application files into the container.
📌 ENV FLASK_RUN_HOST=0.0.0.0
→ This sets environment variables.
- Here, we define
FLASK_RUN_HOST=0.0.0.0
to allow our app to run properly inside a container or VM.
📌 EXPOSE 5000
→ This exposes port 5000, the default port for Flask applications.
📌 CMD ["flask", "run"]
→ This defines the command that will start our Flask app.
Step 3: Building the Docker Image
Now, let’s create a Docker image from this project. Run the following command:
docker build -t my-app:v1 .
🔹 -t my-app:v1
→ Tags the image as my-app with version v1
.
🔹 .
→ Tells Docker to use the current directory as the build context.
Step 4: Running the Docker Container
Once the image is built successfully, let’s run it as a Docker container:
docker run -d -p 5000:5000 my-app:v1
🔹 -d
→ Runs the container in detached mode (in the background).
🔹 -p 5000:5000
→ Maps port 5000 on your local machine to port 5000 inside the container.
Now, open your browser and go to:
🎉 You should see the application running!
Each time you refresh the page, the CPU and memory usage metrics will update dynamically.
Now that we’ve successfully built and run our Dockerized cloud monitoring app.
💡 Conclusion
Congratulations! 🎉 You’ve just taken your first steps into the world of Docker by learning how it solves real-world software deployment problems, running a demo container, and even Dockerizing a Flask-based cloud monitoring application.
Here’s a quick recap of what we covered:
✅ What is Docker? A platform for building, shipping, and running applications in lightweight containers.
✅ Why use Docker? It solves compatibility issues, improves resource utilization, and allows multiple applications to run efficiently on the same machine.
✅ Hands-on Docker: We installed Docker, ran a sample container, created a Dockerfile, built a Docker image, and launched a Docker container for our project.
This is just the beginning! Docker is a powerful tool widely used in DevOps, CI/CD pipelines, and cloud deployments. The next step? Try experimenting with Docker Compose, push your image to Docker Hub, or deploy your containerized app to Kubernetes! 🚀
If you found this blog helpful, feel free to share it with your fellow DevOps enthusiasts! Have questions? Drop them in the comments—I’d love to help. 😊
🔹 Happy Dockering! 🐳
✨ For more informative blog, Follow me on Hashnode, X(Twitter) and LinkedIn.
Top comments (0)