TL/DR: Youtube
I can't stand it when my computer is disorganized.
It's weird, too - because ADHD causes many of my living spaces to be cluttered... but my computer? NEVER. I guess it makes sense, because it's the place I spend most of my time - so even if the rest of my world is piles of clutter, full of things I'll never need again but can't bring myself to discard, at least my computer is organized!
And then came coding.
Oh don't get me wrong, my coding projects are all neatly foldered and easily accessible. The problem is the tooling. Many projects have a one-off tool or technology that I'm integrating with - a database or some kind of other component that's needed in a specific project, but not in all of them.
It can get messy.
Then I've got tons of half-working config files lying around, as I try to get it up and running, and data fragments or other stuff installed... system daemons running that aren't necessary unless I'm working on that project, and... it's frustrating, y'all! I used to set aside time on a regular basis to completely reimage my computer, reinstalling the tools I needed frequently (usually I'd do this after completing a project with some of those "extra" one-off components in it)... but that takes a lot of time and energy, and it's just not fun. There's gotta be a better way!
Docker can help!
Oh sure, Docker is a great tool for running an app on a server... but running things locally could be really helpful too! Think about it - an easy-to-clean-up tooling environment, where you can be sure that everything can be found in just a few files.
The Anatomy of a Container Installation
1. The Container Image
Your Docker image is the basis upon which you build. It's the basic, unconfigured application that you're running - think for example of "mysql" or "mongodb" or "sonarqube" or "jenkins". In a traditional environment, you'd "install the application" and then set about configuring it - in Docker, you'll create the configuration you want to use and feed it to Docker, which will use the base image and add your configuration onto it.
2. The Dockerfile
Your Dockerfile explains how to configure the base image for your own purposes. This is where you might set the administrator password, or define the port that it's going to listen on, and where you'd configure all the application settings flags.
3. The Volume(s)
Docker containers are immutable - that is to say, no data within the container can ever change. While this is fine for some applications, others (like a database) are going to have problems with this - every time you stop the application, all its data will disappear! Volumes give you the ability to create a data store elsewhere on your system, and mount it for your container to be able to store data to disk.
4. The Container
The other three items combine to create a container instance. This is the configured version of the image that actually runs in your environment.
See how this can keep your system tidy? You have to delete the image, the container instance, and any data in the volumes, and everything's gone. (Yes, I know I left the Dockerfile out - you keep that in case you need to recreate it later!) Since all of these items are in known locations, you could even make the process completely automatic.
Scenario: comparing tools
Imagine that you're building a new CI/CD system but you aren't sure what tool will work best in your stack. Maybe you've narrowed it down to 3 competitors. In a traditional environment, you'd have to create 3 different server environments, or risk the installations colliding with each other. Using containers, you can run all 3 different products on the same server, accessible on different ports, and minimize the amount of hardware allocated to the experiment.
Scenario: Onboarding a new teammate
Imagine that you're hiring a new developer for your team. Choose what you'd like to do:
Spend all day installing all the tools they'll need one by one, ensuring that they all have the proper configurations and don't collide with each other in nasty ways.
Clone your company's Standard-Developer-Environment repo and run
docker-compose up
.
Worst-case, you have to customize the dockerfile just a bit with the newbie's credentials or something... but overall, running tools as a family of local containers means that you instantly move your new teammate from "onboarding" to "producing". Why would we ever do it any other way???
Wrapping up
Maybe you're already working like this - for the sake of productivity, I hope I'm late to the party on this one. But somehow, I suspect that I'm not.
Containerization has long been preached as a way to improve app delivery from development to production... but we can get some major developer experience wins by applying the same techniques to the development tooling we use!
Do you have a great docker setup that you use to install your stack tools? Drop a link to the repo below!
And of course, tune in next week as we start this season's build! We'll begin by running a mongo database in a container... and then later we'll craft our app to use it. It's gonna be lots of fun... don't miss out!
Top comments (2)
Great article Ben. Look forward to learning more about Docker and containers.
Hey Tom! Great to see you out here!