DEV Community

Márk Sági-Kazár
Márk Sági-Kazár

Posted on • Edited on • Originally published at sagikazarmark.hu

The Perfect Development Environment 2022

Originally published at https://sagikazarmark.hu.


I'm a huge fan of Developer Experience and I continuously chase The Perfect Development Environment every day. Not just for my sake, but to make the development process as painless as possible for my peers. But after pursuing the dream environment for years, I had to realize it's a moving target: technology evolves every day and development processes have to evolve as well.

Previously on The Perfect Development Environment

I praised Makefiles above all else before. Then modern build systems (eg. Please) became my new religion. I'm pretty sure I believed (and preached) that PHP and NodeJS project had to use Composer and NPM (and nothing else) as task runners at some point in my career. I even remember rewriting build scripts from Grunt to Gulp, then from Gulp to the next fancy JavaScript task runner over and over again.

I've used various tools and methods in various projects over the years, but one thing was always common: whatever tools I used for development, I always aimed to use the same tools in CI to build and deploy the projects I worked on.

It wasn't always easy, but it seemed to be a good practice: it reduced the maintenance burden of maintaining two sets of tools (for dev and CI) and it made building CI pipelines easier (by making them thin layers of integration around dev tools).

It never occurred to me to do things the other way around, until Dagger came along.

The root of all evil: portability across platforms

If everyone in the world used Linux on x86 I wouldn't be writing about developer experience, because there really would be only one perfect development environment for everyone to use and the world would be a much happier place.

Unfortunately (or not?), this is not the case. Developers use and target various platforms during development and that makes development processes and the tooling infinitely more complex: different architectures, different package managers, different available software versions and tons of other factors contribute to making portability across environments a pain. Even on different versions of the same OS reproducing identical environments proved to be a challenge.

Around 2013 Docker came to life and it quickly revolutionized development environments. It wasn't true portability of course, since the technology under Docker required Linux, but it introduced an abstraction layer that made packaging and running applications across various (Linux) platforms much easier.

Nowadays, using Docker for development is trivial, even on Windows and MacOS and it essentially resolved the portability issue for a large number of use cases.

Dev vs CI

As I mentioned earlier, using the same tools for development and running them in CI pipelines is not always easy, because they don't always serve the same purposes.

For example: you don't necessarily want to run all test suites during development, just the ones you are working on. So in your local development environment, you need tools that can selectively run tests, but in CI you want to run them all, thus if you want to reuse the same tools in CI, you need something that supports both.

But you also want to be able to run all tests locally, because that's what the CI does and that's what decides whether your change will be accepted or not at the end of the day.

In other words: you want to be able to do and run all things the CI does in your local environment.

New kid on the block: Dagger

Dagger markets itself as a "new way to create CI/CD pipelines" and a "portable devkit for CI/CD pipelines". The first closed beta was released earlier this year and was quickly followed by a public release in March.

The idea behind Dagger is that instead of running CI pipelines on a central server (often written in a platform specific DSL), you can run them anywhere (where Docker is available). It doesn't mean you should stop running/using a CI server, but it puts building and running those pipelines in a new perspective:

What if we could build, debug and run those pipelines locally?

Dev + CI

So here is the novel idea (at least compared to my earlier believes): instead of integrating dev tools into CI pipelines, let's just build a pipeline that you can run both locally and on a CI server.

Whenever you want to confirm that your changes adhere to the current rules enforced by the CI (aka. everything is green), you can just run the pipeline on your machine.

With that taken care of, you can create a set of development tools that you only use for development. You don't have to worry about running them on the CI.

To give you an example how this looks like in practice: I have a Go project where I use Dagger. I can run tests and the static analysis tools locally and on a CI service (eg. GitHub Actions). At the same time, I have a Makefile (or lately, a Taskfile) for dev tasks, like formatting code (often using the same static analysis tools) or running code generators.

If you want to see more examples, feel free to check out my repositories on GitHub. I use Dagger combined with a task runner in more and more projects.

This is one of the first projects I used Dagger in.

Conclusion

I can't say this is going to be The Perfect Development Environment, because I don't believe it exists anymore. Even if it does, it's only perfect for a moment or until the next, better thing comes along.

Building and running CI pipelines on their own instead of integrating dev tools into them is an exciting new perspective and I'm looking forward to trying it in large, established projects. Not having to worry about breaking builds that already passed once on my machine feels extremely comforting so far.

On the other hand, Dagger is extremely young so I wouldn't start rewriting every single CI pipeline just yet.

Top comments (0)