DEV Community

Cover image for 5 Tricks to Speed up Docker Image Builds
Jonas Scholz
Jonas Scholz Subscriber

Posted on

5 Tricks to Speed up Docker Image Builds

Docker image builds can significantly impact development workflow speed, especially in larger projects. Here are five proven techniques that we use at Sliplane to optimize your build times!

gif

Compression

Docker images consist of multiple layers that need to be compressed for storage and transfer. By default, Docker uses gzip compression, which has been the standard choice for years. However, the newer zstd compression algorithm offers notably better performance while achieving similar or better compression ratios, particularly for larger images.

Compression

According to benchmarks from Depot, zstd consistently outperforms gzip in build times while maintaining excellent compression. To enable zstd compression with BuildKit, use:

docker buildx build --output type=image,compression=zstd -t my-image .
Enter fullscreen mode Exit fullscreen mode

Pull Through Cache

When building Docker images, downloading base images from DockerHub can become a bottleneck, especially with larger images or slower internet connections. A pull-through cache solves this by creating a local proxy that stores copies of your frequently used images.

The cache works transparently: the first pull retrieves the image from DockerHub, but subsequent pulls serve the image from your local cache, significantly reducing build times. To learn how to set it up, check out the official documentation!

Layer Ordering

Docker builds images using layers, and understanding how layer caching works is crucial for optimization. Each instruction in your Dockerfile creates a layer, and Docker can reuse unchanged layers from previous builds. However, when a layer changes, all subsequent layers must be rebuilt.

Strategic ordering of Dockerfile instructions can dramatically improve build times. The key principle is to place instructions that change frequently (like application code) after instructions that change rarely (like dependency installation). Here's an optimized example for a Node.js application:

FROM node:latest
WORKDIR /app
COPY package.json package-lock.json ./
RUN npm install
COPY . .
Enter fullscreen mode Exit fullscreen mode

This structure ensures that time-consuming operations like npm install only run when dependencies actually change, not when you modify source code. The performance impact can be substantial - rebuilds that previously took minutes can complete in seconds.

BuildKit

BuildKit represents a significant advancement in Docker's build architecture, offering many performance improvements over the legacy builder. Key benefits include:

  • Parallel processing of independent build stages
  • More efficient layer caching

To enable BuildKit in your environment, set:

DOCKER_BUILDKIT=1 docker build -t my-image .
Enter fullscreen mode Exit fullscreen mode

For permanent enablement, add { "features": { "buildkit": true } } to your Docker daemon configuration.

With newer docker versions this should already be the default.

CI/CD

While local builds are convenient during development, offloading builds to CI/CD systems offer some advantages:

  • Dedicated build resources that don't impact your development machine
  • Consistent build environments
  • Parallel building of multiple images (like matrix builds)
  • Automated testing and deployment integration

Here's a practical GitHub Actions workflow that handles Docker builds:

name: Build Docker Image
on: [push]
jobs:
  build:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v3
      - name: Build
        run: docker build -t my-image .
Enter fullscreen mode Exit fullscreen mode

This configuration automatically triggers builds on GitHub's infrastructure whenever code is pushed, freeing your local resources for development work. This will probably only be faster if you need to do a lot of parallel builds or if you're development machine is super slow! CI/CD will of course never be as fast as your top-of-the-line macbook :)

Conclusion

At sliplane.io we automatically build your Docker Images on every git push, so optimizing build performance is crucial for us! Do you have any other ideas on how to improve the performance? Let me know in the comments!

Cheers,
Jonas

Top comments (0)