This post describes automating Cloud Build triggered from a personal GitHub source repository using GitHub Actions, and pushing the resulting container to both Artifact Registry (for running in our GCP project) and Docker Hub (for other folks to access).
I've been putting off container build automation for a while. When we dipped our toes in last time we learned that Cloud Build can't connect to a personal repo via a service account. (You need to connect as the repo owner.) And I didn't want to put my personal GitHub token (and everything it can access) into the project.
Instead we added cloud-build.sh, a dead-simple script to submit the build job.
#!/bin/sh
gcloud builds submit --region=us-central1 --tag us-central1-docker.pkg.dev/deepcell-on-batch/deepcell-benchmarking-us-central1/benchmarking:gce
Then, build "automation" meant me running that script from my laptop after merging code.
Honestly: this was pretty good. As long as I'm the one merging PRs, it's quick to submit builds after merge. And I almost always remember ๐
But as we get closer to production this is becoming a hassle, and we need the container in GCP Artifact Registry and public Docker Hub. The client env can access public hub containers, but not our Google project.
So I set out to automate the build and push to GCP and Docker Hub.
Since I couldn't connect Cloud Build, I wondered how hard it would be to do everything on GitHub actions. I wanted the build in one spot if possible.
This post was just what I wanted:
I created a service account in the Google project; gave it the appropriate permissions:
- Artifact Registry Writer
- Cloud Build Service Account
- Service Account Token Creator
I discovered I needed these later:
- Log viewer
- Viewer (aka project viewer)
I also needed to grab the service account JSON and put it in the repo secrets.
The actions config was pretty simple:
name: Build and Push to Artifact Registry
on:
push:
branches: ["main"]
env:
PROJECT_ID: deepcell-on-batch
REGION: us-central1
GAR_LOCATION: us-central1-docker.pkg.dev/deepcell-on-batch/deepcell-benchmarking-us-central1/benchmarking
jobs:
build-push-artifact:
runs-on: ubuntu-latest
steps:
- name: "Checkout"
uses: "actions/checkout@v4"
- id: "auth"
uses: "google-github-actions/auth@v2"
with:
credentials_json: "${{ secrets.GCP_SERVICE_ACCOUNT_KEY }}"
- name: "Set up Cloud SDK"
uses: "google-github-actions/setup-gcloud@v2"
- name: "Use gcloud CLI"
run: "gcloud info"
- name: "Docker auth"
run: |-
gcloud auth configure-docker ${{ env.REGION }}-docker.pkg.dev --quiet
- name: Build image
run: docker build . --file Dockerfile --tag ${{ env.GAR_LOCATION }}
working-directory: container
- name: Push image
run: docker push ${{ env.GAR_LOCATION }}
But, alas, no luck. The GitHub build failed with this mysterious message:
#12 26.21 ERROR: THESE PACKAGES DO NOT MATCH THE HASHES FROM THE REQUIREMENTS FILE. If you have updated the package versions, please update the hashes. Otherwise, examine the package contents carefully; someone may have tampered with them.
#12 26.21 unknown package:
#12 26.21 Expected sha256 2507549cb11e34a79f17ec8bb847d80afc9e71f8d8dffbe4c60baceacbeb787f
#12 26.21 Got bdb08e5ff6de052aed79acf4d6e16c5e09bac12559c6c234511b6048e2d08342
#12 26.21
#12 ERROR: process "/bin/bash -c pip install --user --upgrade -r requirements.txt" did not complete successfully: exit code: 1
An unknown package has an unexpected SHA โ cool ๐
I really didn't want to fiddle around with the build, and I knew Cloud Build works. So I decided to convert the GitHub action to a Cloud Build trigger.
Because I was no longer building & pushing to just Google Artifact Registry, I needed a Cloud Build config file. I also needed a Docker Hub service account.
steps:
- name: 'gcr.io/cloud-builders/docker'
entrypoint: 'bash'
args: [ '-c', 'echo "$$PASSWORD" | docker login --username=$$USERNAME --password-stdin' ]
secretEnv: [ 'USERNAME', 'PASSWORD' ]
- name: 'gcr.io/cloud-builders/docker'
args: [
'build',
'-t', 'us-central1-docker.pkg.dev/deepcell-on-batch/deepcell-benchmarking-us-central1/benchmarking:batch',
'-t', 'dchaley/deepcell-imaging:batch',
'.',
]
- name: 'gcr.io/cloud-builders/docker'
args: [ 'push', 'us-central1-docker.pkg.dev/deepcell-on-batch/deepcell-benchmarking-us-central1/benchmarking:batch' ]
- name: 'gcr.io/cloud-builders/docker'
args: [ 'push', 'dchaley/deepcell-imaging:batch' ]
availableSecrets:
secretManager:
- versionName: projects/deepcell-on-batch/secrets/dockerhub-password/versions/1
env: 'PASSWORD'
- versionName: projects/deepcell-on-batch/secrets/dockerhub-username/versions/2
env: 'USERNAME'
At that point, I just needed to change the build & push image steps to use Cloud Build. This is where we need the log viewer & project viewer permissions: for the gcloud
CLI to pull in the logs during build.
- name: Trigger cloud build
run: "gcloud builds submit --region=us-central1 --config=build-batch-container.yaml ."
working-directory: container
Et voilร , the container builds on each commit to main
, and gets pushed to both Artifact Registry & public Docker Hub. ๐
This experience made clear to me that it's really better to have repos in an organization. That removes the need for GitHub Actions entirely as the infra can just connect to the source.
I'm still curious what the unexpected SHA in the unexpected package was about though ๐ค
Top comments (0)