DEV Community

Cover image for Private LLMs for GitHub Actions
Matthew Casperson
Matthew Casperson

Posted on

Private LLMs for GitHub Actions

GitHub has been an enthusiastic adopter of AI with its Copilot platform to support developers with understanding, coding, and debugging software. However, it is not easy to use Copilot in an automated fashion in your GitHub Actions workflows.

SecondBrain is a new action that supports the use of LLMs inside GitHub Actions workflows. It works by deploying Ollama as a Docker container to host LLMs and then calling Ollama via a custom CLI that automates the process of constructing Retrieval Augmented Generation (RAG) prompts that embed details of git commits.

SecondBrain works like this:

  1. You pass in the git commit SHAs you wish to query
  2. You pass in GitHub token used to query the GitHub REST API to get the details of the commits
  3. You define a prompt to pass to the LLM that can assume access to a summary of the Git commits referenced by the SHAs
  4. SecondBrain queries GitHub for the details of the commits associated with the SHAs, summarizes the commit diffs, places the summaries into the prompt context, and then passes the context and your prompt to the LLM.

The following is a sample workflow YAML file that generates a summary of each commit to the main branch:

name: Summarize the commit

on:
  workflow_dispatch:
  push:
    branches:
      - main

jobs:
  summarize:
    runs-on: ubuntu-latest
    steps:
      - name: SecondBrainAction
        id: secondbrain
        uses: mcasperson/SecondBrain@main
        with:
            prompt: 'Provide a summary of the changes from the git diffs. Use plain language. You will be penalized for offering code suggestions. You will be penalized for sounding excited about the changes.'
            token: ${{ secrets.GITHUB_TOKEN }}
            owner: ${{ github.repository_owner }}
            repo: ${{ github.event.repository.name }}
            sha: ${{ github.sha }}
      - name: Get the diff summary
        env:
            RESPONSE: ${{ steps.secondbrain.outputs.response }}
        run: echo "$RESPONSE"
Enter fullscreen mode Exit fullscreen mode

The output of the Get the diff summary step looks like this:

Here is a summary of the changes:

The README.md file now indicates that the sha input is required and cannot have a default value [2]. The action.yml file has also removed the option to provide a default value for the sha input [1].

- [1]: The action.yml file was also updated to remove the default value for the sha input (52b40e59684d17e5fddc95c4dba3cdc82e4f7b7d)
- [2]: The README.md file was updated to add a note that the sha input is mandatory and has no default value (52b40e59684d17e5fddc95c4dba3cdc82e4f7b7d)

Links:
- [52b40e59684d17e5fddc95c4dba3cdc82e4f7b7d](https://github.com/mcasperson/SecondBrain/commit/52b40e59684d17e5fddc95c4dba3cdc82e4f7b7d)
Enter fullscreen mode Exit fullscreen mode

If you click the link from the report, you will see the commit that generated this summary. The description of the commit is accurate, and much easier to read than inspecting the diff directly.

It is important to note that this action never called any external services except for GitHub itself. There is no need to host your own LLM infrastructure as the entire process is handled by a local, private LLM exposed by Ollama.

Give it a try and let me know if this action was useful.

Top comments (0)