DEV Community

Shrijith Venkatramana
Shrijith Venkatramana

Posted on

Making DBChat VSCode Extension - Ping Pong With LSP Backend (Part 6)

Hi there! I'm Shrijith Venkatrama, the founder of Hexmos. Right now, I’m building LiveAPI, a super-convenient tool that simplifies engineering workflows by generating awesome API docs from your code in minutes.

In this tutorial series, I am on a journey to build for myself DBChat - a simple tool for using AI chat to explore and evolve databases.

See previous posts to get more context:

  1. Building DBChat - Explore and Evolve Your DB with Simple Chat (Part 1)
  2. DBChat: Getting a Toy REPL Going in Golang (Part 2)
  3. DBChat Part 3 - Configure , Connect & Dump Databases
  4. Chat With Your DB via DBChat & Gemini (Part 4)
  5. The Language Server Protocol - Building DBChat (Part 5)

The Goal: Perform Ping/Pong Between Extension UI and Golang LSP Backend

To start with VSCode extensions and the underlying LSPs, I have set up a simple goal that will get us started.

In this post, I will show how to configure the extension frontend, and golang backend to perform a simple communication:

  1. Frontend will send "ping" as an input to the LSP
  2. The backend LSP will respond with "pong" to the frontend

The solution will involve:

  1. Registering a command ("DBChat: Ping")
  2. Handling command invocation in frontend
  3. Handle request in LSP backend
  4. Display response in extension frontend

Setting Up The Frontend Changes

The first task is to start with a VSCode extension itself. Microsoft provides an awesome guide in the page: Your First Extension | Visual Studio Code Extension API.

With a few commands, you'll have a functioning minimal extension that'll show up an alert message.

Now our task is to add a new command to trigger a "Ping" messaage via a command.

Here are the changes required to achieve it:

Define a package.json in the extension's root folder, which registers the command:

{
  "name": "dbchat",
  "displayName": "DBChat",
  "description": "Explore and Evolve Databases With Simple AI Chat",
  "version": "0.0.1",
  "engines": {
    "vscode": "^1.96.0"
  },
  "categories": [
    "Other"
  ],
  "activationEvents": [
    "onCommand:dbchat.ping"
  ],
  "main": "./dist/extension.js",
  "contributes": {
    "commands": [
      {
        "command": "dbchat.ping",
        "title": "DBChat: Ping"
      }
    ]
  },
  "scripts": {
    "vscode:prepublish": "npm run package",
    "compile": "npm run check-types && npm run lint && node esbuild.js",
    "watch": "npm-run-all -p watch:*",
    "watch:esbuild": "node esbuild.js --watch",
    "watch:tsc": "tsc --noEmit --watch --project tsconfig.json",
    "package": "npm run check-types && npm run lint && node esbuild.js --production",
    "compile-tests": "tsc -p . --outDir out",
    "watch-tests": "tsc -p . -w --outDir out",
    "pretest": "npm run compile-tests && npm run compile && npm run lint",
    "check-types": "tsc --noEmit",
    "lint": "eslint src",
    "test": "vscode-test"
  },
  "devDependencies": {
    "@types/vscode": "^1.96.0",
    "@types/mocha": "^10.0.10",
    "@types/node": "20.x",
    "@typescript-eslint/eslint-plugin": "^8.17.0",
    "@typescript-eslint/parser": "^8.17.0",
    "eslint": "^9.16.0",
    "esbuild": "^0.24.0",
    "npm-run-all": "^4.1.5",
    "typescript": "^5.7.2",
    "@vscode/test-cli": "^0.0.10",
    "@vscode/test-electron": "^2.4.1"
  }
}
Enter fullscreen mode Exit fullscreen mode

Once we've registered the command, the next step is to define the actual execution method for the command, which essentially involves triggering the LSP backend.

First registartion looks like this:

    const pingCommand = vscode.commands.registerCommand('dbchat.ping', async () => {
        if (!client) {
            vscode.window.showErrorMessage('DBChat client is not initialized');
            return;
        }

        try {
            const response = await client.ping();
            vscode.window.showInformationMessage(`DBChat response: ${response}`);
        } catch (error) {
            vscode.window.showErrorMessage(`Error: ${error}`);
        }
    });

    context.subscriptions.push(pingCommand);
Enter fullscreen mode Exit fullscreen mode

The next is to handle the command by triggering backend binary:

class DBChatClient {
    private process: cp.ChildProcess;
    private buffer: string = '';

    constructor() {
        // Adjust the path to where your dbchat binary is located
        const dbchatPath = path.join(__dirname, '..', '..', 'build','dbchat');
        this.process = cp.spawn(dbchatPath, ['--lsp']);

        if (!this.process.stdin) {
            throw new Error('Process stdin is not available');
        }

        this.process.stdout?.on('data', (data) => {
            this.buffer += data.toString();
            this.processBuffer();
        });

        this.process.stderr?.on('data', (data) => {
            console.error(`DBChat LSP Error: ${data}`);
        });
    }

    private processBuffer() {
        const lines = this.buffer.split('\n');
        this.buffer = lines.pop() || '';

        for (const line of lines) {
            try {
                const message = JSON.parse(line);
                console.log('Received message:', message);
                // Handle responses here
            } catch (e) {
                console.error('Failed to parse message:', e);
            }
        }
    }

    async ping(): Promise<string> {
        const message = {
            jsonrpc: '2.0',
            id: nextMessageId++,
            method: 'ping',
            params: null
        };

        if (!this.process.stdin) {
            throw new Error('Process stdin is not available');
        }

        this.process.stdin.write(JSON.stringify(message) + '\n');
        return new Promise((resolve) => {
            // In a real implementation, you'd want to properly match
            // the response ID with the request ID
            setTimeout(() => resolve('pong'), 100);
        });
    }

    public dispose() {
        if (this.process) {
            this.process.kill();
        }
    }
}

Enter fullscreen mode Exit fullscreen mode

Do note the two most important things: there is a trigger mechanism, and there is another response handler for the "pong" response part.

Building up LSP Mode In The Golang Backend

On the golang backend side, we have to start a new --lsp switch/mode to keep a server running which honors LSP's mechanisms.

Much of the magic happens in this function:

type JSONRPCMessage struct {
    JSONRPC string      `json:"jsonrpc"`
    ID      int         `json:"id,omitempty"`
    Method  string      `json:"method,omitempty"`
    Params  interface{} `json:"params,omitempty"`
    Result  interface{} `json:"result,omitempty"`
}

func handleLSP() {
    stdin := json.NewDecoder(os.Stdin)
    stdout := json.NewEncoder(os.Stdout)

    for {
        var msg JSONRPCMessage
        if err := stdin.Decode(&msg); err != nil {
            if err == io.EOF {
                return
            }
            log.Printf("Error reading JSON-RPC message: %v", err)
            continue
        }

        // Handle ping message
        if msg.Method == "ping" {
            response := JSONRPCMessage{
                JSONRPC: "2.0",
                ID:      msg.ID,
                Result:  "pong",
            }
            if err := stdout.Encode(response); err != nil {
                log.Printf("Error writing response: %v", err)
            }
        }
    }
}
Enter fullscreen mode Exit fullscreen mode

Pay close attention to the struct getting defined. So it has a method field, which specifies what task the backend is supposed to perform. We can define our ping action here. We have custom handling for that custom command, and we respond with pong

Taking Ping Pong To a Test Drive

With both the frontend and backend defined, I goto my VSCode debugger, and trigger the "DBChat: Ping" command. And voila - I get the expected "Pong" response, and it gets shown in the standard VSCode alert box. How cool is that?!

Ping

ping

Pong

pong

Next Steps

With this exploration - we have the bare minimum foundation necessary to get extension frontend and LSP backend talking.

In the upcoming posts we will develop the frontend and LSP both to get a kind of chat interface along with a DB connection configuration mechanism.

Top comments (0)