Check out my blogpost and Mission behind Mimir BLOG
In this tutorial, we’ll walk through how to use MimirLLM, a peer-to-peer communication library for AI-driven language models, to build a decentralized chatbot. By the end of this guide, you’ll have a working system where nodes can host and interact with Large Language Models (LLMs) in a decentralized network.
What You’ll Learn
- How to set up MimirLLM in Node Mode to host an LLM.
- How to set up MimirLLM in Client Mode to interact with hosted LLMs.
- How to use the
/mimirllm/1.0.0
protocol for peer discovery and LLM interactions. - How to integrate OpenAI or custom models like Ollama.
Prerequisites
Before we begin, ensure you have the following:
- Node.js v22.13.0 (LTS) or later: Download and install it from nodejs.org.
- Ollama or OpenAI API Key (optional): If you want to use OpenAI or Ollama models, have your API key or Ollama endpoint ready.
Step 1: Clone the Repository
First, clone the MimirLLM repository to your local machine:
git clone https://github.com/your-repo/mimirllm.git
cd mimirllm
Step 2: Install Dependencies
Install the required dependencies using npm:
npm install
This will install all the necessary packages, including libp2p for peer-to-peer communication and openai for interacting with OpenAI models.
Step 3: Set Up a Node to Host an LLM
In this step, we’ll configure a node to host an LLM and advertise it to the network.
Create a Node Script
Create a file named node.ts and add the following code:
import { createLibp2p } from './createNode';
import libp2pConfig from '../../shared/libp2p';
import { MimirP2PClient } from '../../shared/mimir';
createLibp2p(libp2pConfig).then(async (node) => {
console.log(`Node listening on:`);
node.getMultiaddrs().forEach((ma) => console.log(ma.toString()));
const mimir = new MimirP2PClient(node, {
mode: "node",
openaiConfig: {
baseUrl: process.env.OLLAMA_ENDPOINT || "https://api.openai.com/v1",
apiKey: process.env.OPENAI_API_KEY || null
}
});
await mimir.start();
}).catch((e) => {
console.error(e);
});
Run the Node
Start the node by running:
tsx node.ts
The node will start listening for connections and advertise the LLMs it hosts. You’ll see output like this:
Node listening on:
/ip4/127.0.0.1/tcp/12345/p2p/QmPeerId
Step 4: Set Up a Client to Interact with the LLM
Now, let’s create a client that can discover and interact with the LLM hosted by the node.
Create a Client Script
Create a file named client.ts and add the following code:
import { createLibp2p } from "libp2p";
import libp2pConfig from "../../shared/libp2p";
import { MimirP2PClient } from "../../shared/mimir";
import { createInterface } from "readline";
import { streamToConsole } from "../utils/stream";
async function main() {
const libp2p = await createLibp2p(libp2pConfig);
const client = new MimirP2PClient(libp2p, {
mode: "client",
openaiConfig: {
baseUrl: process.env.OLLAMA_ENDPOINT
}
});
await client.start();
while (true) {
const message = await new Promise<string>((resolve) => {
const readline = createInterface({
input: process.stdin,
output: process.stdout
});
readline.question('Enter message: ', (message) => {
readline.close();
resolve(message);
});
});
if (message === 'exit') {
break;
}
const stream = await client.sendMessage({
messages: [
{
"role": "system",
content: "You are a helpful assistant"
},
{
"role": "user",
content: message
}
]
});
streamToConsole(stream, (msg) => {
const data = JSON.parse(msg);
process.stdout.write(data.data.choices[0].delta.content);
});
}
}
main().catch((e) => {
console.error(e);
});
Run the Client
Start the client by running:
tsx client.ts
The client will prompt you to enter a message. Type your message and press Enter. The client will discover the node hosting the LLM, send the message, and stream the response back to you.
Step 5: Understanding the Protocol
MimirLLM uses two main protocols:
Discovery Protocol (/mimirllm/1.0.0/identify): Used for peer discovery and handshake.
- The client sends a query to discover nodes hosting specific LLMs.
- The node responds with the models it hosts.
LLM Interaction Protocol (/mimirllm/1.0.0): Used for sending and receiving messages.
- The client sends a message to the node.
- The node forwards the message to the LLM and streams the response back.
Step 6: Customizing the LLM
By default, MimirLLM integrates with OpenAI and Ollama. You can customize the LLM by modifying the MimirP2PClient configuration.
Using Ollama
To use Ollama, set the baseUrl to your Ollama endpoint:
const mimir = new MimirP2PClient(node, {
mode: "node",
openaiConfig: {
baseUrl: "http://localhost:11434" // Ollama endpoint
}
});
Using OpenAI
To use OpenAI, provide your OpenAI API key:
const openai = new OpenAI({
baseUrl: "https://api.openai.com/v1",
apiKey: process.env.OPENAI_API_KEY
});
const mimir = new MimirP2PClient(node, {
mode: "node",
openAIClient: openai
});
Step 7: Extending MimirLLM
MimirLLM is designed to be extensible. Here are some ideas for future enhancements:
[ ] Robust Discovery Mechanism: Implement a more efficient peer discovery system.
[ ] Blockchain Integration: Reward nodes for hosting LLMs using a blockchain-based incentive system.
[ ] Custom Models: Integrate other LLMs or fine-tuned models.
MimirLLM is a powerful tool for democratizing access to AI models, and its decentralized nature opens up exciting possibilities for collaborative AI development. Experiment with the code, extend its functionality, and join us in building the future of decentralized AI!
Next Steps
- Explore the Mimir project for more decentralized AI tools.
- Contribute to MimirLLM by opening issues or submitting pull requests.
- Share your decentralized AI applications with the community!
Happy coding! 🚀
Top comments (0)