DEV Community

Jait Ramadandi Jeke
Jait Ramadandi Jeke

Posted on

Transforming an Old Laptop into a Local AI Chatbot with LLAMA3 and Open WebUI

In this post, I'll show you how I turned my old laptop, which I hadn't used in a while, into a local ChatGPT clone with just a few simple steps. Setting this up is straightforward and doesn’t require much effort.

Why Use a Local AI Chatbot?

  • Privacy: Your data stays local.
  • Customization: Fine-tune the model to fit your needs.
  • Flexibility: Use the AI as you see fit.

Steps to Set Up a Local AI Chatbot

1. Run Open WebUI via Docker

To make the installation easy, I'll use Docker to set up Open WebUI with bundled Ollama support.

Run the following command on your computer:

docker run -d -p 3000:8080 -v ollama:/root/.ollama -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:ollama
Enter fullscreen mode Exit fullscreen mode

This command will start Open WebUI. Once it's running, visit http://localhost:3000 in your browser and set up an admin user.

2. Install LLAMA3.2:1b

For this example, I’m using the llama3.2:1b model, but you can choose any other large language model (LLM) available on the Ollama site, depending on your needs.

  • In the top-left corner of Open WebUI, type the name of the model you want to install.
  • In my case, it’s llama3.2:1b.
  • Click Pull "llama3.2:1b" from Ollama.com.
  • Here’s an example of how it looks during the installation:

Image description

Image description

That's it! You can access your LLM (Large Language Model) from another device with the same network, using a local IP address like this example:

Image description

Once the model is installed, you can start using your local AI chatbot. Open WebUI supports multiple LLMs, so you can experiment with different models to find what works best for your needs.

What's next?

I'm planned to fine-tune LLAMA on my own dataset. So, wish me luck ;v

References

Top comments (0)