DEV Community

prashant rana
prashant rana

Posted on

Run deepseek locally with webUI interface

deepseek is awesome, adding a way to run deepseek models locally with webUI

here is docker-compose.yaml file content


services:
  ollama:
    image: ollama/ollama
    ports:
      - "11434:11434"
    volumes:
      - ollama_models:/root/.ollama
    networks:
      - ollama-net

  open-web-ui:
    image: ghcr.io/open-webui/open-webui:main
    ports:
      - "8080:8080"
    environment:
      - OLLAMA_BASE_URL=http://ollama:11434
    depends_on:
      - ollama
    networks:
      - ollama-net
    volumes:
      - open-webui:/app/backend/data

volumes:
  ollama_models:
  open-webui:

networks:
  ollama-net:
Enter fullscreen mode Exit fullscreen mode

in your project folder, add this docker-compose.yaml file .
run command docker-compose up -d , pull all require images.

then to install a specific model locally, run the command

docker-compose exec ollama ollama pull deepseek-coder:6.7b

here i am installing deepseek-coder:6.7b model . take model from https://ollama.com/library

wait for 10-15 sec to reload.
go to http://localhost:8080 to start using it.

Top comments (0)