Every PC's main drive is almost full (atleast mine). I wanted to install deepseek on my local machine. But my main drive on which OS(Ubuntu 24.04) is installed does not have enough space. So I wanted to install the model in another directory(disk).
So I googled for it. I found the solution at here . I then tried multiple times to succeed. So I want to write a blog so it can save others time.
- Overriding the configuration will result in loosing the existing models you have downloaded. I have not tried this, but models should work if we move all contents from
/usr/share/ollama/.ollama
(default directory) to our new directory after doing below steps. Please let me know via comments, if it works.
Step 1: Open override.conf file
- You need to add an environment variable that overrides the default directory and also we need to specify the use
- Run the below command to open
/etc/systemd/system/ollama.service.d/override.conf
file.
sudo systemctl edit ollama.service
- The default one looks like
### Editing /etc/systemd/system/ollama.service.d/override.conf
### Anything between here and the comment below will become the contents of the>
### Edits below this comment will be discarded
### /etc/systemd/system/ollama.service
# [Unit]
# Description=Ollama Service
# After=network-online.target
#
# [Service]
# ExecStart=/usr/local/bin/ollama serve
# User=ollama
# Group=ollama
# Restart=always
# RestartSec=3
# Environment="PATH=/home/nikki/.nvm/versions/node/v20.17.0/bin:/home/nikki/.nv>
#
# [Install]
# WantedBy=default.target
- It specifies that anything written below line:5 will be discarded. So we need to write our env var before that comment. ### Step 2: Add env variables
- You need to add environment variable under [Service].
- After adding the env vars, the file looks like
### Editing /etc/systemd/system/ollama.service.d/override.conf
### Anything between here and the comment below will become the contents of the>
[Service]
Environment="OLLAMA_MODELS=/media/nikki/mymodels"
User=nikki
Group=nikki
### Edits below this comment will be discarded
### /etc/systemd/system/ollama.service
# [Unit]
# Description=Ollama Service
# After=network-online.target
#
# [Service]
# ExecStart=/usr/local/bin/ollama serve
# User=ollama
# Group=ollama
# Restart=always
# RestartSec=3
# Environment="PATH=/home/nikki/.nvm/versions/node/v20.17.0/bin:/home/nikki/.nv>
#
# [Install]
# WantedBy=default.target
Step 3: Save file
- I think by default the terminal opens
nano
file editor. Follow below steps to save file. - Press Ctrl + x
- Press y
- Press Enter
- If your default editor is vi/vim, then follow
- Press Ctrl + c
- Enter
:wq
to write and quit the editor - Doing this will edit the override.conf file ### Step 4: Restart the daemon
- You need to restart the daemon to apply the changes.
systemctl daemon-reload
systemctl restart ollama
To learn more about daemon, refer this .
systemctl daemon-reload
tellssystemd
to reload its configurations.systemctl restart ollama
restarts(start and then start immediately) the service namedollama
. ### Step 5: (Optional) Give permissions to new directory- Give access to the new directory, if not present
sudo chmod 755 /media/nikki/mymodels
- After execution, the permission will become
drwxr-xr-x 1 nikki nikki 4096 Jan 30 03:57 /media/nikki/mymodels
- r => read
- w => write
x => execute
d
: It's a directory.rwx
: Owner can read, write, and execute.r-x
: Group can read and execute, but not write.-
r-x
: Others can read and execute, but not write.Step 6: Download the model and run
ollama
is such an amazing tool that made naive users to run LLMs locally.Running below command will pull the model's binary from ollama registry
ollama run deepseek-r1:1.5b
- Refer ollama to browse a model for your requirements.
- To list all running models, run below command
ollama list
If you like my article, please share and follow me. If you have any doubts, let me know via comments.
Top comments (0)