Intro
I want to add an AI code assistant to my VSCode and accelerate my work, but I have no money...🫠
Also, I want to use the AI safely and protect privacy.
Well, what should I do?
I googled carefully and found that Continue + VSCode + Ollama
might be the answer.
What is Continue?
Continue
is a super AI code assistant.
https://www.continue.dev/
The feature of Continue is as follows.
1️⃣ First of all, yes, it's free!🤑
2️⃣ You can use it in a local environment.
By using the LLM in a local environment, you can protect the privacy of your data.
It is enabled by Ollama
which can use LLM (Large Language Model) like DeepSeek-R1
, Llama 3
, locally.
3️⃣ You can use with VSCode.
You don't have to go back and forth to the browser and the code editor.
All you have to do is it to install Continue in your VSCode to use it.
How to use Continue?
1️⃣ Install Continue in your VSCode.
"Extensions" > input "Continue" > "Install"
2️⃣ If you are aware of data privacy, follow the setting below.
“File” > “Preferences” > “Settings” > Search "continue" > Uncheck ”Telemetry Enabled”
https://docs.continue.dev/telemetry
3️⃣ Download Ollama
https://ollama.com/
4️⃣ Pull LLM
Choose the LLM you like form this page.
For example, DeepSeek-R1
, Llama 3
.
https://ollama.com/search
Pull the LLM (Large Language Model) you want to use.
Ollama pull LLMname
5️⃣ Set Ollama and LLM to Continue
① "gear" icon in Continue header > "Open configuration file" button > add the "LLMname" at the "models" section
② Push the "LLMname" select box at the left down of the chat input box > "+ add chat model" > Provider "Ollama" > Model "LLMname" (or select "Autodetect" first and then "LLMname")
Now you are ready to use Continue.
Chat
https://docs.continue.dev/chat/how-to-use-it
You can ask questions to LLM using chat
.
LLM will answer you with text/code.
If you want to add the code to chat, press cmd/ctrl + L
.
Autocomplete
https://docs.continue.dev/autocomplete/how-to-use-it
Autocomplete will suggest a code as you write.
Press the Tab
to accept all the suggestion, and press cmd/ctrl + →
to accept part of the suggestion.
Edit
https://docs.continue.dev/edit/how-to-use-it
You can ask questions to LLM in the code using edit
.
LLM will answer you with text/code.
If you want to use LLM in the code, press cmd/ctrl + i
.
When the prompt is short and concise use edit
, and when the prompt is long and complex use chat
.
Codebase
https://docs.continue.dev/customize/deep-dives/codebase
By adding a @codebase
to the input box, the LLM will answer after considering all the workspace.
When adding a @folder
to the input box, the LLM will answer after considering the specific folder.
Negative point of Continue and workaround
Well, my PC is just a normal office PC (CPU: Core i7, Memory: 16GB) and doesn't have NVIDIA GPU.
The response is sooooo slow compare of ChatGPT
on the web.🐢
You can lower the parameter number of LLM to speed up, but if you do so, the answer quality gets worse.
So, if you are not using a high spec PC, you might need to switch AI.
For example,
When you need privacy: Use Continue
locally slowly.🦥
When you don't need privacy: Use ChatGPT
on the web fast.
Or, another way, you can set the LLM part to another PC for load balancing and access it.
Outro
Using AI for coding like Continue
, our productivity will definitely become higher.💪
There seems to be a lot of other convenient coding assistant AI such as Cursor
, Cline
, Codeium
, etc.
I might try them another time.
Thank you for reading.
Happy coding!
Top comments (0)