Step by Step Guide to Integrate FREE DeepSeek R1 and V3 inside Cursor and Bolt.DIY🔥
DeepSeek has taken the tech world by storm, everyone’s testing out DeepSeek R1 and V3 for their impressive performance. Even OpenAI’s CEO put out a good word about it on X.
In this guide, I’ll show you how to set up Bolt.DIY or Cursor with the FREE DeepSeek API, using an EU-hosted, non-censored model. You’ve probably seen different ways to access DeepSeek distilled models, but I’ll walk you through the cheapest (or completely free) method to access Deepseek full weight R1 and V3 models.
Let’s see, how you can use it🚀
I also recorded a Video DEMO for Bolt.DIY, you can follow this side by side:
and you can follow this video DEMO for Cursor:
Why DeepSeek R1 and V3?
DeepSeek R1 is a new open-source AI model that shocked the entire tech world. Unlike costly proprietary models, its open-source nature means that multiple providers can host and distribute it, giving you flexibility and significant cost savings. Also cost of training and developing R1 was far cheaper than of OpenAI-o1 and it performs better than o1 across math, code and reasoning tasks.
As you can see, here are the evaluation scores and comparison graphs:
Both models are released under a permissive MIT license, meaning anyone can fine tune or host at own cost for local use.
Now, let’s see how to use this inside Bolt.DIY and Cursor.
Step 1
Getting FREE API Keys from Nebius AI Studio
There are lots of DeepSeek providers available, even the official DeepSeek platform offers its own API. but, I'll show you how to access one of the most cost-effective solutions out there: Nebius AI Studio. We'll be using Nebius models via OpenRouter so you can integrate the DeepSeek API seamlessly without having to reconfigure your existing Bolt.DIY or Cursor setup.
Signup/Signin on Nebius AI Studio here
Get your API keys from here and save it somewhere safe to be used later.
Nebius AI Studio offers a wide range of open-source models, giving you the flexibility to explore and use different models for various projects. When you sign up, you’ll receive $1 in free credits to get started.
Plus, they’re currently running a limited-time promo—you can claim an extra $25 in free credits using a special coupon code. Check out their official X (Twitter) page for the latest code: - https://x.com/nebiusaistudio
So basically, you’re getting Deepseek R1 and V3 for FREE to be used in your Bolt.DIY or Cursor setup🎉
You can find Nebius AI as a DeepSeek model provider inside OpenRouter—and it’s one of the most affordable options compared to other major providers. But don’t worry, we won’t be paying anything here since we’re using free credits! ⭐
Step 2
Configuring DeepSeek R1/V3 inside OpenRouter
OpenRouter is an API gateway that simplifies access to a wide range of open-source AI models. It provides a unified interface so you can easily switch between multiple providers without having to reconfigure your setup.
Signup/Signin on OpenRouter here
Go into search box and look for DeepSeek R1 or V3
Click on your preferred model to see a page with details on providers, costs, and more. 👇
All you need to do is go over Nebius provider and hover over the “key icon” and click the link to get your API keys from OpenRouter.👇
You'll be redirected to the integration page, where you'll see all providers marked as "Not Configured." Find Nebius and click the edit icon 👇
You'll see a popup window like this 👇 asking you to Edit Nebius Key. Paste the Nebius API key you created and saved in Step 1 from Nebius AI Studio, then hit Save. Now, you're ready to use Nebius models via OpenRouter!
One last step - get your OpenRouter API key.
Go to the OpenRouter dashboard, hover over the 🏠 icon, and click on Keys. Create a new API key and save it somewhere safe, you’ll need it for Cursor and Bolt.DIY.
Step 3
Using FREE DeepSeek inside Bolt.DIY
Bolt.DIY is the official open-source version of Bolt.new (formerly known as oTToDev) that lets you choose the LLM for each prompt. Currently, you can work with models from OpenAI, Anthropic, Ollama, OpenRouter, Gemini, LMStudio, Mistral, xAI and others, and it can be easily extended to support any model compatible with the Vercel AI SDK or OpenRouter.
You can run Bolt.DIY locally on your device and generate AI-powered web apps effortlessly.
Check out the setup instructions to get started - here
Now, follow these steps to run Bolt.DIY locally, You can run it directly or use Docker
Once you've set up Bolt.DIY on your local machine and installed the dependencies using IDE, run:
npm run dev
Now, Bolt.DIY will start running at http://localhost:5173/
open your browser and you’ll get something like this:
You'll see an option to choose an LLM provider, select LLM models, and set up API keys, after that Bolt.DIY can be used to build AI-powered apps. Here, we'll use our FREE DeepSeek model, which we configured earlier via Nebius and OpenRouter.
Go to Bolt.DIY folder and files directory inside your IDE and open /app/lib/modules/llm/providers
and click open-router.ts
file as seen in image below:
You can find a list of supported LLM models in the open-router.ts
file, which you can access via API to build with Bolt.DIY. In this file, we need to define or add the Nebius DeepSeek model by modifying just two lines of code.
Let’s modify just one model name and label instead of adding a new one to the file, since we’re using only DeepSeek LLM, there’s no need to make the list longer.
{
name: 'deepseek/deepseek-chat',
label: 'Nebius-DeepSeek-V3',
provider: 'OpenRouter',
maxTokenAllowed: 8000,
},
In above code part, label
refers to the LLM option you see in Bolt.DIY when running on localhost:5173, while name
is used to access that specific model via OpenRouter.
I’ve used Nebius-DeepSeek-V3 as label
to identify the model in Bolt.DIY, you can choose different name. Now for name, we need to copy model ID from OpenRouter.
Go to the OpenRouter Dashboard here, find Nebius DeepSeek V3, and copy its model ID (I’m using V3, but you can copy DeepSeek R1 if you prefer).
Remember to copy the model ID from the Provider tab - in this case, Nebius. This ensures you get access to the same DeepSeek model we set up earlier. 🚀
Now, I changed name
in above snippet with Nebius model ID deepseek/deepseek-chat that I copied from dashboard.
Now, it’s time to use it with Bolt.DIY! Open your browser and go to localhost:5173 where it’s running.
Select OpenRouter as the provider and choose "Nebius-DeepSeek-V3", as labeled in the open-router.ts
file.
One last step—add the OpenRouter API key you grabbed in Step 2. Paste the key and hit ✅
Hurray🥳 now you’re ready to use FREE DeepSeek V3 or R1 inside Bolt.DIY to create powerful apps
Give a prompt to Bolt.DIY and it will start write codes using DeepSeek..
Using FREE DeepSeek inside Cursor
If you've followed Step 1 and Step 2, you already know how to get free DeepSeek API access and set it up using Nebius Studio and OpenRouter.
Cursor is a powerful IDE that many developers use for AI-assisted code improvements or it’s composer to generate whole app. It also supports custom API keys for OpenAI, Google, Anthropic, and Azure LLMs, making it a versatile tool for modern development tasks.
Now, open Cursor and go to the Settings < Models tab. You'll find a small detail about using OpenRouter models as well
fun fact: this isn't publicly documented, and most developers never noticed it. I bet you didn’t know either! 😃
In Model tab click on Add Model
and get Nebius model Id from OpenRouter similar to what we did in Bolt.DIY, I am using R1 so model Id is deepseek/deepseek-r1
👇
Now that you've added a custom model name, enable the OpenAI API Key toggle just below it. Paste the OpenRouter API key you got in Step 2 and override the OpenAI baseURL with the OpenRouter baseURL.
You’ll find OpenRouter baseURL from here
Click “Save” and then “Verify” as shown in Image👇
Hurray🥳 now you’re ready to use FREE DeepSeek V3 or R1 inside Cursor to fix code errors and add improvements
Open Cursor chat, choose the model we just setup and start coding!
Conclusion
DeepSeek R1 and V3 are making waves as powerful open-source AI models, delivering top-tier performance in math, coding, and reasoning—all without the high costs of proprietary alternatives.
With Nebius AI Studio, you get one of the most affordable ways to access DeepSeek, thanks to its free credits and budget-friendly pricing. By setting it up through OpenRouter, you can easily integrate it into tools like Bolt.DIY and Cursor, unlocking AI-powered development without extra hassle.
If you're looking for a fast, cost-effective way to work with AI models, this setup is the perfect for you! 🚀
If You ❤️ My Content! Connect Me on Twitter
Check SaaS Tools I Use 👉🏼Access here!
I am open to collaborating on Blog Articles and Guest Posts🫱🏼🫲🏼 📅Contact Here
Top comments (2)
Great article 🔥
Thanks Sophia