DEV Community

Cover image for How To Use DeepSeek With .NET 9 | Hands-On With DeepSeek R1, Semantic Kernel & C#
CodeStreet
CodeStreet

Posted on

How To Use DeepSeek With .NET 9 | Hands-On With DeepSeek R1, Semantic Kernel & C#

In this post, we will integrate DeepSeek R1 into a .NET 9 using Semantic Kernel. If you’re looking to get started with DeepSeek models locally, this hands-on guide is for you.

What You Will Learn

  • How to Get Started with DeepSeek R1
  • How to Use Ollama for running local models
  • How to install and start running the DeepSeek R1 model
  • How to Use Semantic Kernel in C#

1. Prerequisites

  • Visual Studio 2022+ (with .NET 9 SDK installed) .NET 9 is still in preview, so ensure that you have the preview SDK installed.
  • Ollama (for managing and running local models)
  • DeepSeek1.5b Model

2. Installing Ollama

Ollama is a platform or tool (specific details may vary depending on the context) that allows users to interact with large language models (LLMs) locally. It simplifies the process of deploying and running LLMs like LLaMA, Phi, DeepSeek R1, or other open-source models.

To install Ollama visit its official website https://ollama.com/download and install it on your machine.

3. Installing DeepSeek R1

DeepSeek's first-generation reasoning models with comparable performance to OpenAI-o1, including six dense models distilled from DeepSeek-R1 based on Llama and Qwen.

On the Ollama website click on Models and click on deepseek-r1 and choose 1.5b parameter option

How To Use DeepSeek With .NET 9 - deepseek-r1:1.5b

Open Command Prompt and run the below command.

ollama run deepseek-r1:1.5b

It will download the model and start running automatically.

Once done, verify the model is available

ollama list

That’s it! We’re ready to integrate DeepSeek locally.

4. Creating .NET Console Application

  1. Launch Visual Studio
  2. Make sure .NET 9 is installed.
  3. Create a New Project
  4. File → New → Project…
  5. Pick Console App with .NET 9.
  6. Name Your Project
  7. e.g., DeepSeekDemoApp or any name you prefer.
  8. Target Framework Check
  9. Right-click on your project → Properties.
  10. Set Target Framework to .NET 9.

5. Integrating DeepSeek R1 with Semantic Kernel

While you could call DeepSeek via direct HTTP requests to Ollama, using Semantic Kernel offers a powerful abstraction for prompt engineering, orchestration, and more.

  1. Add Necessary NuGet Packages
<ItemGroup>
  <PackageReference Include="Codeblaze.SemanticKernel.Connectors.Ollama" Version="1.3.1" />
  <PackageReference Include="Microsoft.SemanticKernel" Version="1.35.0" />
</ItemGroup>
Enter fullscreen mode Exit fullscreen mode

6. Complete code

The Semantic Kernel can use a custom connector to talk to local endpoints. For simplicity, we’ll outline a sample approach:

Program.cs:

using Codeblaze.SemanticKernel.Connectors.Ollama;
using Microsoft.Extensions.DependencyInjection;
using Microsoft.SemanticKernel;

var builder = Kernel.CreateBuilder().AddOllamaChatCompletion("deepseek-r1:1.5b", "http://localhost:11434");

builder.Services.AddScoped<HttpClient>();
var kernel = builder.Build();

while (true)
{
    string input = "";
    Console.WriteLine("Ask anything to Deepseek");
    input = Console.ReadLine();
    var response = await kernel.InvokePromptAsync(input);
    Console.WriteLine($"\nDeepseek: {response}\n");
}

Enter fullscreen mode Exit fullscreen mode

7. Running & Testing

  1. Ensure Ollama is Running

-Some systems auto-run Ollama; otherwise, start it

ollama run

  1. Run Your .NET App

-Hit F5 (or Ctrl+F5) in Visual Studio.

-Watch for console output—

Running DeepSeek on .NET

Support me!

If you found this guide helpful, make sure to check out the accompanying YouTube video tutorial where I walk you through the process visually. Don’t forget to subscribe to my channel for more amazing tutorials!

Feel free to leave your questions, comments, or suggestions below. Happy coding!

Top comments (1)

Collapse
 
stevsharp profile image
Spyros Ponaris

Thanks for sharing 🙏.