{"id":1682,"date":"2025-01-31T22:17:11","date_gmt":"2025-01-31T22:17:11","guid":{"rendered":"https:\/\/rssfeedtelegrambot.bnaya.co.il\/index.php\/2025\/01\/31\/build-intelligent-apps-with-net-and-deepseek-r1-today\/"},"modified":"2025-01-31T22:17:11","modified_gmt":"2025-01-31T22:17:11","slug":"build-intelligent-apps-with-net-and-deepseek-r1-today","status":"publish","type":"post","link":"https:\/\/rssfeedtelegrambot.bnaya.co.il\/index.php\/2025\/01\/31\/build-intelligent-apps-with-net-and-deepseek-r1-today\/","title":{"rendered":"Build Intelligent Apps with .NET and DeepSeek R1 Today!"},"content":{"rendered":"<p>The DeepSeek R1 model has been gaining a ton of attention lately. And one of the questions we\u2019ve been getting asked is: \u201cCan I use DeepSeek in my .NET applications\u201d? The answer is absolutely! I\u2019m going to walk you through how to use the <strong><a href=\"https:\/\/learn.microsoft.com\/dotnet\/ai\/ai-extensions\">Microsoft.Extensions.AI<\/a><\/strong> (MEAI) library with DeepSeek R1 on GitHub Models so you can start experimenting with the R1 model today.<\/p>\n<h2>MEAI makes using AI services easy<\/h2>\n<p>The MEAI library provides a set of unified abstractions and middleware to simplify the integration of AI services into .NET applications.<\/p>\n<p>In other words, if you develop your application with MEAI, your code will use the same APIs no matter which model you decide to use \u201cunder the covers\u201d. This lowers the friction it takes to build a .NET AI application as you\u2019ll only have to remember a single library\u2019s (MEAI\u2019s) way of doing things regardless of which AI service you use.<\/p>\n<p>And for MEAI, the main interface you\u2019ll use is IChatClient.<\/p>\n<h3>Let\u2019s chat with DeepSeek R1<\/h3>\n<p>GitHub Models allows you to experiment with a ton of different AI models without having to worry about hosting. It\u2019s a great way to get started in your AI development journey for free. And GitHub Models gets updated with new models all the time, <a href=\"https:\/\/azure.microsoft.com\/blog\/deepseek-r1-is-now-available-on-azure-ai-foundry-and-github\/\">like DeepSeek\u2019s R1<\/a>.<\/p>\n<p><a href=\"https:\/\/azure.microsoft.com\/blog\/deepseek-r1-is-now-available-on-azure-ai-foundry-and-github\/\"><\/a><\/p>\n<p>The demo app we\u2019re going to build is a simple console application and it\u2019s available on <a href=\"https:\/\/github.com\/codemillmatt\/deepseek-dotnet\">GitHub at codemillmatt\/deepseek-dotnet<\/a>. You can clone or fork it to follow along, but we\u2019ll talk through the important pieces below too.<\/p>\n<p>First let\u2019s take care of some prerequisites:<\/p>\n<p>Head on over to GitHub and generate a personal access token (PAT). This will be your key for GitHub Models access. <a href=\"https:\/\/docs.github.com\/en\/authentication\/keeping-your-account-and-data-secure\/managing-your-personal-access-tokens#creating-a-personal-access-token-classic\">Follow these instructions to create the PAT<\/a>. You will want a <em>classic<\/em> token.<br \/>\nOpen the <strong>DeepSeek.Console.GHModels<\/strong> project. You can either open the full solution in Visual Studio or just the project folder if using VS Code.<br \/>\nCreate a new user secrets entry for the GitHub PAT. Name it <strong>GH_TOKEN<\/strong> and paste in the PAT you generated as the value.<\/p>\n<p>Now let\u2019s explore the code a bit:<\/p>\n<p>Open the <strong>Program.cs<\/strong> file in the <strong>DeepSeek.Console.GHModels<\/strong> project.<br \/>\nThe first 2 things to notice are where we initialize the modelEndpoint and modelName variables. These are standard for the GitHub Models service, they will always be the same.<br \/>\nNow for the fun part! We\u2019re going to initialize our chat client. This is where we\u2019ll connect to the DeepSeek R1 model.<br \/>\nIChatClient client = new ChatCompletionsClient(modelEndpoint, new AzureKeyCredential(Configuration[&#8220;GH_TOKEN&#8221;])).AsChatClient(modelName);<\/p>\n<p>This uses the <strong>Microsoft.Extensions.AI.AzureAIInference<\/strong> package to connect to the GitHub Models service. But the AsChatClient function returns an IChatClient implementation. And that\u2019s super cool. Because regardless of which model we chose from GitHub Models, we\u2019d still write our application against the IChatClient interface!<\/p>\n<p>Next up we pass in our question, or prompt, to the model. And we\u2019ll use make sure we get a streaming response back, this way we can display it as it comes in.<br \/>\nvar response = client.CompleteStreamingAsync(question);<\/p>\n<p>await foreach (var item in response)<br \/>\n{<br \/>\n    Console.Write(item);<br \/>\n}<\/p>\n<p>That\u2019s it! Go ahead and run the project. It might take a few seconds to get the response back (lots of people are trying the model out!). You\u2019ll notice the response isn\u2019t like you\u2019d see in a \u201cnormal\u201d chat bot. DeepSeek R1 is a reasoning model, so it wants to figure out and reason through problems. The first part of the response will be it\u2019s <em>reasoning<\/em> and will be delimited by <strong>&lt;think&gt;<\/strong> and is quite interesting. The second part of the response will be the answer to the question you asked.<\/p>\n<p>Here\u2019s a partial example of a response:<\/p>\n<p>&lt;think&gt;<br \/>\nOkay, let&#8217;s try to figure this out. The problem says: If I have 3 apples and eat 2, how many bananas do I have? Hmm, at first glance, that seems a bit confusing. Let me break it down step by step.<\/p>\n<p>So, the person starts with 3 apples. Then they eat 2 of them. That part is straightforward. If you eat 2 apples out of 3, you&#8217;d have 1 apple left, right? But then the question shifts to bananas. Wait, where did bananas come from? The original problem only mentions apples. There&#8217;s no mention of bananas at all.<\/p>\n<p>&#8230;<\/p>\n<h3>Do I have to use GitHub Models?<\/h3>\n<p>You\u2019re not limited to running DeepSeek R1 on GitHub Models. You can run it on Azure or even locally (or on GitHub Codespaces) through Ollama. I provided 2 additional console applications in the GitHub repository that show you how to do that.<\/p>\n<p>The biggest difference between the GitHub Models version is where the DeepSeek R1 model is deployed, the credentials you use to connect to it, and the specific model name.<\/p>\n<p>If you deploy on Azure AI Foundry, the code is exactly the same. Here are <a href=\"https:\/\/azure.microsoft.com\/blog\/deepseek-r1-is-now-available-on-azure-ai-foundry-and-github\/\">some instructions on how to deploy<\/a> the DeepSeek R1 model into Azure AI Foundry.<\/p>\n<p>If you want to run locally on Ollama, we\u2019ve provided a devcontainer definition that you can use to run Ollama in Docker. It will automatically pull down a small parameter version of DeepSeek R1 and start it up for you. The only difference is you\u2019ll use the <strong>Microsoft.Extensions.AI.Ollama<\/strong> NuGet package and initialize the IChatClient with the with OllamaChatClient. Interacting with DeepSeek R1 is the same.<\/p>\n<p>Note: If you <a href=\"https:\/\/codespaces.new\/codemillmatt\/deepseek-dotnet\">run this in a GitHub Codespace<\/a>, it will take a couple of minutes to start up and you\u2019ll use roughly 8GB of space \u2013 so be aware depending on your Codespace plan.<\/p>\n<p>Of course these are simple Console applications. If you\u2019re using .NET Aspire, it\u2019s easy to use Ollama and DeepSeek R1. Thanks to the .NET Aspire Community Toolkit\u2019s Ollama integration, all you need to do is add one line and you\u2019re all set!<\/p>\n<p>var chat = ollama.AddModel(&#8220;chat&#8221;, &#8220;deepseek-r1&#8221;);<\/p>\n<p>Check out this <a href=\"https:\/\/devblogs.microsoft.com\/dotnet\/local-ai-models-with-dotnet-aspire\/\">blog post<\/a> with all the details on how to get going.<\/p>\n<h2>Summary<\/h2>\n<p>DeepSeek R1 is an exciting new reasoning model that\u2019s drawing a lot of attention and you can build .NET applications that make use of it today using the Microsoft.Extensions.AI library. GitHub Models lowers the friction of getting started and experimenting it with. Go ahead and <a href=\"https:\/\/github.com\/codemillmatt\/deepseek-dotnet\">try out the samples<\/a> and checkout our other <a href=\"https:\/\/github.com\/dotnet\/ai-samples\/tree\/main\/src\/microsoft-extensions-ai\">MEAI samples<\/a>!<\/p>\n<p>The post <a href=\"https:\/\/devblogs.microsoft.com\/dotnet\/start-building-an-intelligent-app-with-dotnet-and-deep-seek\/\">Build Intelligent Apps with .NET and DeepSeek R1 Today!<\/a> appeared first on <a href=\"https:\/\/devblogs.microsoft.com\/dotnet\">.NET Blog<\/a>.<\/p>","protected":false},"excerpt":{"rendered":"<p>The DeepSeek R1 model has been gaining a ton of attention lately. And one of the questions we\u2019ve been getting [&hellip;]<\/p>\n","protected":false},"author":0,"featured_media":0,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"site-sidebar-layout":"default","site-content-layout":"","ast-site-content-layout":"default","site-content-style":"default","site-sidebar-style":"default","ast-global-header-display":"","ast-banner-title-visibility":"","ast-main-header-display":"","ast-hfb-above-header-display":"","ast-hfb-below-header-display":"","ast-hfb-mobile-header-display":"","site-post-title":"","ast-breadcrumbs-content":"","ast-featured-img":"","footer-sml-layout":"","ast-disable-related-posts":"","theme-transparent-header-meta":"","adv-header-id-meta":"","stick-header-meta":"","header-above-stick-meta":"","header-main-stick-meta":"","header-below-stick-meta":"","astra-migrate-meta-layouts":"default","ast-page-background-enabled":"default","ast-page-background-meta":{"desktop":{"background-color":"var(--ast-global-color-4)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"tablet":{"background-color":"","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"mobile":{"background-color":"","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""}},"ast-content-background-meta":{"desktop":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"tablet":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"mobile":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""}},"footnotes":""},"categories":[7],"tags":[],"class_list":["post-1682","post","type-post","status-publish","format-standard","hentry","category-dotnet"],"_links":{"self":[{"href":"https:\/\/rssfeedtelegrambot.bnaya.co.il\/index.php\/wp-json\/wp\/v2\/posts\/1682","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/rssfeedtelegrambot.bnaya.co.il\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/rssfeedtelegrambot.bnaya.co.il\/index.php\/wp-json\/wp\/v2\/types\/post"}],"replies":[{"embeddable":true,"href":"https:\/\/rssfeedtelegrambot.bnaya.co.il\/index.php\/wp-json\/wp\/v2\/comments?post=1682"}],"version-history":[{"count":0,"href":"https:\/\/rssfeedtelegrambot.bnaya.co.il\/index.php\/wp-json\/wp\/v2\/posts\/1682\/revisions"}],"wp:attachment":[{"href":"https:\/\/rssfeedtelegrambot.bnaya.co.il\/index.php\/wp-json\/wp\/v2\/media?parent=1682"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/rssfeedtelegrambot.bnaya.co.il\/index.php\/wp-json\/wp\/v2\/categories?post=1682"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/rssfeedtelegrambot.bnaya.co.il\/index.php\/wp-json\/wp\/v2\/tags?post=1682"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}