{"id":1061,"date":"2024-07-22T19:07:59","date_gmt":"2024-07-22T19:07:59","guid":{"rendered":"https:\/\/rssfeedtelegrambot.bnaya.co.il\/index.php\/2024\/07\/22\/add-ai-to-your-net-apps-easily-with-prompty\/"},"modified":"2024-07-22T19:07:59","modified_gmt":"2024-07-22T19:07:59","slug":"add-ai-to-your-net-apps-easily-with-prompty","status":"publish","type":"post","link":"https:\/\/rssfeedtelegrambot.bnaya.co.il\/index.php\/2024\/07\/22\/add-ai-to-your-net-apps-easily-with-prompty\/","title":{"rendered":"Add AI to Your .NET Apps Easily with Prompty"},"content":{"rendered":"<p>Adding AI features to .NET development is a new and exciting experience. In this blog post, we will explore Prompty and how to use it to integrate a Large Language Model, like GPT-4o, into your development flow and .NET applications.<\/p>\n<h2>Introduction to Prompty<\/h2>\n<p>As AI enthusiasts and .NET developers, we constantly seek tools that simplify our workflows and enhance our productivity. One such powerful tool is Prompty, a Visual Studio Code extension designed to facilitate the integration of Large Language Models (LLMs) like GPT-4o into your applications. Prompty provides an intuitive interface to interact with LLMs directly from your development environment, making it easier than ever to add AI features to your projects.<\/p>\n<p>Prompty is developed by Microsoft and is available for free on the Visual Studio Code marketplace. Whether you are building chatbots, creating content generators, or experimenting with other AI-driven applications, Prompty can significantly streamline your development process.<\/p>\n<h2>Description of a Typical Flow of a Developer Using Prompty in Visual Studio Code<\/h2>\n\n<p>Let\u2019s take a look at a sample flow on how to use Prompty. This process usually involves a few key steps:<\/p>\n<p><strong>Installation<\/strong>: Begin by installing the Prompty extension from the <a href=\"https:\/\/marketplace.visualstudio.com\/items?itemName=ms-toolsai.prompty\">Visual Studio Code Marketplace<\/a>.<\/p>\n<p><strong>Setup<\/strong>: After installation, configure the extension by providing your API keys and setting up the necessary parameters to connect to the LLM of your choice, such as GPT-4o.<\/p>\n<p><strong>Integration<\/strong>: Prompty integrates seamlessly with your development workflow. Start by creating a new file or opening an existing one where you want to use the LLM. Prompty provides commands and snippets to easily insert prompts and handle responses.<\/p>\n<p><strong>Development<\/strong>: Write prompts directly in your codebase to interact with the LLM. Prompty supports various prompt formats and provides syntax highlighting to make your prompts readable and maintainable.<\/p>\n<p>You can use the extension to generate code snippets, create documentation, or even debug your applications by asking the LLM specific questions. Once you have your prompt ready, you can use the extension to generate code snippets, create documentation, or even debug your applications by asking the LLM specific questions.<\/p>\n<p><strong>Testing<\/strong>: Test your prompts and adjust them as needed to get the desired responses from the LLM. Prompty allows you to iterate quickly, refining your prompts to improve the accuracy and relevance of the AI\u2019s responses.<\/p>\n<h2>Real Sample Using a WebAPI Application<\/h2>\n<p>Let\u2019s walk through a practical example of using Prompty in a .NET WebAPI application.<\/p>\n<h3>Step 1: Set up the WebAPI Project<\/h3>\n<p>First, create a new WebAPI project, named PromptyWebAPI, using the .NET CLI:<\/p>\n<p>dotnet new webapi -n PromptyWebAPI<br \/>\ncd PromptyWebAPI<\/p>\n<p>Add the following dependencies:<\/p>\n<p>dotnet add package Microsoft.SemanticKernel &#8211;version 1.15.1<br \/>\ndotnet add package Microsoft.SemanticKernel.Prompty &#8211;version 1.15.1-alpha<br \/>\ndotnet add package Microsoft.Extensions.Configuration.UserSecrets &#8211;version 8.0.0<\/p>\n<p>Run the project directly from Visual Studio Code or with the command:<\/p>\n<p>dotnet run<\/p>\n<p>We will see the standard Weather Forecast API Endpoint.<\/p>\n\n<p><strong>Note:<\/strong> The WebAPI project uses User Secrets to access the GPT-4o Azure OpenAI Model. Set up the user secrets with the following commands:<\/p>\n<p>dotnet user-secrets init<br \/>\ndotnet user-secrets set &#8220;AZURE_OPENAI_MODEL&#8221; &#8220;&lt; model &gt;&#8221;<br \/>\ndotnet user-secrets set &#8220;AZURE_OPENAI_ENDPOINT&#8221; &#8220;&lt; endpoint &gt;&#8221;<br \/>\ndotnet user-secrets set &#8220;AZURE_OPENAI_APIKEY&#8221; &#8220;&lt; api key &gt;&#8221;<\/p>\n<h3>Step 2: Create a prompty for a more descriptive summary for the Forecast<\/h3>\n<p>The weather forecast returns a fictitious forecast including dates, temperatures (C and F) and a summary. Our goal is to create a prompt that can generate a more detailed Summary.<\/p>\n<p>Let\u2019s add a new prompty file to the root of the PromptyWebAPI folder. This is as easy as right-click and New Prompty. Rename the created file to weatherforecastdesc.prompty.<\/p>\n<p>Our solution should look like this:<\/p>\n\n<p>Now it\u2019s time to complete the sections of our prompty file. Each section has specific information related to the use of the LLM. In example, the <strong>model section<\/strong> will define the model to be used, the <strong>sample section<\/strong> will provide a sample for the expected output and finally we have the prompt to be used. <\/p>\n<p>In the following sample, the prompt defines a <strong>System message<\/strong>, and in the <strong>Context<\/strong>, we provide the parameters for the weather.<\/p>\n<p>Replace the content of your prompty file with the following content.  <\/p>\n<p>&#8212;<br \/>\nname: generate_weather_detailed_description<br \/>\ndescription: A prompt that generated a detaled description for a weather forecast<br \/>\nauthors:<br \/>\n  &#8211; Bruno Capuano<br \/>\nmodel:<br \/>\n  api: chat<br \/>\n  configuration:<br \/>\n    type: azure_openai<br \/>\n    azure_endpoint: ${env:AZURE_OPENAI_ENDPOINT}<br \/>\n    azure_deployment: ${env:AZURE_OPENAI_MODEL}<br \/>\n  parameters:<br \/>\n    max_tokens: 3000<br \/>\nsample:<br \/>\n  today: &gt;<br \/>\n    2024-07-16<\/p>\n<p>  date: &gt;<br \/>\n    2024-07-17<\/p>\n<p>  forecastTemperatureC: &gt;<br \/>\n    25\u00b0C<br \/>\n&#8212;<\/p>\n<p># System:<br \/>\nYou are an AI assistant who generated detailed weather forecast descriptions. The detailed description is a paragraph long.<br \/>\nYou use the full description of the date, including the weekday.<br \/>\nYou also give a reference to the forecast compared to the current date today.<br \/>\nAs the assistant, you generate descriptions using a funny style and even add some personal flair with appropriate emojis.<\/p>\n<p># Context<br \/>\nUse the following context to generated a detailed weather forecast descriptions<br \/>\n&#8211; Today: {{today}}<br \/>\n&#8211; Date: {{date}}<br \/>\n&#8211; TemperatureC: {{forecastTemperatureC}}<\/p>\n<p>Once our prompty is ready, we can test the prompt pressing <strong>F5<\/strong>. Once we run our prompty, we should see the results in the output window:<\/p>\n\n<p>This is the moment to start to refine our prompt!<\/p>\n<p><em><strong>Bonus:<\/strong> Right click on the prompty file, also allow the generation of C# code using Semantic Kernel to use the current file.<\/em><\/p>\n\n<p><em><strong>Note:<\/strong> Previous to this, we need to provide the necessary information for the LLM to be used by prompty. Follow the extension configuration to do this. The easiest way is to create a .env file with the LLM information.<\/em><\/p>\n<h3>Step 3: Update the endpoint using prompty for the forecast summary<\/h3>\n<p>Now we can use this prompty, using Semantic Kernel directly in our project. Let\u2019s edit the program.cs file and apply the following changes:<\/p>\n<p>Add the necessary usings to the top of the file.<br \/>\nCreate a Semantic Kernel to generated the forecast summaries.<\/p>\n<p>Add the new forecast summary in the forecast result.<\/p>\n<p>To generate the detailed summary, Semantic Kernel will use the prompty file and the weather information.<\/p>\n<p>using Microsoft.SemanticKernel;<\/p>\n<p>var builder = WebApplication.CreateBuilder(args);<\/p>\n<p>\/\/ Add services to the container.<br \/>\n\/\/ Learn more about configuring Swagger\/OpenAPI at https:\/\/aka.ms\/aspnetcore\/swashbuckle<br \/>\nbuilder.Services.AddEndpointsApiExplorer();<br \/>\nbuilder.Services.AddSwaggerGen();<\/p>\n<p>\/\/ Azure OpenAI keys<br \/>\nvar config = new ConfigurationBuilder().AddUserSecrets&lt;Program&gt;().Build();<br \/>\nvar deploymentName = config[&#8220;AZURE_OPENAI_MODEL&#8221;];<br \/>\nvar endpoint = config[&#8220;AZURE_OPENAI_ENDPOINT&#8221;];<br \/>\nvar apiKey = config[&#8220;AZURE_OPENAI_APIKEY&#8221;];<\/p>\n<p>\/\/ Create a chat completion service<br \/>\nbuilder.Services.AddKernel();<br \/>\nbuilder.Services.AddAzureOpenAIChatCompletion(deploymentName, endpoint, apiKey);<\/p>\n<p>var app = builder.Build();<\/p>\n<p>\/\/ Configure the HTTP request pipeline.<br \/>\nif (app.Environment.IsDevelopment())<br \/>\n{<br \/>\n    app.UseSwagger();<br \/>\n    app.UseSwaggerUI();<br \/>\n}<\/p>\n<p>app.UseHttpsRedirection();<\/p>\n<p>app.MapGet(&#8220;\/weatherforecast&#8221;, async (HttpContext context, Kernel kernel) =&gt;<br \/>\n{<br \/>\n    var forecast = new List&lt;WeatherForecast&gt;();<br \/>\n    for (int i = 0; i &lt; 3; i++)<br \/>\n    {<br \/>\n        var forecastDate = DateOnly.FromDateTime(DateTime.Now.AddDays(i));<br \/>\n        var forecastTemperature = Random.Shared.Next(-20, 55);<\/p>\n<p>        var weatherFunc = kernel.CreateFunctionFromPromptyFile(&#8220;weatherforecastdesc.prompty&#8221;);<br \/>\n        var forecastSummary = await weatherFunc.InvokeAsync&lt;string&gt;(kernel, new()<br \/>\n        {<br \/>\n            { &#8220;today&#8221;, $&#8221;{DateOnly.FromDateTime(DateTime.Now)}&#8221; },<br \/>\n            { &#8220;date&#8221;, $&#8221;{forecastDate}&#8221; },<br \/>\n            { &#8220;forecastTemperatureC&#8221;, $&#8221;{forecastTemperature}&#8221; }<br \/>\n        });<\/p>\n<p>        forecast.Add(new WeatherForecast(forecastDate, forecastTemperature, forecastSummary));<br \/>\n    }<br \/>\n    return forecast;<br \/>\n})<br \/>\n.WithName(&#8220;GetWeatherForecast&#8221;)<br \/>\n.WithOpenApi();<\/p>\n<p>app.Run();<\/p>\n<p>record WeatherForecast(DateOnly Date, int TemperatureC, string? Summary)<br \/>\n{<br \/>\n    public int TemperatureF =&gt; 32 + (int)(TemperatureC \/ 0.5556);<br \/>\n}<\/p>\n<p>When we test again the \/weatherforecast endpoint, the outputs should include more detailed summaries. The following example includes the current date (Jul-16) and 2 more days:<\/p>\n<p><em><strong>Note:<\/strong> These are random generated temperatures. I\u2019m not sure about a temperature change  from -4C\/25F to 45C\/112F in a single day.<\/em><\/p>\n<p>[<br \/>\n  {<br \/>\n    &#8220;date&#8221;: &#8220;2024-07-16&#8221;,<br \/>\n    &#8220;temperatureC&#8221;: -4,<br \/>\n    &#8220;summary&#8221;: &#8220;\ud83c\udf2c\u2744 Happy Tuesday, July 16th, 2024, folks! Guess what? Today\u2019s weather forecast is brought to you by the Frozen Frappuccino Club, because it\u2019s a chilly one! With a temperature of -4\u00b0C, it\u2019s colder than a snowman\u2019s nose out there! \ud83e\udd76 So, get ready to channel your inner penguin and waddle through the frosty air. Remember to layer up with your snuggiest sweaters and warmest scarves, or you might just turn into an icicle! Compared to good old yesterday, well&#8230; there\u2019s not much change, because yesterday was just as brrrrr-tastic. Stay warm, my friends, and maybe keep a hot chocolate handy for emergencies! \u2615\u26c4&#8221;,<br \/>\n    &#8220;temperatureF&#8221;: 25<br \/>\n  },<br \/>\n  {<br \/>\n    &#8220;date&#8221;: &#8220;2024-07-17&#8221;,<br \/>\n    &#8220;temperatureC&#8221;: 45,<br \/>\n    &#8220;summary&#8221;: &#8220;\ud83c\udf1e\ud83d\udd25 Well, buckle up, buttercup, because *Wednesday, July 17, 2024*, is coming in hot! If you thought today was toasty, wait until you get a load of tomorrow. With a sizzling temperature of 45\u00b0C, it&#8217;s like Mother Nature cranked the thermostat up to &#8220;sauna mode.&#8221; \ud83c\udf21 Don&#8217;t even think about wearing dark colors or stepping outside without some serious SPF and hydration on standby! Maybe it&#8217;s a good day to try frying an egg on the sidewalk for breakfast\u2014just kidding, or am I? \ud83e\udd75 Anyway, stay cool, find a shady spot, and keep your ice cream close; you&#8217;re gonna need it! \ud83c\udf66&#8221;,<br \/>\n    &#8220;temperatureF&#8221;: 112<br \/>\n  },<br \/>\n  {<br \/>\n    &#8220;date&#8221;: &#8220;2024-07-18&#8221;,<br \/>\n    &#8220;temperatureC&#8221;: 35,<br \/>\n    &#8220;summary&#8221;: &#8220;Ladies and gentlemen, fasten your seatbelts and hold onto your hats\u2014it\u2019s going to be a sizzling ride! \ud83d\udd76\ud83c\udf1e On Thursday, July 18, 2024, just two days from today, Mother Nature cranks up the heat like she\u2019s trying to turn the entire planet into a giant summer barbecue. \ud83c\udf21\ud83d\udd25 With the temperature shooting up to a toasty 35\u00b0C, it\u2019s the perfect day to channel your inner popsicle in front of the A\/C. Water fights, ice cream sundaes, and epic pool floats are all highly recommended survival strategies. And remember, folks, sunscreen is your best friend\u2014don&#8217;t be caught out there lookin\u2019 like a lobster in a sauna! \ud83e\udd9e\u2600 So get ready to sweat but with style, as we dive headfirst into the fantastic scorching adventure that Thursday promises to be! \ud83d\ude0e\ud83c\udf67\ud83c\udf34&#8221;,<br \/>\n    &#8220;temperatureF&#8221;: 94<br \/>\n  }<br \/>\n]<\/p>\n<h2>Summary<\/h2>\n<p>Prompty offers .NET developers an efficient way to integrate AI capabilities into their applications. By using this Visual Studio Code extension, developers can effortlessly incorporate GPT-4o and other Large Language Models into their workflows.<\/p>\n<p>Prompty and Semantic Kernel simplifies the process of generating code snippets, creating documentation, and debugging applications with AI-driven focus.<\/p>\n<p>To learn more about Prompty and explore its features, visit the <strong><a href=\"https:\/\/prompty.ai\/\">Prompty main page<\/a><\/strong>, check out the <strong><a href=\"https:\/\/marketplace.visualstudio.com\/items?itemName=ms-toolsai.prompty\">Prompty Visual Studio Code extension<\/a><\/strong>, dive into the <strong><a href=\"https:\/\/github.com\/microsoft\/prompty\">Prompty source code on GitHub<\/a><\/strong> or watch the Build session <strong><a href=\"https:\/\/www.youtube.com\/watch?v=HALMFU7o9Gc\">Practical End-to-End AI Development using Prompty and AI Studio | BRK114<\/a><\/strong>.<\/p>\n<p>The post <a href=\"https:\/\/devblogs.microsoft.com\/dotnet\/add-ai-to-your-dotnet-apps-easily-with-prompty\/\">Add AI to Your .NET Apps Easily with Prompty<\/a> appeared first on <a href=\"https:\/\/devblogs.microsoft.com\/dotnet\">.NET Blog<\/a>.<\/p>","protected":false},"excerpt":{"rendered":"<p>Adding AI features to .NET development is a new and exciting experience. In this blog post, we will explore Prompty [&hellip;]<\/p>\n","protected":false},"author":0,"featured_media":0,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"site-sidebar-layout":"default","site-content-layout":"","ast-site-content-layout":"default","site-content-style":"default","site-sidebar-style":"default","ast-global-header-display":"","ast-banner-title-visibility":"","ast-main-header-display":"","ast-hfb-above-header-display":"","ast-hfb-below-header-display":"","ast-hfb-mobile-header-display":"","site-post-title":"","ast-breadcrumbs-content":"","ast-featured-img":"","footer-sml-layout":"","ast-disable-related-posts":"","theme-transparent-header-meta":"","adv-header-id-meta":"","stick-header-meta":"","header-above-stick-meta":"","header-main-stick-meta":"","header-below-stick-meta":"","astra-migrate-meta-layouts":"default","ast-page-background-enabled":"default","ast-page-background-meta":{"desktop":{"background-color":"var(--ast-global-color-4)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"tablet":{"background-color":"","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"mobile":{"background-color":"","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""}},"ast-content-background-meta":{"desktop":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"tablet":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"mobile":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""}},"footnotes":""},"categories":[7],"tags":[],"class_list":["post-1061","post","type-post","status-publish","format-standard","hentry","category-dotnet"],"_links":{"self":[{"href":"https:\/\/rssfeedtelegrambot.bnaya.co.il\/index.php\/wp-json\/wp\/v2\/posts\/1061","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/rssfeedtelegrambot.bnaya.co.il\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/rssfeedtelegrambot.bnaya.co.il\/index.php\/wp-json\/wp\/v2\/types\/post"}],"replies":[{"embeddable":true,"href":"https:\/\/rssfeedtelegrambot.bnaya.co.il\/index.php\/wp-json\/wp\/v2\/comments?post=1061"}],"version-history":[{"count":0,"href":"https:\/\/rssfeedtelegrambot.bnaya.co.il\/index.php\/wp-json\/wp\/v2\/posts\/1061\/revisions"}],"wp:attachment":[{"href":"https:\/\/rssfeedtelegrambot.bnaya.co.il\/index.php\/wp-json\/wp\/v2\/media?parent=1061"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/rssfeedtelegrambot.bnaya.co.il\/index.php\/wp-json\/wp\/v2\/categories?post=1061"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/rssfeedtelegrambot.bnaya.co.il\/index.php\/wp-json\/wp\/v2\/tags?post=1061"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}