title | description | zone_pivot_groups | author | ms.topic | ms.author | ms.date | ms.service |
---|---|---|---|---|---|---|---|
How to quickly start with Semantic Kernel | Follow along with Semantic Kernel's guides to quickly learn how to use the SDK | programming-languages | matthewbolanos | quickstart | mabolan | 07/11/2023 | semantic-kernel |
In just a few steps, you can build your first AI agent with Semantic Kernel in either Python, .NET, or Java. This guide will show you how to...
- Install the necessary packages
- Create a back-and-forth conversation with an AI
- Give an AI agent the ability to run your code
- Watch the AI create plans on the fly
::: zone pivot="programming-language-csharp"
Semantic Kernel has several NuGet packages available. For most scenarios, however, you typically only need Microsoft.SemanticKernel
.
You can install it using the following command:
dotnet add package Microsoft.SemanticKernel
For the full list of Nuget packages, please refer to the supported languages article.
::: zone-end
::: zone pivot="programming-language-python"
Instructions for accessing the SemanticKernel
Python package is available here. It's as easy as:
pip install semantic-kernel
::: zone-end
::: zone pivot="programming-language-java"
Instructions for accessing the SemanticKernel
Java package is available here. It's as easy as:
<dependencyManagement> <dependencies> <dependency> <groupId>com.microsoft.semantic-kernel</groupId> <artifactId>semantickernel-bom</artifactId> <version>${sk.version}</version> <type>pom</type> <scope>import</scope> </dependency> </dependencies> </dependencyManagement> <dependencies> <dependency> <groupId>com.microsoft.semantic-kernel</groupId> <artifactId>semantickernel-api</artifactId> </dependency> <dependency> <groupId>com.microsoft.semantic-kernel</groupId> <artifactId>semantickernel-aiservices-openai</artifactId> </dependency> </dependencies>
::: zone-end
::: zone pivot="programming-language-python,programming-language-csharp"
If you're a Python or C# developer, you can quickly get started with our notebooks. These notebooks provide step-by-step guides on how to use Semantic Kernel to build AI agents.
::: zone-end
::: zone pivot="programming-language-python" To get started, follow these steps:
- Clone the Semantic Kernel repo
- Open the repo in Visual Studio Code
- Navigate to _/python/samples/getting_started
- Open 00-getting-started.ipynb to get started setting your environment and creating your first AI agent! ::: zone-end
::: zone pivot="programming-language-csharp" To get started, follow these steps:
- Clone the Semantic Kernel repo
- Open the repo in Visual Studio Code
- Navigate to _/dotnet/notebooks
- Open 00-getting-started.ipynb to get started setting your environment and creating your first AI agent! ::: zone-end
::: zone pivot="programming-language-csharp"
- Create a new .NET Console project using this command:
dotnet new console
- Install the following .NET dependencies:
dotnet add package Microsoft.SemanticKernel dotnet add package Microsoft.Extensions.Logging dotnet add package Microsoft.Extensions.Logging.Console
- Replace the content of the
Program.cs
file with this code:
// Import packagesusingMicrosoft.Extensions.DependencyInjection;usingMicrosoft.Extensions.Logging;usingMicrosoft.SemanticKernel;usingMicrosoft.SemanticKernel.ChatCompletion;usingMicrosoft.SemanticKernel.Connectors.OpenAI;// Populate values from your OpenAI deploymentvarmodelId="";varendpoint="";varapiKey="";// Create a kernel with Azure OpenAI chat completionvarbuilder=Kernel.CreateBuilder().AddAzureOpenAIChatCompletion(modelId,endpoint,apiKey);// Add enterprise componentsbuilder.Services.AddLogging(services =>services.AddConsole().SetMinimumLevel(LogLevel.Trace));// Build the kernelKernelkernel=builder.Build();varchatCompletionService=kernel.GetRequiredService<IChatCompletionService>();// Add a plugin (the LightsPlugin class is defined below)kernel.Plugins.AddFromType<LightsPlugin>("Lights");// Enable planningOpenAIPromptExecutionSettingsopenAIPromptExecutionSettings=new() {FunctionChoiceBehavior=FunctionChoiceBehavior.Auto()};// Create a history store the conversationvarhistory=newChatHistory();// Initiate a back-and-forth chatstring?userInput;do{// Collect user inputConsole.Write("User > ");userInput=Console.ReadLine();// Add user inputhistory.AddUserMessage(userInput);// Get the response from the AIvarresult=awaitchatCompletionService.GetChatMessageContentAsync(history,executionSettings:openAIPromptExecutionSettings,kernel:kernel);// Print the resultsConsole.WriteLine("Assistant > "+result);// Add the message from the agent to the chat historyhistory.AddMessage(result.Role,result.Content??string.Empty);}while(userInputis not null);
::: zone-end
::: zone pivot="programming-language-python"
importasynciofromsemantic_kernelimportKernelfromsemantic_kernel.utils.loggingimportsetup_loggingfromsemantic_kernel.functionsimportkernel_functionfromsemantic_kernel.connectors.ai.open_aiimportAzureChatCompletionfromsemantic_kernel.connectors.ai.function_choice_behaviorimportFunctionChoiceBehaviorfromsemantic_kernel.connectors.ai.chat_completion_client_baseimportChatCompletionClientBasefromsemantic_kernel.contents.chat_historyimportChatHistoryfromsemantic_kernel.functions.kernel_argumentsimportKernelArgumentsfromsemantic_kernel.connectors.ai.open_ai.prompt_execution_settings.azure_chat_prompt_execution_settingsimport ( AzureChatPromptExecutionSettings, ) asyncdefmain(): # Initialize the kernelkernel=Kernel() # Add Azure OpenAI chat completionchat_completion=AzureChatCompletion( deployment_name="your_models_deployment_name", api_key="your_api_key", base_url="your_base_url", ) kernel.add_service(chat_completion) # Set the logging level for semantic_kernel.kernel to DEBUG.setup_logging() logging.getLogger("kernel").setLevel(logging.DEBUG) # Add a plugin (the LightsPlugin class is defined below)kernel.add_plugin( LightsPlugin(), plugin_name="Lights", ) # Enable planningexecution_settings=AzureChatPromptExecutionSettings() execution_settings.function_choice_behavior=FunctionChoiceBehavior.Auto() # Create a history of the conversationhistory=ChatHistory() # Initiate a back-and-forth chatuserInput=NonewhileTrue: # Collect user inputuserInput=input("User > ") # Terminate the loop if the user says "exit"ifuserInput=="exit": break# Add user input to the historyhistory.add_user_message(userInput) # Get the response from the AIresult=awaitchat_completion.get_chat_message_content( chat_history=history, settings=execution_settings, kernel=kernel, ) # Print the resultsprint("Assistant > "+str(result)) # Add the message from the agent to the chat historyhistory.add_message(result) # Run the main functionif__name__=="__main__": asyncio.run(main())
::: zone-end
::: zone pivot="programming-language-java"
:::code language="java" source="~/../semantic-kernel-samples-java/learnDocs/LightsApp/src/main/java/LightsApp.java" id="LightAppExample":::
::: zone-end
The following back-and-forth chat should be similar to what you see in the console. The function calls have been added below to demonstrate how the AI leverages the plugin behind the scenes.
Role | Message |
---|---|
🔵 User | Please toggle the light |
🔴 Assistant (function call) | LightsPlugin.GetState() |
🟢 Tool | off |
🔴 Assistant (function call) | LightsPlugin.ChangeState(true) |
🟢 Tool | on |
🔴 Assistant | The light is now on |
If you're interested in understanding more about the code above, we'll break it down in the next section.
To make it easier to get started building enterprise apps with Semantic Kernel, we've created a step-by-step that guides you through the process of creating a kernel and using it to interact with AI services.
::: zone pivot="programming-language-python" ::: zone-end
::: zone pivot="programming-language-csharp" ::: zone-end
::: zone pivot="programming-language-java" ::: zone-end
In the following sections, we'll unpack the above sample by walking through steps 1, 2, 3, 4, 6, 9, and 10. Everything you need to build a simple agent that is powered by an AI service and can run your code.
::: zone pivot="programming-language-csharp,programming-language-python"
Add memory (skipped)
Create kernel arguments (skipped)
Create prompts (skipped)
::: zone-end
::: zone pivot="programming-language-java"
Add memory (skipped)
Create kernel arguments (skipped)
Create prompts (skipped)
::: zone-end
For this sample, we first started by importing the following packages:
::: zone pivot="programming-language-csharp"
usingMicrosoft.SemanticKernel;usingMicrosoft.SemanticKernel.ChatCompletion;usingMicrosoft.SemanticKernel.Connectors.OpenAI;
::: zone-end
::: zone pivot="programming-language-python"
importasynciofromsemantic_kernelimportKernelfromsemantic_kernel.connectors.ai.open_aiimportAzureChatCompletionfromsemantic_kernel.connectors.ai.function_choice_behaviorimportFunctionChoiceBehaviorfromsemantic_kernel.connectors.ai.chat_completion_client_baseimportChatCompletionClientBasefromsemantic_kernel.contents.chat_historyimportChatHistoryfromsemantic_kernel.functions.kernel_argumentsimportKernelArgumentsfromsemantic_kernel.connectors.ai.open_ai.prompt_execution_settings.azure_chat_prompt_execution_settingsimport ( AzureChatPromptExecutionSettings, )
::: zone-end
::: zone pivot="programming-language-java"
importcom.microsoft.semantickernel.Kernel; importcom.microsoft.semantickernel.aiservices.openai.chatcompletion.OpenAIChatCompletion; importcom.microsoft.semantickernel.contextvariables.ContextVariableTypeConverter; importcom.microsoft.semantickernel.contextvariables.ContextVariableTypes; importcom.microsoft.semantickernel.orchestration.InvocationContext; importcom.microsoft.semantickernel.orchestration.InvocationReturnMode; importcom.microsoft.semantickernel.orchestration.ToolCallBehavior; importcom.microsoft.semantickernel.plugin.KernelPlugin; importcom.microsoft.semantickernel.plugin.KernelPluginFactory; importcom.microsoft.semantickernel.services.chatcompletion.AuthorRole; importcom.microsoft.semantickernel.services.chatcompletion.ChatCompletionService; importcom.microsoft.semantickernel.services.chatcompletion.ChatHistory; importcom.microsoft.semantickernel.services.chatcompletion.ChatMessageContent;
::: zone-end
Afterwards, we add the most important part of a kernel: the AI services that you want to use. In this example, we added an Azure OpenAI chat completion service to the kernel builder.
Note
In this example, we used Azure OpenAI, but you can use any other chat completion service. To see the full list of supported services, refer to the supported languages article. If you need help creating a different service, refer to the AI services article. There, you'll find guidance on how to use OpenAI or Azure OpenAI models as services.
::: zone pivot="programming-language-csharp"
// Create kernelvarbuilder=Kernel.CreateBuilder() builder.AddAzureOpenAIChatCompletion(modelId,endpoint,apiKey);
::: zone-end
::: zone pivot="programming-language-python"
# Initialize the kernelkernel=Kernel() # Add Azure OpenAI chat completionkernel.add_service(AzureChatCompletion( deployment_name="your_models_deployment_name", api_key="your_api_key", base_url="your_base_url", ))
::: zone-end
::: zone pivot="programming-language-java"
:::code language="java" source="~/../semantic-kernel-samples-java/learnDocs/LightsApp/src/main/java/LightsApp.java" id="createservice":::
::: zone-end
::: zone pivot="programming-language-csharp,programming-language-python"
One of the main benefits of using Semantic Kernel is that it supports enterprise-grade services. In this sample, we added the logging service to the kernel to help debug the AI agent.
::: zone-end
::: zone pivot="programming-language-csharp"
builder.Services.AddLogging(services =>services.AddConsole().SetMinimumLevel(LogLevel.Trace));
::: zone-end
::: zone pivot="programming-language-python"
importlogging# Set the logging level for semantic_kernel.kernel to DEBUG.logging.basicConfig( format="[%(asctime)s - %(name)s:%(lineno)d - %(levelname)s] %(message)s", datefmt="%Y-%m-%d %H:%M:%S", ) logging.getLogger("kernel").setLevel(logging.DEBUG)
::: zone-end
::: zone pivot="programming-language-csharp" Once the services have been added, we then build the kernel and retrieve the chat completion service for later use.
Kernelkernel=builder.Build();// Retrieve the chat completion servicevarchatCompletionService=kernel.Services.GetRequiredService<IChatCompletionService>();
::: zone-end
::: zone pivot="programming-language-python" Once the kernel has been configured, we then retrieve the chat completion service for later use.
Note
In Python, you don't need to explicitly build the kernel. Instead, you can access the services directly from the kernel object.
chat_completion : AzureChatCompletion=kernel.get_service(type=ChatCompletionClientBase)
::: zone-end
::: zone pivot="programming-language-java"
:::code language="java" source="~/../semantic-kernel-samples-java/learnDocs/LightsApp/src/main/java/LightsApp.java" id="buildkernel":::
::: zone-end
With plugins, can give your AI agent the ability to run your code to retrieve information from external sources or to perform actions. In the above example, we added a plugin that allows the AI agent to interact with a light bulb. Below, we'll show you how to create this plugin.
Below, you can see that creating a native plugin is as simple as creating a new class.
In this example, we've created a plugin that can manipulate a light bulb. While this is a simple example, this plugin quickly demonstrates how you can support both...
- Retrieval Augmented Generation (RAG) by providing the AI agent with the state of the light bulb
- And task automation by allowing the AI agent to turn the light bulb on or off.
In your own code, you can create a plugin that interacts with any external service or API to achieve similar results.
::: zone pivot="programming-language-csharp"
usingSystem.ComponentModel;usingSystem.Text.Json.Serialization;usingMicrosoft.SemanticKernel;publicclassLightsPlugin{// Mock data for the lightsprivatereadonlyList<LightModel>lights=new(){newLightModel{Id=1,Name="Table Lamp",IsOn=false},newLightModel{Id=2,Name="Porch light",IsOn=false},newLightModel{Id=3,Name="Chandelier",IsOn=true}};[KernelFunction("get_lights")][Description("Gets a list of lights and their current state")]publicasyncTask<List<LightModel>>GetLightsAsync(){returnlights;}[KernelFunction("change_state")][Description("Changes the state of the light")]publicasyncTask<LightModel?>ChangeStateAsync(intid,boolisOn){varlight=lights.FirstOrDefault(light =>light.Id==id);if(light==null){returnnull;}// Update the light with the new statelight.IsOn=isOn;returnlight;}}publicclassLightModel{[JsonPropertyName("id")]publicintId{get;set;}[JsonPropertyName("name")]publicstringName{get;set;}[JsonPropertyName("is_on")]publicbool?IsOn{get;set;}}
::: zone-end
::: zone pivot="programming-language-python"
fromtypingimportAnnotatedfromsemantic_kernel.functionsimportkernel_functionclassLightsPlugin: lights= [ {"id": 1, "name": "Table Lamp", "is_on": False}, {"id": 2, "name": "Porch light", "is_on": False}, {"id": 3, "name": "Chandelier", "is_on": True}, ] @kernel_function(name="get_lights",description="Gets a list of lights and their current state", )defget_state( self, ) ->str: """Gets a list of lights and their current state."""returnself.lights@kernel_function(name="change_state",description="Changes the state of the light", )defchange_state( self, id: int, is_on: bool, ) ->str: """Changes the state of the light."""forlightinself.lights: iflight["id"] ==id: light["is_on"] =is_onreturnlightreturnNone
::: zone-end
::: zone pivot="programming-language-java"
:::code language="java" source="~/../semantic-kernel-samples-java/learnDocs/LightsApp/src/main/java/LightsPlugin.java" id="plugin":::
::: zone-end
Once you've created your plugin, you can add it to the kernel so the AI agent can access it. In the sample, we added the LightsPlugin
class to the kernel.
::: zone pivot="programming-language-csharp"
// Add the plugin to the kernelkernel.Plugins.AddFromType<LightsPlugin>("Lights");
::: zone-end
::: zone pivot="programming-language-python"
# Add the plugin to the kernelkernel.add_plugin( LightsPlugin(), plugin_name="Lights", )
::: zone-end
::: zone pivot="programming-language-java"
:::code language="java" source="~/../semantic-kernel-samples-java/learnDocs/LightsApp/src/main/java/LightsApp.java" id="importplugin":::
::: zone-end
Semantic Kernel leverages function calling–a native feature of most LLMs–to provide planning. With function calling, LLMs can request (or call) a particular function to satisfy a user's request. Semantic Kernel then marshals the request to the appropriate function in your codebase and returns the results back to the LLM so the AI agent can generate a final response.
To enable automatic function calling, we first need to create the appropriate execution settings so that Semantic Kernel knows to automatically invoke the functions in the kernel when the AI agent requests them.
::: zone pivot="programming-language-csharp"
OpenAIPromptExecutionSettingsopenAIPromptExecutionSettings=new(){FunctionChoiceBehavior=FunctionChoiceBehavior.Auto()};
::: zone-end
::: zone pivot="programming-language-python"
execution_settings=AzureChatPromptExecutionSettings() execution_settings.function_choice_behavior=FunctionChoiceBehavior.Auto()
::: zone-end
::: zone pivot="programming-language-java"
:::code language="java" source="~/../semantic-kernel-samples-java/learnDocs/LightsApp/src/main/java/LightsApp.java" id="enableplanning":::
::: zone-end
Finally, we invoke the AI agent with the plugin. The sample code demonstrates how to generate a non-streaming response, but you can also generate a streaming response by using the GetStreamingChatMessageContentAsync
method.
::: zone pivot="programming-language-csharp"
// Create chat historyvarhistory=newChatHistory();// Get the response from the AIvarresult=awaitchatCompletionService.GetChatMessageContentAsync(history,executionSettings:openAIPromptExecutionSettings,kernel:kernel);
Run the program using this command:
dotnet run
::: zone-end
::: zone pivot="programming-language-python"
# Create a history of the conversationhistory=ChatHistory() # Get the response from the AIresult= (awaitchat_completion.get_chat_message_contents( chat_history=history, settings=execution_settings, kernel=kernel, arguments=KernelArguments(), ))[0]
::: zone-end
::: zone pivot="programming-language-java"
:::code language="java" source="~/../semantic-kernel-samples-java/learnDocs/LightsApp/src/main/java/LightsApp.java" id="prompt":::
::: zone-end
In this guide, you learned how to quickly get started with Semantic Kernel by building a simple AI agent that can interact with an AI service and run your code. To see more examples and learn how to build more complex AI agents, check out our in-depth samples.