探索语义内核
小窍门
有关此讨论的详细 API 文档在以下位置提供:
语义内核中的聊天功能完成
聊天完成 基本上是一种与 AI 模型通过聊天交互的协议,其中每次请求时,聊天历史记录都被维护并呈现给模型。 语义内核 AI 服务 提供统一的框架,用于集成各种 AI 模型的聊天完成功能。
A ChatCompletionAgent 可以利用这些 AI 服务 中的任何一个来生成响应,无论是定向到用户还是另一个代理。
准备您的开发环境
若要继续开发 ChatCompletionAgent,请使用相应的包配置开发环境。
将 Microsoft.SemanticKernel.Agents.Core 包添加到项目:
dotnet add package Microsoft.SemanticKernel.Agents.Core --prerelease
安装 semantic-kernel 包:
pip install semantic-kernel
重要
根据您在 ChatCompletionAgent 中使用的 AI 服务,您可能需要安装额外的软件包。 请在以下页面检查所需的额外内容
<dependency>
<groupId>com.microsoft.semantic-kernel</groupId>
<artifactId>semantickernel-agents-core</artifactId>
<version>[LATEST]</version>
</dependency>
创建 ChatCompletionAgent
ChatCompletionAgent 从根本上基于 人工智能服务。 因此,创建 ChatCompletionAgent 需要先创建一个包含一个或多个聊天完成服务的 Kernel 实例,然后通过引用该 Kernel 实例来实例化代理。
// Initialize a Kernel with a chat-completion service
IKernelBuilder builder = Kernel.CreateBuilder();
builder.AddAzureOpenAIChatCompletion(/*<...configuration parameters>*/);
Kernel kernel = builder.Build();
// Create the agent
ChatCompletionAgent agent =
new()
{
Name = "SummarizationAgent",
Instructions = "Summarize user input",
Kernel = kernel
};
可通过两种方法创建 ChatCompletionAgent:
1.直接提供聊天完成服务
from semantic_kernel.agents import ChatCompletionAgent
# Create the agent by directly providing the chat completion service
agent = ChatCompletionAgent(
service=AzureChatCompletion(), # your chat completion service instance
name="<agent name>",
instructions="<agent instructions>",
)
2.首先创建内核,向其添加服务,然后提供内核
# Define the kernel
kernel = Kernel()
# Add the chat completion service to the kernel
kernel.add_service(AzureChatCompletion())
# Create the agent using the kernel
agent = ChatCompletionAgent(
kernel=kernel,
name="<agent name>",
instructions="<agent instructions>",
)
当你已准备好聊天完成服务时,第一种方法非常有用。 当你需要管理多个服务或其他功能的内核时,第二种方法非常有用。
// Initialize a Kernel with a chat-completion service
var chatCompletion = OpenAIChatCompletion.builder()
.withOpenAIAsyncClient(client) // OpenAIAsyncClient with configuration parameters
.withModelId(MODEL_ID)
.build();
var kernel = Kernel.builder()
.withAIService(ChatCompletionService.class, chatCompletion)
.build();
// Create the agent
var agent = ChatCompletionAgent.builder()
.withKernel(kernel)
.build();
AI 服务选择
与直接使用语义内核AI服务没有不同,ChatCompletionAgent支持服务选择器的规范。 服务选择器会在 包含多个 Kernel。
IKernelBuilder builder = Kernel.CreateBuilder();
// Initialize multiple chat-completion services.
builder.AddAzureOpenAIChatCompletion(/*<...service configuration>*/, serviceId: "service-1");
builder.AddAzureOpenAIChatCompletion(/*<...service configuration>*/, serviceId: "service-2");
Kernel kernel = builder.Build();
ChatCompletionAgent agent =
new()
{
Name = "<agent name>",
Instructions = "<agent instructions>",
Kernel = kernel,
Arguments = // Specify the service-identifier via the KernelArguments
new KernelArguments(
new OpenAIPromptExecutionSettings()
{
ServiceId = "service-2" // The target service-identifier.
})
};
from semantic_kernel.connectors.ai.open_ai import (
AzureChatCompletion,
AzureChatPromptExecutionSettings,
)
# Define the Kernel
kernel = Kernel()
# Add the AzureChatCompletion AI Service to the Kernel
kernel.add_service(AzureChatCompletion(service_id="service1"))
kernel.add_service(AzureChatCompletion(service_id="service2"))
settings = AzureChatPromptExecutionSettings(service_id="service2")
# Create the agent
agent = ChatCompletionAgent(
kernel=kernel,
name="<agent name>",
instructions="<agent instructions>",
arguments=KernelArguments(settings=settings)
)
功能当前在 Java 中不可用。
与 ChatCompletionAgent 交谈
与 ChatCompletionAgent 进行交流是基于 ChatHistory 实例的,这与互动聊天完成 的 AI 服务没有什么不同。
只需使用用户消息调用代理即可。
// Define agent
ChatCompletionAgent agent = ...;
// Generate the agent response(s)
await foreach (ChatMessageContent response in agent.InvokeAsync(new ChatMessageContent(AuthorRole.User, "<user input>")))
{
// Process agent response(s)...
}
还可以使用 AgentThread 与代理进行对话。
在这里,我们使用 ChatHistoryAgentThread。
如果恢复以前的对话,ChatHistoryAgentThread 还可以通过其构造函数将可选的 ChatHistory 对象作为输入。 (未显示)
// Define agent
ChatCompletionAgent agent = ...;
AgentThread thread = new ChatHistoryAgentThread();
// Generate the agent response(s)
await foreach (ChatMessageContent response in agent.InvokeAsync(new ChatMessageContent(AuthorRole.User, "<user input>"), thread))
{
// Process agent response(s)...
}
有多种方法可以与 ChatCompletionAgent对话。
最简单的方法是调用并等待 get_response:
# Define agent
agent = ChatCompletionAgent(...)
# Generate the agent response
response = await agent.get_response(messages="user input")
# response is an `AgentResponseItem[ChatMessageContent]` object
如果希望代理在调用之间保持对话历史记录,可以按如下所示传递 ChatHistoryAgentThread :
# Define agent
agent = ChatCompletionAgent(...)
# Generate the agent response(s)
response = await agent.get_response(messages="user input")
# Generate another response, continuing the conversation thread from the first response.
response2 = await agent.get_response(messages="user input", thread=response.thread)
# process agent response(s)
调用invoke方法将返回一个AsyncIterableAgentResponseItem[ChatMessageContent]。
# Define agent
agent = ChatCompletionAgent(...)
# Define the thread
thread = ChatHistoryAgentThread()
# Generate the agent response(s)
async for response in agent.invoke(messages="user input", thread=thread):
# process agent response(s)
ChatCompletionAgent 还支持流媒体传输,其中 invoke_stream 方法返回 AsyncIterableStreamingChatMessageContent:
# Define agent
agent = ChatCompletionAgent(...)
# Define the thread
thread = ChatHistoryAgentThread()
# Generate the agent response(s)
async for response in agent.invoke_stream(messages="user input", thread=thread):
# process agent response(s)
ChatCompletionAgent agent = ...;
// Generate the agent response(s)
agent.invokeAsync(new ChatMessageContent<>(AuthorRole.USER, "<user input>")).block();
还可以使用 AgentThread 与代理进行对话。
在这里,我们使用 ChatHistoryAgentThread。
ChatHistoryAgentThread的构造函数也可以在恢复先前对话时接受ChatHistory对象作为输入。 (未显示)
// Define agent
ChatCompletionAgent agent = ...;
AgentThread thread = new ChatHistoryAgentThread();
// Generate the agent response(s)
agent.invokeAsync(new ChatMessageContent<>(AuthorRole.USER, "<user input>"), thread).block();
使用 ChatCompletionAgent 处理中间消息
语义内核 ChatCompletionAgent 旨在调用满足用户查询或问题的代理。 在调用期间,代理可以执行工具来派生最终答案。 若要访问在此过程中生成的中间消息,调用方可以提供一个回调函数来处理FunctionCallContent或FunctionResultContent的实例。
即将推出
ChatCompletionAgent回调文档。
在on_intermediate_message或agent.invoke(...)中配置agent.invoke_stream(...)回调函数允许调用方在制定代理最终响应的过程中接收生成的中间消息。
import asyncio
from typing import Annotated
from semantic_kernel.agents.chat_completion.chat_completion_agent import ChatCompletionAgent, ChatHistoryAgentThread
from semantic_kernel.connectors.ai.open_ai.services.azure_chat_completion import AzureChatCompletion
from semantic_kernel.contents import FunctionCallContent, FunctionResultContent
from semantic_kernel.contents.chat_message_content import ChatMessageContent
from semantic_kernel.functions import kernel_function
# Define a sample plugin for the sample
class MenuPlugin:
"""A sample Menu Plugin used for the concept sample."""
@kernel_function(description="Provides a list of specials from the menu.")
def get_specials(self) -> Annotated[str, "Returns the specials from the menu."]:
return """
Special Soup: Clam Chowder
Special Salad: Cobb Salad
Special Drink: Chai Tea
"""
@kernel_function(description="Provides the price of the requested menu item.")
def get_item_price(
self, menu_item: Annotated[str, "The name of the menu item."]
) -> Annotated[str, "Returns the price of the menu item."]:
return "$9.99"
# This callback function will be called for each intermediate message
# Which will allow one to handle FunctionCallContent and FunctionResultContent
# If the callback is not provided, the agent will return the final response
# with no intermediate tool call steps.
async def handle_intermediate_steps(message: ChatMessageContent) -> None:
for item in message.items or []:
if isinstance(item, FunctionCallContent):
print(f"Function Call:> {item.name} with arguments: {item.arguments}")
elif isinstance(item, FunctionResultContent):
print(f"Function Result:> {item.result} for function: {item.name}")
else:
print(f"{message.role}: {message.content}")
async def main() -> None:
agent = ChatCompletionAgent(
service=AzureChatCompletion(),
name="Assistant",
instructions="Answer questions about the menu.",
plugins=[MenuPlugin()],
)
# Create a thread for the agent
# If no thread is provided, a new thread will be
# created and returned with the initial response
thread: ChatHistoryAgentThread = None
user_inputs = [
"Hello",
"What is the special soup?",
"How much does that cost?",
"Thank you",
]
for user_input in user_inputs:
print(f"# User: '{user_input}'")
async for response in agent.invoke(
messages=user_input,
thread=thread,
on_intermediate_message=handle_intermediate_steps,
):
print(f"# {response.role}: {response}")
thread = response.thread
if __name__ == "__main__":
asyncio.run(main())
下面演示了代理调用过程的示例输出:
User: 'Hello'
AuthorRole.ASSISTANT: Hi there! How can I assist you today?
User: 'What is the special soup?'
Function Call:> MenuPlugin-get_specials with arguments: {}
Function Result:>
Special Soup: Clam Chowder
Special Salad: Cobb Salad
Special Drink: Chai Tea
for function: MenuPlugin-get_specials
AuthorRole.ASSISTANT: The special soup today is Clam Chowder. Would you like to know anything else from the menu?
User: 'How much does that cost?'
Function Call:> MenuPlugin-get_item_price with arguments: {"menu_item":"Clam Chowder"}
Function Result:> $9.99 for function: MenuPlugin-get_item_price
AuthorRole.ASSISTANT: The Clam Chowder costs $9.99. Would you like to know more about the menu or anything else?
User: 'Thank you'
AuthorRole.ASSISTANT: You're welcome! If you have any more questions, feel free to ask. Enjoy your day!
功能当前在 Java 中不可用。
声明性规范
即将推出有关使用声明式规格的文档。
重要
此功能处于实验阶段。 此阶段的功能正在开发中,在升级到预览阶段或候选发布阶段之前可能会更改。
ChatCompletionAgent可以直接从 YAML 声明性规范实例化。 此方法允许你以结构化和可移植的方式定义代理的核心属性、指令和可用函数(插件)。 通过使用 YAML,可以在单个文档中描述代理的名称、说明、指令提示、工具集和模型参数,使代理的配置易于审核和重现。
注释
在创建代理时,声明性 YAML 中指定的任何工具或函数必须已存在于内核实例中。 代理加载程序不会从规范创建新函数;而是按内核中的标识符查找引用的插件和函数。 如果内核中不存在必需的插件或函数,则代理构造期间将引发错误。
示例:从 YAML 规范创建ChatCompletionAgent
import asyncio
from typing import Annotated
from semantic_kernel import Kernel
from semantic_kernel.agents import AgentRegistry, ChatHistoryAgentThread
from semantic_kernel.agents.chat_completion.chat_completion_agent import ChatCompletionAgent
from semantic_kernel.connectors.ai.open_ai import OpenAIChatCompletion
from semantic_kernel.functions import kernel_function
# Define a plugin with kernel functions
class MenuPlugin:
@kernel_function(description="Provides a list of specials from the menu.")
def get_specials(self) -> Annotated[str, "Returns the specials from the menu."]:
return """
Special Soup: Clam Chowder
Special Salad: Cobb Salad
Special Drink: Chai Tea
"""
@kernel_function(description="Provides the price of the requested menu item.")
def get_item_price(
self, menu_item: Annotated[str, "The name of the menu item."]
) -> Annotated[str, "Returns the price of the menu item."]:
return "$9.99"
# YAML spec for the agent
AGENT_YAML = """
type: chat_completion_agent
name: Assistant
description: A helpful assistant.
instructions: Answer the user's questions using the menu functions.
tools:
- id: MenuPlugin.get_specials
type: function
- id: MenuPlugin.get_item_price
type: function
model:
options:
temperature: 0.7
"""
USER_INPUTS = [
"Hello",
"What is the special soup?",
"What does that cost?",
"Thank you",
]
async def main():
kernel = Kernel()
kernel.add_plugin(MenuPlugin(), plugin_name="MenuPlugin")
agent: ChatCompletionAgent = await AgentRegistry.create_from_yaml(
AGENT_YAML, kernel=kernel, service=OpenAIChatCompletion()
)
thread: ChatHistoryAgentThread | None = None
for user_input in USER_INPUTS:
print(f"# User: {user_input}")
response = await agent.get_response(user_input, thread=thread)
print(f"# {response.name}: {response}")
thread = response.thread
await thread.delete() if thread else None
if __name__ == "__main__":
asyncio.run(main())
此功能不可用。
操作说明
如需了解 ChatCompletionAgent 的端到端示例,请参阅: