在本教程中,我们将逐步指导您创建一个集成 Microsoft 的 Semantic Kernel 的 Chainlit 应用程序。此集成会自动将 Semantic Kernel 函数调用(如插件或工具)在 Chainlit UI 中可视化为步骤。

先决条件

开始之前,请确保您已满足以下条件

  • Chainlit 已正确安装并工作
  • 已安装 semantic-kernel 软件包
  • 为 Semantic Kernel 配置了 LLM API 密钥(例如,OpenAI、Azure OpenAI)
  • 对 Python 编程和 Semantic Kernel 概念(Kernel、插件、函数)有基本了解

步骤 1:创建 Python 文件

在您的项目目录中创建一个名为 app.py 的新 Python 文件。此文件将包含使用 Semantic Kernel 构建 LLM 应用程序的主要逻辑。

步骤 2:编写应用程序逻辑

app.py 中,导入必要的包,设置您的 Semantic Kernel Kernel,添加用于 Chainlit 集成的 SemanticKernelFilter,并定义用于处理聊天会话和传入消息的函数。

以下是一个示例,演示如何设置 kernel 并使用过滤器

app.py
import chainlit as cl
import semantic_kernel as sk
from semantic_kernel.connectors.ai import FunctionChoiceBehavior
from semantic_kernel.connectors.ai.open_ai import (
    OpenAIChatCompletion,
    OpenAIChatPromptExecutionSettings,
)
from semantic_kernel.functions import kernel_function
from semantic_kernel.contents import ChatHistory

request_settings = OpenAIChatPromptExecutionSettings(
    function_choice_behavior=FunctionChoiceBehavior.Auto(filters={"excluded_plugins": ["ChatBot"]})
)

# Example Native Plugin (Tool)
class WeatherPlugin:
    @kernel_function(name="get_weather", description="Gets the weather for a city")
    def get_weather(self, city: str) -> str:
        """Retrieves the weather for a given city."""
        if "paris" in city.lower():
            return f"The weather in {city} is 20°C and sunny."
        elif "london" in city.lower():
            return f"The weather in {city} is 15°C and cloudy."
        else:
            return f"Sorry, I don't have the weather for {city}."

@cl.on_chat_start
async def on_chat_start():
    # Setup Semantic Kernel
    kernel = sk.Kernel()

    # Add your AI service (e.g., OpenAI)
    # Make sure OPENAI_API_KEY and OPENAI_ORG_ID are set in your environment
    ai_service = OpenAIChatCompletion(service_id="default", ai_model_id="gpt-4o")
    kernel.add_service(ai_service)

    # Import the WeatherPlugin
    kernel.add_plugin(WeatherPlugin(), plugin_name="Weather")
    
    # Instantiate and add the Chainlit filter to the kernel
    # This will automatically capture function calls as Steps
    sk_filter = cl.SemanticKernelFilter(kernel=kernel)

    cl.user_session.set("kernel", kernel)
    cl.user_session.set("ai_service", ai_service)
    cl.user_session.set("chat_history", ChatHistory())

@cl.on_message
async def on_message(message: cl.Message):
    kernel = cl.user_session.get("kernel") # type: sk.Kernel
    ai_service = cl.user_session.get("ai_service") # type: OpenAIChatCompletion
    chat_history = cl.user_session.get("chat_history") # type: ChatHistory

    # Add user message to history
    chat_history.add_user_message(message.content)

    # Create a Chainlit message for the response stream
    answer = cl.Message(content="")

    async for msg in ai_service.get_streaming_chat_message_content(
        chat_history=chat_history,
        user_input=message.content,
        settings=request_settings,
        kernel=kernel,
    ):
        if msg.content:
            await answer.stream_token(msg.content)

    # Add the full assistant response to history
    chat_history.add_assistant_message(answer.content)

    # Send the final message
    await answer.send()

步骤 3:运行应用程序

要启动您的应用程序,请打开终端并导航到包含 app.py 的目录。然后运行以下命令

chainlit run app.py -w

-w 标志告诉 Chainlit 启用自动重载,因此您每次更改应用程序时都不需要重新启动服务器。您的聊天机器人 UI 现在应该可以在 https://:8000 访问。与机器人互动,如果您询问天气(并且 LLM 使用了工具),您应该会在 UI 中看到一个名为“Weather-get_weather”的步骤出现。