原文地址:https://alphahinex.github.io/2024/12/01/spring-ai-chat-model/
description: "本文以調(diào)用智譜 AI 開放平臺的 OpenAI 兼容對話接口為例霜旧,演示了使用 Spring AI 對接單個或多個對話模型的方法错忱。"
date: 2024.12.01 10:34
categories:
- AI
- Spring
tags: [AI, Spring, Spring AI]
keywords: Spring AI, OpenAI, Chat Completions, ChatClient
環(huán)境準(zhǔn)備
JDK
$ java -version
openjdk version "17.0.2" 2022-01-18
OpenJDK Runtime Environment (build 17.0.2+8-86)
OpenJDK 64-Bit Server VM (build 17.0.2+8-86, mixed mode, sharing)
start.spring.io
從 https://start.spring.io/ 下載一個包含 Spring Web 依賴的 Maven 工程:
解壓颁糟,并使用其中自帶的 Maven Wrapper 進行構(gòu)建:
$ unzip demo.zip
$ cd demo
$ ./mvnw clean package
...
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 5.569 s
[INFO] Finished at: 2024-11-30T13:58:27+08:00
[INFO] ------------------------------------------------------------------------
智譜 AI 開放平臺
登錄到智譜AI開放平臺 API Keys 頁面 獲取最新版生成的用戶 API Key航背,用于調(diào)用其提供的兼容 OpenAI 對話接口的免費模型 GLM-4-Flash
:
$ curl --location 'https://open.bigmodel.cn/api/paas/v4/chat/completions' \
--header 'Authorization: Bearer <你的apikey>' \
--header 'Content-Type: application/json' \
--data '{
"model": "glm-4-flash",
"messages": [
{
"role": "user",
"content": "你好"
}
]
}'
{"choices":[{"finish_reason":"stop","index":0,"message":{"content":"你好??!很高興見到你棱貌,有什么可以幫助你的嗎玖媚?","role":"assistant"}}],"created":1732946586,"id":"202411301403051925a900b08f4e23","model":"glm-4-flash","request_id":"202411301403051925a900b08f4e23","usage":{"completion_tokens":16,"prompt_tokens":6,"total_tokens":22}}
添加依賴
在 pom.xml
中添加 Spring AI 的 相關(guān)配置及依賴:
repositories
:
<repositories>
<repository>
<id>spring-milestones</id>
<name>Spring Milestones</name>
<url>https://repo.spring.io/milestone</url>
<snapshots>
<enabled>false</enabled>
</snapshots>
</repository>
<repository>
<id>spring-snapshots</id>
<name>Spring Snapshots</name>
<url>https://repo.spring.io/snapshot</url>
<releases>
<enabled>false</enabled>
</releases>
</repository>
</repositories>
dependencyManagement
:
<dependencyManagement>
<dependencies>
<dependency>
<groupId>org.springframework.ai</groupId>
<artifactId>spring-ai-bom</artifactId>
<version>1.0.0-SNAPSHOT</version>
<type>pom</type>
<scope>import</scope>
</dependency>
</dependencies>
</dependencyManagement>
dependency
:
<dependency>
<groupId>org.springframework.ai</groupId>
<artifactId>spring-ai-openai-spring-boot-starter</artifactId>
</dependency>
使用 ChatClient 與 OpenAI 兼容模型接口對話
僅對接一個大模型時
可直接通過配置項注冊并使用 ChatClient。
在 application.properties
中添加 Spring AI OpenAI 的相關(guān)配置:
spring.ai.openai.base-url=https://open.bigmodel.cn/api/paas
spring.ai.openai.chat.completions-path=/v4/chat/completions
spring.ai.openai.api-key=<你的apikey>
spring.ai.openai.chat.options.model=glm-4-flash
創(chuàng)建一個配置類:
@Configuration
public class Config {
@Bean
ChatClient chatClient(ChatClient.Builder builder) {
return builder.build();
}
}
在工程中自帶的 DemoApplicationTests
單元測試中婚脱,驗證對話效果:
@Autowired
ChatClient chatClient;
@Test
void autoConfig() {
String userMsg = "who r u";
System.out.println(chatClient.prompt().user(userMsg).call().content());
}
執(zhí)行結(jié)果如下:
I am an AI assistant named ChatGLM, which is developed based on the language model jointly trained by Tsinghua University KEG Lab and Zhipu AI Company in 2024. My job is to provide appropriate answers and support to users' questions and requests.
需要對接多個大模型時
可定義一個工場類創(chuàng)建多個 ChatClient 實例:
public class ChatClientFactory {
public static ChatClient createOpenAiChatClient(String baseUrl, String apiKey, String model, String completionsPath) {
if (StringUtils.isBlank(completionsPath)) {
completionsPath = "/v1/chat/completions";
}
OpenAiApi openAiApi = new OpenAiApi(baseUrl, apiKey, completionsPath,
"/v1/embeddings", RestClient.builder(), WebClient.builder(), RetryUtils.DEFAULT_RESPONSE_ERROR_HANDLER);
OpenAiChatModel openAiChatModel = new OpenAiChatModel(openAiApi, OpenAiChatOptions.builder().withModel(model).build());
return ChatClient.create(openAiChatModel);
}
}
@Test
void multiClients() {
ChatClient llm1 = ChatClientFactory.createOpenAiChatClient("https://open.bigmodel.cn/api/paas", "xxxx", "glm-4-flash", "/v4/chat/completions");
ChatClient llm2 = ChatClientFactory.createOpenAiChatClient("https://open.bigmodel.cn/api/paas", "xxxx", "glm-4-flash", "/v4/chat/completions");
String userMsg = "你是誰今魔?";
System.out.println(llm1.prompt().user(userMsg).call().content());
System.out.println(llm2.prompt().user(userMsg).call().content());
}