Для этой цели я создал собственный клиент чата -
Код: Выделить всё
@Bean("docSummaryChatClient")
public ChatClient docSummaryChatClient(
ChatClient.Builder builder,
@Value("${doc-summary.model}") String modelName,
@Value("${doc-summary.temperature}") Double temperature) {
// Build specific options for this client
OpenAiChatOptions docSummaryOptions = OpenAiChatOptions.builder()
.model(modelName)
.temperature(temperature)
.build();
return builder
.defaultOptions(docSummaryOptions)
.build();
}
Код: Выделить всё
@Component
@Slf4j
public class SummaryLlmServiceImpl implements SummaryLlmService {
private final ChatClient docSummaryChatClient;
private final PromptMetadataLoggingAdvisor promptMetadataLoggingAdvisor;
public SummaryLlmServiceImpl(
@Qualifier("docSummaryChatClient") ChatClient docSummaryChatClient,
PromptMetadataLoggingAdvisor promptMetadataLoggingAdvisor) {
this.docSummaryChatClient = docSummaryChatClient;
this.promptMetadataLoggingAdvisor = promptMetadataLoggingAdvisor;
}
@Override
public SummaryLlmResponse generate(String prompt, Integer maxTokens) {
log.debug("Generating response for prompt: {} \n maxTokens: {}", prompt, maxTokens);
var spec = docSummaryChatClient
.prompt(prompt)
.advisors(promptMetadataLoggingAdvisor)
.options(OpenAiChatOptions.builder()
.maxTokens(maxTokens)
.build());
ChatClient.CallResponseSpec response = spec.call();
ChatResponse chatResponse = response.chatResponse();
Usage usage = chatResponse != null && chatResponse.getMetadata() != null
? chatResponse.getMetadata().getUsage()
: null;
var promptTokens = usage != null ? usage.getPromptTokens() : null;
var completionTokens = usage != null ? usage.getCompletionTokens() : null;
var totalTokens = usage != null ? usage.getTotalTokens() : null;
return new SummaryLlmResponse(
response.content(),
promptTokens,
completionTokens,
totalTokens);
}
}
Код: Выделить всё
java.lang.IllegalStateException: No CallAdvisors available to execute
at org.springframework.ai.chat.client.advisor.DefaultAroundAdvisorChain.nextCall(DefaultAroundAdvisorChain.java:97) ~[spring-ai-client-chat-1.0.0.jar:1.0.0]
at org.springframework.ai.chat.client.DefaultChatClient$DefaultCallResponseSpec.lambda$doGetObservableChatClientResponse$1(DefaultChatClient.java:469) ~[spring-ai-client-chat-1.0.0.jar:1.0.0]
at io.micrometer.observation.Observation.observe(Observation.java:564)
Подробнее здесь: https://stackoverflow.com/questions/798 ... -spring-ai
Мобильная версия