Class SpringAI
java.lang.Object
com.google.adk.models.BaseLlm
com.google.adk.models.springai.SpringAI
Spring AI implementation of BaseLlm that wraps Spring AI ChatModel and StreamingChatModel.
This adapter allows Spring AI models to be used within the ADK framework by converting between ADK's LlmRequest/LlmResponse format and Spring AI's Prompt/ChatResponse format.
-
Constructor Summary
ConstructorsConstructorDescriptionSpringAI(org.springframework.ai.chat.model.ChatModel chatModel) SpringAI(org.springframework.ai.chat.model.ChatModel chatModel, String modelName, SpringAIProperties.Observability observabilityConfig) SpringAI(org.springframework.ai.chat.model.ChatModel chatModel, org.springframework.ai.chat.model.StreamingChatModel streamingChatModel, String modelName) SpringAI(org.springframework.ai.chat.model.ChatModel chatModel, org.springframework.ai.chat.model.StreamingChatModel streamingChatModel, String modelName, SpringAIProperties.Observability observabilityConfig) SpringAI(org.springframework.ai.chat.model.StreamingChatModel streamingChatModel) SpringAI(org.springframework.ai.chat.model.StreamingChatModel streamingChatModel, String modelName, SpringAIProperties.Observability observabilityConfig) -
Method Summary
Modifier and TypeMethodDescriptionconnect(LlmRequest llmRequest) Creates a live connection to the LLM.io.reactivex.rxjava3.core.Flowable<LlmResponse> generateContent(LlmRequest llmRequest, boolean stream) Generates one content from the given LLM request and tools.
-
Constructor Details
-
SpringAI
public SpringAI(org.springframework.ai.chat.model.ChatModel chatModel) -
SpringAI
-
SpringAI
public SpringAI(org.springframework.ai.chat.model.StreamingChatModel streamingChatModel) -
SpringAI
public SpringAI(org.springframework.ai.chat.model.StreamingChatModel streamingChatModel, String modelName) -
SpringAI
public SpringAI(org.springframework.ai.chat.model.ChatModel chatModel, org.springframework.ai.chat.model.StreamingChatModel streamingChatModel, String modelName) -
SpringAI
public SpringAI(org.springframework.ai.chat.model.ChatModel chatModel, org.springframework.ai.chat.model.StreamingChatModel streamingChatModel, String modelName, SpringAIProperties.Observability observabilityConfig) -
SpringAI
public SpringAI(org.springframework.ai.chat.model.ChatModel chatModel, String modelName, SpringAIProperties.Observability observabilityConfig) -
SpringAI
public SpringAI(org.springframework.ai.chat.model.StreamingChatModel streamingChatModel, String modelName, SpringAIProperties.Observability observabilityConfig)
-
-
Method Details
-
generateContent
public io.reactivex.rxjava3.core.Flowable<LlmResponse> generateContent(LlmRequest llmRequest, boolean stream) Description copied from class:BaseLlmGenerates one content from the given LLM request and tools.- Specified by:
generateContentin classBaseLlm- Parameters:
llmRequest- The LLM request containing the input prompt and parameters.stream- A boolean flag indicating whether to stream the response.- Returns:
- A Flowable of LlmResponses. For non-streaming calls, it will only yield one LlmResponse. For streaming calls, it may yield more than one LlmResponse, but all yielded LlmResponses should be treated as one content by merging their parts.
-
connect
Description copied from class:BaseLlmCreates a live connection to the LLM.
-