Class SpringAI

java.lang.Object
com.google.adk.models.BaseLlm
com.google.adk.models.springai.SpringAI

public class SpringAI extends BaseLlm
Spring AI implementation of BaseLlm that wraps Spring AI ChatModel and StreamingChatModel.

This adapter allows Spring AI models to be used within the ADK framework by converting between ADK's LlmRequest/LlmResponse format and Spring AI's Prompt/ChatResponse format.

  • Constructor Details

    • SpringAI

      public SpringAI(org.springframework.ai.chat.model.ChatModel chatModel)
    • SpringAI

      public SpringAI(org.springframework.ai.chat.model.ChatModel chatModel, String modelName)
    • SpringAI

      public SpringAI(org.springframework.ai.chat.model.StreamingChatModel streamingChatModel)
    • SpringAI

      public SpringAI(org.springframework.ai.chat.model.StreamingChatModel streamingChatModel, String modelName)
    • SpringAI

      public SpringAI(org.springframework.ai.chat.model.ChatModel chatModel, org.springframework.ai.chat.model.StreamingChatModel streamingChatModel, String modelName)
    • SpringAI

      public SpringAI(org.springframework.ai.chat.model.ChatModel chatModel, org.springframework.ai.chat.model.StreamingChatModel streamingChatModel, String modelName, SpringAIProperties.Observability observabilityConfig)
    • SpringAI

      public SpringAI(org.springframework.ai.chat.model.ChatModel chatModel, String modelName, SpringAIProperties.Observability observabilityConfig)
    • SpringAI

      public SpringAI(org.springframework.ai.chat.model.StreamingChatModel streamingChatModel, String modelName, SpringAIProperties.Observability observabilityConfig)
  • Method Details

    • generateContent

      public io.reactivex.rxjava3.core.Flowable<LlmResponse> generateContent(LlmRequest llmRequest, boolean stream)
      Description copied from class: BaseLlm
      Generates one content from the given LLM request and tools.
      Specified by:
      generateContent in class BaseLlm
      Parameters:
      llmRequest - The LLM request containing the input prompt and parameters.
      stream - A boolean flag indicating whether to stream the response.
      Returns:
      A Flowable of LlmResponses. For non-streaming calls, it will only yield one LlmResponse. For streaming calls, it may yield more than one LlmResponse, but all yielded LlmResponses should be treated as one content by merging their parts.
    • connect

      public BaseLlmConnection connect(LlmRequest llmRequest)
      Description copied from class: BaseLlm
      Creates a live connection to the LLM.
      Specified by:
      connect in class BaseLlm