LlmMessageStreamer

fun interface LlmMessageStreamer

Framework-agnostic interface for streaming LLM inference.

Streaming counterpart of LlmMessageSender. Implementations handle the actual LLM communication (Spring AI, LangChain4j, etc.) and return a reactive stream of raw content chunks.

Key Differences from Non-Streaming:

  • Returns Flux<String> instead of LlmMessageResponse

  • Tool execution is managed by the underlying framework (e.g., Spring AI) since the streaming API is opaque - we cannot inject a custom ToolLoop

  • Only observation of tool execution is possible via ToolCallInspector

See also

for non-streaming equivalent

for tool execution observation

Functions

Link copied to clipboard
abstract fun stream(messages: List<Message>, tools: List<Tool>, toolCallInspectors: List<ToolCallInspector>): <Error class: unknown class><String>

Stream raw content chunks from the LLM.