Streaming Llm Operations
Streaming extension of LlmOperations for real-time LLM response processing.
This SPI interface provides reactive streaming capabilities that support the API layer StreamingPromptRunner interfaces, enabling:
Real-time processing of LLM responses as they arrive
Streaming lists of objects from JSONL responses
Mixed content streams with both objects and LLM reasoning (thinking)
Progressive agent progress monitoring
All streaming methods return Project Reactor Flux streams for integration with Spring WebFlux and other reactive frameworks.
Functions
Create a streaming list of objects from JSONL response in the context of an AgentProcess. Each line in the LLM response should be a valid JSON object matching the output class. Objects are emitted to the Flux as they are parsed from individual lines.
Try to create a streaming list of objects in the context of an AgentProcess. Return a Flux that may error if the LLM does not have enough information to create objects. Streaming equivalent of createObjectIfPossible().
Create a streaming list of objects with LLM thinking content from mixed JSONL response. Supports both JSON object lines and //THINKING: lines in the LLM response. Returns StreamingEvent objects that can contain either typed objects or thinking content.
Low level object streaming transform, not necessarily aware of platform. Streams typed objects as they are parsed from JSONL response.
Low level mixed content streaming transform, not necessarily aware of platform. Streams both typed objects and thinking content from mixed JSONL response.
Low level streaming transform, not necessarily aware of platform. Streams text chunks as they arrive from the LLM without platform mediation.
Generate streaming text in the context of an AgentProcess. Returns a Flux that emits text chunks as they arrive from the LLM.
Generate streaming text from messages in the context of an AgentProcess. Returns a Flux that emits text chunks as they arrive from the LLM.