Streaming
Fluent interface for reactive streaming operations from LLM responses. Provides configuration options for:
Streaming object creation
Streaming with thinking content
Instances are obtained via StreamingPromptRunner.streaming.
Inheritors
Functions
Link copied to clipboard
Create a reactive stream of objects of the specified type. Objects are emitted as they become available during LLM processing.
Link copied to clipboard
abstract fun <T> createObjectStreamWithThinking(itemClass: Class<T>): <Error class: unknown class><<Error class: unknown class><T>>
Create a reactive stream with both objects and thinking content. Provides access to the LLM's reasoning process alongside the results.
Link copied to clipboard
Generate a reactive stream of text chunks as they arrive from the LLM.
Link copied to clipboard
Configure the streaming operation with a list of messages.
Link copied to clipboard
Configure the streaming operation with a single prompt message.