StreamingCapability

Tag interface that marks streaming capability support.

This interface serves as a marker for objects that provide streaming operations, enabling polymorphic access to streaming functionality without creating circular dependencies between API packages.

Implementations of this interface provide reactive streaming capabilities that allow for real-time processing of LLM responses as they arrive, supporting:

  • Progressive text generation

  • Streaming object creation from JSONL responses

  • Mixed content streams with both objects and LLM reasoning (thinking)

Usage:

val runner: PromptRunner = context.ai().autoLlm()
if (runner.supportsStreaming()) {
val capability: StreamingCapability = runner.stream()
val operations = capability as StreamingPromptRunnerOperations (or use asStreaming extension function)
// Use streaming operations...
}

This interface follows the explicit failure policy - streaming operations will throw exceptions if called on non-streaming implementations rather than providing fallback behavior.

Inheritors