supportsStreaming

open override fun supportsStreaming(): Boolean

Check if this LLM service supports streaming operations.

Each LlmService instance is bound to a specific model, so this checks whether that particular model supports streaming.

Return

true if the underlying model supports streaming, false otherwise