create Message Streamer
Create a message streamer for this LLM configured with the given options.
The message streamer handles streaming LLM API calls. Tool execution is handled internally by the underlying LLM framework during streaming.
Return
A message streamer configured for this LLM
Parameters
options
Configuration options for the LLM call (temperature, max tokens, etc.)