Reference Section
LLM Models
List of supported LLM providers and models in Stubber
OpenAI
OpenAI models are used with their standard API format with the following modifications:
O-Type Models
- o3-mini and o4-mini: We delete the temperature field as these models don't support it
- All O-type models: We convert image URLs into data URIs (some O-type models do not support images at all, e.g., o3-mini)
OpenAI Models
Anthropic (Claude)
Anthropic models require specific message format conversions:
Message Processing
- Consecutive roles: Anthropic does not support consecutive roles, so we combine consecutive role messages into a single message with the messages.content into an array of the messages
- System messages: The system messages are extracted and concatenated to be a single top-level system message parameter as Anthropic does not support system messages in the conversation flow
- Conversation start: Anthropic calls have to start with a user message. If it doesn't start with one, we inject a simple one that just says "hi"
Image Understanding
For Anthropic, Stubber accepts the normal OpenAI specification of image understanding messages:
Before these messages are sent to Anthropic, we convert them to their structure, which can be seen here.
Claude Models
Google Gemini
Gemini models have specific requirements and limitations:
Message Processing
- System message limitation: Gemini models cannot be used with only a system message - they require at least one user message
- Image processing: We convert image URLs into data URIs for compatibility
Gemini Models
Other Providers
Additional Models
General Operations Applied to All Providers
Function Method Determination
We determine which function method to use based on the model and the provider:
- Models released after May 2025: Default to supplying functions to providers as
tools
- Models released before May 2025: Default to supplying functions to providers as
functions
This is mostly abstracted for users and they do not need to worry about it, but you can specify a specific function method for a model as a parameter (see function_method).
If a specific chat already had a function call following a specific function method, then future iterations of that chat will also use that same function method. For example, if you use a new model (which uses tools) and then switch to an older function type model in the same chat, the older function type model would also use tools.
Tool Result Injection
For tools, tool messages from LLMs must be followed by a tool result. If there is a scenario where users disable dynamic tasks (which we use to inject tool results after function calls), then Stubber will inject a simple tool result of:
All Supported Models
The following table shows all supported models and their capabilities: