Large Language Model Support#
Squirro connects to any Large Language Model (LLM) that supports the OpenAI API standards and tool calling. Squirro acts as an integration layer, allowing your agents to communicate with your chosen LLM provider.
Important
Squirro does not host, provide, or manage LLMs or their underlying infrastructure.
Compatibility Requirements#
To integrate with Squirro’s agent framework, your LLM must meet these technical requirements:
OpenAI API specifications compatibility
The model must expose an API that follows OpenAI API specifications.
Tool calling support
Native tool calling capability or support provided by the inference engine (for example, vLLM).
These requirements apply to both cloud-based and on-premises models.
Recommended Models#
The LLMs below serve as a guideline based on models evaluated more closely. These recommendations are based on compatibility testing and customer deployments.
Use Case |
Recommended Model |
Notes |
---|---|---|
Production environments |
gpt-4o / gpt-4.1 (OpenAI) |
Widely used and continuously validated in Squirro implementations. |
Development environments |
gpt-5 (OpenAI) |
Refer to OpenAI documentation for current availability. |
On-premises |
OpenAI API compatible model |
Must also support the tool calling API. |
Note
Model performance can vary significantly based on your specific use case, data complexity, and agent configuration. Squirro recommends testing multiple models in your environment before proceeding to a deployment in production.
Considerations#
Integration Architecture#
Squirro serves as an integration layer between your agents and your LLM infrastructure. When an agent needs LLM capabilities, the Squirro platform:
leverages external API to invoke the external LLM and handles token streaming.
orchestrates agent execution and tool-calling.
manages the API communication and response formatting.
returns the LLM’s response to your agent.
Your LLM infrastructure remains completely under your control and management.
Implementation Guidelines#
Although our platform is designed with OpenAI API compatibility in mind, we strongly encourage thorough testing when integrating other LLMs. Each model may have unique characteristics and performance metrics that could impact your application.
When changing the LLM provider, ensure you adjust the system prompts as they are optimized by default for OpenAI models. Different models may require different prompting strategies to achieve optimal performance. Refer to the documentation of your LLM provider for guidelines on how to adjust prompts effectively.
Getting Assistance#
For detailed configuration steps, refer to the How to Connect Squirro Chat to a Third-Party LLM guide and Configuring Agents and Tools documentation. For assistance with LLM integration, configuration, or troubleshooting, contact Squirro Support and submit a technical support request.