How to Connect SquirroGPT to a Third-Party LLM#

Profile: System Administrator

This page describes how system or project administrators, or anyone with access to the Server or Setup spaces of a SquirroGPT project, can connect their SquirroGPT application to a third-party LLM.

There are two steps involved in connecting to a third-party LLM:

  1. Obtain the required LLM information from your third-party provider.

  2. Configure your Squirro project or server settings to use your preferred LLM.

Reference: For information on connecting specifically to a Microsoft Azure LLM, see How to Connect SquirroGPT to a Microsoft Azure LLM.

1 - Obtain Third-Party LLM Information#

To set up SquirroGPT access to a specific third-party LLM, you will need to set up the LLM externally and record certain information for use in the SquirroGPT configuration (Step 2). Your third-party LLM information will be used to authenticate your SquirroGPT application when you attempt to connect it.

The information you’ll need will depend on the type of LLM you are connecting to. The table below outlines the information you’ll need for each type of supported LLM.

Model Type

Required Information

OpenAI

  • OpenAI API key

Azure

  • Deployment Name

  • Base URL for API

  • Azure API key

  • API version (optional)

OpenAI-API

  • API key

  • Model name

  • Base URL of the provider

Note

How you access this information will depend on your specific third-party LLM provider.

2- Configure Your Squirro Server or Project Settings#

You have two options in terms of configuring usage of the third-party LLM:

  • Configure your Squirro instance at the server level, meaning all existing and future SquirroGPT projects will use the third-party LLM you specify.

  • Configure your SquirroGPT application at the project level, meaning only that particular project you are working within will use the third-party LLM you specify.

Tip

Typically, Squirro recommends configuring your Squirro instance at the server level.

Configuring Your Squirro Instance at the Server Level#

To configure your Squirro instance at the server level, follow the steps below:

  1. Log in to your Squirro instance.

  2. Click on Server in the spaces navigation bar.

  3. Click on Configuration.

  4. Click on the plus icon to create a new configuration. (See note below if you have previously created a configuration.)

  5. Enter the following name: genai.sqgpt.settings.

  6. Select Dictionary as the data type.

Azure SquirroGPT Connection

  1. Paste the appropriate values as per the LLM Configuration Values section.

Note

If you have already created the genai.sqgpt.settings configuration, you can simply edit the existing configuration and paste the above value.

Configuring Your SquirroGPT Application at the Project Level#

To configure your SquirroGPT application at the project level, follow the steps below:

  1. Log in to your Squirro instance.

  2. Click on Setup in the spaces navigation bar.

  3. Click on Settings.

  4. Click on Project Configuration.

  5. Click on the plus icon to create a new configuration. (See note below if you have previously created a configuration.)

  6. Enter the following name: genai.sqgpt.settings.

  7. Select Dictionary as the data type.

Azure SquirroGPT Connection

  1. Paste the appropriate values as per the LLM Configuration Values section below.

Note

If you have already created the genai.sqgpt.settings configuration, you can simply edit the existing configuration and paste the above value.

LLM Configuration Values#

The following values are to be used in either Step 7 or Step 8 above, depending on whether you are configuring your Squirro instance at the server level or your SquirroGPT application at the project level.

To properly configure your Squirro instance or project, you will need to paste the appropriate values for your third-party LLM into the configuration that you collected in Step 1.

Configuration for OpenAI Models#

{
    "llm": {
        "kind": "openai",                          # Indicates OpenAI model (required)
        "openai_api_key": "<your OpenAI API key>", # Your OpenAI API key (required)
        "model": "gpt-3.5-turbo"                   # Model name (optional, default is "gpt-3.5-turbo")
    }
}

Configuration for Azure-hosted OpenAI Models#

{
    "llm": {
        "kind": "azure",                           # Indicates Azure-hosted model (required)
        "deployment_name": "<your deployment>",    # Your deployment name (required)
        "openai_api_base": "<your base URL>",      # Base URL for the API (required)
        "openai_api_key": "<your Azure API key>",  # Your Azure API key (required)
        "openai_api_version": "2023-05-15"         # API version (optional, default is "2023-05-15")
    }
}

Note

For more detailed information on connecting to a Microsoft Azure LLM, see How to Connect SquirroGPT to a Microsoft Azure LLM.

Configuration for OpenAI-API Compatible Models#

{
    "llm": {
        "kind": "openai",                          # Indicates OpenAI-API compatible model (required)
        "openai_api_key": "<your API key>",        # Your API key (required)
        "model": "<your model>",                   # Model name (required)
        "base_url": "<base URL of provider>"       # Base URL of the model provider (required)
    }
}

Example: Example configuration for Mixtral deployed on Anyscale:

{
    "llm": {
        "kind": "openai",
        "openai_api_key": "<esecret_…>",
        "model": "mistralai/Mixtral-8x7B-Instruct-v0.1",
        "base_url": "https://api.endpoints.anyscale.com/v1"
    }
}

Note

For Anyscale deployments, you can alternatively set kind to anyscale. This automatically sets the base_url to https://api.endpoints.anyscale.com/v1.