How to Connect SquirroGPT to a Microsoft Azure LLM#

Profile: System Administrator

This page describes how system or project administrators, or anyone with access to the Server or Setup spaces of a SquirroGPT project, can connect their SquirroGPT application to a Microsoft Azure LLM.

There are two steps involved in connecting to a Microsoft Azure LLM:

  1. Obtain Microsoft Azure LLM information via the Microsoft Developer Portal

  2. Configure your Squirro project or server settings to use your preferred Microsoft Azure LLM.

1 - Obtain Microsoft Azure LLM Information#

To set up SquirroGPT access to a specific Microsoft Azure LLM, you will need information about your OpenAI resource and model deployment from your Microsoft Developer account.

Note

This guide assumes you have already set up your resource and deployed a model. For information on creating a resource and deploying a model within it, see Microsoft’s official guide to Creating and Deploying an OpenAI Service Resource.

Your deployed model information will be used to authenticate your SquirroGPT application when you attempt to connect it. You will need to take note of the following information from your Microsoft Developer account:

  • Deployment name

  • OpenAI API base

  • OpenAI API key

  • OpenAI API version

To access this information, follow the steps below:

  1. Log in to the Microsoft Developer Portal with your Microsoft ID at https://portal.azure.com/.

  2. Locate and click on your OpenAI resource that contains your deployed model.

  3. Click Model Deployments.

  4. Click Manage Deployments.

  5. Click your deployment name. (In the example below, it’s Swiss_North).

  6. Click Edit Deployment.

  7. Take note of your Deployment name.

  8. Return to https://portal.azure.com/.

  9. Click Overview in the left navigation menu.

    Azure Developer Portal View Endpoints
  10. Next to Endpoints, click Click here to view endpoints.

  11. Take note of your KEY 1 value. This is your OpenAI API key.

    Azure Developer Portal Key
  12. Take note of your endpoint. This is your OpenAI API base.

    Azure Developer Portal Endpoint

2- Configure Your Squirro Server or Project Settings#

You have two options in terms of configuring usage of the Microsoft Azure LLM:

  • Configure your Squirro instance at the server level, meaning all existing and future SquirroGPT projects will use the Azure LLM you specify.

  • Configure your SquirroGPT application at the project level, meaning only that particular project you are working within will use the Azure LLM you specify.

The configuration steps are the same to configure any external LLM. Follow the steps on the section for Configuring Your Squirro Server or Project Settings, to configure your Squirro instance to use the Microsoft Azure LLM. The llm setting should have the following value:

{
    "llm": {
        "deployment_name": "<value from Step 1>",
        "kind": "azure",
        "openai_api_base": "<value from Step 1>",
        "openai_api_key": "<value from Step 1>",
        "openai_api_version": "2023-05-15",
        "request_timeout_s": [
            5,
            60
        ]
    }
}