You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
4. In the **Connection** dropdown, select **Create Connection**. The **Create AI Connection** view opens:
17
+

18
+
3. Under **AI Connection**, click the dropdown and select **Create new connection**. The **Create AI Connection** view opens:
20
19

21
-
5. In the **Provider** dropdown, select one of the supported LLM providers. Anthropic, Azure OpenAI, Cerebras, Cohere, Deep Infra, DeepSeek, Google Generative AI, Google Vertex AI, Groq, Mistral, OpenAI, OpenAI-compatible providers, Perplexity, Together.ai, and xAI Grok are currently supported.
22
-
6. In the **Model** dropdown, select one of the models your LLM provider supports. (If you're configuring Azure OpenAI, select **Custom** instead of a model and complete the other [Azure OpenAI-specific steps](#azure-openai).)
23
-
7. (Optional) If the model you're looking for is not listed, specify a custom model in the **Custom model** field. This overrides whatever you've selected under **Model**.
24
-
8. Enter your API key for the selected LLM provider in the **API Key** field.
25
-
9. (Optional) Enter a **Base URL** for the selected model. This is useful if you want to use a proxy or if your LLM provider does not use the default base URL. If you selected _OpenAI Compatible_ as the provider, then you are required to enter the base URL.
26
-
10. (Optional) Use the **Provider settings** and **Model settings** fields to specify custom parameters as JSON. The JSON schema varies depending on the chosen provider and model:
27
-
* See the [Azure OpenAI](#azure-openai) and [Google Vertex AI](#google-vertex-ai) instructions for custom provider settings required by these providers.
28
-
* If you've selected OpenAI, use **Provider settings** for JSON you'd normally pass to the `createOpenAI` function, and **Model settings** for JSON you'd normally pass to the `streamText` function. For more details, see the [OpenAI documentation](https://platform.openai.com/docs/api-reference).
29
-
11. Click **Save** to apply your changes in the **Create AI Connection** view.
30
-
12. (Optional) Back in the **AI providers** section, if you're working with AWS and you want your AI connection to have access to AWS MCP servers, click **AWS Cost** in the **MCP** section, and select an [AWS connection](/cloud-access/access-levels-permissions#aws-connections) to use. This enables access to AWS Cost Explorer MCP Server, AWS Pricing MCP Server, and AWS Billing and Cost Management MCP Server.
13. Click **Save** to apply your changes in the **AI providers** section.
20
+
4. In the **Provider** dropdown, select one of the supported LLM providers. Anthropic, Azure OpenAI, Cerebras, Cohere, Deep Infra, DeepSeek, Google Generative AI, Google Vertex AI, Groq, Mistral, OpenAI, OpenAI-compatible providers, Perplexity, Together.ai, and xAI Grok are currently supported.
21
+
5. In the **Model** dropdown, select one of the models your LLM provider supports. (If you're configuring Azure OpenAI, select **Custom** instead of a model and complete the other [Azure OpenAI-specific steps](#azure-openai).)
22
+
6. (Optional) If the model you're looking for is not listed, specify a custom model in the **Custom model** field. This overrides whatever you've selected under **Model**.
23
+
7. Enter your API key for the selected LLM provider in the **API Key** field.
24
+
8. (Optional) Enter a **Base URL** for the selected model. This is useful if you want to use a proxy or if your LLM provider does not use the default base URL. If you selected _OpenAI Compatible_ as the provider, then you are required to enter the base URL.
25
+
9. (Optional) Use the **Provider settings** and **Model settings** fields to specify custom parameters as JSON. The JSON schema varies depending on the chosen provider and model:
26
+
* See the [Azure OpenAI](#azure-openai) and [Google Vertex AI](#google-vertex-ai) instructions for custom provider settings required by these providers.
27
+
* If you've selected OpenAI, use **Provider settings** for JSON you'd normally pass to the `createOpenAI` function, and **Model settings** for JSON you'd normally pass to the `streamText` function. For more details, see the [OpenAI documentation](https://platform.openai.com/docs/api-reference).
28
+
10. Click **Save** to apply your changes in the **Create AI Connection** view.
29
+
11. (Optional) Back in the **OpenOps AI** section, if you're working with AWS and you want your AI connection to have access to AWS MCP servers, go to the **MCP** section and select an [AWS connection](/cloud-access/access-levels-permissions/#aws-connections) in the **AWS Cost** dropdown:
0 commit comments