Skip to content

Commit 726c75e

Browse files
Documentation Update: 0.6.10 (DOC-145) (#264)
1 parent e1ee9e4 commit 726c75e

19 files changed

+20
-18
lines changed

ai-assistance/llm-connections.mdx

Lines changed: 15 additions & 15 deletions
Original file line numberDiff line numberDiff line change
@@ -14,22 +14,22 @@ To enable AI and configure an LLM connection:
1414
1. In the OpenOps left sidebar, click the **Settings** icon at the bottom:
1515
<NarrowImage src="/images/access-llm-settings-icon.png" alt="Settings icon" widthPercent={30} />
1616
2. In the **Settings** view, click **OpenOps AI**:
17-
![AI providers](/images/access-llm-ai-providers.png)
18-
3. Switch on the **Enable OpenOps AI** toggle.
19-
4. In the **Connection** dropdown, select **Create Connection**. The **Create AI Connection** view opens:
17+
![OpenOps AI settings](/images/access-llm-ai-providers.png)
18+
3. Under **AI Connection**, click the dropdown and select **Create new connection**. The **Create AI Connection** view opens:
2019
![Create AI Connection](/images/access-llm-create-connection.png)
21-
5. In the **Provider** dropdown, select one of the supported LLM providers. Anthropic, Azure OpenAI, Cerebras, Cohere, Deep Infra, DeepSeek, Google Generative AI, Google Vertex AI, Groq, Mistral, OpenAI, OpenAI-compatible providers, Perplexity, Together.ai, and xAI Grok are currently supported.
22-
6. In the **Model** dropdown, select one of the models your LLM provider supports. (If you're configuring Azure OpenAI, select **Custom** instead of a model and complete the other [Azure OpenAI-specific steps](#azure-openai).)
23-
7. (Optional) If the model you're looking for is not listed, specify a custom model in the **Custom model** field. This overrides whatever you've selected under **Model**.
24-
8. Enter your API key for the selected LLM provider in the **API Key** field.
25-
9. (Optional) Enter a **Base URL** for the selected model. This is useful if you want to use a proxy or if your LLM provider does not use the default base URL. If you selected _OpenAI Compatible_ as the provider, then you are required to enter the base URL.
26-
10. (Optional) Use the **Provider settings** and **Model settings** fields to specify custom parameters as JSON. The JSON schema varies depending on the chosen provider and model:
27-
* See the [Azure OpenAI](#azure-openai) and [Google Vertex AI](#google-vertex-ai) instructions for custom provider settings required by these providers.
28-
* If you've selected OpenAI, use **Provider settings** for JSON you'd normally pass to the `createOpenAI` function, and **Model settings** for JSON you'd normally pass to the `streamText` function. For more details, see the [OpenAI documentation](https://platform.openai.com/docs/api-reference).
29-
11. Click **Save** to apply your changes in the **Create AI Connection** view.
30-
12. (Optional) Back in the **AI providers** section, if you're working with AWS and you want your AI connection to have access to AWS MCP servers, click **AWS Cost** in the **MCP** section, and select an [AWS connection](/cloud-access/access-levels-permissions#aws-connections) to use. This enables access to AWS Cost Explorer MCP Server, AWS Pricing MCP Server, and AWS Billing and Cost Management MCP Server.
31-
![AWS Cost MCP connection](/images/access-llm-mcp.png)
32-
13. Click **Save** to apply your changes in the **AI providers** section.
20+
4. In the **Provider** dropdown, select one of the supported LLM providers. Anthropic, Azure OpenAI, Cerebras, Cohere, Deep Infra, DeepSeek, Google Generative AI, Google Vertex AI, Groq, Mistral, OpenAI, OpenAI-compatible providers, Perplexity, Together.ai, and xAI Grok are currently supported.
21+
5. In the **Model** dropdown, select one of the models your LLM provider supports. (If you're configuring Azure OpenAI, select **Custom** instead of a model and complete the other [Azure OpenAI-specific steps](#azure-openai).)
22+
6. (Optional) If the model you're looking for is not listed, specify a custom model in the **Custom model** field. This overrides whatever you've selected under **Model**.
23+
7. Enter your API key for the selected LLM provider in the **API Key** field.
24+
8. (Optional) Enter a **Base URL** for the selected model. This is useful if you want to use a proxy or if your LLM provider does not use the default base URL. If you selected _OpenAI Compatible_ as the provider, then you are required to enter the base URL.
25+
9. (Optional) Use the **Provider settings** and **Model settings** fields to specify custom parameters as JSON. The JSON schema varies depending on the chosen provider and model:
26+
* See the [Azure OpenAI](#azure-openai) and [Google Vertex AI](#google-vertex-ai) instructions for custom provider settings required by these providers.
27+
* If you've selected OpenAI, use **Provider settings** for JSON you'd normally pass to the `createOpenAI` function, and **Model settings** for JSON you'd normally pass to the `streamText` function. For more details, see the [OpenAI documentation](https://platform.openai.com/docs/api-reference).
28+
10. Click **Save** to apply your changes in the **Create AI Connection** view.
29+
11. (Optional) Back in the **OpenOps AI** section, if you're working with AWS and you want your AI connection to have access to AWS MCP servers, go to the **MCP** section and select an [AWS connection](/cloud-access/access-levels-permissions/#aws-connections) in the **AWS Cost** dropdown:
30+
![AWS Cost MCP connection](/images/access-llm-mcp.png)
31+
This enables access to AWS Cost Explorer MCP Server, AWS Pricing MCP Server, and AWS Billing and Cost Management MCP Server.
32+
3333

3434
Configuring an LLM connection enables all [AI assistance features](/ai-assistance/overview) in OpenOps.
3535

-2.32 KB
Loading

images/access-llm-ai-providers.png

5.32 KB
Loading
-543 Bytes
Loading
-1.66 KB
Loading

images/access-llm-mcp.png

-44.2 KB
Loading

images/actions-high-risk.png

33.4 KB
Loading
83 Bytes
Loading
11.4 KB
Loading
4.73 KB
Loading

0 commit comments

Comments
 (0)