Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
30 changes: 15 additions & 15 deletions ai-assistance/llm-connections.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -14,22 +14,22 @@ To enable AI and configure an LLM connection:
1. In the OpenOps left sidebar, click the **Settings** icon at the bottom:
<NarrowImage src="/images/access-llm-settings-icon.png" alt="Settings icon" widthPercent={30} />
2. In the **Settings** view, click **OpenOps AI**:
![AI providers](/images/access-llm-ai-providers.png)
3. Switch on the **Enable OpenOps AI** toggle.
4. In the **Connection** dropdown, select **Create Connection**. The **Create AI Connection** view opens:
![OpenOps AI settings](/images/access-llm-ai-providers.png)
3. Under **AI Connection**, click the dropdown and select **Create new connection**. The **Create AI Connection** view opens:
![Create AI Connection](/images/access-llm-create-connection.png)
5. In the **Provider** dropdown, select one of the supported LLM providers. Anthropic, Azure OpenAI, Cerebras, Cohere, Deep Infra, DeepSeek, Google Generative AI, Google Vertex AI, Groq, Mistral, OpenAI, OpenAI-compatible providers, Perplexity, Together.ai, and xAI Grok are currently supported.
6. In the **Model** dropdown, select one of the models your LLM provider supports. (If you're configuring Azure OpenAI, select **Custom** instead of a model and complete the other [Azure OpenAI-specific steps](#azure-openai).)
7. (Optional) If the model you're looking for is not listed, specify a custom model in the **Custom model** field. This overrides whatever you've selected under **Model**.
8. Enter your API key for the selected LLM provider in the **API Key** field.
9. (Optional) Enter a **Base URL** for the selected model. This is useful if you want to use a proxy or if your LLM provider does not use the default base URL. If you selected _OpenAI Compatible_ as the provider, then you are required to enter the base URL.
10. (Optional) Use the **Provider settings** and **Model settings** fields to specify custom parameters as JSON. The JSON schema varies depending on the chosen provider and model:
* See the [Azure OpenAI](#azure-openai) and [Google Vertex AI](#google-vertex-ai) instructions for custom provider settings required by these providers.
* If you've selected OpenAI, use **Provider settings** for JSON you'd normally pass to the `createOpenAI` function, and **Model settings** for JSON you'd normally pass to the `streamText` function. For more details, see the [OpenAI documentation](https://platform.openai.com/docs/api-reference).
11. Click **Save** to apply your changes in the **Create AI Connection** view.
12. (Optional) Back in the **AI providers** section, if you're working with AWS and you want your AI connection to have access to AWS MCP servers, click **AWS Cost** in the **MCP** section, and select an [AWS connection](/cloud-access/access-levels-permissions#aws-connections) to use. This enables access to AWS Cost Explorer MCP Server, AWS Pricing MCP Server, and AWS Billing and Cost Management MCP Server.
![AWS Cost MCP connection](/images/access-llm-mcp.png)
13. Click **Save** to apply your changes in the **AI providers** section.
4. In the **Provider** dropdown, select one of the supported LLM providers. Anthropic, Azure OpenAI, Cerebras, Cohere, Deep Infra, DeepSeek, Google Generative AI, Google Vertex AI, Groq, Mistral, OpenAI, OpenAI-compatible providers, Perplexity, Together.ai, and xAI Grok are currently supported.
5. In the **Model** dropdown, select one of the models your LLM provider supports. (If you're configuring Azure OpenAI, select **Custom** instead of a model and complete the other [Azure OpenAI-specific steps](#azure-openai).)
6. (Optional) If the model you're looking for is not listed, specify a custom model in the **Custom model** field. This overrides whatever you've selected under **Model**.
7. Enter your API key for the selected LLM provider in the **API Key** field.
8. (Optional) Enter a **Base URL** for the selected model. This is useful if you want to use a proxy or if your LLM provider does not use the default base URL. If you selected _OpenAI Compatible_ as the provider, then you are required to enter the base URL.
9. (Optional) Use the **Provider settings** and **Model settings** fields to specify custom parameters as JSON. The JSON schema varies depending on the chosen provider and model:
* See the [Azure OpenAI](#azure-openai) and [Google Vertex AI](#google-vertex-ai) instructions for custom provider settings required by these providers.
* If you've selected OpenAI, use **Provider settings** for JSON you'd normally pass to the `createOpenAI` function, and **Model settings** for JSON you'd normally pass to the `streamText` function. For more details, see the [OpenAI documentation](https://platform.openai.com/docs/api-reference).
10. Click **Save** to apply your changes in the **Create AI Connection** view.
11. (Optional) Back in the **OpenOps AI** section, if you're working with AWS and you want your AI connection to have access to AWS MCP servers, go to the **MCP** section and select an [AWS connection](/cloud-access/access-levels-permissions/#aws-connections) in the **AWS Cost** dropdown:
![AWS Cost MCP connection](/images/access-llm-mcp.png)
This enables access to AWS Cost Explorer MCP Server, AWS Pricing MCP Server, and AWS Billing and Cost Management MCP Server.


Configuring an LLM connection enables all [AI assistance features](/ai-assistance/overview) in OpenOps.

Expand Down
Binary file modified images/access-connection-from-workflow.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified images/access-llm-ai-providers.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified images/access-llm-ask-ai-action.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified images/access-llm-generate-with-ai.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified images/access-llm-mcp.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added images/actions-high-risk.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified images/cookbook/aws-ebs-get-snapshots.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified images/cookbook/aws-get-account-ids-properties.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified images/cookbook/azure-command-inside-loop.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified images/cookbook/azure-list-subscriptions-properties.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified images/cookbook/gcp-command-inside-loop.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified images/cookbook/gcp-list-projects-properties.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified images/parameters-dynamic-value.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified images/qsg-aws-step-properties.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified images/workflow-editor-action-mandatory-fields.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified images/workflow-editor-action-properties.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
4 changes: 3 additions & 1 deletion workflow-management/actions.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -86,7 +86,6 @@ These actions implement [Human-in-the-Loop](/workflow-management/human-in-the-lo
* **Linear**: create and update issues and comments.
* **Zendesk**: create or update tickets, perform other Zendesk API calls.


## Make changes to cloud resources

These are integration actions that provide various ways to make and request changes to your cloud resources via cloud provider APIs, infrastructure-as-code (IaC) tools, or pull requests.
Expand All @@ -103,6 +102,9 @@ These are integration actions that provide various ways to make and request chan
* **GitHub**: retrieve file content, create pull requests, or trigger GitHub Actions runs.
* **Archera**: apply commitment plans for cloud providers.

Since many of these actions can make destructive changes in your cloud environment, they are considered high-risk and are marked with a red shield icon in the [workflow editor](/workflow-management/building-workflows/):
<NarrowImage src="/images/actions-high-risk.png" alt="An action with a high-risk warning icon" widthPercent={50} />

## Interact with project management tools

These actions allow you to retrieve and update data in issues and boards in project management tools.
Expand Down
4 changes: 2 additions & 2 deletions workflow-management/building-workflows.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -155,8 +155,8 @@ Under **Step output**, you can click **Test Step** to test or retest the step in
Sometimes, the **Test Step** button may be grayed out, like this:
<NarrowImage src="/images/workflow-editor-action-test-fix-inputs.png" alt="Test button grayed out" />

Your step will also have a yellow warning icon next to it. If this happens, make sure you have filled in every mandatory input field in the step's properties pane. You may need to scroll through the properties pane to see all required fields:
<NarrowImage src="/images/workflow-editor-action-mandatory-fields.png" alt="Scrolling the properties pane to reveal mandatory input fields" />
Your step will also have a yellow warning icon next to it. If this happens, make sure you have filled in every mandatory input field in the step's properties pane. You may need to scroll through the properties pane to see all required fields.
<NarrowImage src="/images/workflow-editor-action-mandatory-fields.png" alt="Mandatory input fields in the properties pane" />

Once you've filled in all mandatory fields, the **Test Step** button becomes active. Click it to run the test and view the output generated by the step:
<NarrowImage src="/images/workflow-editor-action-generate-sample-data.png" alt="Output data generated for an Umbrella integration action" />
Expand Down