diff --git a/ai-assistance/llm-connections.mdx b/ai-assistance/llm-connections.mdx
index 2851152..95d1e69 100644
--- a/ai-assistance/llm-connections.mdx
+++ b/ai-assistance/llm-connections.mdx
@@ -14,22 +14,22 @@ To enable AI and configure an LLM connection:
1. In the OpenOps left sidebar, click the **Settings** icon at the bottom:
2. In the **Settings** view, click **OpenOps AI**:
-
-3. Switch on the **Enable OpenOps AI** toggle.
-4. In the **Connection** dropdown, select **Create Connection**. The **Create AI Connection** view opens:
+
+3. Under **AI Connection**, click the dropdown and select **Create new connection**. The **Create AI Connection** view opens:

-5. In the **Provider** dropdown, select one of the supported LLM providers. Anthropic, Azure OpenAI, Cerebras, Cohere, Deep Infra, DeepSeek, Google Generative AI, Google Vertex AI, Groq, Mistral, OpenAI, OpenAI-compatible providers, Perplexity, Together.ai, and xAI Grok are currently supported.
-6. In the **Model** dropdown, select one of the models your LLM provider supports. (If you're configuring Azure OpenAI, select **Custom** instead of a model and complete the other [Azure OpenAI-specific steps](#azure-openai).)
-7. (Optional) If the model you're looking for is not listed, specify a custom model in the **Custom model** field. This overrides whatever you've selected under **Model**.
-8. Enter your API key for the selected LLM provider in the **API Key** field.
-9. (Optional) Enter a **Base URL** for the selected model. This is useful if you want to use a proxy or if your LLM provider does not use the default base URL. If you selected _OpenAI Compatible_ as the provider, then you are required to enter the base URL.
-10. (Optional) Use the **Provider settings** and **Model settings** fields to specify custom parameters as JSON. The JSON schema varies depending on the chosen provider and model:
- * See the [Azure OpenAI](#azure-openai) and [Google Vertex AI](#google-vertex-ai) instructions for custom provider settings required by these providers.
- * If you've selected OpenAI, use **Provider settings** for JSON you'd normally pass to the `createOpenAI` function, and **Model settings** for JSON you'd normally pass to the `streamText` function. For more details, see the [OpenAI documentation](https://platform.openai.com/docs/api-reference).
-11. Click **Save** to apply your changes in the **Create AI Connection** view.
-12. (Optional) Back in the **AI providers** section, if you're working with AWS and you want your AI connection to have access to AWS MCP servers, click **AWS Cost** in the **MCP** section, and select an [AWS connection](/cloud-access/access-levels-permissions#aws-connections) to use. This enables access to AWS Cost Explorer MCP Server, AWS Pricing MCP Server, and AWS Billing and Cost Management MCP Server.
-
-13. Click **Save** to apply your changes in the **AI providers** section.
+4. In the **Provider** dropdown, select one of the supported LLM providers. Anthropic, Azure OpenAI, Cerebras, Cohere, Deep Infra, DeepSeek, Google Generative AI, Google Vertex AI, Groq, Mistral, OpenAI, OpenAI-compatible providers, Perplexity, Together.ai, and xAI Grok are currently supported.
+5. In the **Model** dropdown, select one of the models your LLM provider supports. (If you're configuring Azure OpenAI, select **Custom** instead of a model and complete the other [Azure OpenAI-specific steps](#azure-openai).)
+6. (Optional) If the model you're looking for is not listed, specify a custom model in the **Custom model** field. This overrides whatever you've selected under **Model**.
+7. Enter your API key for the selected LLM provider in the **API Key** field.
+8. (Optional) Enter a **Base URL** for the selected model. This is useful if you want to use a proxy or if your LLM provider does not use the default base URL. If you selected _OpenAI Compatible_ as the provider, then you are required to enter the base URL.
+9. (Optional) Use the **Provider settings** and **Model settings** fields to specify custom parameters as JSON. The JSON schema varies depending on the chosen provider and model:
+ * See the [Azure OpenAI](#azure-openai) and [Google Vertex AI](#google-vertex-ai) instructions for custom provider settings required by these providers.
+ * If you've selected OpenAI, use **Provider settings** for JSON you'd normally pass to the `createOpenAI` function, and **Model settings** for JSON you'd normally pass to the `streamText` function. For more details, see the [OpenAI documentation](https://platform.openai.com/docs/api-reference).
+10. Click **Save** to apply your changes in the **Create AI Connection** view.
+11. (Optional) Back in the **OpenOps AI** section, if you're working with AWS and you want your AI connection to have access to AWS MCP servers, go to the **MCP** section and select an [AWS connection](/cloud-access/access-levels-permissions/#aws-connections) in the **AWS Cost** dropdown:
+ 
+ This enables access to AWS Cost Explorer MCP Server, AWS Pricing MCP Server, and AWS Billing and Cost Management MCP Server.
+
Configuring an LLM connection enables all [AI assistance features](/ai-assistance/overview) in OpenOps.
diff --git a/images/access-connection-from-workflow.png b/images/access-connection-from-workflow.png
index 7734b3d..846a3d9 100644
Binary files a/images/access-connection-from-workflow.png and b/images/access-connection-from-workflow.png differ
diff --git a/images/access-llm-ai-providers.png b/images/access-llm-ai-providers.png
index 30577f2..62ea487 100644
Binary files a/images/access-llm-ai-providers.png and b/images/access-llm-ai-providers.png differ
diff --git a/images/access-llm-ask-ai-action.png b/images/access-llm-ask-ai-action.png
index 05ff2a5..aa61513 100644
Binary files a/images/access-llm-ask-ai-action.png and b/images/access-llm-ask-ai-action.png differ
diff --git a/images/access-llm-generate-with-ai.png b/images/access-llm-generate-with-ai.png
index 4387c66..1c7d1fe 100644
Binary files a/images/access-llm-generate-with-ai.png and b/images/access-llm-generate-with-ai.png differ
diff --git a/images/access-llm-mcp.png b/images/access-llm-mcp.png
index 8656055..ebab2b7 100644
Binary files a/images/access-llm-mcp.png and b/images/access-llm-mcp.png differ
diff --git a/images/actions-high-risk.png b/images/actions-high-risk.png
new file mode 100644
index 0000000..6b293b5
Binary files /dev/null and b/images/actions-high-risk.png differ
diff --git a/images/cookbook/aws-ebs-get-snapshots.png b/images/cookbook/aws-ebs-get-snapshots.png
index 53bce53..72412d6 100644
Binary files a/images/cookbook/aws-ebs-get-snapshots.png and b/images/cookbook/aws-ebs-get-snapshots.png differ
diff --git a/images/cookbook/aws-get-account-ids-properties.png b/images/cookbook/aws-get-account-ids-properties.png
index d9401e6..4a162ff 100644
Binary files a/images/cookbook/aws-get-account-ids-properties.png and b/images/cookbook/aws-get-account-ids-properties.png differ
diff --git a/images/cookbook/azure-command-inside-loop.png b/images/cookbook/azure-command-inside-loop.png
index 5abdbca..aa4c81f 100644
Binary files a/images/cookbook/azure-command-inside-loop.png and b/images/cookbook/azure-command-inside-loop.png differ
diff --git a/images/cookbook/azure-list-subscriptions-properties.png b/images/cookbook/azure-list-subscriptions-properties.png
index cf2da47..cc59cc4 100644
Binary files a/images/cookbook/azure-list-subscriptions-properties.png and b/images/cookbook/azure-list-subscriptions-properties.png differ
diff --git a/images/cookbook/gcp-command-inside-loop.png b/images/cookbook/gcp-command-inside-loop.png
index 1d6fd0b..4f046ea 100644
Binary files a/images/cookbook/gcp-command-inside-loop.png and b/images/cookbook/gcp-command-inside-loop.png differ
diff --git a/images/cookbook/gcp-list-projects-properties.png b/images/cookbook/gcp-list-projects-properties.png
index b2df453..41f93f3 100644
Binary files a/images/cookbook/gcp-list-projects-properties.png and b/images/cookbook/gcp-list-projects-properties.png differ
diff --git a/images/parameters-dynamic-value.png b/images/parameters-dynamic-value.png
index d3f96f9..74bc9be 100644
Binary files a/images/parameters-dynamic-value.png and b/images/parameters-dynamic-value.png differ
diff --git a/images/qsg-aws-step-properties.png b/images/qsg-aws-step-properties.png
index 9da4a05..04472fc 100644
Binary files a/images/qsg-aws-step-properties.png and b/images/qsg-aws-step-properties.png differ
diff --git a/images/workflow-editor-action-mandatory-fields.png b/images/workflow-editor-action-mandatory-fields.png
index fe3631f..5160308 100644
Binary files a/images/workflow-editor-action-mandatory-fields.png and b/images/workflow-editor-action-mandatory-fields.png differ
diff --git a/images/workflow-editor-action-properties.png b/images/workflow-editor-action-properties.png
index 3f82c1c..55c1141 100644
Binary files a/images/workflow-editor-action-properties.png and b/images/workflow-editor-action-properties.png differ
diff --git a/workflow-management/actions.mdx b/workflow-management/actions.mdx
index 6f090a5..3a15b14 100644
--- a/workflow-management/actions.mdx
+++ b/workflow-management/actions.mdx
@@ -86,7 +86,6 @@ These actions implement [Human-in-the-Loop](/workflow-management/human-in-the-lo
* **Linear**: create and update issues and comments.
* **Zendesk**: create or update tickets, perform other Zendesk API calls.
-
## Make changes to cloud resources
These are integration actions that provide various ways to make and request changes to your cloud resources via cloud provider APIs, infrastructure-as-code (IaC) tools, or pull requests.
@@ -103,6 +102,9 @@ These are integration actions that provide various ways to make and request chan
* **GitHub**: retrieve file content, create pull requests, or trigger GitHub Actions runs.
* **Archera**: apply commitment plans for cloud providers.
+Since many of these actions can make destructive changes in your cloud environment, they are considered high-risk and are marked with a red shield icon in the [workflow editor](/workflow-management/building-workflows/):
+
+
## Interact with project management tools
These actions allow you to retrieve and update data in issues and boards in project management tools.
diff --git a/workflow-management/building-workflows.mdx b/workflow-management/building-workflows.mdx
index 3046b59..b5542df 100644
--- a/workflow-management/building-workflows.mdx
+++ b/workflow-management/building-workflows.mdx
@@ -155,8 +155,8 @@ Under **Step output**, you can click **Test Step** to test or retest the step in
Sometimes, the **Test Step** button may be grayed out, like this:
-Your step will also have a yellow warning icon next to it. If this happens, make sure you have filled in every mandatory input field in the step's properties pane. You may need to scroll through the properties pane to see all required fields:
-
+Your step will also have a yellow warning icon next to it. If this happens, make sure you have filled in every mandatory input field in the step's properties pane. You may need to scroll through the properties pane to see all required fields.
+
Once you've filled in all mandatory fields, the **Test Step** button becomes active. Click it to run the test and view the output generated by the step: