Skip to content
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@
- id: cloud-serverless
---

# Connect to Amazon Bedrock

Check notice on line 14 in explore-analyze/ai-features/llm-guides/connect-to-amazon-bedrock.md

View workflow job for this annotation

GitHub Actions / vale

Elastic.Capitalization: 'Connect to Amazon Bedrock' should use sentence-style capitalization.

This page provides step-by-step instructions for setting up an Amazon Bedrock connector for the first time. This connector type enables you to leverage large language models (LLMs) within {{kib}}. You’ll first need to configure AWS, then configure the connector in {{kib}}.

Expand All @@ -24,9 +24,9 @@
## Configure AWS [_configure_aws]


### Configure an IAM policy [_configure_an_iam_policy]

Check notice on line 27 in explore-analyze/ai-features/llm-guides/connect-to-amazon-bedrock.md

View workflow job for this annotation

GitHub Actions / vale

Elastic.Acronyms: 'IAM' has no definition.

First, configure an IAM policy with the necessary permissions:

Check notice on line 29 in explore-analyze/ai-features/llm-guides/connect-to-amazon-bedrock.md

View workflow job for this annotation

GitHub Actions / vale

Elastic.Acronyms: 'IAM' has no definition.

1. Log into the AWS console and search for Identity and Access Management (IAM).
2. From the **IAM** menu, select **Policies** → **Create policy**.
Expand Down Expand Up @@ -61,9 +61,9 @@



### Configure an IAM User [_configure_an_iam_user]

Check notice on line 64 in explore-analyze/ai-features/llm-guides/connect-to-amazon-bedrock.md

View workflow job for this annotation

GitHub Actions / vale

Elastic.Capitalization: 'Configure an IAM User' should use sentence-style capitalization.

Next, assign the policy you just created to a new user:
Next, assign the policy you created to a new user:

1. Return to the **IAM** menu. Select **Users** from the navigation menu, then click **Create User**.
2. Name the user, then click **Next**.
Expand All @@ -79,10 +79,10 @@

### Create an access key [_create_an_access_key]

Create the access keys that will authenticate your Elastic connector:

Check notice on line 82 in explore-analyze/ai-features/llm-guides/connect-to-amazon-bedrock.md

View workflow job for this annotation

GitHub Actions / vale

Elastic.FutureTense: 'will authenticate' might be in future tense. Write in the present tense to describe the state of the product as it is now.

1. Return to the **IAM** menu. Select **Users** from the navigation menu.
2. Search for the user you just created, and click its name.
2. Search for the user you created, and click its name.
3. Go to the **Security credentials** tab.
4. Under **Access keys**, click **Create access key**.
5. Select **Third-party service**, check the box under **Confirmation**, click **Next**, then click **Create access key**.
Expand All @@ -94,7 +94,7 @@



## Configure the Amazon Bedrock connector [_configure_the_amazon_bedrock_connector]

Check notice on line 97 in explore-analyze/ai-features/llm-guides/connect-to-amazon-bedrock.md

View workflow job for this annotation

GitHub Actions / vale

Elastic.Capitalization: 'Configure the Amazon Bedrock connector' should use sentence-style capitalization.

Finally, configure the connector in {{kib}}:

Expand All @@ -102,7 +102,7 @@
2. Find the **Connectors** page in the navigation menu or use the [global search field](/explore-analyze/find-and-organize/find-apps-and-objects.md). Then click **Create Connector**, and select **Amazon Bedrock**.
3. Name your connector.
4. (Optional) Configure the Amazon Bedrock connector to use a different AWS region where Anthropic models are supported by editing the **URL** field, for example by changing `us-east-1` to `eu-central-1`.
5. (Optional) Add one of the following strings if you want to use a model other than the default. Note that these model IDs should have a prefix of `us.` or `eu.`, depending on your region, for example `us.anthropic.claude-3-5-sonnet-20240620-v1:0` or `eu.anthropic.claude-3-5-sonnet-20240620-v1:0`.
5. (Optional) Add one of the following strings if you want to use a model other than the default. These model IDs should have a prefix of `us.` or `eu.`, depending on your region, for example `us.anthropic.claude-3-5-sonnet-20240620-v1:0` or `eu.anthropic.claude-3-5-sonnet-20240620-v1:0`.

* Sonnet 3.5: `us.anthropic.claude-3-5-sonnet-20240620-v1:0` or `eu.anthropic.claude-3-5-sonnet-20240620-v1:0`
* Sonnet 3.5 v2: `us.anthropic.claude-3-5-sonnet-20241022-v2:0` or `eu.anthropic.claude-3-5-sonnet-20241022-v2:0`
Expand Down
File renamed without changes.
File renamed without changes.
File renamed without changes.
50 changes: 50 additions & 0 deletions explore-analyze/ai-features/llm-guides/llm-connectors.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,50 @@
---
mapped_pages:
- https://www.elastic.co/guide/en/security/current/llm-connector-guides.html
- https://www.elastic.co/guide/en/serverless/current/security-llm-connector-guides.html
applies_to:
stack: all
serverless:
security: all
observability: all
elasticsearch: all
products:
- id: observability
- id: elasticsearch
- id: security
- id: cloud-serverless
---

# Enable large language model (LLM) access

Elastic uses large language model (LLM) connectors to power its [AI features](/explore-analyze/ai-features.md#ai-powered-features-in-elastic-sec). These features with the out-of-the-box Elastic Managed LLM or by configuring a third-party LLM connector.

## Elastic Managed LLM

:::{include} ../../../solutions/_snippets/elastic-managed-llm.md
:::

## Connect to a third-party LLM

Follow these guides to connect to one or more third-party LLM providers:

* [Azure OpenAI](/explore-analyze/ai-features/llm-guides/connect-to-azure-openai.md)
* [Amazon Bedrock](/explore-analyze/ai-features/llm-guides/connect-to-amazon-bedrock.md)
* [OpenAI](/explore-analyze/ai-features/llm-guides/connect-to-openai.md)
* [Google Vertex](/explore-analyze/ai-features/llm-guides/connect-to-google-vertex.md)

## Preconfigured connectors

```{applies_to}
stack: ga
serverless: unavailable
```

You can also use [preconfigured connectors](kibana://reference/connectors-kibana/pre-configured-connectors.md) to set up a third-party LLM connector.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we want to maybe give a little more info on what a preconfigured connector is here so users know if it's something they should choose? These are only on-prem, right?


If you use a preconfigured connector for your LLM connector, we recommend adding the `exposeConfig: true` parameter to the `xpack.actions.preconfigured` section of the `kibana.yml` config file. This parameter makes debugging easier by adding configuration information to the debug logs, including which LLM the connector uses.





6 changes: 6 additions & 0 deletions explore-analyze/toc.yml
Original file line number Diff line number Diff line change
Expand Up @@ -170,6 +170,12 @@ toc:
children:
- file: ai-features/ai-assistant.md
- file: ai-features/manage-access-to-ai-assistant.md
- file: ai-features/llm-guides/llm-connectors.md
children:
- file: ai-features/llm-guides/connect-to-azure-openai.md
- file: ai-features/llm-guides/connect-to-amazon-bedrock.md
- file: ai-features/llm-guides/connect-to-openai.md
- file: ai-features/llm-guides/connect-to-google-vertex.md
- file: discover.md
children:
- file: discover/discover-get-started.md
Expand Down
7 changes: 7 additions & 0 deletions redirects.yml
Original file line number Diff line number Diff line change
Expand Up @@ -600,3 +600,10 @@ redirects:

# Related to https://github.com/elastic/docs-content/pull/3808
'solutions/observability/get-started/other-tutorials/add-data-from-splunk.md': 'solutions/observability/get-started.md'


# Related to https://github.com/elastic/docs-content/pull/4224
'solutions/security/ai/connect-to-amazon-bedrock.md': 'explore-analyze/ai-features/llm-guides/connect-to-amazon-bedrock.md'
'solutions/security/ai/connect-to-azure-openai.md': 'explore-analyze/ai-features/llm-guides/connect-to-azure-openai.md'
'solutions/security/ai/connect-to-google-vertex.md': 'explore-analyze/ai-features/llm-guides/connect-to-google-vertex.md'
'solutions/security/ai/connect-to-openai.md': 'explore-analyze/ai-features/llm-guides/connect-to-openai.md'
4 changes: 2 additions & 2 deletions solutions/_snippets/elastic-managed-llm.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
[Elastic Managed LLM](https://www.elastic.co/docs/reference/kibana/connectors-kibana/elastic-managed-llm) is the default large language model (LLM) connector available in the AI Assistant for eligible users. It provides immediate access to generative AI features without requiring any setup or external model integration.
[Elastic Managed LLM](https://www.elastic.co/docs/reference/kibana/connectors-kibana/elastic-managed-llm) is the default large language model (LLM) connector available in {{kib}} for eligible users. It provides immediate access to generative AI features without requiring any setup or external model integration.

The Elastic Managed LLM is available out-of-the box; no manual connector setup or API key management is required for initial use. However, you can configure and use a third-party LLM connector, such as OpenAI, Azure, or Amazon Bedrock if you prefer.
Elastic Managed LLM is available out-of-the box; it does not require manual connector setup or API key management. However, you can configure and use a third-party LLM connector, such as OpenAI, Azure, or Amazon Bedrock if you prefer.

Check notice on line 3 in solutions/_snippets/elastic-managed-llm.md

View workflow job for this annotation

GitHub Actions / vale

Elastic.Semicolons: Use semicolons judiciously.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
Elastic Managed LLM is available out-of-the box; it does not require manual connector setup or API key management. However, you can configure and use a third-party LLM connector, such as OpenAI, Azure, or Amazon Bedrock if you prefer.
Elastic Managed LLM is available out-of-the box; you don't need to manually set up connectors or manage API keys. However, you can configure and use a third-party LLM connector, such as OpenAI, Azure, or Amazon Bedrock if you prefer.

I know this is a snippet, but I'm not sure the second sentence, "However, you can.." fits here. It feels a little confusing when we've already talked about how you can use the Elastic Managed LLM or third-party connectors in the intro paragraph. Just something to think about, but maybe we don't need to use the snippet here, or we can edit it in a way that works a little better.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, I had a similar thought when I looked at this.


To learn more about security and data privacy, refer to the [connector documentation](https://www.elastic.co/docs/reference/kibana/connectors-kibana/elastic-managed-llm) and [download the model card](https://raw.githubusercontent.com/elastic/kibana/refs/heads/main/docs/reference/resources/Elastic_Managed_LLM_model_card.pdf).
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is "connector documentation" the right phrasing? Maybe the "Elastic Managed LLM documentation?" I know it's in the connector section, but I feel like we differentiate the Elastic Managed LLM from connectors.


Expand Down
Original file line number Diff line number Diff line change
@@ -1,7 +1,4 @@
---
mapped_pages:
- https://www.elastic.co/guide/en/security/current/llm-connector-guides.html
- https://www.elastic.co/guide/en/serverless/current/security-llm-connector-guides.html
applies_to:
stack: all
serverless:
Expand All @@ -13,7 +10,7 @@ products:

# Enable large language model (LLM) access

{{elastic-sec}} uses large language models (LLMs) for some of its advanced analytics features. To enable these features, you can connect a third-party LLM provider or a custom local LLM.
{{elastic-sec}} uses large language model (LLM) connectors to power it's [AI features](/explore-analyze/ai-features.md#ai-powered-features-in-elastic-sec). To use these features, you can use Elastic Managed LLM, configure a third-party LLM connector, or a custom local LLM.

:::{important}
Different LLMs have varying performance when used to power different features and use-cases. For more information about how various models perform on different tasks in {{elastic-sec}}, refer to the [Large language model performance matrix](/solutions/security/ai/large-language-model-performance-matrix.md).
Expand All @@ -28,10 +25,10 @@ Different LLMs have varying performance when used to power different features an

Follow these guides to connect to one or more third-party LLM providers:

* [Azure OpenAI](/solutions/security/ai/connect-to-azure-openai.md)
* [Amazon Bedrock](/solutions/security/ai/connect-to-amazon-bedrock.md)
* [OpenAI](/solutions/security/ai/connect-to-openai.md)
* [Google Vertex](/solutions/security/ai/connect-to-google-vertex.md)
* [Azure OpenAI](/explore-analyze/ai-features/llm-guides/connect-to-azure-openai.md)
* [Amazon Bedrock](/explore-analyze/ai-features/llm-guides/connect-to-amazon-bedrock.md)
* [OpenAI](/explore-analyze/ai-features/llm-guides/connect-to-openai.md)
* [Google Vertex](/explore-analyze/ai-features/llm-guides/connect-to-google-vertex.md)

## Connect to a self-managed LLM

Expand Down
4 changes: 0 additions & 4 deletions solutions/toc.yml
Original file line number Diff line number Diff line change
@@ -1,3 +1,3 @@
project: "Solutions and use cases"
toc:
- file: index.md
Expand Down Expand Up @@ -568,10 +568,6 @@
- file: security/ai/set-up-connectors-for-large-language-models-llm.md
children:
- file: security/ai/large-language-model-performance-matrix.md
- file: security/ai/connect-to-azure-openai.md
- file: security/ai/connect-to-amazon-bedrock.md
- file: security/ai/connect-to-openai.md
- file: security/ai/connect-to-google-vertex.md
- file: security/ai/connect-to-own-local-llm.md
- file: security/ai/connect-to-vLLM.md
- file: security/ai/use-cases.md
Expand Down