-
Notifications
You must be signed in to change notification settings - Fork 185
Moves LLM connector guides to explore-analyze section #4224
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Changes from all commits
f184f2a
7b356e0
9e22a95
0cf914b
ea81bb4
ef3ab2d
d6a494c
4be99ca
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,50 @@ | ||
| --- | ||
| mapped_pages: | ||
| - https://www.elastic.co/guide/en/security/current/llm-connector-guides.html | ||
| - https://www.elastic.co/guide/en/serverless/current/security-llm-connector-guides.html | ||
| applies_to: | ||
| stack: all | ||
| serverless: | ||
| security: all | ||
| observability: all | ||
| elasticsearch: all | ||
| products: | ||
| - id: observability | ||
| - id: elasticsearch | ||
| - id: security | ||
| - id: cloud-serverless | ||
| --- | ||
|
|
||
| # Enable large language model (LLM) access | ||
|
|
||
| Elastic uses large language model (LLM) connectors to power its [AI features](/explore-analyze/ai-features.md#ai-powered-features-in-elastic-sec). These features with the out-of-the-box Elastic Managed LLM or by configuring a third-party LLM connector. | ||
|
|
||
| ## Elastic Managed LLM | ||
|
|
||
| :::{include} ../../../solutions/_snippets/elastic-managed-llm.md | ||
| ::: | ||
|
|
||
| ## Connect to a third-party LLM | ||
|
|
||
| Follow these guides to connect to one or more third-party LLM providers: | ||
|
|
||
| * [Azure OpenAI](/explore-analyze/ai-features/llm-guides/connect-to-azure-openai.md) | ||
| * [Amazon Bedrock](/explore-analyze/ai-features/llm-guides/connect-to-amazon-bedrock.md) | ||
| * [OpenAI](/explore-analyze/ai-features/llm-guides/connect-to-openai.md) | ||
| * [Google Vertex](/explore-analyze/ai-features/llm-guides/connect-to-google-vertex.md) | ||
|
|
||
| ## Preconfigured connectors | ||
|
|
||
| ```{applies_to} | ||
| stack: ga | ||
| serverless: unavailable | ||
| ``` | ||
|
|
||
| You can also use [preconfigured connectors](kibana://reference/connectors-kibana/pre-configured-connectors.md) to set up a third-party LLM connector. | ||
|
|
||
| If you use a preconfigured connector for your LLM connector, we recommend adding the `exposeConfig: true` parameter to the `xpack.actions.preconfigured` section of the `kibana.yml` config file. This parameter makes debugging easier by adding configuration information to the debug logs, including which LLM the connector uses. | ||
|
|
||
|
|
||
|
|
||
|
|
||
|
|
||
| Original file line number | Diff line number | Diff line change | ||||
|---|---|---|---|---|---|---|
| @@ -1,6 +1,6 @@ | ||||||
| [Elastic Managed LLM](https://www.elastic.co/docs/reference/kibana/connectors-kibana/elastic-managed-llm) is the default large language model (LLM) connector available in the AI Assistant for eligible users. It provides immediate access to generative AI features without requiring any setup or external model integration. | ||||||
| [Elastic Managed LLM](https://www.elastic.co/docs/reference/kibana/connectors-kibana/elastic-managed-llm) is the default large language model (LLM) connector available in {{kib}} for eligible users. It provides immediate access to generative AI features without requiring any setup or external model integration. | ||||||
|
|
||||||
| The Elastic Managed LLM is available out-of-the box; no manual connector setup or API key management is required for initial use. However, you can configure and use a third-party LLM connector, such as OpenAI, Azure, or Amazon Bedrock if you prefer. | ||||||
| Elastic Managed LLM is available out-of-the box; it does not require manual connector setup or API key management. However, you can configure and use a third-party LLM connector, such as OpenAI, Azure, or Amazon Bedrock if you prefer. | ||||||
|
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more.
Suggested change
I know this is a snippet, but I'm not sure the second sentence, "However, you can.." fits here. It feels a little confusing when we've already talked about how you can use the Elastic Managed LLM or third-party connectors in the intro paragraph. Just something to think about, but maybe we don't need to use the snippet here, or we can edit it in a way that works a little better.
Contributor
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Yeah, I had a similar thought when I looked at this. |
||||||
|
|
||||||
| To learn more about security and data privacy, refer to the [connector documentation](https://www.elastic.co/docs/reference/kibana/connectors-kibana/elastic-managed-llm) and [download the model card](https://raw.githubusercontent.com/elastic/kibana/refs/heads/main/docs/reference/resources/Elastic_Managed_LLM_model_card.pdf). | ||||||
|
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Is "connector documentation" the right phrasing? Maybe the "Elastic Managed LLM documentation?" I know it's in the connector section, but I feel like we differentiate the Elastic Managed LLM from connectors. |
||||||
|
|
||||||
|
|
||||||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do we want to maybe give a little more info on what a preconfigured connector is here so users know if it's something they should choose? These are only on-prem, right?