-
Notifications
You must be signed in to change notification settings - Fork 185
Moves LLM connector guides to explore-analyze section #4224
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
Vale Linting ResultsSummary: 1 warning, 10 suggestions found
|
| File | Line | Rule | Message |
|---|---|---|---|
| explore-analyze/ai-features/llm-guides/connect-to-azure-openai.md | 42 | Elastic.MeaningfulCTAs | Use meaningful link text. Use 'visit, go to, refer to' instead of 'Click here'. |
💡 Suggestions (10)
| File | Line | Rule | Message |
|---|---|---|---|
| explore-analyze/ai-features/llm-guides/connect-to-amazon-bedrock.md | 14 | Elastic.Capitalization | 'Connect to Amazon Bedrock' should use sentence-style capitalization. |
| explore-analyze/ai-features/llm-guides/connect-to-amazon-bedrock.md | 27 | Elastic.Acronyms | 'IAM' has no definition. |
| explore-analyze/ai-features/llm-guides/connect-to-amazon-bedrock.md | 29 | Elastic.Acronyms | 'IAM' has no definition. |
| explore-analyze/ai-features/llm-guides/connect-to-amazon-bedrock.md | 64 | Elastic.Capitalization | 'Configure an IAM User' should use sentence-style capitalization. |
| explore-analyze/ai-features/llm-guides/connect-to-amazon-bedrock.md | 82 | Elastic.FutureTense | 'will authenticate' might be in future tense. Write in the present tense to describe the state of the product as it is now. |
| explore-analyze/ai-features/llm-guides/connect-to-amazon-bedrock.md | 97 | Elastic.Capitalization | 'Configure the Amazon Bedrock connector' should use sentence-style capitalization. |
| explore-analyze/ai-features/llm-guides/connect-to-azure-openai.md | 74 | Elastic.Capitalization | 'Configure Elastic AI Assistant' should use sentence-style capitalization. |
| explore-analyze/ai-features/llm-guides/connect-to-google-vertex.md | 14 | Elastic.Capitalization | 'Connect to Google Vertex' should use sentence-style capitalization. |
| explore-analyze/ai-features/llm-guides/connect-to-openai.md | 24 | Elastic.Semicolons | Use semicolons judiciously. |
| solutions/_snippets/elastic-managed-llm.md | 3 | Elastic.Semicolons | Use semicolons judiciously. |
mdbirnstiehl
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Left a few comments/questions/suggestions.
As a general comment, I wonder if we might want to be more careful with giving instruction on how to configure things for software that's not ours. I see that OpenAI doesn't seem to have very good documentation (or I just can't find it), so it may make sense in those instances. Amazon and Microsoft tend to have pretty good documentation, but maybe users need to perform some additional steps to connect to our AI assistant. Either way, worth looking into as it might be easier to rely on the other companies keeping their docs up to date, vs us checking in on their UI to make sure things haven't changed.
We will also need to update some links from the obs AI docs. Currently we link to the connector docs, and these would be more helpful. I can take care of that next week.
| serverless: unavailable | ||
| ``` | ||
|
|
||
| You can also use [preconfigured connectors](kibana://reference/connectors-kibana/pre-configured-connectors.md) to set up a third-party LLM connector. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do we want to maybe give a little more info on what a preconfigured connector is here so users know if it's something they should choose? These are only on-prem, right?
| [Elastic Managed LLM](https://www.elastic.co/docs/reference/kibana/connectors-kibana/elastic-managed-llm) is the default large language model (LLM) connector available in {{kib}} for eligible users. It provides immediate access to generative AI features without requiring any setup or external model integration. | ||
|
|
||
| The Elastic Managed LLM is available out-of-the box; no manual connector setup or API key management is required for initial use. However, you can configure and use a third-party LLM connector, such as OpenAI, Azure, or Amazon Bedrock if you prefer. | ||
| Elastic Managed LLM is available out-of-the box; it does not require manual connector setup or API key management. However, you can configure and use a third-party LLM connector, such as OpenAI, Azure, or Amazon Bedrock if you prefer. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| Elastic Managed LLM is available out-of-the box; it does not require manual connector setup or API key management. However, you can configure and use a third-party LLM connector, such as OpenAI, Azure, or Amazon Bedrock if you prefer. | |
| Elastic Managed LLM is available out-of-the box; you don't need to manually set up connectors or manage API keys. However, you can configure and use a third-party LLM connector, such as OpenAI, Azure, or Amazon Bedrock if you prefer. |
I know this is a snippet, but I'm not sure the second sentence, "However, you can.." fits here. It feels a little confusing when we've already talked about how you can use the Elastic Managed LLM or third-party connectors in the intro paragraph. Just something to think about, but maybe we don't need to use the snippet here, or we can edit it in a way that works a little better.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah, I had a similar thought when I looked at this.
| The Elastic Managed LLM is available out-of-the box; no manual connector setup or API key management is required for initial use. However, you can configure and use a third-party LLM connector, such as OpenAI, Azure, or Amazon Bedrock if you prefer. | ||
| Elastic Managed LLM is available out-of-the box; it does not require manual connector setup or API key management. However, you can configure and use a third-party LLM connector, such as OpenAI, Azure, or Amazon Bedrock if you prefer. | ||
|
|
||
| To learn more about security and data privacy, refer to the [connector documentation](https://www.elastic.co/docs/reference/kibana/connectors-kibana/elastic-managed-llm) and [download the model card](https://raw.githubusercontent.com/elastic/kibana/refs/heads/main/docs/reference/resources/Elastic_Managed_LLM_model_card.pdf). |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is "connector documentation" the right phrasing? Maybe the "Elastic Managed LLM documentation?" I know it's in the connector section, but I feel like we differentiate the Elastic Managed LLM from connectors.
Co-authored-by: Mike Birnstiehl <114418652+mdbirnstiehl@users.noreply.github.com>
benironside
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Great review Mike. Committed your simpler suggestions. Let's talk over the more in depth ones next week. Thanks!
| [Elastic Managed LLM](https://www.elastic.co/docs/reference/kibana/connectors-kibana/elastic-managed-llm) is the default large language model (LLM) connector available in {{kib}} for eligible users. It provides immediate access to generative AI features without requiring any setup or external model integration. | ||
|
|
||
| The Elastic Managed LLM is available out-of-the box; no manual connector setup or API key management is required for initial use. However, you can configure and use a third-party LLM connector, such as OpenAI, Azure, or Amazon Bedrock if you prefer. | ||
| Elastic Managed LLM is available out-of-the box; it does not require manual connector setup or API key management. However, you can configure and use a third-party LLM connector, such as OpenAI, Azure, or Amazon Bedrock if you prefer. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah, I had a similar thought when I looked at this.
Summary
Moves the LLM connector guides from the security docs section to the new AI section within explore-analyze. This fixes elastic/docs-content-internal/issues/487 as part of elastic/docs-content-internal/issues/298. The purpose is to update the IA for these docs to be solution-agnostic, since they are helpful to users of any solution, not just security.
This PR:
explore-analyze/ai-features.mdtoexplore-analyze/ai-features/ai-features.md(this is just cleanup from our previous PR).Generative AI disclosure
Tool(s) and model(s) used:
CoPilot with GPT 4.1 to help with the redirects.