Skip to content

Conversation

@benironside
Copy link
Contributor

@benironside benironside commented Dec 5, 2025

Summary

Moves the LLM connector guides from the security docs section to the new AI section within explore-analyze. This fixes elastic/docs-content-internal/issues/487 as part of elastic/docs-content-internal/issues/298. The purpose is to update the IA for these docs to be solution-agnostic, since they are helpful to users of any solution, not just security.

This PR:

  • Moves the four connector-specific guides to the recently created section, in a new sub-section.
  • Creates a landing page for the new subsection.
  • Moves explore-analyze/ai-features.md to explore-analyze/ai-features/ai-features.md (this is just cleanup from our previous PR).
  • Updates links to the LLM connector guides and creates corresponding redirects.
  • Moves the mapped_pages frontmatter from the page within the security solution docs that previously was the landing page for the connector guides (the page should still exist as there are some security-specific guides that were not moved) to the new landing page within explore-analyze. I would appreciate a check on my thinking here, but my thinking is that we should generally be sending users to the new landing page.
  • Updates the security LLM connectors landing page introduction.
  • Updates Elastic Managed LLM snippet.
  • Fixes a few Vale-identified sentence-level issues in the Amazon Bedrock guide.

Generative AI disclosure

  1. Did you use a generative AI (GenAI) tool to assist in creating this contribution?
  • [x ] Yes
  • No

Tool(s) and model(s) used:
CoPilot with GPT 4.1 to help with the redirects.

@github-actions
Copy link

github-actions bot commented Dec 5, 2025

Vale Linting Results

Summary: 1 warning, 10 suggestions found

⚠️ Warnings (1)
File Line Rule Message
explore-analyze/ai-features/llm-guides/connect-to-azure-openai.md 42 Elastic.MeaningfulCTAs Use meaningful link text. Use 'visit, go to, refer to' instead of 'Click here'.
💡 Suggestions (10)
File Line Rule Message
explore-analyze/ai-features/llm-guides/connect-to-amazon-bedrock.md 14 Elastic.Capitalization 'Connect to Amazon Bedrock' should use sentence-style capitalization.
explore-analyze/ai-features/llm-guides/connect-to-amazon-bedrock.md 27 Elastic.Acronyms 'IAM' has no definition.
explore-analyze/ai-features/llm-guides/connect-to-amazon-bedrock.md 29 Elastic.Acronyms 'IAM' has no definition.
explore-analyze/ai-features/llm-guides/connect-to-amazon-bedrock.md 64 Elastic.Capitalization 'Configure an IAM User' should use sentence-style capitalization.
explore-analyze/ai-features/llm-guides/connect-to-amazon-bedrock.md 82 Elastic.FutureTense 'will authenticate' might be in future tense. Write in the present tense to describe the state of the product as it is now.
explore-analyze/ai-features/llm-guides/connect-to-amazon-bedrock.md 97 Elastic.Capitalization 'Configure the Amazon Bedrock connector' should use sentence-style capitalization.
explore-analyze/ai-features/llm-guides/connect-to-azure-openai.md 74 Elastic.Capitalization 'Configure Elastic AI Assistant' should use sentence-style capitalization.
explore-analyze/ai-features/llm-guides/connect-to-google-vertex.md 14 Elastic.Capitalization 'Connect to Google Vertex' should use sentence-style capitalization.
explore-analyze/ai-features/llm-guides/connect-to-openai.md 24 Elastic.Semicolons Use semicolons judiciously.
solutions/_snippets/elastic-managed-llm.md 3 Elastic.Semicolons Use semicolons judiciously.

Copy link
Contributor

@mdbirnstiehl mdbirnstiehl left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Left a few comments/questions/suggestions.

As a general comment, I wonder if we might want to be more careful with giving instruction on how to configure things for software that's not ours. I see that OpenAI doesn't seem to have very good documentation (or I just can't find it), so it may make sense in those instances. Amazon and Microsoft tend to have pretty good documentation, but maybe users need to perform some additional steps to connect to our AI assistant. Either way, worth looking into as it might be easier to rely on the other companies keeping their docs up to date, vs us checking in on their UI to make sure things haven't changed.

We will also need to update some links from the obs AI docs. Currently we link to the connector docs, and these would be more helpful. I can take care of that next week.

serverless: unavailable
```

You can also use [preconfigured connectors](kibana://reference/connectors-kibana/pre-configured-connectors.md) to set up a third-party LLM connector.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we want to maybe give a little more info on what a preconfigured connector is here so users know if it's something they should choose? These are only on-prem, right?

[Elastic Managed LLM](https://www.elastic.co/docs/reference/kibana/connectors-kibana/elastic-managed-llm) is the default large language model (LLM) connector available in {{kib}} for eligible users. It provides immediate access to generative AI features without requiring any setup or external model integration.

The Elastic Managed LLM is available out-of-the box; no manual connector setup or API key management is required for initial use. However, you can configure and use a third-party LLM connector, such as OpenAI, Azure, or Amazon Bedrock if you prefer.
Elastic Managed LLM is available out-of-the box; it does not require manual connector setup or API key management. However, you can configure and use a third-party LLM connector, such as OpenAI, Azure, or Amazon Bedrock if you prefer.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
Elastic Managed LLM is available out-of-the box; it does not require manual connector setup or API key management. However, you can configure and use a third-party LLM connector, such as OpenAI, Azure, or Amazon Bedrock if you prefer.
Elastic Managed LLM is available out-of-the box; you don't need to manually set up connectors or manage API keys. However, you can configure and use a third-party LLM connector, such as OpenAI, Azure, or Amazon Bedrock if you prefer.

I know this is a snippet, but I'm not sure the second sentence, "However, you can.." fits here. It feels a little confusing when we've already talked about how you can use the Elastic Managed LLM or third-party connectors in the intro paragraph. Just something to think about, but maybe we don't need to use the snippet here, or we can edit it in a way that works a little better.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, I had a similar thought when I looked at this.

The Elastic Managed LLM is available out-of-the box; no manual connector setup or API key management is required for initial use. However, you can configure and use a third-party LLM connector, such as OpenAI, Azure, or Amazon Bedrock if you prefer.
Elastic Managed LLM is available out-of-the box; it does not require manual connector setup or API key management. However, you can configure and use a third-party LLM connector, such as OpenAI, Azure, or Amazon Bedrock if you prefer.

To learn more about security and data privacy, refer to the [connector documentation](https://www.elastic.co/docs/reference/kibana/connectors-kibana/elastic-managed-llm) and [download the model card](https://raw.githubusercontent.com/elastic/kibana/refs/heads/main/docs/reference/resources/Elastic_Managed_LLM_model_card.pdf).
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is "connector documentation" the right phrasing? Maybe the "Elastic Managed LLM documentation?" I know it's in the connector section, but I feel like we differentiate the Elastic Managed LLM from connectors.

Co-authored-by: Mike Birnstiehl <114418652+mdbirnstiehl@users.noreply.github.com>
Copy link
Contributor Author

@benironside benironside left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great review Mike. Committed your simpler suggestions. Let's talk over the more in depth ones next week. Thanks!

[Elastic Managed LLM](https://www.elastic.co/docs/reference/kibana/connectors-kibana/elastic-managed-llm) is the default large language model (LLM) connector available in {{kib}} for eligible users. It provides immediate access to generative AI features without requiring any setup or external model integration.

The Elastic Managed LLM is available out-of-the box; no manual connector setup or API key management is required for initial use. However, you can configure and use a third-party LLM connector, such as OpenAI, Azure, or Amazon Bedrock if you prefer.
Elastic Managed LLM is available out-of-the box; it does not require manual connector setup or API key management. However, you can configure and use a third-party LLM connector, such as OpenAI, Azure, or Amazon Bedrock if you prefer.
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, I had a similar thought when I looked at this.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants