Skip to content
This repository was archived by the owner on Aug 7, 2025. It is now read-only.

Conversation

@silv-io
Copy link
Member

@silv-io silv-io commented Nov 14, 2024

This PR updates the documentation to cover the new LLM-enabled functionality of Bedrock and the new configuration variables.

@github-actions
Copy link

github-actions bot commented Nov 14, 2024

🎊 PR Preview has been successfully built and deployed to https://localstack-docs-preview-pr-1554.surge.sh 🎊

@silv-io silv-io changed the base branch from main to release/v4 November 14, 2024 15:37
@silv-io silv-io force-pushed the update-bedrock-v4-docs branch 2 times, most recently from e59db2b to 5407f1e Compare November 14, 2024 15:45
@silv-io silv-io requested a review from dfangl November 14, 2024 15:46
@silv-io silv-io force-pushed the update-bedrock-v4-docs branch from 5407f1e to c65d2c6 Compare November 14, 2024 15:48
@silv-io silv-io marked this pull request as ready for review November 14, 2024 16:06
Copy link
Member

@dfangl dfangl left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, thanks for the fast turnaround!

## Limitations

* LocalStack Bedrock implementation is mock-only and does not run any LLM model locally.
* LocalStack Bedrock currently only officially supports text-based models.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What does "currently only officially" mean exactly?

Copy link
Member Author

@silv-io silv-io Nov 14, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That at this point in time we only officially support text-based models as opposed to image or other kinds of binary data models.

Do you have an alternative way to read this sentence? I can clarify based on that :D

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

My question would be what kind of models are unofficially supported 😅 . It's fine for me now, this was just the first question popping in my head. For me they are either supported or not, but this phrasing leads me to believe other models might be unofficially supported :P

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

unofficially we support any model that can run in ollama, but not every single one is tested. The invoke-model endpoint accepts binary data, so you could theoretically run non text-based models and use it without any issue probably - we just haven't tested that yet :)

@joe4dev joe4dev mentioned this pull request Nov 14, 2024
18 tasks
@silv-io silv-io merged commit 14c0779 into release/v4 Nov 18, 2024
4 checks passed
@silv-io silv-io deleted the update-bedrock-v4-docs branch November 18, 2024 08:50
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants