From 4059aa134c34443928bc3ed1454e9ac2736314c7 Mon Sep 17 00:00:00 2001 From: Silvio Vasiljevic Date: Tue, 22 Oct 2024 18:18:37 +0200 Subject: [PATCH 1/3] Add Bedrock service documentation --- content/en/user-guide/aws/bedrock/index.md | 18 ++++++++++++++++++ 1 file changed, 18 insertions(+) create mode 100644 content/en/user-guide/aws/bedrock/index.md diff --git a/content/en/user-guide/aws/bedrock/index.md b/content/en/user-guide/aws/bedrock/index.md new file mode 100644 index 0000000000..8d011ea684 --- /dev/null +++ b/content/en/user-guide/aws/bedrock/index.md @@ -0,0 +1,18 @@ +--- +title: "Bedrock" +linkTitle: "Bedrock" +description: Use foundation models running on your device with LocalStack! +tags: ["Enterprise image"] +--- + +## Introduction + +## Getting started + +## Resource Browser + +## Examples + +## Limitations + +Currently, GPU models are not supported by the LocalStack Bedrock implementation. From 66ba59d8fddf59b4661cc1f4186018299b0e21cc Mon Sep 17 00:00:00 2001 From: Harsh Mishra Date: Tue, 29 Oct 2024 17:56:14 +0530 Subject: [PATCH 2/3] add docs fully --- content/en/user-guide/aws/bedrock/index.md | 67 +++++++++++++++++++++- 1 file changed, 64 insertions(+), 3 deletions(-) diff --git a/content/en/user-guide/aws/bedrock/index.md b/content/en/user-guide/aws/bedrock/index.md index 8d011ea684..ef642e157c 100644 --- a/content/en/user-guide/aws/bedrock/index.md +++ b/content/en/user-guide/aws/bedrock/index.md @@ -7,12 +7,73 @@ tags: ["Enterprise image"] ## Introduction +Bedrock is a fully managed service provided by Amazon Web Services (AWS) that makes foundation models from various LLM providers accessible via an API. +LocalStack allows you to use the Bedrock APIs to test and develop AI-powered applications in your local environment. +The supported APIs are available on our [API Coverage Page](https://docs.localstack.cloud/references/coverage/coverage_bedrock/), which provides information on the extent of Bedrock's integration with LocalStack. + ## Getting started -## Resource Browser +This guide is designed for users new to AWS Bedrock and assumes basic knowledge of the AWS CLI and our `awslocal` wrapper script. + +Start your LocalStack container using your preferred method using the `LOCALSTACK_ENABLE_BEDROCK=1` configuration variable. +We will demonstrate how to use Bedrock by following these steps: + +1. Listing available foundation models +2. Invoking a model for inference +3. Using the conversation API + +### List available foundation models + +You can view all available foundation models using the [`ListFoundationModels`](https://docs.aws.amazon.com/bedrock/latest/APIReference/API_ListFoundationModels.html) API. +This will show you which models are available for use in your local environment. + +Run the following command: + +{{< command >}} +$ awslocal bedrock list-foundation-models +{{< / command >}} + +### Invoke a model + +You can use the [`InvokeModel`](https://docs.aws.amazon.com/bedrock/latest/APIReference/API_runtime_InvokeModel.html) API to send requests to a specific model. +In this example, we'll use the Llama 3 model to process a simple prompt. + +Run the following command: + +{{< command >}} +$ awslocal bedrock-runtime invoke-model \ + --model-id "meta.llama3-8b-instruct-v1:0" \ + --body '{ + "prompt": "<|begin_of_text|><|start_header_id|>user<|end_header_id|>\nSay Hello!\n<|eot_id|>\n<|start_header_id|>assistant<|end_header_id|>", + "max_gen_len": 2, + "temperature": 0.9 + }' --cli-binary-format raw-in-base64-out outfile.txt +{{< / command >}} + +The output will be available in the `outfile.txt`. + +### Use the conversation API + +Bedrock provides a higher-level conversation API that makes it easier to maintain context in a chat-like interaction using the [`Converse`](https://docs.aws.amazon.com/bedrock/latest/APIReference/API_runtime_Converse.html) API. +You can specify both system prompts and user messages. + +Run the following command: -## Examples +{{< command >}} +$ awslocal bedrock-runtime converse \ + --model-id "meta.llama3-8b-instruct-v1:0" \ + --messages '[{ + "role": "user", + "content": [{ + "text": "Say Hello!" + }] + }]' \ + --system '[{ + "text": "You'\''re a chatbot that can only say '\''Hello!'\''" + }]' +{{< / command >}} ## Limitations -Currently, GPU models are not supported by the LocalStack Bedrock implementation. +* LocalStack Bedrock implementation is mock-only and does not run any LLM model locally. +* Currently, GPU models are not supported by the LocalStack Bedrock implementation. From 4e0b79f91a84ef980eee477e3328a505003c97a7 Mon Sep 17 00:00:00 2001 From: Harsh Mishra Date: Tue, 29 Oct 2024 17:59:03 +0530 Subject: [PATCH 3/3] add config docs --- content/en/references/configuration.md | 6 ++++++ 1 file changed, 6 insertions(+) diff --git a/content/en/references/configuration.md b/content/en/references/configuration.md index 3184285877..f420e4590c 100644 --- a/content/en/references/configuration.md +++ b/content/en/references/configuration.md @@ -90,6 +90,12 @@ This section covers configuration options that are specific to certain AWS servi | - | - | - | | `BATCH_DOCKER_FLAGS` | `-e TEST_ENV=1337` | Additional flags provided to the batch container. Same restrictions as `LAMBDA_DOCKER_FLAGS`. | +### Bedrock + +| Variable | Example Values | Description | +| - | - | - | +| `LOCALSTACK_ENABLE_BEDROCK` | `1` | Use the Bedrock provider | + ### BigData (EMR, Athena, Glue) | Variable | Example Values | Description |