Skip to content

Commit 0be4bd8

Browse files
docs(examples): complete large-messages example implementation
Signed-off-by: Santiago <sasanchezramirez@gmail.com>
1 parent 9b2e860 commit 0be4bd8

File tree

14 files changed

+116
-2079
lines changed

14 files changed

+116
-2079
lines changed
Lines changed: 18 additions & 68 deletions
Original file line numberDiff line numberDiff line change
@@ -1,77 +1,27 @@
1-
# Powertools for AWS Lambda (Java) - Kafka Example
1+
# Powertools for AWS Lambda (Java) - Large Messages Example
22

3-
This project demonstrates how to use Powertools for AWS Lambda (Java) to deserialize Kafka Lambda events directly into strongly typed Kafka ConsumerRecords<K, V> using different serialization formats.
3+
This project contains an example of a Lambda function using the **Large Messages** module of Powertools for AWS Lambda (Java). For more information on this module, please refer to the [documentation](https://docs.powertools.aws.dev/lambda-java/utilities/large_messages/).
44

5-
## Overview
5+
The example demonstrates an SQS listener that processes messages using the `LargeMessages` functional utility. It handles the retrieval of large payloads offloaded to S3 automatically.
66

7-
The example showcases automatic deserialization of Kafka Lambda events into ConsumerRecords using three formats:
8-
- JSON - Using standard JSON serialization
9-
- Avro - Using Apache Avro schema-based serialization
10-
- Protobuf - Using Google Protocol Buffers serialization
7+
## Deploy the sample application
118

12-
Each format has its own Lambda function handler that demonstrates how to use the `@Deserialization` annotation with the appropriate `DeserializationType`, eliminating the need to handle complex deserialization logic manually.
9+
This sample is based on Serverless Application Model (SAM). To deploy it, check out the instructions for getting
10+
started with SAM in [the examples directory](../README.md).
1311

14-
## Build and Deploy
12+
## Test the application
1513

16-
### Prerequisites
17-
- [AWS SAM CLI](https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/serverless-sam-cli-install.html)
18-
- Java 11+
19-
- Maven
14+
Since this function is triggered by an SQS Queue, you can test it by sending a message to the queue created by the SAM template.
2015

21-
### Build
16+
1. **Find your Queue URL:**
17+
Run the following command (replacing `LargeMessageExample` with the name of your deployed stack):
18+
```bash
19+
aws cloudformation describe-stacks --stack-name LargeMessageExample --query "Stacks[0].Outputs[?OutputKey=='QueueURL'].OutputValue" --output text
2220

23-
```bash
24-
# Build the application
25-
sam build
26-
```
21+
2. **Send a Test Message:**
22+
Note: To test the actual "Large Message" functionality (payload offloading), you would typically use the SQS Extended Client in a producer application. However, you can verify the Lambda trigger with a standard message:
23+
```bash
24+
aws sqs send-message --queue-url [YOUR_QUEUE_URL] --message-body '{"message": "Hello from CLI"}'
2725
28-
### Deploy
29-
30-
```bash
31-
# Deploy the application to AWS
32-
sam deploy --guided
33-
```
34-
35-
During the guided deployment, you'll be prompted to provide values for required parameters. After deployment, SAM will output the ARNs of the deployed Lambda functions.
36-
37-
### Build with Different Serialization Formats
38-
39-
The project includes Maven profiles to build with different serialization formats:
40-
41-
```bash
42-
# Build with JSON only (no Avro or Protobuf)
43-
mvn clean package -P base
44-
45-
# Build with Avro only
46-
mvn clean package -P avro-only
47-
48-
# Build with Protobuf only
49-
mvn clean package -P protobuf-only
50-
51-
# Build with all formats (default)
52-
mvn clean package -P full
53-
```
54-
55-
## Testing
56-
57-
The `events` directory contains sample events for each serialization format:
58-
- `kafka-json-event.json` - Sample event with JSON-serialized products
59-
- `kafka-avro-event.json` - Sample event with Avro-serialized products
60-
- `kafka-protobuf-event.json` - Sample event with Protobuf-serialized products
61-
62-
You can use these events to test the Lambda functions:
63-
64-
```bash
65-
# Test the JSON deserialization function
66-
sam local invoke JsonDeserializationFunction --event events/kafka-json-event.json
67-
68-
# Test the Avro deserialization function
69-
sam local invoke AvroDeserializationFunction --event events/kafka-avro-event.json
70-
71-
# Test the Protobuf deserialization function
72-
sam local invoke ProtobufDeserializationFunction --event events/kafka-protobuf-event.json
73-
```
74-
75-
## Sample Generator Tool
76-
77-
The project includes a tool to generate sample JSON, Avro, and Protobuf serialized data. See the [tools/README.md](tools/README.md) for more information.
26+
3. **Verify Logs:**
27+
Go to AWS CloudWatch Logs and check the Log Group for your function. You should see the processed message logged by the application.
Lines changed: 35 additions & 144 deletions
Original file line numberDiff line numberDiff line change
@@ -1,18 +1,18 @@
11
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
2-
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
2+
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
33
<modelVersion>4.0.0</modelVersion>
4+
45
<groupId>software.amazon.lambda.examples</groupId>
5-
<version>2.8.0</version>
66
<artifactId>powertools-examples-large-messages</artifactId>
7-
<packaging>jar</packaging>
7+
<version>2.8.0</version>
88
<name>Powertools for AWS Lambda (Java) - Examples - Large Messages</name>
99

1010
<properties>
1111
<maven.compiler.source>11</maven.compiler.source>
1212
<maven.compiler.target>11</maven.compiler.target>
13-
<aspectj.version>1.9.20.1</aspectj.version>
14-
<avro.version>1.12.1</avro.version>
15-
<protobuf.version>4.33.0</protobuf.version>
13+
<log4j.version>2.21.1</log4j.version>
14+
<aspectj.version>1.9.21</aspectj.version>
15+
<amazon-sqs-java-extended-client-lib.version>2.1.1</amazon-sqs-java-extended-client-lib.version>
1616
</properties>
1717

1818
<dependencies>
@@ -21,13 +21,19 @@
2121
<artifactId>powertools-large-messages</artifactId>
2222
<version>${project.version}</version>
2323
</dependency>
24+
2425
<dependency>
2526
<groupId>com.amazonaws</groupId>
2627
<artifactId>amazon-sqs-java-extended-client-lib</artifactId>
27-
<version>2.1.1</version>
28+
<version>${amazon-sqs-java-extended-client-lib.version}</version>
29+
</dependency>
30+
31+
<dependency>
32+
<groupId>com.amazonaws</groupId>
33+
<artifactId>aws-lambda-java-events</artifactId>
34+
<version>3.11.4</version>
2835
</dependency>
2936

30-
<!-- Basic logging setup -->
3137
<dependency>
3238
<groupId>software.amazon.lambda</groupId>
3339
<artifactId>powertools-logging-log4j</artifactId>
@@ -42,7 +48,6 @@
4248

4349
<build>
4450
<plugins>
45-
<!-- Don't deploy the example -->
4651
<plugin>
4752
<groupId>org.apache.maven.plugins</groupId>
4853
<artifactId>maven-deploy-plugin</artifactId>
@@ -51,37 +56,11 @@
5156
<skip>true</skip>
5257
</configuration>
5358
</plugin>
54-
<plugin>
55-
<groupId>org.apache.maven.plugins</groupId>
56-
<artifactId>maven-shade-plugin</artifactId>
57-
<version>3.6.1</version>
58-
<executions>
59-
<execution>
60-
<phase>package</phase>
61-
<goals>
62-
<goal>shade</goal>
63-
</goals>
64-
<configuration>
65-
<createDependencyReducedPom>false</createDependencyReducedPom>
66-
<transformers>
67-
<transformer
68-
implementation="org.apache.logging.log4j.maven.plugins.shade.transformer.Log4j2PluginCacheFileTransformer" />
69-
</transformers>
70-
</configuration>
71-
</execution>
72-
</executions>
73-
<dependencies>
74-
<dependency>
75-
<groupId>org.apache.logging.log4j</groupId>
76-
<artifactId>log4j-transform-maven-shade-plugin-extensions</artifactId>
77-
<version>0.2.0</version>
78-
</dependency>
79-
</dependencies>
80-
</plugin>
59+
8160
<plugin>
8261
<groupId>dev.aspectj</groupId>
8362
<artifactId>aspectj-maven-plugin</artifactId>
84-
<version>1.14.1</version>
63+
<version>1.14</version>
8564
<configuration>
8665
<source>${maven.compiler.source}</source>
8766
<target>${maven.compiler.target}</target>
@@ -100,123 +79,35 @@
10079
</goals>
10180
</execution>
10281
</executions>
103-
<dependencies>
104-
<dependency>
105-
<groupId>org.aspectj</groupId>
106-
<artifactId>aspectjtools</artifactId>
107-
<version>${aspectj.version}</version>
108-
</dependency>
109-
</dependencies>
11082
</plugin>
111-
<!-- Generate Avro classes from schema -->
112-
<plugin>
113-
<groupId>org.apache.avro</groupId>
114-
<artifactId>avro-maven-plugin</artifactId>
115-
<version>${avro.version}</version>
116-
<executions>
117-
<execution>
118-
<phase>generate-sources</phase>
119-
<goals>
120-
<goal>schema</goal>
121-
</goals>
122-
<configuration>
123-
<sourceDirectory>${project.basedir}/src/main/avro/</sourceDirectory>
124-
<outputDirectory>${project.basedir}/src/main/java/</outputDirectory>
125-
<stringType>String</stringType>
126-
</configuration>
127-
</execution>
128-
</executions>
129-
</plugin>
130-
<!-- Generate Protobuf classes from schema -->
83+
13184
<plugin>
132-
<groupId>io.github.ascopes</groupId>
133-
<artifactId>protobuf-maven-plugin</artifactId>
134-
<version>3.10.3</version>
85+
<groupId>org.apache.maven.plugins</groupId>
86+
<artifactId>maven-shade-plugin</artifactId>
87+
<version>3.6.1</version>
13588
<executions>
13689
<execution>
90+
<phase>package</phase>
13791
<goals>
138-
<goal>generate</goal>
92+
<goal>shade</goal>
13993
</goals>
140-
<phase>generate-sources</phase>
14194
<configuration>
142-
<protocVersion>${protobuf.version}</protocVersion>
143-
<sourceDirectories>
144-
<sourceDirectory>${project.basedir}/src/main/proto</sourceDirectory>
145-
</sourceDirectories>
146-
<outputDirectory>${project.basedir}/src/main/java</outputDirectory>
147-
<clearOutputDirectory>false</clearOutputDirectory>
95+
<createDependencyReducedPom>false</createDependencyReducedPom>
96+
<transformers>
97+
<transformer
98+
implementation="org.apache.logging.log4j.maven.plugins.shade.transformer.Log4j2PluginCacheFileTransformer" />
99+
</transformers>
148100
</configuration>
149101
</execution>
150102
</executions>
103+
<dependencies>
104+
<dependency>
105+
<groupId>org.apache.logging.log4j</groupId>
106+
<artifactId>log4j-transform-maven-shade-plugin-extensions</artifactId>
107+
<version>0.1.0</version>
108+
</dependency>
109+
</dependencies>
151110
</plugin>
152111
</plugins>
153112
</build>
154-
155-
<profiles>
156-
<!-- Base profile without Avro or Protobuf (compatible with JSON only) -->
157-
<profile>
158-
<id>base</id>
159-
<properties>
160-
<active.profile>base</active.profile>
161-
</properties>
162-
<dependencies>
163-
<!-- Exclude both Avro and Protobuf -->
164-
<dependency>
165-
<groupId>org.apache.avro</groupId>
166-
<artifactId>avro</artifactId>
167-
<version>${avro.version}</version>
168-
<scope>provided</scope>
169-
</dependency>
170-
<dependency>
171-
<groupId>com.google.protobuf</groupId>
172-
<artifactId>protobuf-java</artifactId>
173-
<version>${protobuf.version}</version>
174-
<scope>provided</scope>
175-
</dependency>
176-
</dependencies>
177-
</profile>
178-
179-
<!-- Profile with only Avro -->
180-
<profile>
181-
<id>avro-only</id>
182-
<properties>
183-
<active.profile>avro-only</active.profile>
184-
</properties>
185-
<dependencies>
186-
<dependency>
187-
<groupId>com.google.protobuf</groupId>
188-
<artifactId>protobuf-java</artifactId>
189-
<version>${protobuf.version}</version>
190-
<scope>provided</scope>
191-
</dependency>
192-
</dependencies>
193-
</profile>
194-
195-
<!-- Profile with only Protobuf -->
196-
<profile>
197-
<id>protobuf-only</id>
198-
<properties>
199-
<active.profile>protobuf-only</active.profile>
200-
</properties>
201-
<dependencies>
202-
<dependency>
203-
<groupId>org.apache.avro</groupId>
204-
<artifactId>avro</artifactId>
205-
<version>${avro.version}</version>
206-
<scope>provided</scope>
207-
</dependency>
208-
</dependencies>
209-
</profile>
210-
211-
<!-- Profile with both Avro and Protobuf (default) -->
212-
<profile>
213-
<id>full</id>
214-
<activation>
215-
<activeByDefault>true</activeByDefault>
216-
</activation>
217-
<properties>
218-
<active.profile>full</active.profile>
219-
</properties>
220-
</profile>
221-
</profiles>
222-
</project>
113+
</project>
Lines changed: 47 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,47 @@
1+
/*
2+
* Copyright 2023 Amazon.com, Inc. or its affiliates.
3+
* Licensed under the Apache License, Version 2.0 (the
4+
* "License"); you may not use this file except in compliance
5+
* with the License. You may obtain a copy of the License at
6+
* http://www.apache.org/licenses/LICENSE-2.0
7+
* Unless required by applicable law or agreed to in writing, software
8+
* distributed under the License is distributed on an "AS IS" BASIS,
9+
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
10+
* See the License for the specific language governing permissions and
11+
* limitations under the License.
12+
*
13+
*/
14+
15+
package helloworld;
16+
17+
import com.amazonaws.services.lambda.runtime.Context;
18+
import com.amazonaws.services.lambda.runtime.RequestHandler;
19+
import com.amazonaws.services.lambda.runtime.events.SQSEvent;
20+
import com.amazonaws.services.lambda.runtime.events.SQSEvent.SQSMessage;
21+
import org.apache.logging.log4j.LogManager;
22+
import org.apache.logging.log4j.Logger;
23+
import software.amazon.lambda.powertools.largemessages.LargeMessages;
24+
25+
/**
26+
* Example handler showing how to use LargeMessageProcessor functionally.
27+
* This approach gives you more control than the @LargeMessage annotation.
28+
*/
29+
public final class App implements RequestHandler<SQSEvent, String> {
30+
31+
private static final Logger LOG = LogManager.getLogger(App.class);
32+
33+
@Override
34+
public String handleRequest(final SQSEvent event, final Context context) {
35+
LOG.info("Received event with {} records", event.getRecords().size());
36+
37+
for (SQSMessage message : event.getRecords()) {
38+
LargeMessages.processLargeMessage(message, (processedMessage) -> {
39+
LOG.info("Processing message ID: {}", processedMessage.getMessageId());
40+
LOG.info("Processing body content: {}", processedMessage.getBody());
41+
return "Processed";
42+
});
43+
}
44+
45+
return "SUCCESS";
46+
}
47+
}

0 commit comments

Comments
 (0)