@@ -7,10 +7,10 @@ The large message utility handles SQS and SNS messages which have had their payl
7
7
offloaded to S3 if they are larger than the maximum allowed size (256 KB).
8
8
9
9
!!! Notice
10
- The large message utility (available in the ` powertools-batch ` module with v1.16.1 or lower) is now deprecated
11
- and replaced by the ` powertools-large-messages ` described in this page.
12
- You can still get the documentation [ here] ( sqs_large_message_handling.md )
13
- and the migration guide [ here] ( #migration-from-the-sqs-large-message-utility ) .
10
+ The large message utility (available in the ` powertools-batch ` module with v1.16.1 or lower) is now deprecated
11
+ and replaced by the ` powertools-large-messages ` described in this page.
12
+ You can still get the documentation [ here] ( sqs_large_message_handling.md )
13
+ and the migration guide [ here] ( #migration-from-the-sqs-large-message-utility ) .
14
14
15
15
## Features
16
16
@@ -20,6 +20,64 @@ and the migration guide [here](#migration-from-the-sqs-large-message-utility).
20
20
21
21
## Background
22
22
23
+ ``` mermaid
24
+ stateDiagram-v2
25
+ direction LR
26
+ Function : Lambda Function
27
+
28
+ state Application {
29
+ direction TB
30
+ sendMsg: sendMessage(QueueUrl, MessageBody)
31
+ extendLib: extended-client-lib
32
+ [*] --> sendMsg
33
+ sendMsg --> extendLib
34
+ state extendLib {
35
+ state if_big <<choice>>
36
+ bigMsg: MessageBody > 256KB ?
37
+ putObject: putObject(S3Bucket, S3Key, Body)
38
+ updateMsg: Update MessageBody<br>with a pointer to S3<br>and add a message attribute
39
+ bigMsg --> if_big
40
+ if_big --> [*]: size(body) <= 256kb
41
+ if_big --> putObject: size(body) > 256kb
42
+ putObject --> updateMsg
43
+ updateMsg --> [*]
44
+ }
45
+ }
46
+
47
+ state Function {
48
+ direction TB
49
+ iterateMsgs: Iterate over messages
50
+ ptLargeMsg: powertools-large-messages
51
+ [*] --> Handler
52
+ Handler --> iterateMsgs
53
+ iterateMsgs --> ptLargeMsg
54
+ state ptLargeMsg {
55
+ state if_pointer <<choice>>
56
+ pointer: Message attribute <br>for large message ?
57
+ normalMsg: Small message,<br>body left unchanged
58
+ getObject: getObject(S3Pointer)
59
+ deleteObject: deleteObject(S3Pointer)
60
+ updateBody: Update message body<br>with content from S3 object<br>and remove message attribute
61
+ updateMD5: Update MD5 of the body<br>and attributes (SQS only)
62
+ yourcode: <b>YOUR CODE HERE!</b>
63
+ pointer --> if_pointer
64
+ if_pointer --> normalMsg : False
65
+ normalMsg --> [*]
66
+ if_pointer --> getObject : True
67
+ getObject --> updateBody
68
+ updateBody --> updateMD5
69
+ updateMD5 --> yourcode
70
+ yourcode --> deleteObject
71
+ deleteObject --> [*]
72
+ }
73
+ }
74
+
75
+ [*] --> Application
76
+ Application --> Function : Lambda Invocation
77
+ Function --> [*]
78
+
79
+ ```
80
+
23
81
SQS and SNS message payload is limited to 256KB. If you wish to send larger message payload, you can leverage the
24
82
[ amazon-sqs-java-extended-client-lib] ( https://github.com/awslabs/amazon-sqs-java-extended-client-lib )
25
83
or [ amazon-sns-java-extended-client-lib] ( https://github.com/awslabs/amazon-sns-java-extended-client-lib ) which
@@ -242,15 +300,15 @@ After your code is invoked and returns without error, the object is deleted from
242
300
using the ` deleteObject(bucket, key) ` API. You can disable the deletion of S3 objects with the following configuration:
243
301
244
302
=== "Don't delete S3 Objects"
245
- ``` java
246
- @LargeMessage (deleteS3Object = false )
247
- private void processRawMessage(SQSEvent . SQSMessage sqsMessage) {
248
- // do something with the message
249
- }
250
- ```
303
+ ```java
304
+ @LargeMessage (deleteS3Object = false)
305
+ private void processRawMessage(SQSEvent.SQSMessage sqsMessage) {
306
+ // do something with the message
307
+ }
308
+ ```
251
309
252
310
!!! tip
253
- This utility works perfectly together with the batch module (` powertools-batch ` ), especially for SQS:
311
+ This utility works perfectly together with the batch module (` powertools-batch ` ), especially for SQS:
254
312
255
313
```java hl_lines="2 5-7 12 15 16" title="Combining batch and large message modules"
256
314
public class SqsBatchHandler implements RequestHandler<SQSEvent, SQSBatchResponse> {
@@ -279,18 +337,18 @@ This utility works perfectly together with the batch module (`powertools-batch`)
279
337
To interact with S3, the utility creates a default S3 Client :
280
338
281
339
=== "Default S3 Client"
282
- ``` java
283
- S3Client client = S3Client . builder()
284
- .httpClient(UrlConnectionHttpClient . builder(). build())
285
- .region(Region . of(System . getenv(AWS_REGION_ENV )))
286
- .build();
287
- ```
340
+ ```java
341
+ S3Client client = S3Client.builder()
342
+ .httpClient(UrlConnectionHttpClient.builder().build())
343
+ .region(Region.of(System.getenv(AWS_REGION_ENV)))
344
+ .build();
345
+ ```
288
346
289
347
If you need to customize this ` S3Client ` , you can leverage the ` LargeMessageConfig ` singleton:
290
348
291
349
=== "Custom S3 Client"
292
- ``` java hl_lines="6"
293
- import software.amazon.lambda.powertools.largemessages.LargeMessage ;
350
+ ```java hl_lines="6"
351
+ import software.amazon.lambda.powertools.largemessages.LargeMessage;
294
352
295
353
public class SnsRecordHandler implements RequestHandler<SNSEvent, String> {
296
354
0 commit comments