Skip to content

feat(parser): add support for SQS-wrapped S3 event notifications #2108

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
Merged
Show file tree
Hide file tree
Changes from 12 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 6 additions & 0 deletions aws_lambda_powertools/utilities/parser/models/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -50,6 +50,10 @@
S3Model,
S3RecordModel,
)
from .s3_event_notification import (
SqsS3EventNotificationModel,
SqsS3EventNotificationRecordModel,
)
from .s3_object_event import (
S3ObjectConfiguration,
S3ObjectContext,
Expand Down Expand Up @@ -130,6 +134,8 @@
"SqsRecordModel",
"SqsMsgAttributeModel",
"SqsAttributesModel",
"SqsS3EventNotificationModel",
"SqsS3EventNotificationRecordModel",
"APIGatewayProxyEventModel",
"APIGatewayEventRequestContext",
"APIGatewayEventAuthorizer",
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,14 @@
from typing import List

from pydantic import Json

from aws_lambda_powertools.utilities.parser.models.s3 import S3Model
from aws_lambda_powertools.utilities.parser.models.sqs import SqsModel, SqsRecordModel


class SqsS3EventNotificationRecordModel(SqsRecordModel):
body: Json[S3Model]


class SqsS3EventNotificationModel(SqsModel):
Records: List[SqsS3EventNotificationRecordModel]
4 changes: 2 additions & 2 deletions aws_lambda_powertools/utilities/parser/models/sqs.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
from datetime import datetime
from typing import Dict, List, Optional, Type, Union
from typing import Dict, List, Optional, Sequence, Type, Union

from pydantic import BaseModel

Expand Down Expand Up @@ -63,4 +63,4 @@ class SqsRecordModel(BaseModel):


class SqsModel(BaseModel):
Records: List[SqsRecordModel]
Records: Sequence[SqsRecordModel]
1 change: 1 addition & 0 deletions docs/utilities/parser.md
Original file line number Diff line number Diff line change
Expand Up @@ -175,6 +175,7 @@ Parser comes with the following built-in models:
| **LambdaFunctionUrlModel** | Lambda Event Source payload for Lambda Function URL payload |
| **KafkaSelfManagedEventModel** | Lambda Event Source payload for self managed Kafka payload |
| **KafkaMskEventModel** | Lambda Event Source payload for AWS MSK payload |
| **SqsS3EventNotificationModel** | Lambda Event Source payload for SQS-wrapped S3 event notifications |

#### Extending built-in models

Expand Down
8 changes: 6 additions & 2 deletions tests/functional/parser/test_s3.py
Original file line number Diff line number Diff line change
Expand Up @@ -6,8 +6,7 @@
from tests.functional.utils import load_event


@event_parser(model=S3Model)
def handle_s3(event: S3Model, _: LambdaContext):
def assert_s3(event: S3Model):
records = list(event.Records)
assert len(records) == 1
record: S3RecordModel = records[0]
Expand Down Expand Up @@ -41,6 +40,11 @@ def handle_s3(event: S3Model, _: LambdaContext):
assert record.glacierEventData is None


@event_parser(model=S3Model)
def handle_s3(event: S3Model, _: LambdaContext):
assert_s3(event)


@event_parser(model=S3Model)
def handle_s3_glacier(event: S3Model, _: LambdaContext):
records = list(event.Records)
Expand Down
43 changes: 43 additions & 0 deletions tests/functional/parser/test_sqs_s3_event_notification.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,43 @@
import pytest

from aws_lambda_powertools.utilities.parser import ValidationError, event_parser
from aws_lambda_powertools.utilities.parser.models import SqsS3EventNotificationModel
from aws_lambda_powertools.utilities.typing import LambdaContext
from tests.functional.parser.test_s3 import assert_s3
from tests.functional.utils import json_serialize, load_event


@event_parser(model=SqsS3EventNotificationModel)
def handle_sqs_json_body_containing_s3_notifications(event: SqsS3EventNotificationModel, _: LambdaContext):
return event


def test_handle_sqs_json_body_containing_s3_notifications():
sqs_event_dict = load_event("sqsEvent.json")
s3_event_notification_dict = load_event("s3Event.json")
for record in sqs_event_dict["Records"]:
record["body"] = json_serialize(s3_event_notification_dict)

parsed_event: SqsS3EventNotificationModel = handle_sqs_json_body_containing_s3_notifications(
sqs_event_dict, LambdaContext()
)

assert len(parsed_event.Records) == 2
for parsed_sqs_record in parsed_event.Records:
assert_s3(parsed_sqs_record.body)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would like an opinion here @rubenfonseca @heitorlessa. I think this test is ok, do you see another way to optimize this?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks ok to me

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is a good opportunity to move to the new unit tests since all we should care here is whether the new Model can be instantiated correctly (along with its ramifications, JSON deserialization etc.).

Following this, all you'd need to do here is:

  • Create a new S3->SQS JSON event file under tests/events
  • Create a new test at tests/unit/parser/test_s3.py
  • Test whether the model can be initialized without any problem (we already test both SQS and S3 Models)
  • Test whether an invalid JSON in the S3 data will lead to a model validation failure

Previously, we were shoving everything under tests/functional where we have a lot of boilerplate like @event_parser and these functions, when in fact we're merely testing the Model since we already have separate tests for the @event_parser... which in this case is a proper functional test.

Hope that makes sense ;)

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It makes perfect sense @heitorlessa! I changed the files as per this feedback



def test_handle_sqs_body_invalid_json():
sqs_event_dict = load_event("sqsEvent.json")

with pytest.raises(ValidationError):
handle_sqs_json_body_containing_s3_notifications(sqs_event_dict, LambdaContext())


def test_handle_sqs_json_body_containing_arbitrary_json():
sqs_event_dict = load_event("sqsEvent.json")
for record in sqs_event_dict["Records"]:
record["body"] = json_serialize({"foo": "bar"})

with pytest.raises(ValidationError):
handle_sqs_json_body_containing_s3_notifications(sqs_event_dict, LambdaContext())