You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: content/microservices/kafka.md
+10-10Lines changed: 10 additions & 10 deletions
Original file line number
Diff line number
Diff line change
@@ -433,9 +433,9 @@ throw new KafkaRetriableException('...');
433
433
434
434
> info **Hint**`KafkaRetriableException` class is exported from the `@nestjs/microservices` package.
435
435
436
-
### Kafka Exception Handling
436
+
### Custom exception handling
437
437
438
-
In addition to the default error handling mechanisms, you can implement a custom Exception Filter for Kafka events to handle retry logic. For example, the following sample shows how to skip a problematic event after a configurable number of retries:
438
+
Along with the default error handling mechanisms, you can create a custom Exception Filter for Kafka events to manage retry logic. For instance, the example below demonstrates how to skip a problematic event after a configurable number of retries:
@@ -507,7 +504,9 @@ export class KafkaMaxRetryExceptionFilter extends BaseExceptionFilter {
507
504
const offset =message.offset;
508
505
509
506
if (!topic||partition===undefined||offset===undefined) {
510
-
thrownewError('Incomplete Kafka message context for committing offset.');
507
+
thrownewError(
508
+
'Incomplete Kafka message context for committing offset.',
509
+
);
511
510
}
512
511
513
512
awaitconsumer.commitOffsets([
@@ -522,7 +521,8 @@ export class KafkaMaxRetryExceptionFilter extends BaseExceptionFilter {
522
521
}
523
522
```
524
523
525
-
This filter provides a mechanism to retry processing a Kafka event up to a configurable number of times. Once the maximum retries are reached, it executes a custom skipHandler (if provided) and commits the offset, effectively skipping the problematic event. This ensures that subsequent events can be processed.
524
+
This filter offers a way to retry processing a Kafka event up to a configurable number of times. Once the maximum retries are reached, it triggers a custom `skipHandler` (if provided) and commits the offset, effectively skipping the problematic event. This allows subsequent events to be processed without interruption.
525
+
526
526
You can integrate this filter by adding it to your event handlers:
0 commit comments