Skip to content

Commit

Permalink
docs: update readme & small var renaming
Browse files Browse the repository at this point in the history
  • Loading branch information
Drodevbar committed Jan 28, 2025
1 parent a122e01 commit 31f5b28
Show file tree
Hide file tree
Showing 2 changed files with 11 additions and 6 deletions.
13 changes: 9 additions & 4 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -400,8 +400,9 @@ See [@message-queue-toolkit/metrics](packages/metrics/README.md) for concrete im
Publisher-level store-based message deduplication is a mechanism that prevents the same message from being sent to the queue multiple times.
It is useful when you want to ensure that a message is published only once, regardless of how many times it is sent.

The mechanism relies on a deduplication store, which is used to store deduplication keys for a certain period of time.
Before message is published, a deduplication key is generated based on the message content and checked against the store.
The mechanism relies on:
1. a deduplication store, which is used to store deduplication keys for a certain period of time
2. a deduplication key, which must be generated and passed as the message property (`messageDeduplicationIdField` property of publisher configuration lets you specify the field in the message that contains the deduplication id). A good deduplication key should be unique for each message and should not change between retries. It should also be relatively short to avoid unnecessary storage costs.

Note that in case of some queuing systems, such as standard SQS, publisher-level deduplication is not sufficient to guarantee that a message is **processed** only once.
This is because standard SQS has an at-least-once delivery guarantee, which means that a message can be delivered more than once.
Expand Down Expand Up @@ -459,12 +460,16 @@ Instead, you should either enable content-based deduplication on the queue or pa
Consumer-level store-based message deduplication is a mechanism that prevents the same message from being processed multiple times.
It is useful when you want to be sure that message is processed only once, regardless of how many times it is received.

The mechanism relies on a deduplication store, which is used to store deduplication keys for a certain period of time.
Upon processing a message, a deduplication key is generated based on the message content and checked against the store.
The mechanism relies on:
1. a deduplication store, which is used to store deduplication keys for a certain period of time.
2. a deduplication key, which should be stored as part of the message before publishing it (`messageDeduplicationIdField` property of `newPublisherOptions` lets you specify the field in the message that contains the deduplication id)

In case the key doesn't exist in the store, key is stored with status `PROCESSING` in the store for a certain period of time (i.e. `maximumProcessingTimeSeconds`) and the message is processed.
Upon successful processing, deduplication key TTL is updated to `deduplicationWindowSeconds` to prevent processing the same message again. The value is updated to `PROCESSED`.
In case of errors handled gracefully by a consumer, deduplication key is removed from the store to allow instant reprocessing of the message.
In case of unexpected errors, the message will be retried again after deduplication key TTL expires.
In case of another consumer trying to process the same message while its status is `PROCESSING`, the message will be retried again according to `queueMessageForRetry` method which can be override by the consumer.
In case of another consumer trying to process the same message while its status is `PROCESSED`, the message will be considered as a duplicate and will be ignored.

In case you would like to use SQS FIFO deduplication feature, this feature won't handle it for you.
Instead, you should either enable content-based deduplication on the queue or pass `MessageDeduplicationId` within message options when publishing a message.
Expand Down
4 changes: 2 additions & 2 deletions packages/core/lib/queues/AbstractQueueService.ts
Original file line number Diff line number Diff line change
Expand Up @@ -635,14 +635,14 @@ export abstract class AbstractQueueService<
.consumerMessageDeduplicationConfig as ConsumerMessageDeduplicationConfig

try {
const result = await consumerDeduplicationConfig.deduplicationStore.setIfNotExists(
const wasLockAcquired = await consumerDeduplicationConfig.deduplicationStore.setIfNotExists(
deduplicationId,
ConsumerMessageDeduplicationKeyStatus.PROCESSING,
messageDeduplicationConfig.maximumProcessingTimeSeconds,
)

// Deduplication key was just created meaning the lock was acquired and message can be processed
if (result) {
if (wasLockAcquired) {
return true
}

Expand Down

0 comments on commit 31f5b28

Please sign in to comment.