Skip to content

Why pubsub handler defaults to Success on error? #720

@germansokolov13

Description

@germansokolov13

I was trying to develop resilient Consumer that reads durable replicated Kafka topic via dapr/sdk in Node.js.

The issue is that topic offset is committed(i.e. moved forward, ack-ed, acknowledged) even if an error occurs. So Kafka-side resilient retry doesn't work. I.e. any bugs in my code or transient errors result in message loss.

When pinpointing the problem I found this code:
https://github.com/dapr/js-sdk/blob/main/src/implementation/Server/HTTPServer/HTTPServerImpl.ts#L171

In particular this:

try {
    status = await cb(data, headers);
} catch (e) {
    // We catch and log an error, but we don't do anything with it as the statuses should define that
   this.logger.error(`[route-${routeObj.path}] Message processing failed, ${e}`);
}

statuses.push(status ?? DaprPubSubStatusEnum.SUCCESS);

So it seems if my callback throws an unexpected error then the message is lost. I can't use dapr/sdk and guarantee that all messages are processed and not lost.
Why catch doesn't do anything and Success is always default? How will retry work?

Does retry even work at all with Kafka? I really need messages to be durable and durably replicated and acked only after full finish of the handler. Money is inside the message payload. What am I doing wrong?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions