-
-
Notifications
You must be signed in to change notification settings - Fork 83
Description
Hi,
Newbie user here, used celery in the past to manage jobs in my applications.
It seems to me that procrastinate doesn’t offer a built-in “results storage” mechanism like some other task queues.
It simply executes the job and stores metadata in the Postgres tables, without providing handy access to the job’s return value.
It's clearly possible to open a new db connection inside the worker to store the results, but this would double the connection count on the DB. Moreover, I am not sure this is the intended usage: is this an acceptable usage pattern, or do you expect to cause any issue?
I'd appreciate if you can point me to documentation I may have missed about storing job results or discuss the issue further, if this an interesting use cases.