Skip to content

Commit 845d36e

Browse files
committed
Handle concurrent inserts in PostgreSQL when enqueuing recurring tasks
Very similar to inserting recurring tasks on boot. PostgreSQL fails and aborts the current transaction when it hits a duplicate key conflict during two concurrent INSERTs for the same value of an unique index. We need to use `insert` instead of `create` here and indicate unique_by to ignore duplicate rows by this value when inserting
1 parent d1676d5 commit 845d36e

File tree

1 file changed

+16
-3
lines changed

1 file changed

+16
-3
lines changed

app/models/solid_queue/recurring_execution.rb

Lines changed: 16 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -7,16 +7,29 @@ class AlreadyRecorded < StandardError; end
77
scope :clearable, -> { where.missing(:job) }
88

99
class << self
10+
def create_or_insert!(**attributes)
11+
if connection.supports_insert_conflict_target?
12+
# PostgreSQL fails and aborts the current transaction when it hits a duplicate key conflict
13+
# during two concurrent INSERTs for the same value of an unique index. We need to explicitly
14+
# indicate unique_by to ignore duplicate rows by this value when inserting
15+
unless insert(attributes, unique_by: [ :task_key, :run_at ]).any?
16+
raise AlreadyRecorded
17+
end
18+
else
19+
create!(**attributes)
20+
end
21+
rescue ActiveRecord::RecordNotUnique
22+
raise AlreadyRecorded
23+
end
24+
1025
def record(task_key, run_at, &block)
1126
transaction do
1227
block.call.tap do |active_job|
1328
if active_job
14-
create!(job_id: active_job.provider_job_id, task_key: task_key, run_at: run_at)
29+
create_or_insert!(job_id: active_job.provider_job_id, task_key: task_key, run_at: run_at)
1530
end
1631
end
1732
end
18-
rescue ActiveRecord::RecordNotUnique => e
19-
raise AlreadyRecorded
2033
end
2134

2235
def clear_in_batches(batch_size: 500)

0 commit comments

Comments
 (0)