[ad_1]
This is a SAAS solution with many customers.
We receiving a lot of reservations via Webpush for many different properties, when the reservation was created or changed on the other side.
Sometimes it happens, a new entry was created, and 30ms later, we receive an update.
Then our current approach fails sometimes. Currently we’re just using a simple lock to prevent changes at the same time. But the find_or_create_by!
-Method before might be the problem which creates duplicates.
reservation = reservations.reload.find_or_create_by!(hotel_id: hotel.id, reservation_id: api_reservation.id, booking_id: api_reservation.booking_id)
reservation.with_lock { update_reservation(reservation, api_reservation) }
To prevent changes at the same time for one reservation, we just build a reservation with minimal data, which will be updated afterwards in the lock.
In the update, there will be some information added besides of the data from the creation. So the information from the initial creation are in the update als included.
To prevent duplicates, we added a validation in the model.
But there are some cases, where duplicates are okay, for example when there is no connection to a hotel system and the guest fills out the information by himself. So there are hundreds of reservations with the reservation_id “0000” oder “1234”,… because the guest doesn’t know the correct number often.
class Reservation
validates :reservation_id, uniqueness: { scope: [:hotel_id, :booking_id] }, if: :to_be_uniq?
end
So there is currently no possibilty to validate this with a PostgreSQL Unique index.
With that ruby validation we had the strange problem, that we created the reservation and then the update failed, because the reservation already exists.RuntimeError - ActiveRecord::RecordInvalid: Validation failed: Reservation has already been taken
When I checked the database, the error was a false positive.
We removed the validation because it caused more problems than it solved.
My last idea is to put random information into the reservations which are created manually. Then it would be possible to validate this via the unique index in PostgreSQL.
But I can’t understand, why an update with the same data, raises the Validation Error.
Does someone have any idea?
[ad_2]