So this question was asked to me in an interview where the interviewer wanted me to design an distributed application that could process more than 20k transactions to a given bank account concurrently. [it could be somehow also like this, 500 transactions come in at same time, after 5sec another 200 transactions come in, after 2 mins another 500 transactions come in and so on]
These transactions could either be credit transactions, debit transactions or both.
We need to maintain consistency all the time, which plainly mean we cannot debit if the bank balance does not permits. [bank balance can never be negative]
I could not think of anything other than sequentially executing those transactions, since we are talking of the same account.
Can anyone let me know of another approach, or something that I might have been missing?
Other options that I had talked about, but didn’t really liked by the interviewer were:
identify whether this is credit or debit transaction, create batches of them.. and execute debit only after credit.
do all the debit and credit calculation in memory, and then create a single transaction onto the database.
[this answer I did not used during the interview but I think this might work]
using multi-version concurrency of postgresQL. [would still need some explanation here, if this works]
Leave an answer