Postgresql – Concurrent usage of rdbms, write 4K rows without lockdown

concurrencydjangoperformancepostgresqlpostgresql-performance

I am working on a django project using Postgresql. The use case is as follows:

A user needs to upload an excel (containing 2.5-4K rows) 2-3 times daily. There will be 100s of users (somewhere between 1100-1200).

Now, the issue is the app is taking 1.2-1.8 seconds to write the data in the db. During this time, anyone else viewing the webapp is seeing a "server busy" error.

How can I get around this issue, so that:
1. Write speed can improve
2. Other users can use the webapp and maybe in worst case, we can also have concurrent uploading of excel possible?

Best Answer

I'm not sure you understand.

  1. PostgreSQL has max_connections. If that number is hit people won't be able to connect to the database. If your server is waiting you may get a "server busy" message or the like. As @a_horse_with_no_name suggests connection pooling may solve this problem.
  2. You can speed up writes by batching them. When you convert from your "excel format" to SQL are you doing one insert, or one insert per row? Consider inserting into an unlogged table if you're inserting one row at a time, and then running a single INSERT INTO SELECT to UPSERT into the master.
  3. You should also check out the non-durability settings for faster writes