# Postgresql – Postgres long running query not blocking table operations

lockingpostgresqltemporary-tables

We have a 3rd party service that imports data into a local postgres instance, that we then query/copy into our own view-data system. These queries can yield millions of rows, so the import of them to our view-data can take a LONG time (days sometimes). This isn't actually a problem for us, but the 3rd party script occasionally performs table operations (like drop) when it's rebuilding itself. Unfortunately those operations get blocked when we have queries running. Right now, due to the size of the data, we're using a postgres cursor (using the ruby sequel gem to assist with this) so we can fit a fixed chunked of data into memory. It seems that CURSORS though, have a sort of transaction running while iterating through the cursor.

I recognize that the best way to do this is take the results of a query and place them somewhere else as quickly as possible and then scan them to import that data and I know I could do that with temporary tables or other means, however I'm wondering if there's a way that a query can buffer its results (say, to a file on disk) without having to resort to managing temporary tables, such that it no longer locks schema operations?

For instance, I found there's a "single row mode" that streams query results, but I'm not entirely sure how postgres does that under the hood and if that would ultimately have the same effect on concurrently running schema operations.

Can someone enlighten me as to how "streaming" works and if this would solve my problem? Or do I have no other choice but to create a temporary table?