Postgresql – Alter postgresql table with huge data


I have a table in my database with few simple columns and one binary column. The primary key of that table is an integer type.

The problem I am facing is that auto-incremented primary key column had hit its maximum limit +2147483647. Means that table have around 2 billion records in it. Due to binary column in that table, it have around 255 GB size on disk.

Now all insertion to that table had stopped due to that data type limit. Logical solution is that we covert that primary key column to bigint. But due to huge size of the ALTER TABLE command always get failed and some times even database put into inconsistent state, where we can't access that table at all.

Statement used:


Any one have any suggestions to solve this issue.

Best Answer

One idea, without knowing the error you get, is that the operation times out. This sometimes happens when the locks to be taken conflict with other operations or there are too many indexes and constraints to update and check. My answer below gives an idea for this scenario. Other possibilities are that you ran out of disk space - add more disks to fix it, and consider the solution below to change the table structure.

Now that you cannot insert into your table anymore, you can just as well create a similar table with bigint instead of the integer PK column. Copy over the existing data. When done (and the necessary indexes built), you can simply rename both tables in the same transaction, so that the new one will in the end have the name of the present one.

With normal operation this would be a bit more complicated (has to duplicate the inserts to write to the new table and so on), but in your case it should work as described above.

Copying a table's data without the constraints and indexes will be possibly much faster than altering it in place (for most cases, at least).

Note: 9.1 will run out of support soon, please consider upgrading to a more recent version.