# Postgresql – Alter postgresql table with huge data

postgresqlpostgresql-9.1

I have a table in my database with few simple columns and one binary column. The primary key of that table is an integer type.

The problem I am facing is that auto-incremented primary key column had hit its maximum limit +2147483647. Means that table have around 2 billion records in it. Due to binary column in that table, it have around 255 GB size on disk.

Now all insertion to that table had stopped due to that data type limit. Logical solution is that we covert that primary key column to bigint. But due to huge size of the ALTER TABLE command always get failed and some times even database put into inconsistent state, where we can't access that table at all.

Statement used:

ALTER TABLE table_name ALTER COLUMN id SET DATA TYPE bigint ;


Any one have any suggestions to solve this issue.

Now that you cannot insert into your table anymore, you can just as well create a similar table with bigint instead of the integer PK column. Copy over the existing data. When done (and the necessary indexes built), you can simply rename both tables in the same transaction, so that the new one will in the end have the name of the present one.