I'm looking at migrating a current PostgreSQL data warehouse to a cloud host with SSD storage and RAM as one of the main sizing variables. The bulkiest data we're dealing with at the moment will live on monthly partitioned tables. Each month is about 70GB with indexes (40-ish w/o). Data is likely mostly bulk-loaded periodically, and will then be accessed by a small team of 5 researchers.
I've been trying to search for recommendations for spec'ing RAM on this site and all I've found is:
- Fit the entire DB (>1TB, unrealistic)
- More is better
Should there be enough RAM to at least load an entire index (16GB) to RAM? Are there any other details I should provide?