I am working on a scenario to store strings in
hex format of fixed length 64 to database. Obviously the choices are
Initial thought was to enforce a valid hex string storing it as
BYTEA will be good idea, but impact I evaluated was the use of
ENCODE on select queries.
I did some performance benchmarks, considering both tables have few million rows;
# A file query_with_char.sql SELECT "key" FROM table_varchar;; # A file query_with_binary.sql SELECT ENCODE("key", 'hex') FROM table_binary; pgbench -c 30 -T 120 -n -f ./query_with_binary.sql -f ./query_with_char.sql -P 5 -S my_db SQL script 1: ./query_with_binary.sql - weight: 1 (targets 33.3% of total) - 236 transactions (34.6% of total, tps = 1.876072) - latency average = 8896.888 ms - latency stddev = 2548.701 ms SQL script 2: ./query_with_varchar.sql - weight: 1 (targets 33.3% of total) - 225 transactions (33.0% of total, tps = 1.788628) - latency average = 7164.604 ms - latency stddev = 2209.866 ms
I am unable to understand why the performance of query
ENCODE is faster compared to normal string. How is PostgreSQL able to do encoding on million rows faster than just fetching the string columns?
Can someone explain what might be wrong in the above test?