Reading about video quality I found that it depends of resolution, frames per second and bitrate, which decides the size of the video.
My question is how the bitrate is calculated and how it can differ.
Let's say a video has a 360×240 resolution. It takes 86400 pixels per frame.
The frame rate is 30 Hz. So the video takes 86400 × 30 = 2592000 pixels per second.
So let's say 1 pixel is 3 Bytes (24 Bits) of data: we have 2592000 × 24 bits per second video (62208000 Bits), that is 62208 kBits (This does not sound right, maybe some problem in my calculation).
But how can it differ and how does it make difference in quality?