- Edited
Rakly3
You replied here yesterday that bitrate means nothing, later you deleted the post.
That is just not true.
Higher resolution = More Information, how else are you gonna transmit that information than increasing bitrate.
6K - 5800x2900=16,820,000 pixels
8K - 8000x4000=32,000,000 pixels
So you are telling us that doubling amount of pixels, doesn't also require increasing bitrate.
If that is true Netflix would use same bitrate for all resolutions, but they don't, they use about 6mbps for 1080p, and around 15mbps for 2160p.
Also netflix is not known for their picture quality, almost all other streaming services have better picture quality than them.
Also i don't need some tool to tell me difference between encodes when i can see it with naked eye. For 6K difference between 30mbps and high bitrate file is almost negligible (except in some rare cases like outdor shot with moving foliage or low light scenes), but for 8K that difference is much bigger and i can clearly see it.