mirdumar You replied here yesterday that bitrate means nothing, later you deleted the post.
I can't make posts that are hidden, I have to first post them, then hide. It was meant for certain ppl only.
What I said though, is that Bitrate is no measurement for quality.
Increasing the bitrate of our files will not automatically translate into better image quality.
That is not the same as saying bitrate doesn't matter. There is a point of diminishing returns.
Quality can also be increased by changing encoding settings with much larger impact than simply increasing the quality or bitrate settings. Changing from
- I-B-P-B-I
to
- I-P-P-P-I
can increase quality with a smaller increase to bitrate than simply doubling or tripling the bitrate.
2-pass increases quality without increasing the bitrate. But at double the electricity-cost, time-cost, etc
Is the 2-pass quality difference enough to warrant the extra cost? At the scale of SLR, definitely not.
The library used to encode has a huge impact on quality. Try encoding the same video with same settings and bitrate in HEVC with
- x265
- Nvenc
- Quick Sync
- VCE
The difference is quite large between the first two, and last two.
I've restored the hidden posts.
To shed some more light on my comments, because they will be turned into something I didn't say.
There are other aspects that we have to take into account as well. Such as reaching a broad userbase.
If we increase our bitrate, then certain users will have to stream the 6K version instead of the 8K version. Because their internet is too slow fo the higher bitrate 8K.
Let's say our new 6K version = quality-wise, our old 8K version. People are going to be unsubscribing and demanding refunds because they can't use the new 8K stream.
The 'K', marketing-wise, has a much bigger impact generally speaking than the bitrate.