- Edited
someoneX After all the scene content and codec are the same or similar between all studios and almost all of them came to the conclusion, that they need higher average bitrates to preserve as much detail as possible, while keeping file sizes reasonable and save server capacities and bandwidth.
That's actually common sense in the video industry that ultra high resolution content requires a proper bitrates. 30 mbps is definitely not enough and results in a poor image with a lot of nosie. Just look at my post above, for 4k and better at 60 FPS the bitrate should be at least 50 mbps, better 60 mbps or better.
SLR should at least be honest and say: "We do it to save costs because higher bitrates mean bigger files and more data traffic that we have to pay for. We can't reduce resolution due to marketing reasons but we can reduce the bitrate to make a few extra bucks."
It's definitely NOT a technical issue since all the other studios obviously have no issue at all releasing scenes at the same resolution but with a much better bitrate - all of these files can be streamed as well.
I actually think that I pay a reasonable monthy subscription fee to get proper ultra high videos with a bitrate that fits the resolution. It's a damn shame that great content is artificially worsened by a lacking video bitrate.