Bitrate comparison test
Hey, I will follow up today, just wanted to be sure to not bullshit you, so we did many tests.
Will try to clear if I can post all the samples we did.
Sandi_SLR you'd be better off now explaining why you are shooting 8k at only 200mbps. That's insanely low quality for 8k footage. it's barely acceptable to shoot 4k at 200mbps and even then that's on the low end.
doublevr Streams are actually the best of the best quality.
demonstrably false. slr has the lowest streaming bitrates of any major site. and no.. your "super advanced 30mbps max variable bitrate encodings" (in reality constant bitrate) are not higher quality than content from other sites.
We can do smaller size downloads using the same advanced encodings, but it would take too much of resources. Thus we only do it for streaming which is our primary focus for a million reasons.
encoding 1 extra hq file for new slro scenes is barely any additional resources
bobbytables Hey, we are not shooting at that bitrate, our post production exports at that bitrate before it goes to transcoder to get all the streaming versions.
doublevr These streaming encodings are magic. The quality is on pars with 200mbps original file for most scenes.
The individual perception of video quality is perhaps also magical. 'On par' probably depends on what you are paying attention to. For my part, the best streaming file quality as it is now is perfectly fine for clothing, furniture, wallpapers or floors. But for close-up skin texture, I don't buy that taking away 60-80 % of the data (from a file that is not the original camera file but has already undergone some data reduction in post-production) will make no difference at all. You will clearly see the difference if you have a thing for skin texture looking as real and crisp as possible. The animated picture of @mirdumar is a good representation of this difference.
I can accept the business decision of SLR to not invest resources into encoding additional intermediate bitrates files. But justifying this by saying that due to some kind of magic 30 mbps can be encoded from 200 without the slightest loss in perceived quality, so you wouldn't notice the difference anyway? I am not convinced.
Came here from the AV1 thread. It's same story here. People verifiably proving that High Bit Rate matters while SLR staff are telling them they are wrong.
- Edited
Yeah that moving GIF between the 90 and 30 mpbs version says everything. Not sure why we're still even debating. I'm sure 30 mbps is enough for SOME people. Everybody is different, some won't even see the difference or won't care. But others obviously do.
I'm sure that there's some bitrate where it will be hard to tell the difference with the original and the compressed version. For MP3 that's usually 256kbps, it's very difficult for people to notice difference between the raw data and the compressed data beyond that bitrate in a blind test. For 8k VR there will be a similar bitrate but obviously, that's WAY above 30 Mbps.
If anything, it kinda feelds like 30 Mbps for 8k video is like 128kbps for MP3 audio. Good enough for some, but anybody with good hardware and ears will hate 128 kbps.
petex67 After many tests and comparisons, I found that we need at least 60 Mbps bitrate for 5760x2880 @ 60 FPS @ 8-bit color depth to look acceptable in visual quality. ANY higher value of Resolution or Frames or Color depth and you need to double and triple the bitrate REGARDLESS of the Codec when it comes to HEVC vs AV1 to keep the same level of the acceptable visual quality ESPECIALLY in the outdoor scenes. DON'T listen or believe ANYONE tells you otherwise. End of the story.
if i'm not mistaken the skin also tends to be an easy target for compression, as the compression software aims to reserve those precious bits for details in the background etc.. I have always wondered (or wished actually) if there was a way to give more priority during filming to the skin portions of the image over the background details...
- Edited
petex67 You'd think that indeed but i was surprised that I could see the difference about equally on both my Quest 3 as the AVP. Especially when using the heresphere player on the quest 3.
- Edited
petex67 You don't even need to wait for the next generation headsets to see the difference. You can already do that with current VR headsets like @harigeharry said.
I can see them very, very obviously with Pimax Crystal Light that I use currently. But yes, the better the generation the obvious the difference.
This is why we should NOT accept such garbage video quality for now and for tomorrow.
Oh it's 100% noticeable on a Quest 3 + Heresphere.
I meant it will be even more apparent on the next gen headsets.