- Edited
Hairsational ahh yeah, raw first makes sense. So then this is about comparing the original to the raw I suppose.
Hairsational ahh yeah, raw first makes sense. So then this is about comparing the original to the raw I suppose.
I don't see Master AV1 Videos. Are we comparing Master HEVC to Low Bit Rate AV1?
I tested it directly from DuoVR because I don't think we are able to download the "standard" quality AV1 files. Overall, it's better than before, but I find the H265 slightly better. I think if you double the bitrate, it'll be fine. Compared to the master, we're far from this standard quality, especially in close-ups where you can see the skin texture better, both up close and from a distance. Note, I'm not anti-AV1. I don't care as long as the quality is there, but you should double your bitrates of your standards videos....
This has nothing to do with compression quality, but the videos are too light. The skin is too white. We would probably have more details if the skin was more natural like in the thumbnails.
Edit: I use this one to compare and switch between AV1 & H265 from DuoVR menu : https://www.sexlikereal.com/scenes/pink-me-hevc-56016
About the master, I direclty downloaded from the age and store the video file in my quest3
Hello,
If you make both versions from now on, would it be possible to create only one video card?
For those who download (like me), and don't pay attention to the first version that came out a week earlier, it's a lost download slot.
Back when downloads were unlimited, no problem, but now, sometimes, it can be a little tense to lose a slot on such a silly thing.
fenderwq Pretty sure he wants us to compare HEVC master vs AV1 streaming, since SLR's whole argument is that AV1 is such an efficient codec that you don't need higher bitrates and it's all placebo anyways. Any other comparison is pointless and doesn't address people's arguments (AV1 master vs HEVC master because both will look good apart from color grading issues, maybe not even possible since people recently complained that AV1 masters aren't even live yet even when buying scenes; AV1 streaming vs HEVC streaming because even if AV1 was more efficient and looked better at 30mbit/s, it doesn't address complaints that regardless of codec the bitrate is too low).
When the bitrates are similar the quality is similar. AV1 vs x265 does not matter much quality wise.
When it comes to VR video encoding, it does not matter which Codec you choose out of the two (HEVC or AV1) in terms of video quality. At same bitrate, both provide IDINTICAL visual quality. Period.
The ONLY advantage of AV1 is SLIGHTLY smaller file size when encoding 8K+ resolutions which is what should be used for VR Videos in this day and age.
So, even that advantage is not worth it IMHO with all the time AV1 encoding it takes vs HEVC.
And no, better Color grading for AV1 Videos doesn't mean they are better in visual quality. You can adjust the same color grade for the HEVC video, and they we'll look identical.
When I compare, I don't look for the better color, I look for the better pixel. And at same bitrate, both are identical as I said.
Sorry, this whole HEVC vs AV1 is meaningless. And if you guys want to be sure about this, just encode your own VR videos at 8K resolutions in both HEVC and AV1 at same bitrate and same color grade and you'll see NOTHING difference between the two either you look at them and zooming on a 4K PC monitor or on a high resolution VR headset.
That's what I did multiple times, and these were the end results every time.
phiber Even HEVC master vs AV1 streaming is a pointless comparison. AV1 is around 20% more efficient than HEVC at best. The HEVC master here has something like 9x the bitrate of the AV1 streaming version, so obviously it's going to be much higher quality.
VRXVR When it comes to VR video encoding, it does not matter which Codec you choose out of the two (HEVC or AV1) in terms of video quality. At same bitrate, both provide IDINTICAL visual quality. Period.
That's straight up wrong. AV1, when encoded with appropriate parameters, has better compression efficiency than HEVC and can therefore achieve higher visual quality at the same bitrate. Whether that additional quality is perceptible or not depends on a multitude of other factors.
Clearly not a question of HEVC or AV1... it's the bitrate that you provided in these last slr videos... I dont care if its AV1 or HEVC or H265... having 30 mb bitrates makes the video less smooth... if your testing people dont see that, then they dont watch videos when they test. Or they dont wanna see because you want to profit from selling high bitrate videos... If you dont make an acceptable choice of bitrate (at least 60mb) to run the video smoothly, then your videos will lack image quality and smooth thats so important in VR. If you follow this way of having only 30 mb bitrate of your videos company, I will GTFO (term and suggestion kindly used on other post) when my subscription is over as I see no point of being here if you dont provide the best quality has you always did in the past.
Playing AV1 files maxes out the thermals on my (mobile) 3080 and CPU, while the HEVC files play like a cold breeze.
Hairsational Yes, that's right and we all know that "theoretically", This is why I started my statement with "When it comes to VR video encoding" then specifically expanded on the 8K+ resolution videos.
What I meant is, in real life implementations (encoding and observing) you MOSTLY can't see the difference between the two at same 8K+ resolution at same bitrate, with your perfect vision and high resolution headset.
At least, that was my case. And this is why I encouraged everyone to do the same tests using 8K+ resolutions at same bitrate at same color grade and see for themselves.
People still treat the AV1 specs as the judging factor for their 8K+ VR Videos arguments which is not always accurate.
I know it's confusing as many of us tend to rely on spec numbers and so on when compare. But VR Video format is a totally different beast, especially when observing with a VR headset, which is the way to go of course and thus can throw any specs and numbers out of the window.
In other words, as you said, what's the point of a higher quality if you can't see it with your perfect vision using high resolution headset? Yes, it's there, but it's out of our human eyes distinguish capabilities.
There is a "threshold" for our human eyes when it comes to visual quality, even with perfect vision.
If we don't have this vision limitation, All of us should rage about demanding the VR Camera RAW files themselves which would be hilarious.
And that's my whole point and I hope I cleared it out this time.
Best Regards.
The AVP can't even play those HBR AV1 files since it doesn't have a AV1 decoder onboard.
nimendie It's a bit more encoding efficient, so resulting in 10-20% smaller files. Or at the same file size, in slightly better quality. It's way more processing intensive though. And I was surprised to find that I actually thought that the HVEC examples at low bitrate that were posted, were better than the AV1 samples. At least I could distguish a bit more detail.
nimendie
Any Nvidia decoder "NVDEC" from RTX 30 Series and up can play AV1 very smoothly including the RTX 4080 I mentioned.
There's no any problem here despite they need to work much harder to decode compared to HEVC especially for high resolutions like 8192x4096. More GPU/ CPU work = more heat generated = faster thus louder fans spinning to cool them.
As for the AV1 efficiency, it's mainly about its higher compression ratio compared to HEVC. In other words AV1 can provide the same apparent visual quality as HEVC but with a smaller video file size. Especially for lower tier resolutions (2K to 5K).
Main AV1 Pros compared to HEVC:
Best Regards.
If people are wondering why AV1 Files are causing their GPUs to thermal throttle, it's because the files are being live transcoded (into ironically, HEVC).
The reason being AV1 software support is still not that mainstream and your playing software (whatever it might be) does not support live AV1 playback even if the hardware is capable of doing it.
Hairsational point is that AV1 playback on local machines that dont have a recent GPU sucks, HEVC 8K plays perfectly fine on most GPU's but the AV1 requires a 3090 or above I believe, some people dont feel like upgrading a GPU is worth watching their new 8k codec until modern playback software can handle that codec correctly. Heresphere for example chokes up with AV1 and a 2080Ti on my media machine and overpriced GPU's are not worth it just to watch porn
petex67 I can't confirm this to be honest. But all I can tell is if both CPU and GPU don't have an AV1 Decoder, then software decoding is the way to go.
And it plays very smoothly up to 5K (5400x2700) actually according to some tests I did on my old test PC using DeoVR which natively supports AV1 currently. Of course, my CPU has to work very hard to play them smoothly with a little delay when you jump at different parts of the video.
Any resolutions higher than 5K, say "6K" (5760x2880) and up and it struggles to play them smoothly (low frame rates and a lot of hiccups) and the higher the resolution the worse the playback.
Of course I'm talking about an 8-bit videos, the situation is way worse (almost unplayable) with 10-bit videos at the same resolutions I mentioned.