• SLR
  • h265/HEVC v AV1 masters fight

metichemsi
Absolutely right and well said. Nvidia RTX 20 series are still very capable GPUs for many things including gaming.

The 2080 Ti can play current games at 1440p with mostly max settings and provides at least 60 FPS. Heck, it can do 4K with low to medium settings at 30 FPS easily.

So, yes, upgrading the GPU just to watch AV1 Videos is NOT worth it at all EVEN if the modern GPUs are not expensive.

I rather wait until a very large leap in performance happens to future GPUs to upgrade my RTX 4080 for more general purpose use. This is why I don't care about the recent RTX 50 series GPUs, the large performance leap isn't there yet.

    VRXVR Hey l want upgrade my mobile1650 to rtx5060 to watch 8k high bitrate hevc is that enough?

      VRXVR 100% agree! It's bad enough you cant even get a good modern GPU without paying scalpers and arm and a leg, there is no retail stock anywhere, makes no sense and further proves to me that streaming is all they seem to care about. Either way, I'm just going to continue building my own library of HEVC content for myself to enjoy for the foreseeable future or until GPU's go back to normal or software playback is optimized even on 20 series cards, if ever.

      nimendie simply for HEVC that is overkill, I still have a 2080Ti and I can watch 8K HEVC perfectly fine with plenty of headroom still left where I can increase my steamvr render resolution to 200%+ if I want and still not max out my card. If you want the latest and greatest and it works well for your system and you got the cash, sure.

        nimendie Oh I dont know about that, I dont know that even a 5090 today could handle 16k without some foveated eye tracking mumbo jumbo black magic right now. I think if you really want to futureproof your system and again, you have the cash and system for it, then maybe just get the 5090 and be set for another 5-10 years. I personally wont bother upgrading my media PC beyond the 2080ti until the secondhand market for more modern GPU's normalizes, if it ever does haha Personally at this rate, I think future headsets might even be able to render 16k without a PC, just look at what the apple vision pro can do right now, and you know meta is going to eventually make a quest 4 and 5, by then your real concern might actually revolve around having good enough routers at home and fast and big enough storage media to locally stream 16k content to your headset lol

          • Edited

          nimendie If 16K is your future target, then yes, you need to upgrade your GPU to the RTX 50 Series and up since AV1 and MV-HEVC are the only hardware accelerated supported codecs currently that can support up to 16K resolution, especially for 180°+ VR Video formats.

          As for your GTX 1650 laptop GPU, it's more than enough for handling HEVC 8K resolutions @ 60 FPS, since it has HEVC decoder. When it comes to GPU video decoding, it does not matter what GPU tier you own (Entry level, Mid range or High end) as long as the GPU decoder supports the codec you try to decode.

          Hairsational I completely agree with you that the comparison is pointless because the HEVC master will obviously look better, but as I said in the last post, the opposite (i.e. that the quality difference is marginal because AV1 is more efficient) is SLR/doublevr's whole argument (see the bitrate comparison and AV1 announcement forum threads). So in that sense it's the only comparison that at least addresses people's complaints, even if the outcome is already obvious from the outset.

          It's decided guys - People saying AV1 Low Bit Rate is better than or equal to HEVC HBR are like Flat Earthers. Boom!

            petex67 LOL! What?! Who said that?! Like seriously?

            HEVC with high bitrate will beat AV1 with low bitrate at same Resolution/ FPS in terms of visual quality. Yes, AV1 is SUPERIOR to HEVC in almost everything. But that's not meant we should treat it like a magical codec or something.

            Like @Hairsational said, it's still need to be encoded with appropriate parameters like reasonable bitrate.

            And that's depends on many factors like Resolution/ Frame rates/ Color depth/ The nature of the environment or project/ The more objects/ movements/ elements/ particles/ changing shadows or lighting...etc...

            The more and higher of them, the higher the bitrate should be applied regardless of the codec and that's still true in case of AV1. I encoded a 8192x4096 5 minute videos at 8-bit @ 60FPS at same color grade with both HEVC and AV1 with 200 Mbps using CPU encoding (Because it provides higher quality than GPU encoding regardless of the longer time it takes)

            And both of them were IDENTICAL in video quality. I just couldn't see any difference no matter what I try zooming or looking anywhere. I just couldn't locate any difference in visual quality!

            The only advantage of AV1 in my situation, it just was slightly smaller in video file size, that's all. Which is not bad of course, But does it really worth the encoding time and hardware compatibility sacrifice when it comes to 8K @60 FPS?
            It's up to you guys to answer this.

            I believe AV1 will really shine in future 16K resolution, at that time it's game over for HEVC since it cannot support this resolution and I highly doubt even MV-HEVC or future high resolution codecs can compete with it, especially because it's royalty free and will keep advancing and developing.

            Best Regards.

              VRXVR
              I was just memeing doublevr who said something along the lines of

              "People who prefer HEVC HBR are like Flatearthers"

                VRXVR I encoded [..] with both HEVC and AV1 with 200 Mbps

                VRXVR The only advantage of AV1 in my situation, it just was slightly smaller in video file size, that's all.

                You got the same/similar file size precisely because you used used a constant bitrate. In your test the AV1 file is very likely higher quality but not perceptibly higher to the human eye because you used such a high bitrate. If you lower the bitrate then the higher quality of the AV1 file would start to become more noticeable as HEVC reaches a point were it can't compress the video enough to match AV1 quality.

                  Hairsational Yes, and you're absolutely right.

                  My tests were an attempt to simulate the "SLR Magic Encoding" since you know, this is what we have to deal with here.

                  They either use very low VBR for streaming or very high CBR for originals. Without any middle ground "wisely enough".

                  And as you know, I'm biased towards visual quality, so I didn't bother to simulate the low VBR and invested my tests with high CBR instead with the results I mentioned previously.

                  But otherwise, you're very right, and AV1 will beat the heck out of HEVC if it's done with reasonable bitrate.

                  PS: When I talk about visual quality in most of my statements, I always refer to the Original/ Studio files.

                  But I got mostly misunderstood, maybe because I don't use the phrase "High bitrate" to refer to them.

                  I really hate to describe them with "High bitrate" since they are just the "reasonable" bitrate for such high resolutions with constant movements are part of their core, you know what I mean 😉

                  But I think I'm forced to describe them with "High bitrate" from now on to eliminate any future misunderstanding I guess.

                  Respectfully.

                  VRXVR This is a bit offtopic, but I'm not sure about the future of AV1 for 16K VR. I don't know if Wikipedia is outdated, but while the highest level (6.3) does support 16000x8000 as a resolution, "MaxDisplayRate" and "MaxDecoderRate" (not sure about the difference tbh) are only 4,278,190,080 and 4,706,009,088 respectively. 12000x6000@60FPS are already 4,320,000,000 samples per second, which is at least outside the MaxDisplayRate spec. So even if you can encode 16K 60FPS VR videos with AV1, it seems to be outside the specifications, which basically means that the chance of having a hardware decoder for this content should be really slim? And good luck using a software decoder for that! That's also why the two VRBangers 12K releases are using AVC/H.264 at High4.1 profile instead of HEVC, because 12K@60FPS is outside HEVC's official specs.

                  Like I said, maybe I'm missing something here and people are planning some kind of specification extension, but I feel like VR content (2:1 aspect ratio, high resolution, high framerate) is not really on the codec committees minds when making these specs.

                    phiber Not off topic at all! Yes, you're very right, the AV1 Specs describe "mostly" flat videos. At least currently. But the more popular VR will be the more and more codecs start to priorities it in their specs.

                    And 16K @60FPS is not outside of the AV1 specifications actually, because AV1 maximum tile width is 4096 pixels. So, 16K @60 FPS should be possible with 4x4 tiles for 180°+ FOV VR Videos, for example.

                    As for 16K @60 FPS AV1 hardware decoding, Nvidia 50 series, more specifically RTX 5080 and RTX 5090 has multiple 9th Gen Encoder (NVENC) and Decoder (NVDEC).

                    Still not sure about the encoding capabilities at the moment, but their NVDEC can easily decode AV1 8K @60 FPS per decoder, which means 16K @60 FPS hardware decoding output with both decoders in parallel use.

                    https://en.wikipedia.org/wiki/GeForce_RTX_50_series#Media_Engine_and_I/O

                    Also, the Intel 11th Gen CPUs and newer can support AV1 16K @60 FPS natively along with Intel Iris Xe MAX mobile GPUs and Intel Arc A Series desktop GPUs according to their media capabilities:

                    https://www.intel.com/content/www/us/en/docs/onevpl/developer-reference-media-intel-hardware/1-1/details.html#DECODE-11-12

                    As for 16K @60 FPS AV1 hardware encoding, we still in the early years for that to happen currently. So, software encoding is the way to go, and yes, as you said, we need a freaking High-end PC with at least 64GB of RAM to do that, like with "SVT-AV1" for example:

                    https://github.com/psy-ex/svt-av1-psy/blob/master/README.md

                    Damn, just searched for "VRBangers 12K Videos". They are 12288x6144 resolution @60 FPS @90 Mbps bitrate with AVC/H.264 Codec.

                    LOL!!! WTH, they were thinking!!! This is almost unplayable even with powerful PCs! Thanks for the heads up man! I really laughed when I read about this.

                    Anyway, I think they could "remaster" them now with AV1 16K @60FPS for RTX 5080/ 5090 GPU owners, the whole three of them! lol!

                    Best Regards!