• SLR
  • h265/HEVC v AV1 masters fight

  • Edited

nimendie
Any Nvidia decoder "NVDEC" from RTX 30 Series and up can play AV1 very smoothly including the RTX 4080 I mentioned.

There's no any problem here despite they need to work much harder to decode compared to HEVC especially for high resolutions like 8192x4096. More GPU/ CPU work = more heat generated = faster thus louder fans spinning to cool them.

As for the AV1 efficiency, it's mainly about its higher compression ratio compared to HEVC. In other words AV1 can provide the same apparent visual quality as HEVC but with a smaller video file size. Especially for lower tier resolutions (2K to 5K).

Main AV1 Pros compared to HEVC:

  • Free licensing. Anyone can use it without paying any fees.
  • Better at lower bitrates. Which means higher visual quality than HEVC.
  • Better for streaming (saves more bandwidth costs).
  • Supports higher resolutions than 8K compared to HEVC (NOT MV-HEVC), which can also provide such high resolutions greater than 8K.

Best Regards.

    If people are wondering why AV1 Files are causing their GPUs to thermal throttle, it's because the files are being live transcoded (into ironically, HEVC).

    The reason being AV1 software support is still not that mainstream and your playing software (whatever it might be) does not support live AV1 playback even if the hardware is capable of doing it.

      Hairsational point is that AV1 playback on local machines that dont have a recent GPU sucks, HEVC 8K plays perfectly fine on most GPU's but the AV1 requires a 3090 or above I believe, some people dont feel like upgrading a GPU is worth watching their new 8k codec until modern playback software can handle that codec correctly. Heresphere for example chokes up with AV1 and a 2080Ti on my media machine and overpriced GPU's are not worth it just to watch porn

        petex67 I can't confirm this to be honest. But all I can tell is if both CPU and GPU don't have an AV1 Decoder, then software decoding is the way to go.

        And it plays very smoothly up to 5K (5400x2700) actually according to some tests I did on my old test PC using DeoVR which natively supports AV1 currently. Of course, my CPU has to work very hard to play them smoothly with a little delay when you jump at different parts of the video.

        Any resolutions higher than 5K, say "6K" (5760x2880) and up and it struggles to play them smoothly (low frame rates and a lot of hiccups) and the higher the resolution the worse the playback.

        Of course I'm talking about an 8-bit videos, the situation is way worse (almost unplayable) with 10-bit videos at the same resolutions I mentioned.

        metichemsi
        Absolutely right and well said. Nvidia RTX 20 series are still very capable GPUs for many things including gaming.

        The 2080 Ti can play current games at 1440p with mostly max settings and provides at least 60 FPS. Heck, it can do 4K with low to medium settings at 30 FPS easily.

        So, yes, upgrading the GPU just to watch AV1 Videos is NOT worth it at all EVEN if the modern GPUs are not expensive.

        I rather wait until a very large leap in performance happens to future GPUs to upgrade my RTX 4080 for more general purpose use. This is why I don't care about the recent RTX 50 series GPUs, the large performance leap isn't there yet.

          VRXVR Hey l want upgrade my mobile1650 to rtx5060 to watch 8k high bitrate hevc is that enough?

            VRXVR 100% agree! It's bad enough you cant even get a good modern GPU without paying scalpers and arm and a leg, there is no retail stock anywhere, makes no sense and further proves to me that streaming is all they seem to care about. Either way, I'm just going to continue building my own library of HEVC content for myself to enjoy for the foreseeable future or until GPU's go back to normal or software playback is optimized even on 20 series cards, if ever.

            nimendie simply for HEVC that is overkill, I still have a 2080Ti and I can watch 8K HEVC perfectly fine with plenty of headroom still left where I can increase my steamvr render resolution to 200%+ if I want and still not max out my card. If you want the latest and greatest and it works well for your system and you got the cash, sure.

              nimendie Oh I dont know about that, I dont know that even a 5090 today could handle 16k without some foveated eye tracking mumbo jumbo black magic right now. I think if you really want to futureproof your system and again, you have the cash and system for it, then maybe just get the 5090 and be set for another 5-10 years. I personally wont bother upgrading my media PC beyond the 2080ti until the secondhand market for more modern GPU's normalizes, if it ever does haha Personally at this rate, I think future headsets might even be able to render 16k without a PC, just look at what the apple vision pro can do right now, and you know meta is going to eventually make a quest 4 and 5, by then your real concern might actually revolve around having good enough routers at home and fast and big enough storage media to locally stream 16k content to your headset lol

                • Edited

                nimendie If 16K is your future target, then yes, you need to upgrade your GPU to the RTX 50 Series and up since AV1 and MV-HEVC are the only hardware accelerated supported codecs currently that can support up to 16K resolution, especially for 180°+ VR Video formats.

                As for your GTX 1650 laptop GPU, it's more than enough for handling HEVC 8K resolutions @ 60 FPS, since it has HEVC decoder. When it comes to GPU video decoding, it does not matter what GPU tier you own (Entry level, Mid range or High end) as long as the GPU decoder supports the codec you try to decode.

                Hairsational I completely agree with you that the comparison is pointless because the HEVC master will obviously look better, but as I said in the last post, the opposite (i.e. that the quality difference is marginal because AV1 is more efficient) is SLR/doublevr's whole argument (see the bitrate comparison and AV1 announcement forum threads). So in that sense it's the only comparison that at least addresses people's complaints, even if the outcome is already obvious from the outset.

                It's decided guys - People saying AV1 Low Bit Rate is better than or equal to HEVC HBR are like Flat Earthers. Boom!

                  petex67 LOL! What?! Who said that?! Like seriously?

                  HEVC with high bitrate will beat AV1 with low bitrate at same Resolution/ FPS in terms of visual quality. Yes, AV1 is SUPERIOR to HEVC in almost everything. But that's not meant we should treat it like a magical codec or something.

                  Like @Hairsational said, it's still need to be encoded with appropriate parameters like reasonable bitrate.

                  And that's depends on many factors like Resolution/ Frame rates/ Color depth/ The nature of the environment or project/ The more objects/ movements/ elements/ particles/ changing shadows or lighting...etc...

                  The more and higher of them, the higher the bitrate should be applied regardless of the codec and that's still true in case of AV1. I encoded a 8192x4096 5 minute videos at 8-bit @ 60FPS at same color grade with both HEVC and AV1 with 200 Mbps using CPU encoding (Because it provides higher quality than GPU encoding regardless of the longer time it takes)

                  And both of them were IDENTICAL in video quality. I just couldn't see any difference no matter what I try zooming or looking anywhere. I just couldn't locate any difference in visual quality!

                  The only advantage of AV1 in my situation, it just was slightly smaller in video file size, that's all. Which is not bad of course, But does it really worth the encoding time and hardware compatibility sacrifice when it comes to 8K @60 FPS?
                  It's up to you guys to answer this.

                  I believe AV1 will really shine in future 16K resolution, at that time it's game over for HEVC since it cannot support this resolution and I highly doubt even MV-HEVC or future high resolution codecs can compete with it, especially because it's royalty free and will keep advancing and developing.

                  Best Regards.

                    VRXVR
                    I was just memeing doublevr who said something along the lines of

                    "People who prefer HEVC HBR are like Flatearthers"

                      VRXVR I encoded [..] with both HEVC and AV1 with 200 Mbps

                      VRXVR The only advantage of AV1 in my situation, it just was slightly smaller in video file size, that's all.

                      You got the same/similar file size precisely because you used used a constant bitrate. In your test the AV1 file is very likely higher quality but not perceptibly higher to the human eye because you used such a high bitrate. If you lower the bitrate then the higher quality of the AV1 file would start to become more noticeable as HEVC reaches a point were it can't compress the video enough to match AV1 quality.