• SLR
  • Bitrate comparison test

Sandi_SLR both hopefully. I'd like to see downloaded files at their maximum visual fidelity, and for streaming to be greatly improved.

g2kbuffetboy We're going to have a lot of complaints and cancelations I guess.

There are users who don't even have the possibility to get a connection fast enough for 30Mbps

Sandi_SLR Netflix has a tool for quantifying image quality.

https://github.com/Netflix/vmaf

Which should close the debate about bitrate, which is no measure of quality. That's ridiculous.
You don't decide how good a cook is by how tall they are.

There's 1000 variables that influence this.
Most of all, complexity of the image and complexity level of the codec.
Both of these determine the macroblocks and temporal vectors.

L6 encoding will yield a smaller bitrate of the same video encoded at L4 at equal quality.

    Sandi_SLR Can I ask who are we actually addressing here, the people watching downloaded files, or people watching porn by streaming?

    I'm using both ways, but the better quality would be more important for downloads IMO.

    Sandi_SLR Can I ask who are we actually addressing here, the people watching downloaded files, or people watching porn by streaming?

    Me, personally, 98% downloads right now. But I certainly wouldn't mind better quality for streaming since quality is pretty much all that counts for me. Best quality in streaming might convince me to use more streaming in the future tbh.

    Setting up VMAF is not the easiest thing in the world. And you need to make complete decompressed YUV versions of the videos to use correctly.
    But, you can use it on this web page with your mp4 files!

    https://vmaf.dev/


    Comparing full scenes will take forever this way. I suggest using Shutter Encoder (Free), and use the Cut Without Re-encoding feature to take some excerpts from the scenes and compare them.
    VMAF is a well known tool made by Netflix to objectively measure quality difference between encodes of the same video.

    • Cut Without Re-encoding is the very first function in the list of the Shutter Encoder app.

    IMPORTANT!
    Then in the editor I suggest using a starting time and end by typing it in.

    • Lower left, In point
    • Lower right, Out point

    The app will automatically pick the correct keyframe for not re-encoding!

    EXTRA IMPORTANT!
    Also in the editor, in the upper left, click Image adjustment. It will by default always be set to enabled. Even though all settings are at default, I always make sure to disable it, just in case!

    Then in the upper right of the editor, click Apply, followed by Start Function.

      Rakly3
      You replied here yesterday that bitrate means nothing, later you deleted the post.

      That is just not true.

      Higher resolution = More Information, how else are you gonna transmit that information than increasing bitrate.
      6K - 5800x2900=16,820,000 pixels
      8K - 8000x4000=32,000,000 pixels

      So you are telling us that doubling amount of pixels, doesn't also require increasing bitrate.

      If that is true Netflix would use same bitrate for all resolutions, but they don't, they use about 6mbps for 1080p, and around 15mbps for 2160p.

      Also netflix is not known for their picture quality, almost all other streaming services have better picture quality than them.

      Also i don't need some tool to tell me difference between encodes when i can see it with naked eye. For 6K difference between 30mbps and high bitrate file is almost negligible (except in some rare cases like outdor shot with moving foliage or low light scenes), but for 8K that difference is much bigger and i can clearly see it.

        Here is comparison between 8K 30mbps and Original 8K 90mbps. Scene is Aubree Valentine "Consumated Passion" from VRHush.

        I selected two random frames, and cut everything beside Aubree face.

        Here are they as a gif so they are easier to compare.

        And here individually.
        30mbps

        90mbps

        30mbps

        90mbps

          mirdumar You replied here yesterday that bitrate means nothing, later you deleted the post.

          I can't make posts that are hidden, I have to first post them, then hide. It was meant for certain ppl only.

          What I said though, is that Bitrate is no measurement for quality.
          Increasing the bitrate of our files will not automatically translate into better image quality.

          That is not the same as saying bitrate doesn't matter. There is a point of diminishing returns.
          Quality can also be increased by changing encoding settings with much larger impact than simply increasing the quality or bitrate settings. Changing from

          • I-B-P-B-I
            to
          • I-P-P-P-I
            can increase quality with a smaller increase to bitrate than simply doubling or tripling the bitrate.

          2-pass increases quality without increasing the bitrate. But at double the electricity-cost, time-cost, etc
          Is the 2-pass quality difference enough to warrant the extra cost? At the scale of SLR, definitely not.

          The library used to encode has a huge impact on quality. Try encoding the same video with same settings and bitrate in HEVC with

          • x265
          • Nvenc
          • Quick Sync
          • VCE

          The difference is quite large between the first two, and last two.


          I've restored the hidden posts.

          To shed some more light on my comments, because they will be turned into something I didn't say.
          There are other aspects that we have to take into account as well. Such as reaching a broad userbase.

          If we increase our bitrate, then certain users will have to stream the 6K version instead of the 8K version. Because their internet is too slow fo the higher bitrate 8K.

          Let's say our new 6K version = quality-wise, our old 8K version. People are going to be unsubscribing and demanding refunds because they can't use the new 8K stream.
          The 'K', marketing-wise, has a much bigger impact generally speaking than the bitrate.

            Rakly3
            Yeah, and i agree, for SLR 6K, 30mbps is enough in 99% of cases, i was mostly talking about 8K.
            As you can see in images i posted above there is a noticeable difference between 8K 30mbps and 8K 90mbps.

            Rakly3 I love this post. You have very valid concerns that I agree with. As someone who mostly streams, I would not want any buffering of my videos.

            Could we possibly offer two different 8k versions, one optimised for smooth streaming and the other for downloads/those with extremely fast internet?

            Rakly3
            I'm honestly interested: Who are these options for?

            1. That's the actual "regular file" with max. 30k kbit/s bitrate. I guess the HUUUUUUUGE majority is either streaming this one or downloading it.
            2. Who is actually streaming or downloading 6k+ scenes in 4k anymore? According to your own research like 90%+ of your customer base has either a Quest 2 or a better headset that has zero issues rendering the 6k+ file. Why do the few people on Quest 1 or Go get an extra download option but all the people on the major and technically much more advanced headsets have to deal with 30k kbit/s max?
            3. Who the fuck downloads h.264 files today? Two people on the planet? Why is this even an option today?
            4. It's cool that there is a downloadable raw file but I doubt that it's downloaded a lot. Should imho stay though for the complete enthusiasts on PIMAX etc. or those who want to edit/cut their own video material from the original file.

            IMHO you should at least kick 3, rename "max 6k+" to "optimized 6k" (= max. 30k kbit/s for optimizing streaming and smaller download files) and then add a new, actual max. 6k+ option with an ideal bitrate for maximum quality (less than the untouched "original" file with 120k kbit/s or whatever but more than the capped 30k kbit/s file we have right now).

            Rakly3

            Which should close the debate about bitrate, which is no measure of quality. That's ridiculous.

            Seriously, that's apples and oranges. Like 100% apples and oranges.

            It's like saying "Bitrate doesn't matter because resolution matters much more".

            Well, an 8k picture looks very, very likely more crisp than a 720p picture. No bitrate increase will ever make the 720p more crisp than the 8k picture. So yeah, you're king of right but...

            ...your statement also misses the whole point though...

            The question at hand is not whether something else could be more benefital for picture quality but whether a higher bitrate without changing anything else would result in better quality. And well, yes, it does, especially if you starting bitrate is pretty small.

            So no, bitrate is no measure of quality. But a higher bitrate can naturally improve the quality of a live video if you don't change anything else about it.

            Rakly3 Let's say our new 6K version = quality-wise, our old 8K version. People are going to be unsubscribing and demanding refunds because they can't use the new 8K stream.

            While I understand that argument I don't think that it's a pretty strong one. You could still make the "optimized for streaming 8k version" the standard version for streaming. The option to go for the masses. Or you could offer presets for users so that everybody could choose which version should be standard for them.

            I think you use the same flawed logic like some game companies who try to fool people into thinking that their newest PC games couldn't offer the best visual quality because some people with older rigs could be alienated that they couldn't play the game on ultra presets. Which has been proven bollocks so many times in gaming history, it's basically a running joke right now. PC games offer different quality presets and you choose the one that matches your hardware etc. If you want to have better tech, upgrade your tech. Same applies here. If you want to have the best experience, buy the best headset. Invest in the best internet. Or just use an optimized reduced version and be happy with it. There are options for everyone, depends on personal investment and personal circumstances.

            ISo really don't get the fear of lost sale here. It's much rather a scapegoat imho.

            mirdumar As far as l know all the 8k from vrhush is around 50mbps,and the pictures do not look like 8k

              Lenovo
              On their site bitrates are either 45 or 90, on SLR they are 30 or 90.

              It doesn't look like 8K because i cut everything else beside the face, so it's easier to compare, but it is 8K, you can download the scene in question and compare.

                mirdumar Thanks l am not questioning your opinion that high bitrate is better.

                To me the gigantic original files are hit and miss as to if I can see the difference on my Reverb G2. That being said, the very best Original file videos I've ever seen have been a few SLR Originals in 8k. One particular one blows me away still. It has popping colors and background and foreground resolution. Not sure what it is specifically about that scene, but I pasted below.

                https://www.sexlikereal.com/scenes/laundry-day-27037

                Note: I have a G2 connected to an RTX 4080 system. If I remember correctly this original file size studdered badly on Q2 wireless, so power and cable connection is likely a limiting factor.

                  Ventriloquist_Tacos That's because that's one of the SLR scenes shot with the Canon rc5. This is the same camera VrAllure and FuckPassVR use (among others).
                  Even though the camera seems to produce varying results (it's apparently very hard to shoot with) when done right it has the absolute best image clarity / quality. It's also a true 8k camera (8192x4069 or 8000x4000 for the SLR scenes).
                  So, yeah, I agree, I noticed that too.

                  To make it easier for you guys Zcam K1 Pro and K2 Pro have exactly the same resolution. The main difference is amateur vs professional sensor that gives you much better image at almost the same specs (bitrate is still higher there, but it's secondary).

                  Our great @Sandi_SLR is working on a post outlining the whole bitrate business. I can assure you are going to get surprised

                    mirdumar using gifs to highlight the differences isnt good because gifs have a very limited color palette

                    Rakly3 Increasing the bitrate of our files will not automatically translate into better image quality.

                    yes it will (unless you intentionally use some stupid settings to increase the bitrate without a corresponding increase in quality)

                    Rakly3 Changing from
                    I-B-P-B-I
                    to
                    I-P-P-P-I
                    can increase quality with a smaller increase to bitrate than simply doubling or tripling the bitrate.

                    b frames have better coding efficiency... especially for vr content. replacing b frames with p frames will increase bitrate requirement to achieve the same quality so chances are that your example would decrease quality compared to if you just kept b frames and increased bitrate.

                    Rakly3 If we increase our bitrate, then certain users will have to stream the 6K version instead of the 8K version. Because their internet is too slow fo the higher bitrate 8K.

                    so the majority of customers should suffer with low quality 8k because a few people with dialup speed connections want to stream 8k? makes total sense... btw median internet connection speed in the us (probably slr's biggest market) is way higher than the appallingly low 22 mbps slr is encoding 8k videos at.