doublevr The Max Q file is 4x larger but bitrate and fps are the same? Why is that?
Bitrate comparison test
- Edited
Yeah despite being 4X larger bitrates are the same, MediaInfo shows this for MAX_Q file "Stream size: 259 MiB (30%)", and this for 30M file "Stream size: 259 MiB (99%), so i would assume, as they have same stream size, that files are the same, only MAX_Q has some junk data in it or something.
MediaInfo shows that they have same encoding settings, also for MAX_Q it shows "IsTruncated: Yes", that only happens when file is not expected size, or is corrupted, etc.
SchnuppiLilac hes trying to play us for fools again. the videos are totally identical but the max q one has exactly 600mb of random data appended to the end of it to make it bigger.
- Edited
Hey guys, we went into full research mode today, trying to figure out the difference between original file and max streaming one that we provide for download, the difference is there but miniscule.
The example we used, granted, did not have a full frame of moving things, so lets say it was average content.
We will continue to test on "harder" examples like some harem scene, where the girls are all around the camera.
Will keep you updated with our findings.
Also we have a memory leak with playing original files, we will fix this, so you can watch them too.
Can I ask who are we actually addressing here, the people watching downloaded files, or people watching porn by streaming?
Sandi_SLR both hopefully. I'd like to see downloaded files at their maximum visual fidelity, and for streaming to be greatly improved.
g2kbuffetboy We're going to have a lot of complaints and cancelations I guess.
There are users who don't even have the possibility to get a connection fast enough for 30Mbps
- Edited
Sandi_SLR Netflix has a tool for quantifying image quality.
https://github.com/Netflix/vmaf
Which should close the debate about bitrate, which is no measure of quality. That's ridiculous.
You don't decide how good a cook is by how tall they are.
There's 1000 variables that influence this.
Most of all, complexity of the image and complexity level of the codec.
Both of these determine the macroblocks and temporal vectors.
L6 encoding will yield a smaller bitrate of the same video encoded at L4 at equal quality.
- Edited
Sandi_SLR Can I ask who are we actually addressing here, the people watching downloaded files, or people watching porn by streaming?
Me, personally, 98% downloads right now. But I certainly wouldn't mind better quality for streaming since quality is pretty much all that counts for me. Best quality in streaming might convince me to use more streaming in the future tbh.
- Edited
Setting up VMAF is not the easiest thing in the world. And you need to make complete decompressed YUV versions of the videos to use correctly.
But, you can use it on this web page with your mp4 files!
Comparing full scenes will take forever this way. I suggest using Shutter Encoder (Free), and use the Cut Without Re-encoding feature to take some excerpts from the scenes and compare them.
VMAF is a well known tool made by Netflix to objectively measure quality difference between encodes of the same video.
- Cut Without Re-encoding is the very first function in the list of the Shutter Encoder app.
IMPORTANT!
Then in the editor I suggest using a starting time and end by typing it in.
- Lower left, In point
- Lower right, Out point
The app will automatically pick the correct keyframe for not re-encoding!
EXTRA IMPORTANT!
Also in the editor, in the upper left, click Image adjustment. It will by default always be set to enabled. Even though all settings are at default, I always make sure to disable it, just in case!
Then in the upper right of the editor, click Apply, followed by Start Function.
- Edited
Rakly3
You replied here yesterday that bitrate means nothing, later you deleted the post.
That is just not true.
Higher resolution = More Information, how else are you gonna transmit that information than increasing bitrate.
6K - 5800x2900=16,820,000 pixels
8K - 8000x4000=32,000,000 pixels
So you are telling us that doubling amount of pixels, doesn't also require increasing bitrate.
If that is true Netflix would use same bitrate for all resolutions, but they don't, they use about 6mbps for 1080p, and around 15mbps for 2160p.
Also netflix is not known for their picture quality, almost all other streaming services have better picture quality than them.
Also i don't need some tool to tell me difference between encodes when i can see it with naked eye. For 6K difference between 30mbps and high bitrate file is almost negligible (except in some rare cases like outdor shot with moving foliage or low light scenes), but for 8K that difference is much bigger and i can clearly see it.
- Edited
Here is comparison between 8K 30mbps and Original 8K 90mbps. Scene is Aubree Valentine "Consumated Passion" from VRHush.
I selected two random frames, and cut everything beside Aubree face.
Here are they as a gif so they are easier to compare.
And here individually.
30mbps
90mbps
30mbps
90mbps
- Edited
mirdumar You replied here yesterday that bitrate means nothing, later you deleted the post.
I can't make posts that are hidden, I have to first post them, then hide. It was meant for certain ppl only.
What I said though, is that Bitrate is no measurement for quality.
Increasing the bitrate of our files will not automatically translate into better image quality.
That is not the same as saying bitrate doesn't matter. There is a point of diminishing returns.
Quality can also be increased by changing encoding settings with much larger impact than simply increasing the quality or bitrate settings. Changing from
- I-B-P-B-I
to - I-P-P-P-I
can increase quality with a smaller increase to bitrate than simply doubling or tripling the bitrate.
2-pass increases quality without increasing the bitrate. But at double the electricity-cost, time-cost, etc
Is the 2-pass quality difference enough to warrant the extra cost? At the scale of SLR, definitely not.
The library used to encode has a huge impact on quality. Try encoding the same video with same settings and bitrate in HEVC with
- x265
- Nvenc
- Quick Sync
- VCE
The difference is quite large between the first two, and last two.
I've restored the hidden posts.
To shed some more light on my comments, because they will be turned into something I didn't say.
There are other aspects that we have to take into account as well. Such as reaching a broad userbase.
If we increase our bitrate, then certain users will have to stream the 6K version instead of the 8K version. Because their internet is too slow fo the higher bitrate 8K.
Let's say our new 6K version = quality-wise, our old 8K version. People are going to be unsubscribing and demanding refunds because they can't use the new 8K stream.
The 'K', marketing-wise, has a much bigger impact generally speaking than the bitrate.
Rakly3 I love this post. You have very valid concerns that I agree with. As someone who mostly streams, I would not want any buffering of my videos.
Could we possibly offer two different 8k versions, one optimised for smooth streaming and the other for downloads/those with extremely fast internet?
- Edited
Rakly3
I'm honestly interested: Who are these options for?
- That's the actual "regular file" with max. 30k kbit/s bitrate. I guess the HUUUUUUUGE majority is either streaming this one or downloading it.
- Who is actually streaming or downloading 6k+ scenes in 4k anymore? According to your own research like 90%+ of your customer base has either a Quest 2 or a better headset that has zero issues rendering the 6k+ file. Why do the few people on Quest 1 or Go get an extra download option but all the people on the major and technically much more advanced headsets have to deal with 30k kbit/s max?
- Who the fuck downloads h.264 files today? Two people on the planet? Why is this even an option today?
- It's cool that there is a downloadable raw file but I doubt that it's downloaded a lot. Should imho stay though for the complete enthusiasts on PIMAX etc. or those who want to edit/cut their own video material from the original file.
IMHO you should at least kick 3, rename "max 6k+" to "optimized 6k" (= max. 30k kbit/s for optimizing streaming and smaller download files) and then add a new, actual max. 6k+ option with an ideal bitrate for maximum quality (less than the untouched "original" file with 120k kbit/s or whatever but more than the capped 30k kbit/s file we have right now).
- Edited
Which should close the debate about bitrate, which is no measure of quality. That's ridiculous.
Seriously, that's apples and oranges. Like 100% apples and oranges.
It's like saying "Bitrate doesn't matter because resolution matters much more".
Well, an 8k picture looks very, very likely more crisp than a 720p picture. No bitrate increase will ever make the 720p more crisp than the 8k picture. So yeah, you're king of right but...
...your statement also misses the whole point though...
The question at hand is not whether something else could be more benefital for picture quality but whether a higher bitrate without changing anything else would result in better quality. And well, yes, it does, especially if you starting bitrate is pretty small.
So no, bitrate is no measure of quality. But a higher bitrate can naturally improve the quality of a live video if you don't change anything else about it.
Rakly3 Let's say our new 6K version = quality-wise, our old 8K version. People are going to be unsubscribing and demanding refunds because they can't use the new 8K stream.
While I understand that argument I don't think that it's a pretty strong one. You could still make the "optimized for streaming 8k version" the standard version for streaming. The option to go for the masses. Or you could offer presets for users so that everybody could choose which version should be standard for them.
I think you use the same flawed logic like some game companies who try to fool people into thinking that their newest PC games couldn't offer the best visual quality because some people with older rigs could be alienated that they couldn't play the game on ultra presets. Which has been proven bollocks so many times in gaming history, it's basically a running joke right now. PC games offer different quality presets and you choose the one that matches your hardware etc. If you want to have better tech, upgrade your tech. Same applies here. If you want to have the best experience, buy the best headset. Invest in the best internet. Or just use an optimized reduced version and be happy with it. There are options for everyone, depends on personal investment and personal circumstances.
ISo really don't get the fear of lost sale here. It's much rather a scapegoat imho.