4c1dr3f13x It wasn't about semantics, really. 😉 I genuinely wasn't sure because decoding can be taxing on the CPU too. The page you linked to also makes a clear distinction between encoding and rendering. It's of-course also correct that previews are also rendered, else you wouldn't see much 🙂 It just isn't encoded or decoded but rather stored in the system memory. -- Talking about video editing as with Adobe, not previews in encoding apps like Handbrake. Those will also output an encoded file because the file, metadata, bitrate, size are all part of the preview in this case. -- The rendering in an editing app would mostly be about applying filters/effects, which have to be rendered by the GPU, not encoded by the codec.
I swear it's not semantics! 🙂
To answer the question. it depends on the codec and if you are encoding on 'specialized' hardware or not. I'm technically talking about GPU's, though i'm also not. There is hardware that only does encoding but no rendering. This would not be a GPU. The GPU applies (calculates) the filters/effects. -- The reverse is also true. The chip in the Quest 2 for example is less powerful than your CPU most likely, yet it can decode 8K in real time. You CPU maybe not.
(if you have a Threadrippr / EPIC or something then that's different.)
So if you are applying any filters while encoding on your CPU, then this is going to take long yes. Can be 'simple' things like denoise, deblock, sharpen. Some filters will ofc be more taxing than others.
I'm not sure exactly what is referred to by HDR10 being faster than RGB 8-bit on a CPU. Do you have something to link me to for that? Thanks!
The reason why I say it's not semantics is because it is important we understand each-other and not use different definitions. It avoids a lot of conflict and unintentional misinformation.
Who's definition we use is not the point. That would be semantics 🙂 Just that we use the same.
Going back to the CPU. Less cores doesn't only mean it can do less work at the same time, but you can also buffer less frames at the same time. Your system is going to have to "shuffle" things around more to collect all the data from multiple frames. Lowering your GOP, don't use B-frames or only use I-frames should speed things up at the cost of filesize. -- Studio files SHOULD only be I-frames. Thought these still can be compressed (lossy) vs uncompressed RAW.
Lossy also doesn't mean by definition loss of quality, it's just "loss" of data, yet when uncompressed the data is still an exact copy. See Zip of bitmap image from earlier.