renstimpy11 yes and no. I'm not sure if it is real time AI image processing, but I think that v81 is allowing AI to make depth-aware passthrough refinements based on the volumes of historical data its likely collected about my environment from the front facing cameras. I also think it is doing this with color balance, and possibly scene anchors. I have historical use patterns of the quest 3 around my house. In those familiar use locations, the passthrough quality on v81 has been far better than if I "stress test" it, and move to a brand new location.
I think with this historical data, and the new data I provide with every use, AI is likely working to make local device updates via the chipset, and also Meta is using the data more globally to push new OS updates.
Interestingly, like you, I barely noticed any difference immediately after upgrading to v81. It was very subtle at first, but the improvements picked up rapidly with use. I've also now had 4 new v81 builds pushed to my headset over the span of about 3 weeks. This feels abnormally fast to me, even on the PTC, and the quality has improved significantly with each new build push.
I have only used heresphere since updating to V81. I only use Deo when traveling (to stream). I would imagine it would work with any video player, and might even work best with whichever video player you use most. But, again, I really don't know, these are all my predictions based on my own experience.
I really believe this is just the beginning, and we will probably see improvements in passthrough latency, smarter auto calibration of 3D video and passthrough video to match the users environment (kind of eluded to this at Connect 25), better dynamic relighting in passthrough, and maybe context-aware blending. My prediction is that we will see all of these before any new hardware is released, and that with AI, there is probably another 15-25% headroom left to gain just from OS updates alone.