I've said this a couple of times here: SLR should really focus on developing a feature that matches the video's overall lighting with the user's room lighting. I'm not talking about matching the shadows' directions, that would be too computationally expensive for a Quest 3 to do in real time (even for a pc). I just think it should match the overall color of my room depending on the time of day and the warmth and intensity of the lighting
Up until recently, this would've been impossible since devs didn't have access to the Quest camera API, but now they do
Another feature that would make the scenes feel more immersive is the option to have a filter that matches the Quest 3 cameras' "graininess". Right now, the videos look too good compared to the passthrough feed, so a filter like that would increase immersion