Have a few suggestions how this could be solved with some active hardware/software :
Kissing distance issue:
Have a depth/distance sensor as a camera attachment(I guess VR cameras already capture this data, not sure though) and make a simple logic check if her face covers a certain area and at certain distance, it should activate let's say, KISS MODE! And during KISS MODE if it detects face distance to be beyond the sweet spot of perfect distance, it should buzz, light up, or give some ui indication to the director, cameraman or directly to the actor. The actor should be able to correct it on the fly without even opening her eyes for example, with this buzzer solution.
For the eye contact problem:
Train a lightweight AI model to flag in the same way as above, if she looks away or at a single eyepiece during KISS MODE.
In scenarios where she is further away(outside KISS MODE range), I don't think eye contact should be an issue.
As justsomedude101 mentioned, live 2d vid feed is available. Depth/distance and face area data can be taken from here.
Mind you, this has to run in real-time. Maybe someone can create a camera attachment for this.
Over time, this shouldn't be a problem at all.
That's my two cents.