I have a friend who recently underwent major brain surgery. As a result, his eyes no longer track together, and he is suffering from debilitating double vision. He tells me that his right side vision is tilted 45 degrees from the left (which has mostly normal vision). This constant clash of two images, offset by 45 degrees is bothering him more than any other symptom.
It made me wonder if I could adjust the virtual camera position for his right eye so that his eyes would be seeing the same image. He has some prism glasses that do this to a certain extent.
Are there any exposed knobs for adjusting tilt and direction of one of the "virtual cameras" in the rift headset? At the very least, I'd like to remove the 45 degree tilt for him and see if it gave him some respite. Even if he had to keep his gaze on a fixed point, I think he would welcome the ability to integrate his vision for short trips into VR. I also have another more ambitious project in mind. I'd like to use touch controls to control that adjustment for tilt and direction. That way he could focus on a fixed object, then rotate/tilt his other eye's field of view with the touch controller, basically using the touch controller to move the field of view on his right to match that of his left.
I'm not looking for mechanical adjustment, but low level adjustment of rendering positioning. Basically, there are two virtual cameras in the VR game that feed into your headset, each rendering its own perspective of the scene. Normally, the direction, position, and rotation of these virtual cameras is dictated by the position of the headset. I'd like to add some offset to the direction and rotation of one the cameras.
Let's say you had a condition where your left eye was tilted a constant 10 degrees higher than your right eye. You could compensate for this by tilting the left virtual camera down by 10 degrees. This centers the scene for both eyes now. This is a simple example, but the idea is sound--compensate for an eye that points in the wrong direction by compensating the POV camera in that eye.
In an ideal solution, one could push live eye tracking information into a function that would slave a affected eye to the healthy one. Basically, figure out whatever the healthy eye is pointing at, and then center that point with the other eye. One could even compensate for an extremely jittery eye with the same sort of algorithms we use for image stabilization in cameras. But the real question here is whether the controls for the virtual cameras are exposed outside proprietary driver code.
It's possible, if you're a developer, but can't be used with existing apps on the store. However, if you are a developer, you could code your own demo and manually adjust the 2 virtual cameras in any way you like. But this would only work on your own custom apps.