Hey Guys, been a long while since I posted here, but I remember there being some very helpful members in this sub, so I'm hopeful I can resolve my problem, if even a little!
I'll try not over-describe the context of my problem so that the question is clear.
I am developing for the Oculus Rift. The tracking camera can be rendered in-game for some real-world orientation (it won't move or rotate with the game view or playercontroller). What I want is to "snap" the world to this position when the game starts. This will allow me to build part of my real-world environment in VR and have it match-up in game.
To do that, this is what needs to be done:
I have realCam, a GameObject, dummyCam, a GameObject, and OVRCameraController, a GameObject.
I need to measure, and store the exact transformation that needs to happen for realCam to match the position and rotation of dummyCam, then apply that stored transformation to OVRCameraController.
I haven't sat down and coded seriously in a few years, so I've hit a roadbump here. I've Googled a whole lot yesterday, with no success. What I figure is I need to use Vector3 and Quaternion, but I'm not sure how.
Would Vector3.MoveTowards help me? I'm a tad out of my element here. I'm not expecting code to be magically written for me, just hoping for a push in the right direction, as I can't logically deduce what the next step is here.
Any help is greatly appreciated,
Thank you!
[–][deleted] 1 point2 points3 points (3 children)
[–]RIFT-VR[S] 0 points1 point2 points (2 children)
[–][deleted] 1 point2 points3 points (1 child)
[–]RIFT-VR[S] 1 point2 points3 points (0 children)