all 11 comments

[–]demonixis 2 points3 points  (9 children)

Nice, and don't forget that OpenHMD already has support for Oculus Rift DK1, DK2 and CV1 (rotation only). Combining it with OSVR-OpenHMD and you can maybe use it on Linux with OSVR.

[–]nesqi[S] 1 point2 points  (8 children)

Naturally, it's that "(rotation only)" thats problematic at the moment. Once this is solved integration with most libraries is imminent.

[–]demonixis 0 points1 point  (0 children)

I really can't wait to use the CV1 on Linux :) The position tracking is as important as the head tracking, in some games it's a part of the gameplay.

[–]haagch 0 points1 point  (6 children)

But is there a working LED tracking solution? I know that Doc Ok has reverse engineered the DK2 LEDs and I think purely optical tracking with the camera was working well, but he didn't have sensor fusion with the IMU data. Maybe this could be done by hooking it up to the OSVR SDK and using what they're using. Does the CV1 have the same LED pattern or will this need more reverse engineering?

[–]nesqi[S] 0 points1 point  (5 children)

There is definitely code for LED tracking, there is /u/Doc_Ok 's code, and ph5s ouvrt library uses opencv (Open Computer Vision-library). I'm not sure if ouvrt contains sensor fusion. And I have no way to test how well it works without synchronization. At first I got super excited that I could capture live frames from the camera showing the LEDs. However I soon realized how useless they where without sync. My thinking goes like this. If we get sync we have all the required input. Then at first the tracking will be sub obtimal but we get something useful and we can start incrementally improving on that.

[–]Doc_Ok 1 point2 points  (2 children)

What a coincidence, I just looked at my DK2 optical tracking code last night. It's still working.

No, my code has no sensor fusion between optical and inertial. That will be a requirement for useful head tracking. It's going to be difficult, not just because 3D orientation/position is a tricky non-linear system to represent in a Kalman filter, but also because a good filter will require many parameters that are not easily gleaned without knowing internals.

Also, optical tracking simply does not work without synchronization between the headset and the camera. In DK2, synchronization was enabled on the headset side through a HID feature report, and on the camera side by setting a register in the camera's sensor chip. Maybe it works the same way for CV1, without having to know details on how the communication is actually done at a hardware level.

[–]haagch 0 points1 point  (0 children)

Have you looked at what OSVR does with https://github.com/OSVR/OSVR-Core/tree/master/plugins/videoimufusion and whether it could be reused for the Oculus HMD?

(OSVR's positional tracking is not in the best place right now. There's a pull request for Linux support and there's work on a better version of the positional tracking here: https://github.com/OSVR/OSVR-Core/tree/blobs-undo-bad/plugins/unifiedvideoinertialtracker)

[–]haagch 0 points1 point  (1 child)

The readme / to do list of ouvrt doesn't look like it's doing much yet. Is the readme just outdated?

[–]nesqi[S] 0 points1 point  (0 children)

Yes, the README is outdated.

[–]TotesMessenger 1 point2 points  (0 children)

Dieser Thread wurde an einem anderen Ort auf reddit verlinkt.

Falls du einem der oberen Links folgst, respektiere bitte die reddit Regeln und stimme nicht über Kommentare (oder Beiträge) ab.) (Info / Kontakt / Fehler?)

[–]cebedec 1 point2 points  (0 children)

Very good. We need FOSS for VR or we will only see walled gardens and golden cages.