/ Forums
Applications are now open to attend this year's Oculus Connect. Join us October 11–12 and be first in line for an inside look at what’s new and next for VR.

New To The Forum? Click Here To Read The How To Guide. -- Developers Click Here.

Linux 'putting your money where your mouth is' thread...

jhericojherico Posts: 1,419
Nexus 6
edited October 2014 in PC Development
So, the Linux 'nagging thread' has kind of dragged on for a while. Oculus has pledged Linux support, but given no ETA on when we'll see positional tracking supported for DK2. It could be this week. It could be months.

However, I think that sitting around and waiting for Oculus to solve the issue kind of goes against the spirit of Linux anyway. If you want something done, you should be willing to do it yourself. To that end, I'm working on porting the current 0.4.2 SDK to Linux. I've managed to reverse engineer some of the code required to interact with the LEDs so that they can be turned on. The camera is natively supported. Right now the primary missing component to getting this done is the software for calculating a head pose based on an image from the camera.

Unfortunately this kind of math is outside my field of expertise. So I'm putting a call out to the community to see if interested parties might be able to assist with this.

If you want a Linux SDK and don't want to wait for Oculus to get around to it, and you've got the skills, here's a video of them captured from the webcam on Linux:



You can download original file from here: https://s3.amazonaws.com/Oculus/oculus_rift_leds.webm

If you can write C or C++ code that will take that video, or an image from that video, and turn it into a head pose, let me know and I'll work with it to produce a viable Linux SDK with positional tracking.
Brad Davis - Developer for High Fidelity
Co-author of Oculus Rift in Action

«1

Comments

  • phr00tphr00t Posts: 31
    I wish you the best of luck! I am in dire need of a Linux SDK, because my main development machine is running Linux. Unfortunately, many people don't understand that not having a Linux SDK also can limit game development for Windows & Mac OSX, as is in my case.

    Anyway, do you have a limited Linux SDK you could put into JOVR in the meantime? Something that would at least get rotational tracking & SDK-side distortion working? I'd like to continue Rift development with my game, 4089, but when I try to use it in "Rift mode", it simply crashes because no linux libraries can be found in JOVR 4.2.0.
  • cyberealitycybereality Posts: 21,760 Oculus Staff
    I don't want to stop you guys, but I'll be willing to bet the Linux SDK will come out long before you figure this out on your own. But please, if you want to hack away, go ahead.
    AMD Ryzen 7 1800X | MSI X370 Titanium | G.Skill 16GB DDR4 3200 | EVGA SuperNOVA 1000 | Corsair Hydro H110i
    PowerColor RX 480 x2 | Samsung 960 Evo M.2 500GB | Seagate FireCuda SSHD 2TB | Phanteks ENTHOO EVOLV
  • phr00tphr00t Posts: 31
    Also, I think I found FreeTrack's source code.. not sure if it will be helpful at all. Written in Pascal:

    https://github.com/PeterN/freetrack/tree/master/Freetrack
  • phr00tphr00t Posts: 31
    I don't want to stop you guys, but I'll be willing to bet the Linux SDK will come out long before you figure this out on your own. But please, if you want to hack away, go ahead.

    Thank you for stopping by! I'm eagerly awaiting the Linux SDK release. Are you able to share if it will be included in the next SDK release?
  • cyberealitycybereality Posts: 21,760 Oculus Staff
    I can't promise it will be in the next SDK release, but good progress is being made and it should be coming soon.
    AMD Ryzen 7 1800X | MSI X370 Titanium | G.Skill 16GB DDR4 3200 | EVGA SuperNOVA 1000 | Corsair Hydro H110i
    PowerColor RX 480 x2 | Samsung 960 Evo M.2 500GB | Seagate FireCuda SSHD 2TB | Phanteks ENTHOO EVOLV
  • I don't want to stop you guys, but I'll be willing to bet the Linux SDK will come out long before you figure this out on your own. But please, if you want to hack away, go ahead.
    how encouraging :lol:

    Are the LEDs supposed to flash in a specific pattern or are they supposed to be on continuously?

    In regards to the math: http://en.wikipedia.org/wiki/Delaunay_tessellation_field_estimator

    What I am thinking: Calculate the DTFE for several known states (Needs a cartesian robot) of the HMD and then use a curve fitting tool to get a function representation of it for the coordinates or store those in a LUT and interpolate.
  • brantlewbrantlew Posts: 537 Oculus Staff
    Yeah, I don't want to burst bubbles here but the tracking algorithm is quite involved and ideally you would want to tap into the modulated LED signal to first identify the LEDs before attempting a post reconstruction. Not to mention there are a whole slew of tie-ins to IMU data for prediction that really optimize the quality of the tracking. The video below gives a flavor a what is involved. I suspect the official Linux release will be out much sooner than a reverse-engineering of the tracking algorithm could be completed.

  • You know you could share the precise geometric locations of the LEDs and as such give us the possibility of a less ad-hoc approach. ;)
  • pixelminerpixelminer Posts: 171
    Hiro Protagonist
    Well, But even if we wait for the official SDK for Linux, we still have the problem with the current Oculus software license being incompatible with GPL, thus hindering support in Blender, FlightGear etc.

    An alternative, community developed SDK may be what is needed for these projects to be able to use the Rift hardware.
  • jhericojherico Posts: 1,419
    Nexus 6
    Guys, I appreciate the sentiments being expressed here, but I'd like to avoid turning this into a clone of the 'nagging thread'. Try to focus this thread largely on community built tools and overcoming the problems we're faced with, not another back and forth on Linux issues.
    Brad Davis - Developer for High Fidelity
    Co-author of Oculus Rift in Action

  • matusmatus Posts: 66
    An open sdk for rift is a great idea. I think it's necessary if you wish to experiment with stand-up experience, eye-tracking or other stuff that oculus refuses to tolerate with their sdk.

    As already mentioned, it would be helpful to see the output of the camera when the leds are controled with the oculus sdk and then reproduce the modulation. 3D model of the relative led location is also necessary. Finally, you need an external sensor to test and validate the algorithm.

    Personally, I plan to ditch the cheap camera solution (that's what oculus calls it in the talk linked above) altogether and use the priovr head sensor instead. Unfortunately, the delivery of priovr is also late - looks like no one in VR indurstry is able get things done on time :roll:
  • swsmithswsmith Posts: 15
    I also think an Open VR SDK would be a great idea given Oculus move towards closed source. The first thing we would need to implement the camera position tracking is:

    1) An LED numbering scheme and careful measurements of the XYZ LED positions on the headset relative to some fixed point on the headset. If Oculus was feeling nice the could provide us with this data.

    2) Some video with a mask for each frame indicating which LED's are on for that frame. If we could get a capture of what Oculus is doing we could use a similar scheme for modulating the LED's state to identify them.
  • nuclearnuclear Posts: 68
    Lawnmower Man (or Woman)
    While indeed a LibOVR-quality full pose estimation might be a big and involved project (but not without merit due to the possibility of obtaining a full free software SDK out of it), I think that, as a more immediate goal, it would be quite easy to calculate a rough 3 DoF head position, by treating the LED image as a blob and using its 2D center position and projected screen size (to derive Z distance).
    We already have rotational data from the other sensors, so apart for correcting yaw drift, rotational information from the pose estimation is not strictly necessary.
    John Tsiombikas
    webpage - blog - youtube channel
  • How would one go about estimating an actual scope of work to achieve a robust, open-source positional-tracking solution?

    Because maybe if we could scope it out with more precision than faith-based beliefs... there could be a way to partition the problem more incrementally (and have it solved sooner).

    Also I question whether positional tracking is in fact just a matter of maths... the same way I question whether virtual reality is just a matter of binary numbers and photons. :)
  • Dear Oculus developers, please read this, and chime in with hints http://doc-ok.org/?p=1095. Thanks.
  • pH5pH5 Posts: 3
    blackguest wrote:
    Dear Oculus developers, please read this, and chime in with hints http://doc-ok.org/?p=1095. Thanks.
    Thank you, that is a great write-up!

    I have done some USB tracing on a Windows machine and have figured out how to read the Camera's EEPROM and how to enable the exposure synchronization. There is a block of data at 0x2000 in the EEPROM that I hope encodes the lens distortion parameters:
    00002000  00 89 97 e2 7a 00 01 00  09 00 fe 57 c1 af 00 00
    00002010  0c 00 63 1a dc 83 e7 ca  85 40 01 00 0c 00 f4 3c
    00002020  f4 46 b8 cb 85 40 02 00  0c 00 42 d2 37 0a 31 bf
    00002030  77 40 03 00 0c 00 4c 32  ee 7a fe ac 6c 40 04 00
    00002040  0c 00 f8 1a 08 80 b9 3d  e0 bf 05 00 0c 00 f4 1b
    00002050  eb e1 e0 f7 d4 3f 06 00  0c 00 fe a7 2f 24 67 4b
    00002060  3b 3f 07 00 0c 00 c1 6f  08 a3 8f 14 53 3f 08 00
    00002070  0c 00 8b e6 c0 de 56 2d  c0 bf ff ff ff ff ff ff
    
    I have uploaded my code here: https://github.com/pH5/ouvrt
  • This all looks fantastic, and completely fascinating. I'll try hard to find time to participate in the hacking! Thanks so much for your work so far.

    An open driver stack will be a huge benefit for the community, and actually for Oculus themselves too.
  • While others work on the maths I've been looking across communities for ways to strengthen our own.

    One of greatest community shortcuts I've rediscovered is how a Blender developer offers a VirtualBox .vha image http://wiki.blender.org/index.php/User:Ideasman42/ArchLinuxVirtualBox -- literally, you download, launch with VirtualBox tools and are sitting at a productive Blender dev prompt -- replete with all necessary tools including an IDE.

    Perhaps such an instant-on "VR dev image" is something we could work on as a community, in parallel to (and in support of) ongoing lower-level efforts?

    In theory such a tool would immediately foster an influx of new hands to help in general, with even those on Windows and OS X able to boot within a VM and test Linux-side features like distortion.

    I'm thinking a remastered Ubuntu LiveCD .iso would be sufficient if it had a pre-installed IDE and relevant github SDK pointers. The README then might only need three bullets:
    Step 1: Download a bootable .iso
    Step 2: Boot it into any x86_64 machine (or VM container)
    Step 3: ~90 seconds later, hit F5 to debug some example C++ code (etc)

    Do we think this is a good idea? If so, does anyone have experience remastering LiveCDs yet? And what open source IDE might offer newcomers to VR (and potentially Linux) the most pleasant first-time experience?
  • blackguestblackguest Posts: 12
    edited October 2014
    vrcoder3d wrote:
    One of greatest community shortcuts I've rediscovered is how a Blender developer offers a VirtualBox .vha image...

    I really like this idea
  • vrcoder3d wrote:
    Do we think this is a good idea? If so, does anyone have experience remastering LiveCDs yet? And what open source IDE might offer newcomers to VR (and potentially Linux) the most pleasant first-time experience?

    I've made a (very) experimental live respin of Fedora 20, including Nvidia's vendor-supplied binary GPU driver and our entire VR software stack. It boots directly from USB stick, but it obviously can't run inside a VM due to lack of GPU access. The idea was to make it easier to test-drive our software, but it could serve as a starting point.
  • lhllhl Posts: 30
    Lawnmower Man (or Woman)
    I've been traveling sans DK2/Linux box, but I started putting together some docs on jherico's project, just to start to gather up some of the far-flung stuff out there: https://github.com/jherico/OculusRiftHacking/wiki
  • pH5pH5 Posts: 3
    pH5 wrote:
    00002000  00 89 97 e2 7a 00 01 00  09 00 fe 57 c1 af 00 00
    00002010  0c 00 63 1a dc 83 e7 ca  85 40 01 00 0c 00 f4 3c
    00002020  f4 46 b8 cb 85 40 02 00  0c 00 42 d2 37 0a 31 bf
    00002030  77 40 03 00 0c 00 4c 32  ee 7a fe ac 6c 40 04 00
    00002040  0c 00 f8 1a 08 80 b9 3d  e0 bf 05 00 0c 00 f4 1b
    00002050  eb e1 e0 f7 d4 3f 06 00  0c 00 fe a7 2f 24 67 4b
    00002060  3b 3f 07 00 0c 00 c1 6f  08 a3 8f 14 53 3f 08 00
    00002070  0c 00 8b e6 c0 de 56 2d  c0 bf ff ff ff ff ff ff
    
    Let's reorder that a bit:
    2000: 00 89 97 e2 7a 00 01 00 09 00
    200a: fe 57
    200c: c1 af
    200e: 00 00 0c 00 63 1a dc 83 e7 ca 85 40
    201a: 01 00 0c 00 f4 3c f4 46 b8 cb 85 40
    2026: 02 00 0c 00 42 d2 37 0a 31 bf 77 40
    2032: 03 00 0c 00 4c 32 ee 7a fe ac 6c 40
    203e: 04 00 0c 00 f8 1a 08 80 b9 3d e0 bf
    204a: 05 00 0c 00 f4 1b eb e1 e0 f7 d4 3f
    2056: 06 00 0c 00 fe a7 2f 24 67 4b 3b 3f
    2062: 07 00 0c 00 c1 6f 08 a3 8f 14 53 3f
    206e: 08 00 0c 00 8b e6 c0 de 56 2d c0 bf
    
    Interpreting the last 8 bytes of the nine 12-byte blocks as double yields:

    697.363044, 697.464979, 379.949473, 229.406064, -0.507535, 0.327629, 0.000416, 0.001165, -0.126384

    Those look suspiciously like the focal lengths and distortion center point that blackguest measured.
    I guess the remaining values have to be the radial and tangential distortion coefficients in some order.

    Maybe I'm now grasping at straws, but the 2-byte blocks at 200a and 200c interpreted as 16-bit signed integers are:

    22526, 44993

    Could those be the intrinsic parameters' principal point in units of centipixels?
  • lhllhl Posts: 30
    Lawnmower Man (or Woman)
    Just as an update for those that might have missed it/not been following along, okreylos has posted the work he's done to decode the LED constellation (identified by 10-bit blinking pattern) here: http://doc-ok.org/?p=1124

    At this point it is possible to:
    * Pull the LED/IMU positions (3D model of the HMD)
    * Talk to the HMD, initialize LEDs, sync with the camera
    * Identify and relatively-stabley track LEDs w/ blob and blink tracking

    I've dropped a line to okreylos to see if he has a repo/code dump for his new work, but it seems like w/ the basics tackled, pose estimation is next (and then sensor fusion).

    Since jherico's OculusRiftHacking repo hasn't had updates, I forked it to keep track of the existing code/tools: https://github.com/lhl/OculusRiftHacking

    I'm happy to take pull requests if there's any extant code that I've missed (I just pulled the projects into root but maybe will do some shuffling for if becomes unweildy say rift vs pose or other reference project code) or just let me know of something that's out there and I'll add it.

    I'm documenting information I find in the wiki: https://github.com/lhl/OculusRiftHacking/wiki
    Anyone with a github account can add/edit this documentation. Unlike OculusVR, I won't be deleting the wiki without notice (grumble grumble)

    Also, to kick off some pose estimation discussion, I found an interesting paper by Fernando Herranz et al on Camera Pose Estimation using Particle Filters which describes a particle filtering algorithm for pose estimation. Positional/angular error seems large but it was done w/ a 640x480x30Hz camera and there's no sensor fusion. I assume that the optical tracking can be relatively coarse / used primarily for drift correct against the much more accurate IMU data.

    I'm not a hardcore CV guy at all, so looking forward to hopefully hearing some informed feedback here?
  • lhllhl Posts: 30
    Lawnmower Man (or Woman)
    BTW, the PDF link (or maybe # of links) was being flagged as spam, so here's the link to the paper on "Camera Pose Estimation using Particle Filters": http://www.es.ewi.tudelft.nl/papers/2011-Herranz-pose-estimation.pdf
  • I'm a bit disappointed with the lack of linux support so far, but as the hardware is still in flux I'm not surprised.

    As long as a community version doesn't prompt oculus to stop supporting their linux drivers I really the idea as I can't hack around with the low level stuff atm.

    I'm a computer science researcher based in medical imaging and machine vision. The positional tracking of the LEDs is a far simpler project than several other ones I've done recently so skills won't be a problem... but time is.

    I'd be far more inclined to dedicate some of my time to the project if there was a clearer picture of what the goals are and what the progress is.

    Is someone heading the project who might be able to setup a quick site hosting some information, a repository of the latest bits of code, list what's being worked on by who, and what is needed?
  • jhericojherico Posts: 1,419
    Nexus 6
    There is a Github repository for hosting tools and tidbits of code related to the project here: https://github.com/jherico/OculusRiftHacking

    Oliver Kreylos has written up an excellent summary of his and others' findings so far here:

    Part 1:
    http://doc-ok.org/?p=1095

    Part 2:
    http://doc-ok.org/?p=1124
    Brad Davis - Developer for High Fidelity
    Co-author of Oculus Rift in Action

  • kaetemikaetemi Posts: 191
    I'd back a Kickstarter for this.
  • matusmatus Posts: 66
    just send some money to oliver, he needs a new cpu.
  • lhllhl Posts: 30
    Lawnmower Man (or Woman)
    jimisdead wrote:
    Is someone heading the project who might be able to setup a quick site hosting some information, a repository of the latest bits of code, list what's being worked on by who, and what is needed?

    As jherico mentioned, he's hosting a project right now as a meta-repository w/ available code. It looks like okreylos/doc_ok/Oliver is chugging along - he just tweeted a pic w/ pose estimation a couple hours ago.

    I've started a doc on the wiki w/ a roadmap: https://github.com/jherico/OculusRiftHacking/wiki/Roadmap

    Looks like sensor fusion is the only missing piece once Oliver gets around to packaging his work. Oculus has published a fair amount on how they do things, and there's lots of existing work (see the Sensor Fusion page on the wiki for the results of a quick fishing expedition)

    After the components are done there's the matter of wrapping up all the work in a proper statically compiled daemon and packaging it up. I'm assuming that since chunks of the code is is GPLv2 that's what the daemon will end up being, but that shouldn't be a problem for people using it, since the way I see it, the daemon will just run as a system service and emit or broadcast tracking data...
  • Doc-OK just posted his first successful head tracking in his latest blog post: Hacking the Oculus Rift DK2, part III http://doc-ok.org/?p=1138!!!

    We are getting close to beating facebooculus in getting dk2 working in linux!
«1
Sign In or Register to comment.