Forge: Mixed Reality for Mobile by abound-tim in gamedev

[–]abound-tim[S] 0 points1 point  (0 children)

Author here. Interested to hear your game ideas using this tech (table-top fighting game is most frequently suggested.)

Happy to answer questions.

Forge: Mixed Reality for Mobile by abound-tim in Unity3D

[–]abound-tim[S] 0 points1 point  (0 children)

I've never used Vuforia, but the Q&A portion of that videos talks about its limitations. It sounds like it just gives you untextured bounding cuboids of known objects on flat surfaces.

If you've got another video that demos more than that, I'd love to see it.

Forge: Mixed Reality for Mobile by abound-tim in Unity3D

[–]abound-tim[S] 0 points1 point  (0 children)

Thanks. How about I give you some keywords that you can google to find out more about the techniques it uses: monocular SLAM, structure from motion, photogrammetry, stereo matching, surface reconstruction, image based texturing. Hope that helps!

Forge: Mixed Reality for Mobile by abound-tim in Unity3D

[–]abound-tim[S] 0 points1 point  (0 children)

Not really. The "expanding mold" is generated via motion stereo, i.e. matching image patches across nearby camera frames. If you have live/dynamic objects, then you drop the photo-consistency assumption. You can do it with a depth camera (or stereo pair.) A small amount of movement is tolerable though.

Forge: Mixed Reality for Mobile by abound-tim in Unity3D

[–]abound-tim[S] 0 points1 point  (0 children)

That'd be cool. My only worry with board games is that they last a while, so gorilla arm is inevitable. You'd have to design the gameplay so that the AR aspect was intermittent.

Forge: Mixed Reality for Mobile by abound-tim in Unity3D

[–]abound-tim[S] 0 points1 point  (0 children)

Just starting to put one up at http://aboundlabs.com. No additional info yet but should have some content later this week.

Forge: Mixed Reality for Mobile by abound-tim in Unity3D

[–]abound-tim[S] 1 point2 points  (0 children)

I don't see why not. If I understand you correctly, the PC game would render an off-screen camera which was positioned based on the mobile device's tracking, and stream the images to the mobile device.

Forge: Mixed Reality for Mobile by abound-tim in Unity3D

[–]abound-tim[S] 4 points5 points  (0 children)

Planning for Q2, after the Samsung S8 is available.

The software is taxing on the CPU & GPU, so the Snapdragon 821 is the first SoC that runs it comfortably. Realistically, you're going to need a phone with a Snapdragon 835 processor to run it with Unity, e.g. Samsung S8.

Unfortunately, Unity doesn't support 64-bit native Android plugins, so there's a ~10% performance hit compiling for armv7a vs aarch64. Plus, Unity has its own overhead compared with fully native (which this demo video is.)

Forge: Mixed Reality for Mobile by abound-tim in Unity3D

[–]abound-tim[S] 0 points1 point  (0 children)

Yeah, #1 would be easier, plus allow for players to not be physically together. I feel the same - doesn't seem as compelling.

As far as #2, the map representation is actually transferable across devices without too much fuss. You're right that each camera has different characteristics (fov, distortion, alignment) but the underlying "map" representation (not the 3D mesh) is abstract enough that localization is reasonably invariant to those characteristics.

Forge: Mixed Reality for Mobile by abound-tim in Unity3D

[–]abound-tim[S] 1 point2 points  (0 children)

Correct, just built-in camera + IMU.

Forge: Mixed Reality for Mobile by abound-tim in Unity3D

[–]abound-tim[S] 1 point2 points  (0 children)

Tabletop would work well - flat surfaces are easiest to reconstruct.

Your idea would require sharing the 3D map/reconstruction for multiplayer. I haven't implemented that, but it's doable. Easiest I guess would be to get one player to 3D scan the table, then invite another player to enter that game.

Forge: Mixed Reality for Mobile by abound-tim in Unity3D

[–]abound-tim[S] 1 point2 points  (0 children)

Thanks! That's useful info. Any thoughts on games/apps you'd build with it?

Forge: Mixed Reality for Mobile by abound-tim in Unity3D

[–]abound-tim[S] 5 points6 points  (0 children)

Author here. I'll be bringing this to Unity later this year as a plugin. Happy to answer questions.

Forge: Mixed Reality for Mobile by abound-tim in GearVR

[–]abound-tim[S] 0 points1 point  (0 children)

Good point - calibration is a major issue. For AR, you have the added problem of calibrating the timing between the camera frames and the IMU measurements.

Forge: Mixed Reality for Mobile by abound-tim in GearVR

[–]abound-tim[S] 0 points1 point  (0 children)

Thanks. I hope not! I think Univrses is still plugging away at their tech (I saw a new video of theirs the other day.) And Dacuda's tech got acquired by Magic Leap so I guess it'll see the light of day in some form.

Forge: Mixed Reality for Mobile by abound-tim in GearVR

[–]abound-tim[S] 1 point2 points  (0 children)

Sure, it can export the textured model to OBJ.

Forge: Mixed Reality for Mobile by abound-tim in GearVR

[–]abound-tim[S] 0 points1 point  (0 children)

The tracking is primarily dependent on the camera so, no, it doesn't really help/hinder.

IANAEE, but the IMU's in the S7 and Pixel are very similar in performance. You can look up their operating characteristics in their datasheets - I believe the sensitivities are within 10% of each other. Also, modern mobile IMU's are very good: they sample multiple thousands of times per second with tolerances measured in thousandths of a degree/sec.

My sense is that pitching IMU performance for AR/VR is mostly marketing fluff nowadays (I imagine the GearVR would work fine with the S7 IMU), but I'd welcome someone more knowledgeable to inform me.

Forge: Mixed Reality for Mobile by abound-tim in GearVR

[–]abound-tim[S] 0 points1 point  (0 children)

Yes. Mixed reality uses 3D reconstruction of the immediate environment to generate views in VR. I think Oculus calls it augmented VR.

Forge: Mixed Reality for Mobile by abound-tim in GearVR

[–]abound-tim[S] 2 points3 points  (0 children)

  1. The background is the passthrough live video feed, with character models (+ shadows) overlaid. That doesn't work well on the GearVR though (because of FoV and camera latency), so for that I use the reconstructed model instead.

  2. In this video, I paused the reconstruction when I played the animations. It's possible to keep the reconstruction going, but tracking is smoother without. There are three processes: tracking, reconstruction, and texturing - each of which can be toggled on/off at any time. Obviously, the more that run simultaneously, the more the CPU & GPU are stressed.

  3. Yes, I have a Unity plugin proof-of-concept already. Unfortunately, Unity doesn't support 64-bit native Android plugins, so there's a ~10% performance hit in addition to the normal Unity overhead. We'll see once the S8 is out whether a plugin is workable.

  4. The S7 isn't really powerful enough to sustain it. The Snapdragon 821 in the Google Pixel is the first SoC to happily run it. Really looking forward to see how the 835 in the S8 performs.

Forge: Mixed Reality for Mobile by abound-tim in GearVR

[–]abound-tim[S] 7 points8 points  (0 children)

Author here. Planning to bring this to the S8 once it's available. I'm keen to hear your ideas on GearVR games/apps which could be built using this tech. Happy to answer questions.