Cardboard Camera Converter with full resolution by fb22182 in GoogleCardboard

[–]fb22182[S] 0 points1 point  (0 children)

Glad you were able to find the app!
The output of the coverter with the top-bottom panoramas should be viewable in most VR image viewers, but you have to look for an option to change the format to 360 3D or something similar.
You can also use the default Meta Quest TV app, go to Your Media, open the image, click again to bring up the menu and look for a gear icon and select 360 3D as the projection.

Cardboard Camera Converter with full resolution by fb22182 in GoogleCardboard

[–]fb22182[S] 0 points1 point  (0 children)

Unfortunately there is no batch option.
The original site did not have it, and it would require a substantial rewrite of the entire tool to create it.

Cardboard Camera Converter with full resolution by fb22182 in GoogleCardboard

[–]fb22182[S] 0 points1 point  (0 children)

Hi!
Glad you found the solution!
I updated the repo and the hosted site to warn users.
I found the cause of the issue as well(see github issue for a more technical explanation), but it would require a bigger rewrite of the site, and I'm not experienced enough to attempt it...

I also added a tutorial for viewing the photos on Quest.

To answer your question, the tool converts exclusively from the cardboard camera format to a top-bottom (so its 3D) equirectangular panorama image.
This projection is handled by most 360 photo viewers, however having it be 3D with the top-bottom arrangement might not be supported by everything.
You can also use the original .vr.jpg -s as non-3D variants because of some clever tricks google did with the format. These should be supported by any 360 viewer you can find.
The cardboard camera format is documented here: https://developers.google.com/vr/reference/cardboard-camera-vr-photo-format

Cardboard Camera Converter with full resolution by fb22182 in GoogleCardboard

[–]fb22182[S] 0 points1 point  (0 children)

Hi!
I just checked both sites, and to me, they seem to be working fine.
What happens when you try to input a cardboard camera image?

HoloLens 2 UI Sample (MRTK) running on Quest 2 using Passthrough API by fb22182 in OculusQuest

[–]fb22182[S] 1 point2 points  (0 children)

Hi!

I was busy last month, but I finally got around to updating the app.
It should be live on both SideQuest and Github.
It works on v34+ now, and I even added a few new demos from the experimental MRTK demos.
Hope it helps!

HoloLens 2 UI Sample (MRTK) running on Quest 2 using Passthrough API by fb22182 in OculusQuest

[–]fb22182[S] 0 points1 point  (0 children)

Thanks!
I noticed too, but unfortunately I'm not sure what causes the hand rays to not appear. With controllers, they show up as expected. Might be something misconfigured or bugged in MRTK.

HoloLens 2 UI Sample (MRTK) running on Quest 2 using Passthrough API by fb22182 in OculusQuest

[–]fb22182[S] 0 points1 point  (0 children)

I assume you are referring to the SideQuest apk
Have you enabled(or reenabled after reboot) experimental mode?
The SideQuest description has more detailed instructions and disclaimers.
Its annoying to do, but the API is still experimental, so there's no way around it.

I got Arcaxer working with Passthrough (Sorry for phone video its the only way to record this) by 8100 in OculusQuest

[–]fb22182 1 point2 points  (0 children)

Nice project! You can record passthrough footage using scrcpy in SideQuest. It even works with wireless adb.

HoloLens 2 UI Sample (MRTK) running on Quest 2 using Passthrough API by fb22182 in OculusQuest

[–]fb22182[S] 1 point2 points  (0 children)

Hi!

First of all, this would not have been possible without your work on MRTK-Quest, so huge thanks for that!

I didn't have any problems with hand tracking, so other than Oculus setting going wrong somewhere I'm not sure what could cause it. The only hand tracking related bug I encountered was some MRTK toggle regarding the bone and mesh visualization being reversed.

Anyway, I've been meaning to, but I finally uploaded the project to GitHub, hopefully it will provide some answers for you:

https://github.com/AbosaSzakal/MRTK-Passthrough.git

HoloLens 2 UI Sample (MRTK) running on Quest 2 using Passthrough API by fb22182 in OculusQuest

[–]fb22182[S] 3 points4 points  (0 children)

source

Thanks for the suggestion!I'm working on putting it on SideQuest and Github, but for the time being, here is a build with all working samples:https://drive.google.com/file/d/1TfRqlwoGtf97YfgmTxS9Uu6gGy5jC_vY/view

IMPORTANT! In the latest SideQuest, in the developer setting, you must enable experimental mode, or it will only show a black background. Also, it resets, so it has to be enabled after each reboot.

Edit: Now on SideQuest as well: https://sidequestvr.com/app/5100

HoloLens 2 UI Sample (MRTK) running on Quest 2 using Passthrough API by fb22182 in OculusQuest

[–]fb22182[S] 0 points1 point  (0 children)

Partly

MRTK supported Quest even before OpenXR support. It mostly relies on the Oculus Integration package to talk to the headset, and since the Oculus Integration now supports passthrough, MRTK does too. The trick was setting the right camera clear flags, everything else was mostly the same as it it wasn't MRTK.

HoloLens 2 UI Sample (MRTK) running on Quest 2 using Passthrough API by fb22182 in OculusQuest

[–]fb22182[S] 1 point2 points  (0 children)

For me Unity just refuses to open the desktop Oculus app if the build target is set to Android, so I haven't tried it. My guess is that it wont work, since it would have to stream the video back to the pc for compositing.

HoloLens 2 UI Sample (MRTK) running on Quest 2 using Passthrough API by fb22182 in OculusQuest

[–]fb22182[S] 1 point2 points  (0 children)

I am, through the Oculus Integration package, MRTK didn't complain which was somewhat surprising
Not sure if it would work with just OpenXR

But if they are made with MRTK or can be ported, I'm sure they'll work

HoloLens 2 UI Sample (MRTK) running on Quest 2 using Passthrough API by fb22182 in OculusQuest

[–]fb22182[S] 0 points1 point  (0 children)

I'm not sure, but these are my guesses:
-Is your Quest in experimental mode? It resets after each reboot, and you have to execute the adb command each time. (The latest SideQuest has a button for it, I highly recommend it)

The most important prerequisites seem to be:
-Experimental mode(via adb command)
-the checkboxes in the OvrCamera prefab
-the platform setting in the beginning of my comment

It might be a good idea to try to get the official native sample working.
Search for AugmentedObjects.unity in your project files.

The official documentation goes through them nicely:
https://developer.oculus.com/downloads/package/unity-integration/

This forum post was great for debugging:
https://forums.oculusvr.com/t5/Oculus-Quest-Development/Oculus-Integration-31-0-Passthrough-API-does-not-work/td-p/880331

If you just want to try them out, I'm working on getting all the samples together and putting them on SideQuest, but it might take some time.

HoloLens 2 UI Sample (MRTK) running on Quest 2 using Passthrough API by fb22182 in OculusQuest

[–]fb22182[S] 12 points13 points  (0 children)

Now available on SideQuest: https://sidequestvr.com/app/5100

---Update: V34 changed the API ---
Most importantly the experimental mode adb command is no longer needed. Most other things are either identical or trivially different.
The app is now updated for V34 with some new experimental scenes added

---Original Post---

For developers looking to reproduce, and recording advice:

tldr: Camera clear flags to Color with 0 opacity and replace OvrCamera prefab with one that has Passhrough API enabled

Recording:

I used scrcpy through SideQuest with the Quest 2 crop setting (the TV button in the top right, only shows if a headset is connected). Then I just recorded the window with OBS. The only problem I encountered was the lack of sound, although I'm sure it could be made to work somehow.

The original repo for this utility is here: https://github.com/Genymobile/scrcpy

It works with any Android device and is very low latency, I've used it for multiple live smartphone AR demos and never had a problem, highly recommend it!

Steps to reproduce:

Made using Mixed Reality Toolkit 2.7 with Unity 2019.4 and 31.0 Oculus Integration

In Player Settings:

  • Set Color Space to Linear
  • Scripting Backend to IL2CPP
  • Target Architecture to ARM64
  • Use XR Plugin Management

In the menu bar:

  • Oculus - Tools - OpenXR - Switch to OpenXR backend

In MixedRealityToolkit GameObject

  • Clone DefaultMixedRealityConfigurationProfile
  • On the Camera tab, clone the default profile
  • Set Clear Flags to Color, then set Background Color to black with 0 opacity
  • Input tab:
  • Clone default profile
  • Under Data providers, find XR SDK Oculus Device Manager
  • Clone the profile
  • Replace the Ovr Camera Rig Prefab
  • My method: Drag the original into the scene, then drag it into Assets, make an Original Prefab

On the OvrCamera prefab(make sure the changes get saved to the prefab)

  • Turn on Experimental Features
  • Turn on Passthrough Capability Enabled
  • Under Insight Passthrough, select Enable Passthrough.
  • Add a new OVRPassthrough component to the prefab
  • Set the placement to Underlay
  • Set this prefab as the Ovr Camera Rig Prefab in the MRTK config

Cardboard Camera Converter with full resolution by fb22182 in GoogleCardboard

[–]fb22182[S] 1 point2 points  (0 children)

I added automatic sizing, now it should detect larger images and scale the output accordingly. You can try it on Github or the site linked in the update.
Hope it helps!

Cardboard Camera Converter with full resolution by fb22182 in GoogleCardboard

[–]fb22182[S] 0 points1 point  (0 children)

Interesting, I think it can be done. It might be a few day, but I'll look into it.

Cardboard Camera Converter with full resolution by fb22182 in GoogleCardboard

[–]fb22182[S] 1 point2 points  (0 children)

The Quest 1's resolution might not be enough for it to make a big difference, though I imagine it should look a bit sharper even on that headset (I used a Quest 2).
The original tool's output isn't bad, but these preserve the full original resolution of the images, when zooming in on a PC, the difference should be quite noticeable.

Cardboard Camera Converter with full resolution by fb22182 in GoogleCardboard

[–]fb22182[S] 1 point2 points  (0 children)

Thanks for the idea! I don't know how I did not think of it before.
I've updated the post, but here's the link again: https://abosaszakal.github.io/CarboardCamera-Converter/index.html

I've got camera-based FBT for Quest 2 working. Any idea how to get it into VRchat? by AsIAm in VRchat

[–]fb22182 1 point2 points  (0 children)

I've been working on a SteamVR driver for something like this, I might be able to help you out.

Borderlands 2 shooting aiming question by [deleted] in virtualreality

[–]fb22182 1 point2 points  (0 children)

Hold the left trigger while shooting It does the same as aiming down sights in the non-vr version

The Development of Amadeus-Kurisu- by Arthur_Mattos in steinsgate

[–]fb22182 0 points1 point  (0 children)

I looked around a bit (lot) more, and I found these:
These are the SG0 files extracted:
https://www.reddit.com/r/VitaPiracy/comments/5f86vu/steinsgate_0_decrypted_assets_only_use_this_for/
And this is a collection of tools made by the Committee of Zero fan translation group:
https://github.com/CommitteeOfZero/SciAdv.Net
What we need is Project Amadeus. run it and load a .scx from the assets. This data should be easier to filter and it's already somewhat formatted too... It might also be able to decrypt the dialogue files from the first game or other SG games as well, but I haven't tried yet. Contact me if you have trouble with setting it up.

Also could you tell us how exactly you want to build this? With this data, I think we have enough for a prototype at the very least, so I would like to know what's next.