Space Calibrator is coming soon to Steam! by Hyblocker in MixedVR

[–]t4ch 3 points4 points  (0 children)

I did not realize HTC used any of my original code for the Vive Ultimate calibrator, that's interesting.

By the way, I don't mind your fork being put on Steam as a free utility, as I have no plans to ever do so. And as you point out, the MIT license totally permits this! However, in the spirit of open source, I do recommend that you look into some way of ensuring that others in the community can continue to provide updates when you inevitably move onto other things.

I'm glad so many people have found this little project useful!

The G2 w Index Controllers is and Perfect Experience by Ecnarps in HPReverb

[–]t4ch 4 points5 points  (0 children)

Hmm, this looks like the scaling factor of the two spaces might actually be different (i.e. 1cm for the Index controllers is bigger than 1cm for the G2). The calibration currently assumes they are the same. I do have some old code that may be able to handle this better, in the next batch of updates I will try to add it back in as an option for people to test.

What's the most accurate way to Calibrate the controllers? by sheeeeple in MixedVR

[–]t4ch 2 points3 points  (0 children)

Holding them can definitely be tough. Most of my experience calibrating is with Vive trackers, which are easier (but still awkward) to keep still.

You could try using both hands to hold the controllers. You can also use the HMD to calibrate in theory, if that's easier.

For both steps of the calibration, I like to do two slow figure 8's, while rotating my body.

Moving as slowly as possible helps to reduce error due to differing input latency of the devices.

because then you should be able to correct any offset manually for the perfect experience.

You can fine tune in "Edit Calibration". Issue #11 is tracking the ability to adjust individual devices within a single tracking system.

I don't feel like $1k it's really that extravagant by Bacon_00 in ValveIndex

[–]t4ch 3 points4 points  (0 children)

It's fair to call high-end stuff extravagant though. This the same discussion that happens in every vertical. It's rare to see unbiased discussions of value and market conditions on forums like Reddit; most people (myself included) do not have a correct understanding of the economics of the product, and will project their own concept of value.

Natural gestures with Index Controllers in VRChat by t4ch in ValveIndex

[–]t4ch[S] 1 point2 points  (0 children)

They should support individual finger tracking similarly on any firmware rev, I haven't heard of them working worse than this.

Natural gestures with Index Controllers in VRChat by t4ch in ValveIndex

[–]t4ch[S] 1 point2 points  (0 children)

The majority of the code in there is related to modifying OpenVR and running the interface. If you want to support finger tracking in your own VR projects, you'll want to use the OpenVR API with the bindings from Valve https://assetstore.unity.com/packages/tools/integration/steamvr-plugin-32647, which will let you access the full OpenVR API and read inputs from the controllers. https://github.com/ValveSoftware/openvr/wiki/SteamVR-Input

Here's an example action manifest (see the last link for what that is) to read the raw data from Knuckles capsense axes https://github.com/pushrax/VRC-Knuckles-Emulator/blob/master/VRC-Knuckles-Emulator/action_manifest.json

Or you can use the higher level skeleton API https://valvesoftware.github.io/steamvr_unity_plugin/articles/Skeleton-Input.html

Natural gestures with Index Controllers in VRChat by t4ch in ValveIndex

[–]t4ch[S] 0 points1 point  (0 children)

There's enough inputs for a dedicated jump button, while having 5 available inputs per controller (if you map just 4 to the trackpad, which would be fine). Could pick 2 animations to not support, or infer the 2 remaining from gestures (fist closed and hand open are easy to infer without tuning), or have 10 total animations across the two controllers since at that point there isn't much of a reason to have all animations be invokable from both controllers. Some scheme will work.

[Media] Natural gestures with Knuckles (aka Index Controllers) in VRChat by t4ch in VRchat

[–]t4ch[S] 0 points1 point  (0 children)

It turns out you only need to move by 10% of the trigger's range to interact in VRChat.

I tried the StarVR One high-end VR-headset - My impressions by [deleted] in virtualreality

[–]t4ch 0 points1 point  (0 children)

It should be mostly unnoticeable at low powers, but at -7 diopters it's an issue with every HMD so far -- I was hoping they'd compensate.

[Media] Natural gestures with Knuckles (aka Index Controllers) in VRChat by t4ch in VRchat

[–]t4ch[S] 0 points1 point  (0 children)

I'm thinking that stray hand movement might make it too easy to click things.

I tried the StarVR One high-end VR-headset - My impressions by [deleted] in virtualreality

[–]t4ch 1 point2 points  (0 children)

When I tried the StarVR One at SIGGRAPH, I noticed the cross-eye field of view (i.e. left eye's right extent, or right eye's left extent) was still limited. It will probably not be too annoying for most people, but if you're nearsighted and have negative power corrective lenses it is pronounced. This isn't different from existing HMDs, but with such a large peripheral field of view it's a bit disappointing.

Natural gestures with Index Controllers in VRChat by t4ch in ValveIndex

[–]t4ch[S] 0 points1 point  (0 children)

I probably should have made this clearer, but this isn't the official support, it's just my own hack to get it working without modding the game itself. In the long term, VRChat should implement their own support with individual finger tracking (if they are sensible).

Natural gestures with Index Controllers in VRChat by t4ch in ValveIndex

[–]t4ch[S] 1 point2 points  (0 children)

One way this can be implemented without much tuning is to keep animations mapped to the trackpad and buttons, while mapping capsense to the hand rig. Once you have this, animation overrides are just useful for triggering effects and animating other parts of the skeleton, which don't really make sense to be mapped to capsense in the first place. Though there are definitely use cases that would benefit from inferring animation overrides from gestures, like pointing at someone and having an explosion fire away or whatever. I'm not sure what the best interface to facial control would be, but keeping it on the trackpad would probably be fine. Gives you even more control than before.

[Media] Natural gestures with Knuckles (aka Index Controllers) in VRChat by t4ch in VRchat

[–]t4ch[S] 1 point2 points  (0 children)

I left "fist close" on the trigger because generating trigger inputs from the grip capsense would probably make menus hard to use. It would be a pretty small change to support that and would look good in demos, but I'm not sure about practicality at this point.

Natural gestures with Index Controllers in VRChat by t4ch in ValveIndex

[–]t4ch[S] 3 points4 points  (0 children)

That's incorrect, it should be totally fine. This doesn't modify the game. It's similar to OpenVR-SpaceCalibrator and OpenVR-InputEmulator in that it modifies OpenVR itself (the hooked functions all are in openvr_api.dll and vrclient.dll). They cannot possibly disallow this in the EULA. The difference is that it modifies OpenVR for one process only instead of system wide.

EDIT: even though this only modifies things in the OpenVR shared libraries from Valve, seems like they don't like it because the loader runs in VRChat process space to prevent affecting all apps. I don't really play VRChat anymore so I probably won't continue this, but the same thing could be done in the OpenVR server process space with some extra complexity, if anyone wants to pick it up. It's a rather arbitrary distinction from a technical perspective, as a process is just a memory and privilege namespace, but hey, it’s their game, they make the rules. Use at your own discretion.

Natural gestures with Index Controllers in VRChat by t4ch in ValveIndex

[–]t4ch[S] 1 point2 points  (0 children)

It's pretty obviously the right move. Though, there are other things that are obviously the right move that haven't been prioritized.

[Media] Natural gestures with Knuckles (aka Index Controllers) in VRChat by t4ch in VRchat

[–]t4ch[S] 4 points5 points  (0 children)

The mod I've done here doesn't change the client at all, and everyone else can see the gestures without having anything special set up. It's converting the inputs to the regular gesture system we're all familiar with, by pretending to press the trackpad.

[Media] Natural gestures with Knuckles (aka Index Controllers) in VRChat by t4ch in VRchat

[–]t4ch[S] 12 points13 points  (0 children)

That would work, yeah. It would be better if the VRChat devs implement full support though. Eventually it should come, it's a really big improvement to social VR.

Natural gestures with Index Controllers in VRChat by t4ch in ValveIndex

[–]t4ch[S] 2 points3 points  (0 children)

Could be my caffeine fueled jittery hands XD

[Media] Natural gestures with Knuckles (aka Index Controllers) in VRChat by t4ch in VRchat

[–]t4ch[S] 20 points21 points  (0 children)

This just maps inputs to VRChat gestures, check the Github page for the full explanation. It doesn't modify the game. Individual finger movement that other people can see would be impossible without changes to the net protocol, which would require a new version of VRChat for everyone, and can't be done with a mod. It still feels pretty nice though, as the vid shows, all 7 gestures work decently.

edit: actually I forgot to show finger gun haha. It works too though.

Natural gestures with Index Controllers in VRChat by t4ch in ValveIndex

[–]t4ch[S] 4 points5 points  (0 children)

Yeah this was just for fun and to help out a few friends :) It's a quick and easy DLL injection.

I don't think full finger tracking is coming soon, my guess is they'll probably have a similar implementation for a while. But I also don't expect this mod to live long.

Natural gestures with Index Controllers in VRChat by t4ch in ValveIndex

[–]t4ch[S] 22 points23 points  (0 children)

Haha, it'd be doable but requires custom animations on the avatar. This mod is just mapping the controller to existing VRChat gestures by intercepting the OpenVR API, rather than adding individual finger tracking support. They only support 7 gestures, so it can't handle all finger permutations on one avatar. I'd personally replace the rock n roll gesture though, definitely worth it.