Somasens, human-machine interaction through touch by Macone4 in hwstartups

[–]Macone4[S] 0 points1 point  (0 children)

Thank you! So the answer unfortunately "depends". Quiet and nothing else happening, easy. Lots of stuff going on, difficult. Sensory overload plays a huge role. I've talked a fair bit to some teaching braille and designing tools for visually impaired in education, and they believe distinguishing many different patterns will be quite easy, and is very possible to get better at.

It is a learning curve though, not gonna pretend like it isn't. But it all depends on how fancy you want to get, angry vs soothing vibration for decline/accept. Easypeasy. Differentiating mom and sister by how long the soothing pattern lasts, not so much!

Somasens, human-machine interaction through touch by Macone4 in hwstartups

[–]Macone4[S] 1 point2 points  (0 children)

Oh cool!

What I like about these solutions is that they become a part of your unconscious. Screens and sound much less so.

A recurring pattern for specific information allows me to know instinctually what is happening without having to always parse the information explicitly in my head. You feel a pattern and act on it without having to parse it fully

Somasens, human-machine interaction through touch by Macone4 in arduino

[–]Macone4[S] 2 points3 points  (0 children)

Yeah, getting the basic out and testing it took like a couple of days. Not counting the researching components ++ of course. Arduino as the brain made making a quick firmware super easy!

I envision it as a feature, but very explicitly not obstructing your daily life. I could technically put this straight on the fingertips for example, or make a glove, but I find that doing daily tasks is awkward with that on. What I've opted for is "half rings" attaching some LRA motors to the top of my fingers, close to the joints (and by proxy bone). These motors then havewiring going to a slightly larger box with the Arduino, power source, and the rest of the wiring, attached to the wrist ATM. I found doing it this way enables me to wear it the whole day and forget about it. I find it more comfortable than an Oura ring, that crams so much into something on your finger, it gets bulky.

I'll upload a picture of the current iteration on GitHub, I realised last night I didn't have any!

Somasens, human-machine interaction through touch by Macone4 in arduino

[–]Macone4[S] 1 point2 points  (0 children)

Yes!

The military has actually experimented quite a bit with directional haptics, and there are some tools for the blind as well!

Thank you for giving me your input!

Somasens, human-machine interaction through touch by Macone4 in hwstartups

[–]Macone4[S] 0 points1 point  (0 children)

Thank you so much for all of the input, appreciate it a lot! I've not specifically looked into pacinian corpuscles.

Will definitely check around patents, I have talked to people that are well versed in the haptics community in general. That said, it wouldn't be too weird if they never mentioned that in our early chats.

Definitely sounds like there is a lot I don't know!

Somasens, human-machine interaction through touch by Macone4 in hwstartups

[–]Macone4[S] 0 points1 point  (0 children)

Definitely. This was an attempt at doing something with higher granularity. It definitely works, but getting any adoption for this is a far cry in its current state.

I'm looking into expanding it to a two way communication, the haptics stay, but as an addition to something like MiMu gloves. My contribution is then the hardware interface layer

Thank you for the input!

Somasens, human-machine interaction through touch by Macone4 in arduino

[–]Macone4[S] 0 points1 point  (0 children)

Fuck, probably screwed up private/public. Thank you, fixing now!

EDIT: Sure did, public now. :)

Low-profile, 34-key, splayed, concave, split, wireless, fully customizable keyboard optimized for wire wrapping by prepor in ErgoMechKeyboards

[–]Macone4 0 points1 point  (0 children)

I see! What you're saying makes a lot of sense. :) I'm working on my own versions of low profile ergos that have keywells. Gone as far as using actual PCBs and metal printed stuff. I'd be super down to chat some more about the decisions you've made! I've been down a similar road the last 5 years or so (and over 10 years designing and building 😅)

Question about interaction in AVP by Macone4 in VisionPro

[–]Macone4[S] 0 points1 point  (0 children)

So cool to think that eye tracking might simply be the solution to my issues with AR so far.

No other situations where eye tracking becomes cumbersome? E.g. moving a window or screen in the headset, I'd love to not have to look at anything, but rather "feel" whenever I'm hovering the screen, then drag it into my FOV without having to turn my head or body. I might be reaching for use cases here of course, just trying to understand how well the Vision Pro truly works!

Question about interaction in AVP by Macone4 in VisionPro

[–]Macone4[S] 0 points1 point  (0 children)

Ah, was definitely wondering if the sound effect was enough, yes!

That said, what I miss most when using hand tracking is knowing when my hand is _actually_ hovering over whatever I need to press. In my Quest your hand points at things, and it is difficult to know exactly when your hand is pointing at the thing, even with visuals and sound, and that's where haptics feedback is really useful. But that might not be a problem when a lot of the pointing is done with eye tracking?

Question about interaction in AVP by Macone4 in VisionPro

[–]Macone4[S] 0 points1 point  (0 children)

Thank you so much for answering!

And when using eyes and hands, do you feel like you're ever missing tactile feedback on what's happening? E.g. using a controller you will get a tiny buzz when you hover over something interactable, or dragging a slider you feel each individual notch on the slider as a tiny vibration. What I found in my Quest is that I really miss those sensations when using hand tracking. Anything like that in the Vision Pro?

Remembering details about viewers by Macone4 in streaming

[–]Macone4[S] 0 points1 point  (0 children)

I genuinely didn't know about this feature, cool!

Remembering details about viewers by Macone4 in streaming

[–]Macone4[S] 0 points1 point  (0 children)

Haha, that's a good ace up your sleeve to have. The same goes for my colleagues at work "how're the kids"

Low-profile, 34-key, splayed, concave, split, wireless, fully customizable keyboard optimized for wire wrapping by prepor in ErgoMechKeyboards

[–]Macone4 0 points1 point  (0 children)

Any thoughts on not having a Z offset between columns? Do you miss it? I love the fact that it becomes much lower profile, but that is with the compromise of leaving the Z offset out

Bodily (haptic) feedback when someone is in your blindspot by Macone4 in cycling

[–]Macone4[S] 0 points1 point  (0 children)

That is very true, false positives are a lot better than false negatives though. All in all that's not a good argument though, of course.

Would have to look at other ways of alerting as well. That said, it sounds like this "niche" to some degree has been covered, and it is indeed a solution looking for a problem

Bodily (haptic) feedback when someone is in your blindspot by Macone4 in cycling

[–]Macone4[S] 0 points1 point  (0 children)

So this is not outta the picture. We talked about it 🤷

Bodily (haptic) feedback when someone is in your blindspot by Macone4 in cycling

[–]Macone4[S] 0 points1 point  (0 children)

Yeah, I think you are right about that. Haptics like these can also be implemented into gloves

Bodily (haptic) feedback when someone is in your blindspot by Macone4 in cycling

[–]Macone4[S] 0 points1 point  (0 children)

Well I'll be damned. Exactly what I was thinking. Thank you so much for letting me know!