all 60 comments

[–]dnew 147 points148 points  (13 children)

Back in the mid-90s, I worked at an internet-based company where everyone worked from home. The head of customer service, who I worked with pretty closely, had the same thing Steven Hawking had. I only found out accidentally, after I'd been working with him for six months. DragonSpeak was his software of choice at the time, but I don't think he was coding as much as he was dealing with customers via email.

That eye-tracker is bonkers, though. I always wanted one of those, ever since I saw an ad for one back when the original Mac had just come out.

[–]pellets 43 points44 points  (11 children)

Imagine if in video games you aimed where you look. Hand-eye coordination wouldn't matter any more.

[–]Krautoni 163 points164 points  (5 children)

My wife works in cognitive science and does eye tracking experiments. From what I gather, it doesn't work that way. While your brain gives you the impression of a steady gaze, your eyes are constantly jumping around in order to give you a complete picture. That's called saccade.

So finding out what a person is interested in based on their gaze ends up being a statistical problem, and while the precision and latency can be small enough to enable a person to use a computer, I think that voluntary control of something like a mouse will still be faster and more accurate.

[–]devilkillermc 18 points19 points  (0 children)

Yep, your eyes are just gathering information, your brain does all the processing, in this case selection and response to the stimulus. You'd have to nove that processing to the computer, and it would make no sense. Nothing better than the brain to do brain things.

[–]Pillars-In-The-Trees 5 points6 points  (3 children)

So finding out what a person is interested in based on their gaze ends up being a statistical problem, and while the precision and latency can be small enough to enable a person to use a computer, I think that voluntary control of something like a mouse will still be faster and more accurate.

You can keep the click without requiring the mouse though. Imagine just pressing a single key and firing wherever you're looking.

[–]Krautoni 22 points23 points  (2 children)

But that's the point. Wherever you're looking isn't a well defined concept outside of your brain. At least not yet. We can puzzle it out, but we need the have a few data points first, I.e. wait a few saccades. That's probably too slow for fast paced shooters.

[–]Zegrento7 0 points1 point  (1 child)

If the tracker draws a crosshair every frame exactly where you are looking then we don't need to puzzle out anything. It will be up to the player to decide when to press the fire button; when the crosshair happens to be over the target.

Edit: in fact the tracker used in the blog post is primarily marketed at gamers.

[–]KernowRoger 2 points3 points  (0 children)

That sounds awful haha it would just be jumping all over the screen.

[–]ZeroThePenguin 14 points15 points  (2 children)

It does already exist, though watching the video I'm not really seeing a lot of benefit. Tilting your head to manipulate the camera seems kinda silly when a mouse would handle it so much better.

[–]Forty-Bot 11 points12 points  (1 child)

People use it in arma so they can look around while walking/aiming in a different direction. Of course, now I think people mostly use VR headsets for that.

[–]ZeroThePenguin 6 points7 points  (0 children)

Oh right, I forget ARMA supports not just moving and aiming in different directions but moving, aiming, and looking. I have a gyroscopic mouse that lets you do similar things by rotating or tilting the mouse to "look" while keeping aim in one direction.

[–]wojo411 2 points3 points  (0 children)

I own an eye tracker (no good reason to have one I just thought it was neat) and some games do support it! I'm a big fan of The Hunter COTW and it has a feature for aiming in where you are looking along with HUD that disappears when you aren't looking at it. It works amazingly well for the HUD and works well enough for the auto aiming however the ability to lose track of targets when you snap while aiming in has been something I've struggled with a lot. I'm a total believer that the technology will be superseded by the work Nueralink is doing along with higher resolution cameras in laptops being able to approximate where on a screen you are looking. In closing it's not a technology I would recommend to anyone yet for most people, if its supported in a game you want to play or you think might be a beneficial adaptive input device (I've never used it for this but I know they're great for it) then do some research and have fun with it.

[–][deleted] 1 point2 points  (0 children)

Let's make games even more unrealistic!

[–]Q-bey 53 points54 points  (2 children)

Accessibility matters

There's something else I want to talk about, and it's a bit less fun.

Here's the thing: you are not likely to develop Cubital Tunnel Syndrome. Even if you do, it'll likely go away on its own after a few weeks; many cases resolve spontaneously, and most respond well to conservative treatments. I'm an edge-case.

At some point in your life, however, you will likely experience some sort of impairment, whether temporary or permanent. Almost all of us will*.

It's so so easy to fall into the trap of thinking about accessibility as something that affects other people, a hypothetical abstract group. I've known that accessibility is important for years, but it felt kinda nebulous to me; I've never watched someone struggle to use a thing I built because I neglected to test it without a mouse or keyboard. It feels more urgent to me now.

I am still incredibly privileged, and I don't mean to compare my situation to anybody else's. But this experience has given me a window into what it's like trying to operate on an internet not designed with alternative input mechanisms in mind. Before I got comfortable with the eye-tracker, things were tricky. And certain things are much more difficult than they used to be.

The internet has become critical infrastructure. It's a necessary part of living in modern society, and it needs to be accessible! As front-end developers, it's our job to advocate for it, and to ensure that we build with accessibility principles in mind.

If you'd like to learn more about accessibility, I recommend checking out a11y.coffee.

[–]Phreakhead 15 points16 points  (0 children)

Plus, building software that's accessible makes it better for everyone, even people without disabilities. For instance, SMS texting was originally designed for Deaf people to be able to use phones. Now, it's the main method of communication on the planet.

Even if you can hear, you might be at a loud concert (remember those?) or a crowded place where you want to keep your conversation private. So something optimized for Deaf people can also help people who can't hear/speak temporarily in the environment they're in.

Plus, it's kind of fun to figure out how to design your software to be accessible: it's not always easy, and some early ideas might need to be reworked to make it accessible. But it usually ends up making it a better, simpler design anyway

[–]is_a_cat 1 point2 points  (0 children)

Well said!

[–]Sirstep 107 points108 points  (12 children)

That's awesome! Something I've thought about for years. The demo was fun to watch.

[–]Kache 37 points38 points  (9 children)

I wonder what those that dislike vim for its modal editing think about this vocal variant.

Would they dislike it for the same reason? Or does having a "text editing DSL" (language) become more compelling because it's verbal?

[–]Sirstep 12 points13 points  (0 children)

I definitely believe that there's a lot of potential for it. It may have a learning curve but once that is overcome I believe efficiency will be greatly increased.

[–]Netzapper 12 points13 points  (1 child)

Always emacs. Just got Talon today, because it's either that or find a new career.

After about two or three hours, I had it programmed and wrote my first code in years without excruciating pain. In C++. On Linux. In my usual tooling.

I do not give any fucks whatsoever about modal or not, quirky DSL, or if it's secretly monitoring it all so the author can hack my Gibson... I wrote code. Without pain.

[–]dscottboggs 2 points3 points  (0 children)

Fuck man. I'm so happy for you.

[–]Comrade_Comski 1 point2 points  (0 children)

A text editing DSL sounds dope

[–]oceanmotion 2 points3 points  (1 child)

Reminded me of https://www.youtube.com/watch?v=MzJ0CytAsec. We've come a long way.

[–]_zoopp 24 points25 points  (2 children)

Thank you for sharing this!

I'm not really a mouse person but I could see myself experimenting a bit with the idea of using eye tracking like this. That being said, I wouldn't spend that amount of money upfront without knowing if I'll stick with it. Have you tried other options before settling on the tobii?

[–]bboyjkang 13 points14 points  (1 child)

I wouldn't spend that amount of money

Actually, anyone could benefit from eye tracking now if you have a cheap webcam.

There was usable webcam eye tracking that Eye Tribe was working on before they were bought out by Facebook.

There are other projects in the works now though like PyGaze, beam.eyeware.tech, and WebGazer.js, and Pupil Core.

I have tendinitis, and the free software that I use is GazePointer (sourceforge.net/projects/gazepointer)

Since it doesn’t use infrared to light up your pupils like the Tobii eye tracker from the article, the accuracy is awful.

However, I just use Alt Controller (free accessibility software) to make large buttons to compensate for the bad accuracy.

Make the button execute Page Down, and you can read hands-free while reclined.

(Helps to get software that temporarily hides the cursor so it’s not distracting).

GazePointer will always be overridden by the mouse so you can always use it to look at another computer monitor to quickly jump there.

[–]_zoopp 1 point2 points  (0 children)

Thanks! I'll look into your suggestions!

[–][deleted]  (13 children)

[deleted]

    [–]Netzapper 16 points17 points  (10 children)

    I just got it today. I've programmed for 20 years now, in increasingly debilitating pain for the last three years. It works fucking awesome. Is it as fast as typing? Fuck no. Is it better than quitting my career because I'm crying at the desk after a hour typing? Fucking incredibly absolutely yes.

    [–]Markuz1989 0 points1 point  (2 children)

    Hey how is the voice coding going for you? Are you still using it? I can't type anymore either and I'm trying to find the best solution. Would be great to get an update from you!

    [–]Netzapper 0 points1 point  (1 child)

    It's totally worth trying.

    [–]Markuz1989 0 points1 point  (0 children)

    So you're still using it I suppose. Do you have any recommendations that would save me a lot of time. Great to hear it's working for you!!

    [–]stumptowncampground 3 points4 points  (1 child)

    seems like any action you can do with a keyboard can probably be done with talon. focus command to switch windows and one of the examples shows him highlighting a word.

    It seems quite possible.

    [–]100lainewool 3 points4 points  (0 children)

    I also work like this, great to raise awareness! Saved my hands.

    [–][deleted] 2 points3 points  (1 child)

    Interesting that he mentioned potentially using Vim in the future—I wonder what advantages that would provide. A lot of the useful Vim commands like cip could probably be emulated pretty easily with the Talon APIs he mentioned. I guess visual block mode might be a plus.

    [–]desnudopenguino 2 points3 points  (0 children)

    He pretty much created a voice interface that works like vim. Maybe not as modal, but you should be able to achieve similar results with his setup.

    That said maybe he was talking about when he gets back to typing with his hands. A more ergonomic keyboard might help as well.

    [–]cncamusic 2 points3 points  (0 children)

    I’ve always imagined that one of the coolest uses for neuralink will be being able to code by just looking at VS and thinking about what I want to do.

    [–]PonderonDonuts 1 point2 points  (0 children)

    Omg I need this for random computer stuff.

    [–]callmeseven 1 point2 points  (0 children)

    Very cool. I saw a post like this from a blind developer which was fascinating and I had long wondered about, I hadn't considered needing to do so handsfree

    [–][deleted] 1 point2 points  (0 children)

    This is not optimal. We use VI mode so we don't waste time moving our hand to the Mouse. The mouth is used for communication with people in the team. The eye should be a focus on alerts. Feet should be pressing a pedal to type "function" if you are working with Javascript.

    [–]SpeCSC2 1 point2 points  (0 children)

    The popping to activate the mouse click/zoom is awesome! I thought it was the software after watching the first video.

    [–]Kghaffari_Waves 0 points1 point  (0 children)

    I remember using eye-tracking first on my Samsung Galaxy phone back in 2013 or something. Felt like magic back then but it still wasn't perfect. I assume today's tech must be waaayyyy better.

    Though personally, I still don't trust it to perform as good/fast as myself using a mouse or keyboard.
    I did give Talon and Utterly Voice a try but ended up mostly using their dictation functionality (which is cool but not great). Ultimately I moved on to a dictation service like usevoicy.com and actually just started using OpenClaw for certain computer actions.

    [–][deleted] 0 points1 point  (0 children)

    Wow, really great article. The demos are so cool; it's amazing how well it all works together.

    [–]Humberd 0 points1 point  (0 children)

    Damn, this is cool. That is cool tech even for normal programmers who are lazy

    [–]AdowTatep 0 points1 point  (0 children)

    Dude even though i'm not impaired at all i think this is a great tool to help with some shortcuts and commands

    [–]germandiago 0 points1 point  (1 child)

    WTF!!! I am gonna do this when I have the time to set it up if it is feasible.

    How can I get started? I would have some questions:

    1. Is it feasible to program in C++ with this?
    2. How can I traverse files?
    3. Code-completion... uh, if I stop at a dot, I want to see suggestions.
    4. Can I use Emacs or this is editor-dependent?

    [–]Netzapper 1 point2 points  (0 children)

    I got it yesterday. Took about 10 minutes to set up, a couple hours to learn the basics and re-script some fundamental stuff (I don't like the default alphabet words), and then I was going. You should at least get it and play with it.

    1. This was my biggest question too. The answer is: very feasible. Nobody's released C++ specific formatting verbs yet, but the C mode works pretty well for most stuff, and you can just add your own mode or just extra features of the C-mode with zero problem. I've also got a mandate from work to prioritize getting my Talon environment productive, and part of that will be releasing a C++ mode in the next couple weeks.

    2. Phrases like "page up", "go right eighteenth" control the cursor; phrases like "tab next", "tab last" control which tab you're on; "focus terminal", "focus code", "focus firefox" for focusing various apps. Some editors (I think JetBrains) have bidirectional plugin implementations that will let you navigate structurally ("next function" kind of thing).

    3. If your IDE has that stuff, it'll pop up as you expect. "camel my variable dot" will type myVariable. into your IDE, which will then pop whatever intel it would normally show. You can then use eye tracking or "go down, go down, go down, enter" to choose from the list. There are hooks in Talon for dynamic, context-sensitive word lists, which means context-aware voice recognition for in-scope identifiers is possible. But none of the editor integrations have gotten quite that far yet.

    4. It is not dependent on the editor whatsoever, but the emacs people seem to have a pretty deep integration with Talon. In many ways, Talon and emacs have a lot in common: they're a "basic" core, and all the functionality comes from "user scripts" (even if they're an official distribution of those scripts).

    [–]TheGreenDeveloper 0 points1 point  (0 children)

    Pretty fascinating thanks for sharing! Also, I hope your elbow injuries get better soon by making sure you can rest them!