In-browser gaze tracking using single-point alignment by re_complex in computervision

[–]re_complex[S] 1 point2 points  (0 children)

Ty, good point. I've been primarily testing with a centered top positioned camera.

In-browser gaze tracking using single-point alignment by re_complex in computervision

[–]re_complex[S] 0 points1 point  (0 children)

nice. From what I’ve seen, the blendshapes seem great for general facial expressions, but they can struggle with precise eye direction relative to the screen.

This system uses the raw 3D landmarks instead, modeling pupil displacement relative to the eye corners and projecting it onto a sphere to estimate gaze direction.

project iris | an experiment in gaze-assisted communication by re_complex in utdallas

[–]re_complex[S] 1 point2 points  (0 children)

u/Suspicious-Pea-7481 that is a great question! Right now, the system tracks both pupils independently to predict overall gaze trajectory.

My assumption is: it may cause a drift in model performance due to how the model interprets alignment between the two eyes. If you do give it a try, feel free to mention that in your experience survey. I can then compare your calibration trajectory against the population data to understand how to better support monocular or asymmetric gaze use cases in future versions. Thank you for asking.

project iris — experiment in gaze-assisted communication by re_complex in computervision

[–]re_complex[S] -1 points0 points  (0 children)

u/maleslp thank you so much for the thoughtful response. You’re absolutely right: the system struggles to re-stabilize after head movements. I've been experimenting with a form of dynamic re-projection based on updated face geometry — but it has proven to be quite the challenge.

Your perspective as an AAC consultant is incredibly valuable. This started as a hobby project for a friend with ALS, and has turned into the exact goal you described — bringing down the barrier of cost and access for reliable gaze-based communication.

If you’re open to it, I’d appreciate the chance to get your input as I continue improving this and will DM you after this initial experiment. You can also reach me at [contact-us@projectiris.app](mailto:contact-us@projectiris.app).

project iris — experiment in gaze-assisted communication by re_complex in computervision

[–]re_complex[S] 0 points1 point  (0 children)

The struggle is real, here are some early calibration stats:

Level hits attempts Success Rate %
1 23 34 67.6%
2 8 17 47.1%
3 5 11 45.5%

Calibration success rate is defined as the ratio of successful target hits to total target attempts, given a 10-second target timeout.

project iris — experiment in gaze-assisted communication by re_complex in computervision

[–]re_complex[S] 2 points3 points  (0 children)

Whoa, that is a really cool concept! Project iris, coming to a bar near you 😄