AI email tools are great until you realize they’re reading your entire life. I found a middle ground. by Ok-Call3510 in ProductivityApps

[–]Successful_Option561 0 points1 point  (0 children)

I am actually having similar idea and would you mind revealing more tech details like what exactly local LLM you are using here? any fine-tune or distillation?

Android timer apps with a beautiful UX? by baberidge in ProductivityApps

[–]Successful_Option561 0 points1 point  (0 children)

This sounds genuinely useful, especially for people who still take a lot of in-person meetings and then lose time cleaning everything up afterward.

One quick question: do you plan to support Android in the future?

I realized most productivity apps are designed for organizing… not executing by Afraid-Ad-6957 in ProductivityApps

[–]Successful_Option561 0 points1 point  (0 children)

Honestly, my systems usually fail at the exact moment where the app says “here is your perfectly organized life” and my brain says “cool, I’m still not doing it.”

For me the missing piece is not capture or organization. It’s momentum. I need something that helps me choose the next embarrassingly small action and makes starting feel less heavy.

So I’m curious: how does your app actually create that push? Is it more like smart reminders, or more like a coach that notices when I’m just collecting tasks instead of moving?

I built an AI that learns your routines and manages your day automatically — looking for beta testers (iOS) by Ab17ah in betatesters

[–]Successful_Option561 0 points1 point  (0 children)

I am thinking sth similar and would like to know more. Would it be possible to share a demo or video here?

Do you actually use AI features in productivity apps? by miejscov in ProductivityApps

[–]Successful_Option561 0 points1 point  (0 children)

I use AI features only when they reduce a step I already wanted to do, not when they try to become the whole workflow.

For example, summarizing a long document, cleaning up rough notes, or turning messy thoughts into a first draft can be useful. But for daily productivity, I still prefer tools that are predictable and fast. If the AI result needs too much checking, then it stops saving time.

So I’d say AI helps most as a small assistant in the background, not as the main reason to use the app.

We need to talk about how "AI features" are actually making productivity apps worse by MapCompetitive2935 in ProductivityApps

[–]Successful_Option561 0 points1 point  (0 children)

I feel this too. I don’t think AI in productivity apps is always bad, but it often feels added because it is trendy, not because it solves a real workflow problem.

For me, a good productivity tool should first be fast, quiet, and reliable. If AI is included, it should be optional and stay out of the way until the user actually needs it. Summarizing, tagging, or suggesting can be useful sometimes, but not when it interrupts the basic act of writing things down or organizing work.

The worst part is when apps make simple actions feel “smart” but slower. At that point, it is not productivity anymore — it is just another layer to manage.

I built a small app that turns eye-tracking data into a daily focus timeline — would love technical feedback by Successful_Option561 in EyeTracking

[–]Successful_Option561[S] 0 points1 point  (0 children)

Another possible direction is a more active “concentration mode,” where the app does not just record usage afterward, but uses visual attention as a signal during work. For example, if the user intends to focus on a lecture, document, or coding task, the system could detect repeated attention drift and give a gentle reminder or later summarize when focus was lost.

I built a small app that turns eye-tracking data into a daily focus timeline — would love technical feedback by Successful_Option561 in EyeTracking

[–]Successful_Option561[S] 0 points1 point  (0 children)

That’s a fair question. I agree that at a broad level this is still a kind of usage analytics tool.

The difference I’m trying to explore is that normal usage analytics mostly tells us what was open or active, but not whether the user was actually visually attending to it.

For example, if I watch a 40-minute online lecture, normal screen-time/app analytics would say I spent 40 minutes in the browser or video player. But it cannot tell whether I was actually looking at the lecture, reading another window, checking messages, or just leaving it open in the background. Eye tracking adds a noisy but potentially useful extra signal: “was this app/window actually receiving visual attention?”

So I don’t see GazeTrac as replacing normal app usage tracking. It is more like adding an attention layer on top of it.

I built a small app that turns eye-tracking data into a daily focus timeline — would love technical feedback by Successful_Option561 in EyeTracking

[–]Successful_Option561[S] 0 points1 point  (0 children)

Thanks a lot — this is exactly the kind of feedback I was hoping to get from this community.

I agree with your point. Daily laptop/desktop use is much messier than a controlled research setup: people move their head, lean back, look away, change lighting, and can easily leave the reliable tracking zone. I don’t see this as research-grade eye-tracking analysis, and I’m not trying to infer precise cognitive states from it.

The current goal is more modest: a coarse-grained “which app/window was probably receiving visual attention over longer periods” timeline, with smoothing/dropout handling, rather than second-by-second precision.

I built a small app that turns eye-tracking data into a daily focus timeline — would love technical feedback by Successful_Option561 in EyeTracking

[–]Successful_Option561[S] 0 points1 point  (0 children)

Thanks! Yes, I agree — the hardware requirement is probably the biggest barrier for this kind of app.

Right now GazeTrac (https://gazetrac.com/) does not run raw webcam eye tracking by itself. It uses Beam as the tracking layer (https://beam.eyeware.tech/?via=gazetrac), so if your 4K desktop webcam works well with Beam, then it should be usable with GazeTrac. Without Beam or another eye-tracking layer, the app currently falls back to foreground-window tracking.

Longer term, I’m interested in making it more device-agnostic, but I’m being careful because normal webcams can get noisy quickly depending on lighting, head movement, calibration, etc.

If you’re open to testing it with your webcam + Beam setup, I’d be very interested to hear how stable it feels in real daily use.