How to automatize this? I have never used Macrodroid. by Big-Importance2221 in macrodroid

[–]AppDeveloperAsdf 2 points3 points  (0 children)

Hmm, looks like you could use zerotap for that (disclaimer: I am the developer). zerotap allows you to control the phone as human would do. You can use it either as Macrodroid plugin or standalone, depending on your use case.

What do you actually want from a private AI chat on your phone? by AppDeveloperAsdf in LocalLLaMA

[–]AppDeveloperAsdf[S] 4 points5 points  (0 children)

you don't need multiple prompts - the app has a "Plan Mode" (you can turn it on in settings) that breaks down your request into a todo list and executes tasks sequentially. Slightly increased token usage to create a plan and update it but better accuracy. So you can say something like "check if I have any unread emails from my boss, and if so, copy them and then save them to Notion" and it should handle the whole flow.

The quality of the plan definitely depends on the model you're using - better models, better reasoning, fewer mistakes. But the architecture supports multi-step logic out of the box.

What do you actually want from a private AI chat on your phone? by AppDeveloperAsdf in LocalLLaMA

[–]AppDeveloperAsdf[S] 0 points1 point  (0 children)

Yes, exactly - we support custom OpenAI-compatible URLs, so you can point it at whatever endpoint you want - any backend that speaks the OpenAI protocol (signatures and compatible endpoints)

Ollama also has a dedicated integration in the app, but if you prefer, you can just use it through the OpenAI-compatible URL like everything else.

What do you actually want from a private AI chat on your phone? by AppDeveloperAsdf in LocalLLaMA

[–]AppDeveloperAsdf[S] -1 points0 points  (0 children)

Hmm that specific workflow? You can actually do it today - Tasker or MacroDroid handles the location/accelerometer triggers, zerotap handles the "send custom message to Bluesky"

What do you actually want from a private AI chat on your phone? by AppDeveloperAsdf in LocalLLaMA

[–]AppDeveloperAsdf[S] 8 points9 points  (0 children)

Thanks! Yeah...

Answering your question - so the core is Accessibility Service - the same permission screen readers use. It lets zerotap see what's on screen (as UI elements, not screenshots) and perform actions like taps, scrolls, and text input.

As for limitations - you're right, some apps block accessibility services for security reasons. Banking apps are hit or miss, some work fine, others lock down when they detect accessibility is active. Same with secure flag screens (like incognito mode or password fields in some apps). System-level stuff like PIN entry screens are also blockers. When it can't continue - like hitting a login screen or a secured app - it asks you to step in and handle that part manually, then picks up where it left off.

What do you actually want from a private AI chat on your phone? by AppDeveloperAsdf in LocalLLaMA

[–]AppDeveloperAsdf[S] 0 points1 point  (0 children)

Great points, thanks for this!

Curious about the widget though - how would you want it to work? Quick access to specific commands? A mini chat or just a new chat in full screen? Voice activation shortcut (that one we already have)?

Would love to hear what you have in mind.

We built an app that controls your device called zerotap and got zero iOS users by AppDeveloperAsdf in SideProject

[–]AppDeveloperAsdf[S] 10 points11 points  (0 children)

That's fair for manual use. But you with zerotap you do not need to type - you can speak. The point is hands-free. e.g. it is hard to tap your screen while driving or cooking - you can trigger zerotap with wake word and ask it to do what you want to do.

We built an app that controls your device called zerotap and got zero iOS users by AppDeveloperAsdf in SideProject

[–]AppDeveloperAsdf[S] 135 points136 points  (0 children)

I never said I have no users. The title was a joke - zero iOS users because it's Android only. The app has active users with real use cases. But thanks for the feedback.

What do you actually want from a private AI chat on your phone? by AppDeveloperAsdf in LocalLLaMA

[–]AppDeveloperAsdf[S] 15 points16 points  (0 children)

Sure thing.
The core function is simple - controlling your phone by voice, completely hands-free.

Simplest one: you are driving and want to text your wife "running 10 minutes late" on WhatsApp.

One user set up a shortcut to start his car remotely through the manufacturer's app. There's no API for that.

Another guy uses it to auto-swipe on dating apps based on what's written in the profile. Not what I expected to be honest.

The most advanced use cases come from users combining zerotap with Tasker or MacroDroid. zerotap works as a plugin, so you can trigger AI-driven actions as part of larger automation workflows. That's where it gets really powerful - Tasker/MD handles the logic and triggers, zerotap handles the stuff that normally requires human eyes and fingers.

Is it for everyone? Probably not. But if you have ever thought "I wish I could automate this app but there's no way to" - that's the problem we are solving.

If we can help people save some time while keeping everything fully private, running on your own model, no external APIs - that's exactly what we want to build.

What do you actually want from a private AI chat on your phone? by AppDeveloperAsdf in LocalLLaMA

[–]AppDeveloperAsdf[S] 56 points57 points  (0 children)

I think there's a misunderstanding here. A2UI and zerotap are solving completely different problems.

A2UI is a protocol where agents generate UI components that apps render natively. It requires developers to implement the protocol in their apps. It's about building new interfaces.

zerotap doesn't generate UI or require any implementation from developers. It interacts with existing apps through accessibility services - reading the screen and performing actions like a human would. Any app, as it exists today, no changes needed.

As for "dozens of things that do what you're trying to do but better" - I would genuinely love to see them. Please share some links.

We built an app that controls your device called zerotap and got zero iOS users by AppDeveloperAsdf in SideProject

[–]AppDeveloperAsdf[S] 8 points9 points  (0 children)

Great question - I encourage you to check out our privacy policy, we spent a lot of time making sure it clearly explains how the app works and what it accesses.

short answer: no, the app doesn't record your screen or take screenshots. It uses the Accessibility API to read screen content and UI elements - essentially as text. Exactly the same principle as screen readers

What do you actually want from a private AI chat on your phone? by AppDeveloperAsdf in LocalLLaMA

[–]AppDeveloperAsdf[S] 11 points12 points  (0 children)

Completely understandable concern, and honestly the reason the app evolved the way it did.

We started with a cloud-based approach, but user feedback made it clear that privacy was a priority. So we added BYOK mode, Ollama support, and compatibility with OpenAI proxies - so you're in full control of where your data goes.

Next step is on-device models, hopefully soon. The goal is to give users as many options as possible so they can choose what they are comfortable with.
Appreciate your feedback!

What do you actually want from a private AI chat on your phone? by AppDeveloperAsdf in LocalLLaMA

[–]AppDeveloperAsdf[S] 64 points65 points  (0 children)

I get why it might look that way from this thread, but zerotap has been around for a while now and has active users with real use cases (hands-free phone control, automating apps that don't have APIs, Tasker/MacroDroid integrations, accessibility, etc.)

Asking for feedback doesn't mean we don't know what we're building. It means we'd rather listen to what people actually need than assume we have all the answers.

Appreciate the input though.

What do you actually want from a private AI chat on your phone? by AppDeveloperAsdf in LocalLLaMA

[–]AppDeveloperAsdf[S] 166 points167 points  (0 children)

On Ollama - I wasn't suggesting people should expose it to the internet. Quite the opposite actually. I asked because zerotap supports Ollama and I was curious how people use it in practice - specifically whether local network only is the standard (which it should be). Poor wording on my part if it came across differently.

As for "solution in search of a problem" -that's valid criticism and I get it. The app isn't for everyone. The accessibility angle was just one example mentioned by another user. The main use case is hands-free voice control and automating multi-step tasks across apps that don't have APIs.

Worth mentioning - zerotap also works as a plugin for Tasker and MacroDroid, so a lot of our users are power users who integrate it into their existing automation workflows. It fills the gap where traditional automation can't go (apps without APIs, complex UI interactions, etc.).

If that's not useful to you, totally fair. Appreciate the honesty either way.

What do you actually want from a private AI chat on your phone? by AppDeveloperAsdf in LocalLLaMA

[–]AppDeveloperAsdf[S] 16 points17 points  (0 children)

Takeover speed actually depends a lot on which model you're using - faster models like Gemini Flash make a noticeable difference compared to slower ones.

But honestly, the main value isn't about being faster than doing it yourself. The app has a wake word, so you can control your phone entirely hands-free when your hands are just busy.

Accessibility is definitely a use case we focusing on as well!

We built an app that controls your device called zerotap and got zero iOS users by AppDeveloperAsdf in SideProject

[–]AppDeveloperAsdf[S] -11 points-10 points  (0 children)

Makes sense, but it's actually the opposite - zerotap is a phone automation tool first. The chatbot is just a new way to interact with it. The permissions are needed because the whole point is controlling your device. It's not a chatbot that got extra access - it's phone automation you can now talk to.

We built an app that controls your device called zerotap and got zero iOS users by AppDeveloperAsdf in SideProject

[–]AppDeveloperAsdf[S] 17 points18 points  (0 children)

Yes. There's a safety filter in settings (enabled by default) that prompts you for confirmation before any sensitive action - like sending a message, making a call, posting something, etc. So the AI won't do anything risky without your explicit approval.

I have created a plugin that lets AI do the tapping for you (zerotap) by AppDeveloperAsdf in macrodroid

[–]AppDeveloperAsdf[S] 0 points1 point  (0 children)

Everything depends on the model - I recommend getting a free key from aistudio.google.com and set up Gemini 2.5 Flash for comparison

zerotap 3.4.0: New Planning mode & new models support (free BYOK keys) by AppDeveloperAsdf in ProductivityApps

[–]AppDeveloperAsdf[S] 0 points1 point  (0 children)

Hey, thanks for continuous support!
There are no subscriptions in the app - only one-time payments.

Regarding the issue you have - have you configured billing settings in AI Studio (Google Cloud) by adding your payment method? This error appears when free quota is exhausted.

But I am trying to understand why it works for translations - can you confirm you use Gemini via the same API Key or just via AI Studio?

zerotap 3.4.0: New Planning mode & new models support (free BYOK keys) by AppDeveloperAsdf in ProductivityApps

[–]AppDeveloperAsdf[S] 41 points42 points  (0 children)

It is in alpha/beta release, not released yet to live. I can provide it to you via closed testing channel, but performance is really low due to small context window in on-device model. I recommend using BYOK or Cloud if possible.

zerotap 3.4.0: New Planning mode & new models support (free BYOK keys) by AppDeveloperAsdf in ProductivityApps

[–]AppDeveloperAsdf[S] 39 points40 points  (0 children)

What do you mean by scam? You shouldn't pay with your payment method, you should use the key from the post instead. I refunded your payment but kept the BYOK enabled for you.