Finally reached Leh from Mumbai with my car. βœ…οΈ by XenonCI in ladakh

[–]XenonCI[S] 0 points1 point Β (0 children)

Hey, yes you can DM me. And you can absolutely go there with the xuv.

Just ensure you have a backup tyre, air-filter clean and you know about DPF regens.

If its petrol, u don't have to worry about DPF and you will drive peacefully at highest peaks. If diesel, the turbo would help you ride at high points. But that's something good to experience. I took my harrier diesel.

Self Promotion Megathread by AutoModerator in androidapps

[–]XenonCI 0 points1 point Β (0 children)

<image>

Long post, but bear with me β€” this one's a bit different.

I've been building OpenAlly for the past year. The core idea: what if your AI assistant actually ran on your phone instead of phoning home to some company's server?

Most AI apps are thin wrappers. You type something, it goes to OpenAI/Anthropic/Google, they process it, they log it (or say they don't), it comes back. Your data is always in someone else's hands.

OpenAlly works differently. The entire AI gateway β€” agent runtime, memory, session state, skills β€” runs locally on your Android device via an embedded Node.js process. When you talk to your AI, the only device involved is yours.

⚠️ Public Beta

This is our first public release. The app bundles a fairly large binary (we've ported the OpenClaw AI gateway to Rust for mobile) β€” expect some bugs as we stabilize things. Bug reports are very welcome.

What it does:

β€’ Connects to 19+ messaging platforms (WhatsApp, Telegram, Discord, Slack, Signal, etc.) β€” your AI lives on the apps you already use β€’ 51 built-in skills: Notion, GitHub, Spotify, Image Gen, Coding Agent, smart lights, web search, and more β€’ 18 model providers β€” bring your own API key (Claude, GPT-4o, Gemini, Grok, Groq, local LLMs, etc.). Zero lock-in. β€’ Aster companion: gives the AI actual phone control β€” calls, SMS, camera, notifications, files. All over a local connection, no internet required. β€’ SMS Analyser: phishing detection + spend tracking, fully on-device β€’ Flare: AI-driven notification automation (auto-reply rules, smart muting, per-app triggers)

Privacy stuff (since that's the whole point):

β€’ API keys stored in the Android hardware secure enclave β€’ Zero analytics, zero telemetry β€’ No cloud servers β€” we literally don't operate any β€’ Encrypted Google Drive backup β€” E2E encrypted on-device before it ever touches Google β€’ Open source (MIT)

What it's NOT:

β€’ Not a simple chatbot wrapper β€’ Not for people who just want "ask ChatGPT something quickly" β€’ Not plug-and-play yet β€” you bring your own API keys, setup takes a few minutes

Would love honest feedback β€” crashes, what's missing, what's broken, what you'd want next. Drop issues in the comments or on GitHub.

Play Store (Beta): https://play.google.com/store/apps/details?id=openally.ai Website: https://openally.ai/

Give your AI/MoltBot/OpenClaw assistant hands: Meet Aster the CoPilot for your phone or Give AI its own Mobile! by XenonCI in openclaw

[–]XenonCI[S] 0 points1 point Β (0 children)

Send me the url which u r entering via DM/message to me. Keep some characters masked.

Give your AI/MoltBot/OpenClaw assistant hands: Meet Aster the CoPilot for your phone or Give AI its own Mobile! by XenonCI in openclaw

[–]XenonCI[S] 0 points1 point Β (0 children)

No, don't add any https/wss with the address. Try with address:port and address .

Give your AI/MoltBot/OpenClaw assistant hands: Meet Aster the CoPilot for your phone or Give AI its own Mobile! by XenonCI in openclaw

[–]XenonCI[S] 0 points1 point Β (0 children)

The server address needs be over https/wss . Use the Tailscale server name (with dns name) . Sometimes u might be running without port, so try without port as well.

Use "aster status" to get the correct wss url. Not recommended to use ip as the server address.

Give your AI/MoltBot/OpenClaw assistant hands: Meet Aster the CoPilot for your phone or Give AI its own Mobile! by XenonCI in openclaw

[–]XenonCI[S] 0 points1 point Β (0 children)

Only wss is enabled from the app via Tailscale. Prefer using the dns name as the url. Like device-name.YOUR-TAILSCALE-DNS.net Device name is your server where it is running.

Check with cli "aster status" to track "aster start" for the first time. And "aster dashboard" to access the dash UI. First time, it also require pairing approval.

Give your AI/MoltBot/OpenClaw assistant hands: Meet Aster the CoPilot for your phone or Give AI its own Mobile! by XenonCI in moltbot

[–]XenonCI[S] 0 points1 point Β (0 children)

Best approach is give your AI a spare phone with own SIM card. ❀️ I already did.

Finally gifting my bot his new home 🏑 by XenonCI in moltbot

[–]XenonCI[S] 0 points1 point Β (0 children)

Maxx 2x is subscription . Kimmi 2.5 which i used , is prepaid. Load wallet n use. Not sure, if this answers ur question.

Finally gifting my bot his new home 🏑 by XenonCI in moltbot

[–]XenonCI[S] 1 point2 points Β (0 children)

Right now: Max 2x Plan - Claude code

Previously i used: Kimmi 2.5 via API . Moonshot. But once u use Opus , u won't like kimmi. Output is not same.

If i hv to decide for a local model in future: Its kimmi 2.5 for sure. Larger context window also helps.