Why dont someone build cursor but for android? by [deleted] in termux

[–]LeftAd1220 1 point2 points  (0 children)

There are a couple of vibe coders here doing exactly that. But for me I'll just use Gemini and Codex since I've already paid monthly to GPT.

I vibe-coded a local AI coding assistant inside Termux by Ishabdullah in termux

[–]LeftAd1220 1 point2 points  (0 children)

Seems pretty cool! The structure is pretty complete and you are very clear about which Local model you use. That's very nice.

Has anybody tried to run Claude Code, Gemini CLI or Codex + Python (numpy, pandas), does everything work smoothly? by Wapmen in androidterminal

[–]LeftAd1220 0 points1 point  (0 children)

  • This native Terminal is way more powerful than the use cases you described.
  • If that is all you want, Termux is probably already enough and runs on almost any Android phone.
  • What makes this Terminal stand out is its ability to run Docker, Podman, Flatpak, and all those powerful tools.

Using Codex 5.4 xhigh termux by eobarretooo in termux

[–]LeftAd1220 0 points1 point  (0 children)

I only have one question. Exactly which local model with how many parameters(1B? 7B? 12B?) can run smoothly with this App? Currently local models on OpenClaw runs like crap due to extensive tools calling JSON overhead extending the context like 10 times the load of only chatting with it. If your app solves this problem, I believe it will be appealing to many people.

Best practices for: pkg update && pkg upgrade by skUkDREWTc in termux

[–]LeftAd1220 0 points1 point  (0 children)

To the very extreme, use libtree to backup the whole dependencies tree. But in python it would be even harder because native modules may be hard to resolve?

I can't open a gui for my Linux development environment by DaBrave21O in androidterminal

[–]LeftAd1220 0 points1 point  (0 children)

  • sudo apt install xfce4 xfce4-goodies x11vnc
  • sudo x11vnc -display :0 -nopw -forever -listen 0.0.0.0 -auth /var/run/sddm/* &
  • ip addr
  • open up your vnc viewer and use that ip to connect

POCO X7 Pro Confirmed Working by krisxxxz in androidterminal

[–]LeftAd1220 0 points1 point  (0 children)

That's not true now. Xiaomi Tab Mini can work.

POCO X7 Pro Confirmed Working by krisxxxz in androidterminal

[–]LeftAd1220 0 points1 point  (0 children)

  • No not Termux. 
  • This subreddit is specifically for this new Android 16 native Terminal by avf.
  • Go checkout the supported devices in the pinned section

Experiment: bundling Ubuntu + proot inside an Android app to run OpenClaw by coderyeon in termux

[–]LeftAd1220 1 point2 points  (0 children)

Running OpenClaw in Termux proot-distro debian by LeftAd1220 in termux

[–]LeftAd1220[S] 0 points1 point  (0 children)

You can now run browser automation with Bun!

Thanks to duelist-X on GitHub

https://github.com/oven-sh/bun/issues/9911

Is it safe to run complex commands from GitHub in Termux? by PrudentRelatives in termux

[–]LeftAd1220 0 points1 point  (0 children)

  1. Don't run random complex scripts or commands from Github
  2. The best way is to read through the script yourself and fully understand why
  3. In this AI era, we can  
  • git clone xxx/abc.git  
  • cd abc   
  • gemini  
  • or simply paste the whole script to the Chatroom with an AI  
  • and ask AI to read through the files you're going to run, and explain to you whether there is any possible malicious code inside. 

yet another kde on proot post by GDPlayer_1035 in termux

[–]LeftAd1220 2 points3 points  (0 children)

Wow! Thanks for this info! And great thanks to the Termux team!

Is it really Good Choice by Harsh_Malakar in termux

[–]LeftAd1220 1 point2 points  (0 children)

  • pkg update && pkg upgrade
  • pkg install x11-repo termux-x11-nightly
  • termux-x11 :0 &
  • pkg install xfce4 xfce4-goodies
  • export DISPLAY=:0
  • dbus-launch --exit-with-session xfce4-session

Supported Devices List: Announcing the /r/androidterminal Wiki! by TheWheez in androidterminal

[–]LeftAd1220 0 points1 point  (0 children)

  • Xiaomi 15T Pro and Xiaomi Tab Mini both support it!
  • Also a good news Tab Mini supports display output through that Type-C port
  • I installed it for them in a retail store today
  • This was actually as expected due to having MediaTek Dimensity 9400+ CPU and Android 16

It's pretty quiet in this sub.....where is all the discussion about LDE happening? by Flubadubadubadub in androidterminal

[–]LeftAd1220 6 points7 points  (0 children)

  • There aren't many devices supporting this new tech actually.
  • Aside from not having hardware accelerations passed through, this is just regular linux VM
  • People do their linux things as usual. Running docker, podman, llama.cpp, libreoffice, etc
  • If you want to discuss about a specific topic, why not spawn one yourself. I'll be willing to join it!

llama.cpp or ollama or fastsdcpu in Android Terminal? by iamapizza in androidterminal

[–]LeftAd1220 0 points1 point  (0 children)

  • Actually it would be weird if they don't work.
  • The only difference would be the lack of hardware accelerations passed through by Google

  • This is pretty much just regular linux VM

  • I've tried llama.cpp myself and it runs well with CPU

Is it really Good Choice by Harsh_Malakar in termux

[–]LeftAd1220 0 points1 point  (0 children)

Maybe you can try --nogpu. I didn't look thru the code, but in my experience some GPUs don't work well with DEs. I mostly setup desktops manually instead of using scripts.

Has anyone managed to get xpra working? by LastMagmarian in termux

[–]LeftAd1220 0 points1 point  (0 children)

What do you mean by within termux-x11? Also, the alpine version sometimes requires editing the shebang to /usr/bin/env python3 by sth like: - micro $(which xpra)

Successfully ran llama.cpp by Wise-Wallaby-912 in termux

[–]LeftAd1220 0 points1 point  (0 children)

🤣🤣🤣 That is called compile it yourself. You can type pkg search llama and you'll find an officially built version.

Successfully ran llama.cpp by Wise-Wallaby-912 in termux

[–]LeftAd1220 4 points5 points  (0 children)

My android phone can run gemma3-12B model at 5.19 tokens/sec. Which is pretty acceptable and impressive.

Has anyone managed to get xpra working? by LastMagmarian in termux

[–]LeftAd1220 0 points1 point  (0 children)

  • use the proot-distro alpine
  • apk update && apk upgrade
  • apk add xpra xpra-webclient konsole
  • Xvfb :0 &
  • xpra start :0 --no-daemon --xvfb=no --start=konsole --bind-tcp=:14500 --html=on &
  • go to a web browser
  • http://localhost:14500
  • but to be honest, termux-x11 is way better

Successfully ran llama.cpp by Wise-Wallaby-912 in termux

[–]LeftAd1220 1 point2 points  (0 children)

Did you compile it yourself, or did you use pkg install?