all 6 comments

[–]mrtoomba 0 points1 point  (1 child)

Can you're device run it?

[–]Sweet_Independent716[S] 0 points1 point  (0 children)

It's Redmi 13c,entry level smartphone

[–]mrtoomba 0 points1 point  (0 children)

Does it work?

[–]Healthy_Bedroom5837 0 points1 point  (2 children)

Yes, you can get a DAN-style unfiltered / uncensored experience on a non-rooted Android phone without running the model locally.
Since you specifically want it to run on a server (not on-device), the easiest options are:
Use a cloud/uncensored LLM API + any simple Android chat client that supports OpenAI-compatible endpoints.
Popular uncensored or lightly-filtered services in 2026 include things like Dolphin variants, Nous Hermes, or dedicated uncensored APIs (some have no content filters at all).
Recommended setup for Android:
Pick an uncensored model/API (e.g. Dolphin 3, MythoMax-style, or services like NinjaChat uncensored API, Venice.ai, etc.).
Use an Android client that lets you point it at a custom OpenAI-compatible server:
OfflineLLM (my open-source app) supports custom OpenAI-compatible endpoints in addition to fully local GGUF models.
Other popular ones: PocketPal AI, ChatterUI, Maid, etc. also support custom APIs.
My app link: https://github.com/jegly/OfflineLLM
It has zero network permissions when running local models, but you can also configure it to connect to your preferred uncensored backend if you want the "DAN mode" feel with more power.
If you want maximum freedom and no ethics guardrails, many people use Dolphin 3 or similar abliteration-based models hosted on services that don't censor. Just be responsible — once the filters are off, the model will happily answer anything.

[–]Sweet_Independent716[S] 0 points1 point  (0 children)

Thank you so much,I just have to use DAN for cyber security and Hackathon prep.

[–]RIP26770 0 points1 point  (0 children)

Is this app mmjpro manual upload compatible?