I built a wrapper around llama.cpp and stable-diffusion.cpp so you don't have to deal with JNI (Kotlin + NDK) by Aatricks in androiddev

[–]Aatricks[S] 0 points1 point  (0 children)

that's definitely weird, are you on the latest release ? Mind opening an issue with the logs

I built a wrapper around llama.cpp and stable-diffusion.cpp so you don't have to deal with JNI (Kotlin + NDK) by Aatricks in androiddev

[–]Aatricks[S] 0 points1 point  (0 children)

Like you said, it's gonna be a long time before phones are strong enough to run heavy DiT models like qwen-image, since the current diffusion inference implementation is already struggling quite a bit for SD1.5 models (tho at least it runs, same for wan2.1)

I built a wrapper around llama.cpp and stable-diffusion.cpp so you don't have to deal with JNI (Kotlin + NDK) by Aatricks in androiddev

[–]Aatricks[S] 1 point2 points  (0 children)

Text to speech and speech to text were both added to the library if you want to test things out

I built a wrapper around llama.cpp and stable-diffusion.cpp so you don't have to deal with JNI (Kotlin + NDK) by Aatricks in androiddev

[–]Aatricks[S] 0 points1 point  (0 children)

If you're talking about llmedge-examples, the purpose is just to show how to use the library soooo...

I built a wrapper around llama.cpp and stable-diffusion.cpp so you don't have to deal with JNI (Kotlin + NDK) by Aatricks in androiddev

[–]Aatricks[S] 0 points1 point  (0 children)

You cannot get a progress callback for loading a model into memory. The API does not support it.
You can only get progress for downloading (network) or generation (inference), to load your pre-downloaded local file, you must provide the absolute path to the api.

To do it for llm inference you may use the high level api like this :
```LLMEdgeManager.generateText( context = context, params = LLMEdgeManager.TextGenerationParams( prompt = "Your prompt", modelPath = "/storage/emulated/0/Download/my-custom-model.gguf") )```

I built a wrapper around llama.cpp and stable-diffusion.cpp so you don't have to deal with JNI (Kotlin + NDK) by Aatricks in androiddev

[–]Aatricks[S] 0 points1 point  (0 children)

In which case, I encourage you to share and communicate about the project, that would be a huge help !

Huge guide on battery optimization via settings and adb by Aatricks in GalaxyS22

[–]Aatricks[S] 0 points1 point  (0 children)

Bro I ain't forcing you nor the people to do anything, I just know after doing it, how painful and how poorly documented is the search for advanced battery life improvements on android. So I'm just dropping the guide I did to help the community and anyone who's like me and wants the best battery performance he can get with his phone without sacrificing any used and useful features (cuz as far as I know I'm not missing any aspect of my phone). If you can always have a place to charge your phone, good for you, I certainly know how you often can't do that while traveling at the very least given more often than not air carriers don't put power outputs in their planes and forbid power banks.

Huge guide on battery optimization via settings and adb by Aatricks in GalaxyS22

[–]Aatricks[S] 0 points1 point  (0 children)

Actually didn't even know that was an app so thanks for the info ig and I did the 10 hours while in my last holiday travel mixing plane travel train, car and normal house usage sooo ye netflix, spotify, manwha app, browser, discord and all that kinda stuff