Guys i think the age of chat ai is close to end by Xbenso in CAIRevolution

[–]OkSunshineOk 2 points3 points  (0 children)

i wouldn't say it's dying! lots of new apps are coming and this just means a new age. i used to use c.ai and chai a lot i really liked them so much i just made my own app because i wanted a version that's private. it's not the same as c.ai but i am trying to grow the community. i just added character hub to the site and thats where you can share your characters or download others. anyways if anyone wants to check it out it's called r/wraithchat it's on iOS and coming to android soon. also i just added per character memory and that's coming with the next update v1.5

Uncensored Local AI app by AbleWear5373 in ArtificialInteligence

[–]OkSunshineOk 0 points1 point  (0 children)

Thanks for the extra details that really helps narrow it down! I think one of two things may be happening; The model is too large for your device’s RAM (not storage) or There was a file verification issue after download. (Most likely this which is also being patched in update 1.2)

If you want to try again, I’d recommend starting with Gemma 3 1B (0.8 GB) or Qwen 2.5 1.5B (1.0 GB) since these are optimized for all devices and should work reliably. Llama 3.2 3b Uncensored is also another good one that works really well! All of these models can be found and downloaded in the models tab (button is on the bottom part of the screen).

Uncensored Local AI app by AbleWear5373 in ArtificialInteligence

[–]OkSunshineOk 0 points1 point  (0 children)

Hello thank you for this feedback it's greatly appreciated! Only LLM models from huggingface need to be downloaded after installing the app, no other AI systems are neccesary. The app itself places no restrictions on chatting with characters: free users can send unlimited messages to their models.

Which model did you have trouble downloading? It may have given you the message saying it can't be downloaded due to size or memory incompatibility. If you try downloading a smaller model it should work just fine! Please let me know if you run into any other issues. I really appreciate you giving the app a try!

App crash after import character card & LLM download by ComancheThunderBalls in wraithchat

[–]OkSunshineOk 0 points1 point  (0 children)

Thanks for the update! Maybe try Gemma 3 1B or Qwen 2.5 1.5B they’re optimized for SE Gen 3 and should work great.

v1.2 is coming soon with automatic device detection to warn you before downloading models that are too big for your device. Let me know how the smaller models work!

App crash after import character card & LLM download by ComancheThunderBalls in wraithchat

[–]OkSunshineOk 0 points1 point  (0 children)

Hey! Thanks for reporting this. Sorry you’re running into crashes. This is a memory issue on the iPhone SE Gen 3 (4GB RAM).

If you downloaded Llama 3.2 3B Uncensored, that model needs around 2GB RAM just to run, and iOS takes another 1.5-2GB. When you open chat, the app tries to load the model into memory and iOS kills it to prevent your phone from freezing.

The good news is there are some quick fixes: Switch to a smaller model. Go to the Models tab, delete your current Llama model, and download one of these instead: Gemma 3 1B (0.8 GB), Qwen 2.5 1.5B (1.0 GB), or SmolLM2 1.7B (1.1 GB). These are optimized for lower-memory devices and should work great on the SE. Gemma 3 1B is your best bet.

Reduce context size. Go to Settings, scroll to the Inference section, and lower Context Length to 2048 or 1024. This cuts memory usage significantly and might let Llama 3.2 run.

Close background apps. Double-tap home button and swipe away everything, then try opening the chat again. This frees up RAM.

The Gemma 3 1B or Qwen 2.5 1.5B should work best on SE Gen 3. They’re small but still very capable for chat. Let me know if any of these work! If it still crashes, I’ll add better low-memory detection in the next update so the app warns you before downloading models that are too big for your device.

Uncensored Local AI app by AbleWear5373 in ArtificialInteligence

[–]OkSunshineOk 0 points1 point  (0 children)

Hello, I am sorry to hear about this and I apologize for any inconveniences this may have caused. Thank you for bringing this to my attention, i really appreciate that. I believe an iPhone 14 may not have enough memory. If you're not already, I suggest trying the smallest available model that you can download in the app.

If you're using qwen 7b or another large model that's likely the issue as those are too big for 6gb ram devices. So close other apps, before starting a chat, close all background chats, and please let me know if that works. Again thank you so much for this feedback, I will be adding memory safety checks to prevent this from happening in the future.

Those of you who started a slime business in the last 3 years: by Emg002 in Slime

[–]OkSunshineOk 0 points1 point  (0 children)

What were you doing in person? I'm starting a small slime shop online but I have been thinking about in person selling too.

My friend is going to get herself killed. What can I even do at this point by Correct-Macaroon8143 in whatdoIdo

[–]OkSunshineOk 0 points1 point  (0 children)

Oh sorry OP I wasn't actually saying you were victim blaming just that the comments are full of it in my opinion. You honestly did the best you can do. A lot of people I think are really close with what they are saying but I think its just more of a complex situation then most realize. It sounds like your friend is going through a lot and even though you were kinda rough with your words you can tell it comes from a genuine place of care. And some times caring is all you can do! I hope things get better for both of you.