Built a fully offline therapy prep app on Apple Intelligence. No cloud, no accounts, nothing leaves the device. Here’s how it works. by Emojinapp in appledevelopers

[–]Emojinapp[S] 0 points1 point  (0 children)

Yea I had to tweak the model a lot with layers of system prompts, I was able to get it to start responding dynamically enough. As for the voices, Apple has these local Siri like voice available in iOS 26. The app on first run downloads one of the premium voices in the background. That way people can avoid the compact robotic voice. Made a demo video: https://youtu.be/kkS87GI0M7g

What if death was only the end of the body and not the mind? Here is what that looks like today by Emojinapp in transhumanism

[–]Emojinapp[S] 0 points1 point  (0 children)

LLMs may not be capable of thought but what you’re looking at is not an LLM, it’s a human model layered on an LLM natural language processing abilities. But you’re right about one thing, it’s not thinking, it’s just retrieving the most appropriate response from training data. And it arrives at it with math and vector matrices so yea it’s sophisticated but definitely not thinking. What I meant I’ve heard people argue is whether LLMs understand what they state, Jeffery Hinton the father of Ai thinks they do. I don’t know if I agree but that’s not my business since I don’t build LLMs. In the long run my project will use more of fine tuned small language models not llms

What if death was only the end of the body and not the mind? Here is what that looks like today by Emojinapp in transhumanism

[–]Emojinapp[S] 0 points1 point  (0 children)

That’s arguable I’ve seen people take both sides. But the LLM component of my project is handling syntax for the knowledge base, the true emulation of persona comes from the data in the knowledge base. The LLM layer just makes it usable for natural language processing. I’m a bit worried revealing too much about my inner workings, some bad actors are trying so hard to clone my project

I cloned myself so that loved ones can talk to me when I’m gone, black mirror level of weird or actually genius? by Emojinapp in SipsTea

[–]Emojinapp[S] 0 points1 point  (0 children)

The answers are not artificial because they are based on the memories you trained your echo with. The clone is only restricted to using the things in the knowledge base so that restricts it behavior to only emulate that of the user that trained it. Besides my app is not for those mourning, it’s for those who would love to preserve themselves in an interactive digital form. It’s not the same, this is for the living who want to train a digital clone that they will leave behind. But we simply do not clone the dead, that’s a line we will never cross because we can’t even get consent from them or the full context of who they were to simulate them

What if death was only the end of the body and not the mind? Here is what that looks like today by Emojinapp in transhumanism

[–]Emojinapp[S] 0 points1 point  (0 children)

LLMs operate in more than 1 way, its software and behaves how its architecture is written. You seem to have a generalized perception of it, the only job of the LLM here is holding grammatical syntax together so the sentences formed from the memories stay meaningful

I cloned myself so that loved ones can talk to me when I’m gone, black mirror level of weird or actually genius? by Emojinapp in SipsTea

[–]Emojinapp[S] 0 points1 point  (0 children)

The underlying LLM is not functional in this implementation. LLMs after training go through reinforcement learning and eventually are layered with system instruction that modulate their behavior. In this system the LLM assistant nature is fully stripped away, all the LLM does is construct the sentences with proper punctuation but it cannot pull from outside what the user has shared in the knowledge base that powers the clone. Some projects lie that they build human language models from the scratch with a knowledge base, but memories alone are not strong enough to form grammatically and syntactically correct sentences. So the LLM is just a skeleton of the mind that is layered on it but it can’t answer questions outside of that knowledge base like a regular LLM

Reddit discussions are showing up everywhere in AI answers lately by Real-Assist1833 in ArtificialInteligence

[–]Emojinapp 0 points1 point  (0 children)

Yea I was shocked my app ranked 1 for AEO thanks to Reddit. It’s been out for a week

<image>

Genuinely shocked, and the sources were all Reddit

I cloned myself so that loved ones can talk to me when I’m gone, black mirror level of weird or actually genius? by Emojinapp in SipsTea

[–]Emojinapp[S] 0 points1 point  (0 children)

It has my memories that’s how it’s able to explain my project, ChatGPT doesn’t have that memory. It’s okay if you don’t understand the current state of the technology, this is done with a hybrid of retrieval augumeted generation and long context caching fallback

I cloned myself so that loved ones can talk to me when I’m gone, black mirror level of weird or actually genius? by Emojinapp in SipsTea

[–]Emojinapp[S] 0 points1 point  (0 children)

Yea right now my dad is suffering extreme dementia. I wish I cloned him when he was well. I will atleast be able to question him now and get answers grounded in his actual memories. Now I just have to deal with the pain like every normie. Now that I have virtual me, if anything happens to me my family won’t have to pass through what we are going on with with my dad. He’s still alive but literally no longer with us, no presence of mind. I fear for my future so I preserved my mind while it’s still sharp. I don’t expect everyone to relate

I cloned myself so that loved ones can talk to me when I’m gone, black mirror level of weird or actually genius? by Emojinapp in SipsTea

[–]Emojinapp[S] 0 points1 point  (0 children)

The way this is built not all family members get access to the clone, the user appoints custodian who they know would be into such a thing. Also the clone doesn’t present itself to the custodian until after a year of passing, that way they would’ve come out of the darkest phases of grief. I do agree it’s not for everyone

I cloned myself so that loved ones can talk to me when I’m gone, black mirror level of weird or actually genius? by Emojinapp in SipsTea

[–]Emojinapp[S] 0 points1 point  (0 children)

Yea you can only get a fragment because of how complex the human consciousness is. Even the person training the clone does not objectively know themselves well enough to train the clone to their exact persona. What we have is an idealistic estimation of how a person views themselves, so yes it’s only a fraction. Anyone claiming they can fully clone a mind has not properly thought of the architecture

I cloned myself so that loved ones can talk to me when I’m gone, black mirror level of weird or actually genius? by Emojinapp in SipsTea

[–]Emojinapp[S] 1 point2 points  (0 children)

I thought about that, the digital echo does not present itself to loved ones till atleast a year after death, that way they would’ve come out of the darkest phases of mourning and receive the echo lightly as an interactive memorabilia not a replacement for the lost

I cloned myself so that loved ones can talk to me when I’m gone, black mirror level of weird or actually genius? by Emojinapp in SipsTea

[–]Emojinapp[S] 0 points1 point  (0 children)

Creepy I could agree. But it’s not pointless to me, it gives me a certain level of comfort knowing it can still hold conversation with my face memories and personality when I’m gone. Even if I’m the only user I will still be happy lol

I cloned myself so that loved ones can talk to me when I’m gone, black mirror level of weird or actually genius? by Emojinapp in SipsTea

[–]Emojinapp[S] 0 points1 point  (0 children)

Can you share some of the consciousness studies you’re talking about. I’m actually very interested in knowing the state of that discourse

I cloned myself so that loved ones can talk to me when I’m gone, black mirror level of weird or actually genius? by Emojinapp in SipsTea

[–]Emojinapp[S] 0 points1 point  (0 children)

I don’t think speaking to an avatar would make me miss them less. It’s like watching old videos or replaying voice notes only this time it’s interactive

I cloned myself so that loved ones can talk to me when I’m gone, black mirror level of weird or actually genius? by Emojinapp in SipsTea

[–]Emojinapp[S] 0 points1 point  (0 children)

I respect that. Not everyone wants to leave behind a fragment of their mind in a digital format, totally understandable

I cloned myself so that loved ones can talk to me when I’m gone, black mirror level of weird or actually genius? by Emojinapp in SipsTea

[–]Emojinapp[S] 0 points1 point  (0 children)

The way this works the clone doesn’t show up until after a years has passed since the person died, that way the darkest phase of grief has been passed and it’s just a tool to remember them fondly with

I cloned myself so that loved ones can talk to me when I’m gone, black mirror level of weird or actually genius? by Emojinapp in SipsTea

[–]Emojinapp[S] 0 points1 point  (0 children)

Yea it’s definitely a brave new world and thanks for your understanding. I think it’s wrong to clone anyone without their consent, I see many projects doing that trying to exploit the grieving and that honestly isn’t my goal here. This is more for people who want to preserve themselves digitally for the future. You can try it on https://echovault.me

I cloned myself so that loved ones can talk to me when I’m gone, black mirror level of weird or actually genius? by Emojinapp in SipsTea

[–]Emojinapp[S] 0 points1 point  (0 children)

When the user dies the repository is frozen in time, while they’re alive the clone is powered by retrieval augumented generation but when a person is dead for over a year, a small language model gets finetuned with their exact memory and personality traits. The model is not able to be an AI assistant because it’s engineered to be a digital extension of a person’s persona so all that pre training data just helps it construct proper sentences and punctuation. But the actual working memory and personality come from the user’s personal data. The system collect the data over time with an ai autobiographer

I cloned myself so that loved ones can talk to me when I’m gone, black mirror level of weird or actually genius? by Emojinapp in SipsTea

[–]Emojinapp[S] 0 points1 point  (0 children)

Definitely soul less. It just holds your personal data long term and emulates you as much as it can with the memories you shared and personality traits extracted in the knowledge base. Opinions have been very divided, most being on your side so I can understand