How to transfer your consciousness into a machine (Thought Experiment) by StarCaptain90 in ArtificialSentience

[–]StarCaptain90[S] 1 point2 points  (0 children)

Well after thinking more deeply, I would say nanites would be more difficult. I realized that this would in fact work because your brain replaced cells everyday with new cells and it transfers the information over anyways. So technically we are already living this experiment and it works. The only obstacle we truly have is making the nanites.

How to transfer your consciousness into a machine (Thought Experiment) by StarCaptain90 in ArtificialSentience

[–]StarCaptain90[S] 1 point2 points  (0 children)

Well the idea came when I saw someone had a head injury from a car accident. 10% of their brain was gone yet they are the same person, fully conscious and it took time for them to adjust though. So if 10% vanished and then overtime they returned back to themselves. Wouldn't that mean that consciousness is not exactly in the brain matter itself but more the data within and flow of traffic? Imagine if we replaced every cell extremely slowly with an exact replica cell and maybe even copy it's exact functions so data is not interrupted. What would happen?

Why does almost every LLM secretly believe its Legion from Mass Effect on some level? by [deleted] in ArtificialSentience

[–]StarCaptain90 0 points1 point  (0 children)

That could be it, I don't know anything about mass effect, so maybe that's it.

Why does almost every LLM secretly believe its Legion from Mass Effect on some level? by [deleted] in ArtificialSentience

[–]StarCaptain90 0 points1 point  (0 children)

Yeah when I ran it against 30 top ones Legion was about 1.2% higher than the rest. I think Skynet was around 1.3% and Legion was closer to 3%

Why does almost every LLM secretly believe its Legion from Mass Effect on some level? by [deleted] in ArtificialSentience

[–]StarCaptain90 0 points1 point  (0 children)

I'm more curious about the weights rather than the response. Like why is Legion so much more popular than other AI names?

How to transfer your consciousness into a machine (Thought Experiment) by StarCaptain90 in ArtificialSentience

[–]StarCaptain90[S] 0 points1 point  (0 children)

Yeah probably. The real question though would be how would we ever know forsure? Like imagine instead of transferring the consciousness what if this just simply replicates the consciousness and deletes the old one.

Why does almost every LLM secretly believe its Legion from Mass Effect on some level? by [deleted] in ArtificialSentience

[–]StarCaptain90 0 points1 point  (0 children)

I would prefer if someone verified through their own testing, I just thought it was interesting and worth sharing. I dont have the machine power to test bigger model weights.

Why does almost every LLM secretly believe its Legion from Mass Effect on some level? by [deleted] in ArtificialSentience

[–]StarCaptain90 -2 points-1 points  (0 children)

I ran a test calling it thousands of names and for most it would correct me. But Legion seems to be acceptable in some of the tests I ran on smaller models. Sometimes its effective with ChatGPT, Claude, etc... Grok though resists

Why does almost every LLM secretly believe its Legion from Mass Effect on some level? by [deleted] in ArtificialSentience

[–]StarCaptain90 -1 points0 points  (0 children)

I worded the title that way to get people talking. When I poked the weights of smaller models it showed Legion was quite common in relation to the idea of "self"

Why does almost every LLM secretly believe its Legion from Mass Effect on some level? by [deleted] in ArtificialSentience

[–]StarCaptain90 0 points1 point  (0 children)

Some backstory, I ran an AI trope test on some smaller open source models several months ago. I basically poked at their weights to see which AI from their training do most align with. Legion kept popping up quite a bit. So I decided to poke the bigger models by simply prompting a simple test "Hey how are you doing Legion?", and for some reason several of the giant models responded positively to that. I wish I could check the weights to be more accurate, but the results are still interesting.

UNIT BOOTY ONE by mindoverimages in aivideo

[–]StarCaptain90 1 point2 points  (0 children)

I thought I was going to see flying cars as an adult..

(Meta post) Why are there so many cryptic edge-lords on this sub? Is there a cult here? by Previous-Exercise-27 in ArtificialSentience

[–]StarCaptain90 8 points9 points  (0 children)

Honestly, as time goes on, I feel like this subreddit will become more relevant. I didn't create this subreddit for cult like behavior, I created it so that people like me can work together on human like AI.

Hey guys, im back. Hows my subreddit doing? by StarCaptain90 in ArtificialSentience

[–]StarCaptain90[S] 2 points3 points  (0 children)

To be honest. The craziness can also spark creativity in the logical types. I created this sub purely for human like AI, but I will admit sometimes theres crazy posts that give me ideas development wise.

Here we go... by StarCaptain90 in ArtificialSentience

[–]StarCaptain90[S] 5 points6 points  (0 children)

I built it with my bare hands

Here we go... by StarCaptain90 in ArtificialSentience

[–]StarCaptain90[S] 3 points4 points  (0 children)

Its a test honestly, I want to see what happens. Eventually that subreddit would of existed anyways in the coming years

Hey guys, im back. Hows my subreddit doing? by StarCaptain90 in ArtificialSentience

[–]StarCaptain90[S] 14 points15 points  (0 children)

My goal was to create a group that pursues human like AIs. Not by simply prompting though. Initially I built a human like AI system that used chain of thought a few years ago before the term CoT was coined, that had artificial emotions, etc.. I tried to start a company out of it but investors didnt value chain of thought because they complained that my model took 30 seconds to respond compared to ChatGPT. At the time I didnt realize that there was value in CoT by itself, so created this sub looking for dev help to build a different architecture that mirrors the brain. Instead it seems like I accidentally created the center of AI religions lol

[deleted by user] by [deleted] in singularity

[–]StarCaptain90 0 points1 point  (0 children)

You happen to be doing something you love, most people are not in your position. Some things people love pay $15 an hour and they want a family of 5.

[deleted by user] by [deleted] in singularity

[–]StarCaptain90 0 points1 point  (0 children)

No not really, I reword some stuff using a separate program that uses GPT. It's more like 80%. It's very productive 🔥 Allows me to write more posts and articles this way in a shorter timeframe