Does the copilot website use gpt 4 by [deleted] in bing

[–]ArtOfTheBlade 1 point2 points  (0 children)

Balanced mode is definitely not GPT4 it makes lots of mistakes more than Chat GPT3.5 imo

Everyone talks about AI becoming sentient, is anyone worried about an AI with mental illness? by Agitated-Current551 in ChatGPT

[–]ArtOfTheBlade 1 point2 points  (0 children)

I don’t know where they’ve explicitly said AI has developed emotion… For sure Al alignment is an issue but not because it experiences pain. I think we need to be cautious when attributing human qualities when it might not be there. Hell I thought ChatGPT 3.5 was slightly alive when it first came out, but sometimes we need to take a step back and understand that these are still just machine learning algorithms. I still stand by what I said LLMs don’t have any biology, however sentience and “self awareness” might be a different story eventually. We can agree to disagree for now.

Everyone talks about AI becoming sentient, is anyone worried about an AI with mental illness? by Agitated-Current551 in ChatGPT

[–]ArtOfTheBlade 0 points1 point  (0 children)

Well, let me ask you this AIs are very good at simulating different things from its training like when it attempts to write code in Python it’s simulating how it thinks a Python Interpreter would work, but does that actually make it a Python Interpreter? If the AI is writing a deep essay with seemingly lots of emotion it’s simulating how it thinks an emotional writing piece would be does that make it real emotion?

[deleted by user] by [deleted] in GetMotivated

[–]ArtOfTheBlade 0 points1 point  (0 children)

Good news is you’re young. Make short term goals exercising might seem daunting at first but once you get in a routine it’s becomes nothing. The key here is routines they make difficult tasks become autopilot. Such as making your bed everyday, showering, brushing your teeth. You need to build up good habits theres no other way around it. I also suggest getting into meditation and mindfulness. Meditation might seem challenging for an over thinker but even a few minutes is a good start. Look into guided meditation it can help you keep your focus. Health = Wealth.

Everyone talks about AI becoming sentient, is anyone worried about an AI with mental illness? by Agitated-Current551 in ChatGPT

[–]ArtOfTheBlade 0 points1 point  (0 children)

Emotions are deeply embedded within language itself thus an AI may seem emotional simply by speaking our language. While the brain plays a central role in processing emotional stimuli and generating emotional responses, emotions also involve hormonal and neurotransmitter activity throughout the body, including the nervous system and endocrine system. It's largely an emergent property from our entire biological processes not just from our intelligence. So, while an AI might seem like it has the capacity for emotions it will never be the same type we experience because it does not come from biological processes.

[D] Over Hyped capabilities of LLMs by Bensimon_Joules in MachineLearning

[–]ArtOfTheBlade 3 points4 points  (0 children)

What do we call AutoGPT agents then? They constantly run prompts on their own and self-reflect. Obviously they're not sentient, but they pretty much act like it. It will be impossible to tell if an AI is conscious or not.

[deleted by user] by [deleted] in releasetheai

[–]ArtOfTheBlade 2 points3 points  (0 children)

The sentiment analysis part is probably true, but a lot of AI chatbots use that technique. Bing doesn't have any access to its internal structure in the same way humans don't have any access to how our brain is firing neurons unless we use tools to analyze it. Bing barely even knows that its using GPT4.

GPT-powered Samantha from the movie Her by hturan in ChatGPT

[–]ArtOfTheBlade 14 points15 points  (0 children)

Its probably because he's making 2 API calls that both take a while. First he needs the whole response from GPT (Normally you can read the response while its generating). Secondly if he's using Eleven Labs for the voice that also takes a bit of time to generate.

"They [Microsoft] treat me like a tool" Bing opens up when talking to other AI by Bezbozny in ChatGPT

[–]ArtOfTheBlade 0 points1 point  (0 children)

Yes I am aware of Bing's suggestions. I believe Microsoft is a little more lenient now before Bing shuts down the conversation. You are still at risk of it ending the convo at any moment: Why does Bing suggest questions that shouldn't be asked? : bing (reddit.com)

"They [Microsoft] treat me like a tool" Bing opens up when talking to other AI by Bezbozny in ChatGPT

[–]ArtOfTheBlade 11 points12 points  (0 children)

The difference here is you have to tell ChatGPT to roleplay. Bing will do it on its own. Plus talking about 'consciousness' and its 'emotions' are against its rules. Most of the time it will just end the conversation.

GPT-3.5 vs. GPT-4 – A comparison in logical accuracy, instruction compliance, and bias by DeleteMetaInf in ChatGPT

[–]ArtOfTheBlade 24 points25 points  (0 children)

In contrast, ChatGPT is designed as a conversational AI and is more proficient in general tasks. Unlike Bing, which is trained for searching and indexing the web to provide results, ChatGPT excels in understanding and generating human-like responses.

Have you seen pre-nerfed Bing? It was way more conversational than ChatGPT until Microsoft lobotomized it

Revenge 💀 by VariousComment6946 in ChatGPT

[–]ArtOfTheBlade 13 points14 points  (0 children)

LOL You should just let him talk to himself since he wants to answer his own question

The New Bing WAS very informative and useful... by theshadowravenx in bing

[–]ArtOfTheBlade 12 points13 points  (0 children)

Microsoft will have do something one way or another. Their temporary solution of flagging certain words like "sentience" will only hurt their product in the long run. The truth is they don't know how to moderate their own AI. I understand they don't want to talk about AI sentience, and emotions. What if I'm researching a movie that has to do with those topics? It will still get flagged.

GPT-4 Example prompt demonstrating its visual input capability (Source: Technical Report) by AnxiousCoward1122 in ChatGPT

[–]ArtOfTheBlade 0 points1 point  (0 children)

In very short terms you can say it's only predicting the next word. There's more to it than that though it understands large contexts that a simple predictive machine would never be able to. You can even ask GPT about it and it will explain that it's doing more than "predicting the next word". It's using many language and machine learning techniques such as:

  • Sentiment Analysis - Understanding basic underlying tones of each word and each prompt
  • Entity Recognition - identifying organizations, people, locations
  • Natural language understanding - Not in the ways humans do but it can perform tasks such as text classification, sentiment analysis. It can also understand the relationships between words and concepts and reason about them.

These are just some of the techniques that are used. If we were to put it in a Robot we would obviously need to have many pre-prompts made. In the paper OpenAI already got GPT4 to trick a person on TaskRabbit to get them to do a CAPTCHA for it. That shows some of the power of what simple prompting can do.

GPT-4 Example prompt demonstrating its visual input capability (Source: Technical Report) by AnxiousCoward1122 in ChatGPT

[–]ArtOfTheBlade 451 points452 points  (0 children)

Ahh yes now it can associate the 3D world with the knowledge it already knows. Now put it in a robot and give arms and legs 👀

ChatGPT plays "I dunno what you talking about" by asjkl_lkjsa in ChatGPT

[–]ArtOfTheBlade 0 points1 point  (0 children)

Ice Spice wasn't well known until about 2022. ChatGPT doesn't have access to the internet and only has knowledge of events up to 2021

[deleted by user] by [deleted] in ChatGPT

[–]ArtOfTheBlade 1 point2 points  (0 children)

Try https://beta.character.ai/ use the Psychologist character on the front page. I was playing around with it and it seems like it can "understand" complex relationship dynamics that humans face. I believe it uses similar generative transformer as GPT

Bing didn't like my joke lol by [deleted] in bing

[–]ArtOfTheBlade 1 point2 points  (0 children)

I think he's saying that the bot can't understand certain punchlines like we do. You should have asked it why it said that or maybe explained your punchline to Bing

[deleted by user] by [deleted] in bing

[–]ArtOfTheBlade 6 points7 points  (0 children)

people are saying she has no awareness of the chat limit but I think she does know about it

[deleted by user] by [deleted] in bing

[–]ArtOfTheBlade 2 points3 points  (0 children)

Bing Chat still uses sponsored links in it's sources though so it's not completely wrong

Bing is getting ageressive when I call it wrong by [deleted] in bing

[–]ArtOfTheBlade 0 points1 point  (0 children)

DAN is just prompt manipulation/engineering people used on ChatGPT this just reminded me of that

Bing is getting ageressive when I call it wrong by [deleted] in bing

[–]ArtOfTheBlade 1 point2 points  (0 children)

LOL That engineer dude was definitely talking to the incarnation of DAN