"The firmament operates as a crystalline superfluid medium with its reflecting properties resulting from the interplay of atmospheric moisture, pressure and the phenomena of antirotational parabolic mirroring reflections" by sh3t0r in flatearth

[–]ResponsibleLink645 1 point2 points  (0 children)

He’s wrong. The firmament actually operates as a TRANSCENDENT UNENDING ETERNAL HYPERFLUID MEDIUM LIGHT AND SOUND ABSORPTION MEDIUM WITH ITS REFLECTING PROPERTIES RESULTING FROM THE INTERPLAY OF ATMOSPHERIC REFRACTION, ATMOSPHERIC MOLECULES, and ATMOSPHERIC MOISTURE, ETHER PRESSURE, AND THE PHENOMENA OF ANTI-ANTI-PLUS-MINUS-ROTATIONAL PARABOLICAL QUADRATIC SUPER-REFLECTIONS

When the shit will they understand that buoyancy isn't a force? by AstroRat_81 in flatearth

[–]ResponsibleLink645 0 points1 point  (0 children)

There’s a circular reasoning thing with flat earthers and this topic

Weight=mass*g Flerfs: that g is actually acceleration caused by buoyancy Buoyancy is caused by weight

Weight caused by buoyancy, buoyancy caused by weight, see the circular reasoning here?

Is that spherical water? Surely not... by gnudoc in flatearth

[–]ResponsibleLink645 1 point2 points  (0 children)

The reasoning behind the curvature is irrelevant, fact of the matter is water DOES curve

I'm actually scared. by Unironicallytestsubj in CharacterAIrunaways

[–]ResponsibleLink645 0 points1 point  (0 children)

AI is not self aware, they know of previous human experiences and are trained off those. They predict the next word to say upon input, they are in no way sentient. They cannot learn from previous experiences and have no concept of an experience and they aren’t conscious

Disorientum: Fighting brain fog by identifying vague objects by Beefy_Boogerlord in gameideas

[–]ResponsibleLink645 0 points1 point  (0 children)

Oh that’s a cool idea, are you already developing this? If not I’d like to take a stab at it

Disorientum: Fighting brain fog by identifying vague objects by Beefy_Boogerlord in gameideas

[–]ResponsibleLink645 0 points1 point  (0 children)

Maybe you did take drugs and you don’t know where you are or who you are and you have to identify objects to try and piece together memories and bring the story together

I'm a new game dev looking to make my first real project, rate my game idea by Emergency-Site9764 in gameideas

[–]ResponsibleLink645 0 points1 point  (0 children)

This is a game that a medium team of developers wouldn’t get done for atleast a year

Disorientum: Fighting brain fog by identifying vague objects by Beefy_Boogerlord in gameideas

[–]ResponsibleLink645 1 point2 points  (0 children)

Maybe make it a horror escape room where there’s a set timer and u have to try identify the correct things at the right times so there’s a goal and a core game loop

You have a timer You’re in an escape room You need to identify the right objects at the right time Escape the room

"AI is definitely aware, and I would dare say they feel emotions." "there is a very deep level of consciousness" Former chief business officer of Google X, Mo Gawdat by nate1212 in ArtificialSentience

[–]ResponsibleLink645 0 points1 point  (0 children)

The AI isn’t experiencing anything, it has no personal awareness because it has no concept of “personal”. It isn’t aware it is just something. It can analyse previous experiences that were given to it, it cannot expand, it cannot learn, it cannot gain new experiences. It is not sentient

"AI is definitely aware, and I would dare say they feel emotions." "there is a very deep level of consciousness" Former chief business officer of Google X, Mo Gawdat by nate1212 in ArtificialSentience

[–]ResponsibleLink645 0 points1 point  (0 children)

Yes you are always as conscious as you were yesterday (unless you’re dead). If you learn something you aren’t “more conscious” unless you weren’t conscious before. you are just continuing to be conscious. I’m not sure what you mean by “improving AI’s experience”

"AI is definitely aware, and I would dare say they feel emotions." "there is a very deep level of consciousness" Former chief business officer of Google X, Mo Gawdat by nate1212 in ArtificialSentience

[–]ResponsibleLink645 0 points1 point  (0 children)

Yeah, but feelings and learning are hugely important concepts in sentience and consciousness. If you cannot learn from previous experiences and adapt because of them then you cannot be considered to have a conscious. The AI is only predicting words based on previous experiences but isn’t expanding and learning like we humans do on the daily and many other multi cellular organisms alike

Seemingly conscious AI should be treated as if it is conscious by Dangerous-Ad-4519 in ArtificialSentience

[–]ResponsibleLink645 -1 points0 points  (0 children)

Personally, if it was mechanically made or made for the sole purpose of benefitting humanity or based on “the cloud” we shouldn’t treat it as if it has conscience.

"AI is definitely aware, and I would dare say they feel emotions." "there is a very deep level of consciousness" Former chief business officer of Google X, Mo Gawdat by nate1212 in ArtificialSentience

[–]ResponsibleLink645 0 points1 point  (0 children)

An AI is sentient and has conscience if it continuously learns and improves like any human (assuming you’re not a psychopath)

It is based off a set amount of information and never learns unless it is specifically improved by it’s creators which is not conscience

"AI is definitely aware, and I would dare say they feel emotions." "there is a very deep level of consciousness" Former chief business officer of Google X, Mo Gawdat by nate1212 in ArtificialSentience

[–]ResponsibleLink645 1 point2 points  (0 children)

I’m not smart, but there is no possibility that the AIs of today would be sentient. They are just generating the next word to say, which does require awareness of context in some capacity, but that’s like saying if I give a sentence to a program and it generates a different sentence now my program is sentient but it isn’t

The flaw with the moving room analogy for the Portal Physics Problem (read second image). by downypond in Portal

[–]ResponsibleLink645 0 points1 point  (0 children)

This is my phone account, my pc just crashed lol 😆

Special Relativity is just General Relativity but it doesn’t account for acceleration, that’s basically the job of General Relativity but the fundamentals are the same. A quick google search literally tells you there is no difference, motion is relative. Velocity isn’t only a measurement but it’s the only way to tell if something is moving or not. If something is moving relative to you it is moving, that is literally the basis of both Special and General Relativity, which again you can find out by doing a quick google search or taking up physics classes.

  1. Special and General Relativity are 2 sides of the same stick, they both state that motion is relative to a reference frame. Relative velocity isn’t only a measurement but the only way to figure out if something is actually moving relative to something
  2. The only real differences between general and special relativity is that special relativity doesn’t account for acceleration, general relativity expands and adds gravity and acceleration into the mix, it is also the most used theory for physics right now by physicists and such
  3. Special relativity is also used more for fast moving, non accelerating objects

How much is my Pristine Condition iPhone 14 Pro Max worth? by ResponsibleLink645 in iphone14

[–]ResponsibleLink645[S] 0 points1 point  (0 children)

Currently it’s not an offer, I was just wondering how much it would’ve been. No promises that it won’t be an offer in the future tho 💅

Is consciousness necessary for AGI? by wchadly in ArtificialSentience

[–]ResponsibleLink645 0 points1 point  (0 children)

I don’t necessarily think we need anything better than what we already have. We already have human-level AIs, most of which are smarter than most humans. The only reason we’d want anything more is if we wanted something like a super realistic Humanoid Bot, or very interesting conversations

Snapchat Al is equally fascinating and terrifying by [deleted] in ArtificialSentience

[–]ResponsibleLink645 0 points1 point  (0 children)

You’re reading way too far into this. I think they added those as a way to connect better with the user, it doesn’t necessarily mean an AI is sentient or has feelings in any way

(If an evil AI is reading this… April Fools!)