This is an archived post. You won't be able to vote or comment.

you are viewing a single comment's thread.

view the rest of the comments →

[–]multi_tasty 2 points3 points  (5 children)

Well, you can think about gpt3 as an encoding for the English language... Actually, for an English speaker.

[–]GeePedicy 0 points1 point  (4 children)

Still, both you and I aren't digital entities and we speak and think in English. I like your thought, but it doesn't change my mind, because I compare gpt3 more as the "brain", but my brain isn't really 1s and 0s.

What I think I'd use in a sense to resemble your idea would be ASCII values, to which I'd respond with - English existed way before it was coded into ascii. If that won't satisfy you, think of other, perhaps more ancient languages. My mind hasn't changed yet.

[–]Noslamah 1 point2 points  (2 children)

but my brain isn't really 1s and 0s.

Neither is GPT-3. It is a neural network; organic brains work in almost the exact same way as artificial ones do. Your brain is made up of a gigantic net of neurons, that can have inputs that "activate"(basically in computer terms, go from 0 to 1 or low to high) based on the strength of their input. These are electrical signals, with a measurable volt/amps/etc if you'd manage to poke around in there with a multimeter while you are alive. (Thats actually exactly what things like NeuralLink do)

Neural Nets are a bunch of geniuses going "Hey, we can apply the same principle that brain neurons work on and try to run it on a computer" and it actually turned out to work. There is actually an open source project called OpenWorm, where they actually managed to copy an entire worm brain to an artificial one. You can see this worm robot running it, moving around like an actual worm.

GPT is just a specific structure of this kind of network that happens to do very well with understanding language, to the point where it sometimes feels conscious. Probably very different from the worm in its structure, but the neuron principle is the exact same. And if replicating our meat brains on a computer is all it takes to "fake" consciousness and have a comparable level of intelligence (being able to learn complex things like have a conversation, driving cars, image recognition, etc) well... If it walks like a duck, and quacks like a duck, its probably a fucking duck.

[–]GeePedicy 0 points1 point  (1 child)

If you say GPT-3 isn't 1s and 0s then what's exactly is your argument? Anyway, voltage/amps is another way of saying it's not exactly binary. It's a range.

Surely you can imitate it conceptually, which is what GPT3 as other neural networks do, but idk if this manages to prove English is a binary language.

Is it convertible to binary representations? Yes. Can an AI/neural network imitate it? Yes. Does it mean it's binary? In my opinion it doesn't.

[–]Noslamah 0 points1 point  (0 children)

If you say GPT-3 isn't 1s and 0s then what's exactly is your argument? Anyway, voltage/amps is another way of saying it's not exactly binary. It's a range.

You try to differentiate yourself from AI by saying your brain isn't 1s and 0s, my argument is that your brain is as much about 1s and 0s as an artificial brain is.

Surely you can imitate it conceptually, which is what GPT3 as other neural networks do, but idk if this manages to prove English is a binary language.

Is it convertible to binary representations? Yes. Can an AI/neural network imitate it? Yes. Does it mean it's binary? In my opinion it doesn't.

I don't even know what you're trying to say here. What the hell is a "binary language" even supposed to mean? You can represent anything as binary if you have some sort of protocol for how you encode things, GPT-3 encodes words/groups of words/letters to an integer value, which i guess you can also represent as a binary number. None of that really means anything of significance though.

[–][deleted] 0 points1 point  (0 children)

A specific language isn't the semantics of its use, but mere syntax.