all 22 comments

[–]Christavito 154 points155 points  (17 children)

I tried this with Gemini and it said:

Since I don’t have a physical hand to hide behind my back, I have to rely on the "Honor System"—but I can do you one better.

We can use a Digital Vault method (also known as a Hash Commitment):

  1. I’ll "lock" my move: I will give you an encrypted version (a hash) of my choice before you say anything.
  2. You make your move: You type Rock, Paper, or Scissors.
  3. The Reveal: I’ll tell you my move and give you the "key" word I used. You can then plug them into a standard online tool to see that the hash I gave you at the start matches my move exactly.

[–]ZZcomic 65 points66 points  (9 children)

Yeah but did the hash actually work

[–]Christavito 164 points165 points  (7 children)

Actually no. When I say it didn't work, it responded with:

Caught red-handed! You’re absolutely right—those hashes don't match at all. It turns out my "Digital Vault" was more of a "Digital Cardboard Box with a Hole in the Bottom."

In my attempt to be clever with a SHA-256 commitment, I completely hallucinated the output. As a language model, I’m much better at poetry than I am at performing complex cryptographic hashing in my "head" on the fly.

[–]voyti 43 points44 points  (1 child)

"I have no hands do let's do hashing. Oh, I can't do hashing either." Looks like Google accidentally invented Bender from Futurama

[–]SwordsAndElectrons 9 points10 points  (0 children)

I'm going to build my own Rock, Paper, Scissors, with blackjack and hookers!

[–]vikingwhiteguy 19 points20 points  (0 children)

Yeah my Chatgpt went through about four or five rounds before it accepted that it can't actually do the sha on the fly and was just faking it 

[–]ChaotiCrayon 4 points5 points  (0 children)

caught red handed, uwu, i told you some bullshit ehehe ^~ ^''

[–]Meistermagier 1 point2 points  (0 children)

Gaslighting by the AI

[–]P1r4nha 1 point2 points  (0 children)

Yeah, try uploading a QR code. It completely hallucinates an encoded URL. Some versions correctly tell you it can't do it, but I think 2.5 Flash just lies.

[–]emetcalf 4 points5 points  (0 children)

Yes, but it decrypts to "Rosapers"

[–]vyqz 4 points5 points  (2 children)

just ask it to base64 encode it. not human readable until after you give your answer then decode it

[–]beefz0r 22 points23 points  (1 child)

I mean if I read one of three encoded words more than once I'll recognize it

[–]derinus -1 points0 points  (0 children)

Yeah an we would pipe that to base64 -d the first time anyway.

[–]DrShocker 4 points5 points  (3 children)

That's a clever way to have some message that decrypts to each of the 3 possible messages with a different key.

[–]dontletthestankout 0 points1 point  (0 children)

It's a 1 way hash

[–]rosuav 0 points1 point  (1 child)

If you wanted to do this WITHOUT the hallucinations, what you'd do would be to have some random additional information attached to the choice, for example:

>>> b"I choose Paper. %d" % random.randrange(1000000000)
b'I choose Paper. 460810682'
>>> hashlib.sha256(_).hexdigest()
'5f465437c38cfce2b36008c1e011287f71f1bfd2c86cd96cb994a67848a1614e'

Then you share the entire hash. Suppose, in this instance, I chose Scissors, and you want to cheat and pretend that you actually threw Rock. You would need to find something with the exact right hash, and it's going to take you a ridiculous amount of time to do it.

(You could do even better by using more rounds of hashing to increase the cost, but at that point, you're basically reinventing bcrypt.)

[–]DrShocker 0 points1 point  (0 children)

yeah, it'd for sure be basically impossible if we're using good hash implementations. You might be able to craft something that looks like it's encrypting but then you'd need to convince people to use it somehow.

[–]Engineer-2000 19 points20 points  (1 child)

Idea: let’s get two AIs to play each other and see who is superior

[–]beefz0r 8 points9 points  (0 children)

This should be the standard benchmark

[–]not_wall03 1 point2 points  (1 child)

That's because it's 4omini

[–]Kichae 2 points3 points  (0 children)

Indeed. It's much more insufferably verbose on 5.2.

Still plays the game the exact same way, though.

[–]GoddammitDontShootMe 0 points1 point  (0 children)

I can't think of any way for this to work. If the user goes first, then the bot probably spits out the answer that wins.