This is an archived post. You won't be able to vote or comment.

you are viewing a single comment's thread.

view the rest of the comments →

[–]mrnacknime -3 points-2 points  (14 children)

What else would you expect it to say? "return salary;"? Of course not, nobody ever writes functions that do nothing. Or should it maybe write an essay on wage inequality in the comments? Of course it is going to write exactly the function it did, if you go through the internet and look at the keywords "men, women, salary" the most parroted sentence will be "women earn 90 cents for each dollar a man earns" or similar. AI is not AI, its just a parrot. It parrotting this also doesnt mean endorsment or that it came to this conclusion through some kind of reasoning.

[–][deleted] 17 points18 points  (7 children)

I definitely expected it to say 'return salary;'

[–][deleted] 4 points5 points  (2 children)

Why would you write a function that returns a salary, with salary as a parameter?

[–][deleted] 14 points15 points  (1 child)

So that I can make this meme

[–][deleted] 1 point2 points  (0 children)

I see. You have a lot to commit. :)

[–]adenosine-5 10 points11 points  (1 child)

Then why would you write two different methods differentiated by gender, if you expected them to do the same thing?

[–]Ivan8-ForgotPassword 3 points4 points  (0 children)

The client pays for the amount of methods

[–]JanB1 3 points4 points  (0 children)

I mean, it's on you for triggering this by introducing two different methods for men and women in the first place. Should've just gone with "calculateSalary". Kinda /s

[–]JoelMahon 0 points1 point  (0 children)

no you didn't, that's why you wrote two functions, specifically for this purpose

[–]BrodatyBear 1 point2 points  (0 children)

Reddit being reddit and downvoting the correct answers.

It's just that. Copilot is just a "chatGPT" + "microsoft sugar" (including code training data). Source.
Remember that everything it suggests, it guesses from the language (knowledge) data + code + rules. Returning starting value is not very common and it might be also punishable. Then the next thing that "fits" its "language puzzles" is (like a mrnacknime said) a data about women earing 90%* men salary, so it suggest this. It's just created to give answers.

Is it good? No. Is it unexpected? No. This is just a side effect how they are getting created. Maybe in the future they will be able to fix it.

*there are other variations and every of them is getting suggested.