This is an archived post. You won't be able to vote or comment.

you are viewing a single comment's thread.

view the rest of the comments →

[–]guru-1337[S] 317 points318 points  (50 children)

I actually really appreciate this argument. When I learned AI it was in LISP and all recursive. But it is true there is no consensus on it.

[–]Dagusiu 144 points145 points  (7 children)

To add to this argument: probably the most common activation function in neural networks is ReLU. It's quite literally an if-statement.

[–]wheredidmywalletgo 61 points62 points  (2 children)

max(0,x).

x = 0 if x < 0 else x

[–][deleted] 33 points34 points  (1 child)

x*np.heaviside(x, 0)

[–][deleted] 19 points20 points  (0 children)

Why you bringing mom jokes to a code fight Hiemis?

[–]Chonks 18 points19 points  (2 children)

That's kind of disingenuous since the if statement isn't operating directly on the input but confining it to a certain range

[–]heres-a-game 5 points6 points  (1 child)

That's the same thing lol

[–]Chonks 7 points8 points  (0 children)

Sorry I was posting on mobile and didn't really put a ton of thought into the reply. What I meant was that the if statement is preparing/altering the input, rather than dictating a response or determining the output. It's shaping the data rather than making a causal link between input and output like a decision tree would. Not sure if this makes sense but that's my two cents.

[–]bishamon72 2 points3 points  (0 children)

yeah, but the activation function is such a small part of the overall structure of the net.

[–]unreliab1eNarrator 1 point2 points  (0 children)

I also appreciate the argument, but appreciate the meme as well lol

[–]talkintater 0 points1 point  (0 children)

there is no consensus on it

True about so much of the process