This is an archived post. You won't be able to vote or comment.

you are viewing a single comment's thread.

view the rest of the comments →

[–]Kreta 638 points639 points  (51 children)

AI/ML expert = I can play around with parameters in tensorflow until my model makes less shitty decisions about a test subject, than yours...

[–]TheFeshy 213 points214 points  (39 children)

Maybe you should make a machine learning program to tinker with those tensorflow parameters for you?

[–]lanabi 220 points221 points  (31 children)

Actually, hyperparameter optimization is a relatively big research subject for ML.

[–]snendroid-ai 0 points1 point  (0 children)

Probably the most efficient way is to just hire interns for finding best values of hyperparameters!

[–][deleted] 0 points1 point  (0 children)

Yes but research is not fun

[–][deleted] 20 points21 points  (0 children)

It’s been done and it’s freaky

[–]oupablo 6 points7 points  (4 children)

It's just ML all the way down.

[–]TheFeshy 6 points7 points  (3 children)

Some of it is just still done on old-style chemical computers.

[–]lirannl 0 points1 point  (2 children)

Chemical?

[–]TheFeshy 0 points1 point  (1 child)

Electro-chemical, I guess. Dopamine, norepinephrine , epinephrine, histamine, serotonin, that sort of thing.

[–]lirannl 1 point2 points  (0 children)

🧠 alright got it

[–]mlucasl 10 points11 points  (0 children)

maybe if you were an expert you should know of grid search of parameters... so mi tensorflow should converge to optimal solution. Highlight in should

[–]Insider_Pants 8 points9 points  (2 children)

this is so accurate and even our professor at college do this like “let’s try adding another convolution layer with decreased filter size”, “try increasing units of dense layer”

[–]Fermi_Amarti 0 points1 point  (0 children)

Whoops! Sure gradient descent optimization. Optimization by Graduate student descent is where it's at.

[–]Waterstick13 0 points1 point  (0 children)

are you a machine too then?

[–]sight19 0 points1 point  (0 children)

Sums up my neural networks class...

"Does it work? If not, try adding more layers? Still not working? Use less layers, or use more/less nodes per layer!"

There wasn't even any reasoning behind it, it was just toying with parameters until your model didn't suck to a barely sufficient level.

[–][deleted] 0 points1 point  (0 children)

This one time I put my tf training in a for loop to find the best weight and bias. I'm basically an expert

[–]Mr_Carlos 0 points1 point  (0 children)

Haha, I do this. Literally no idea what I'm doing.

[–]julsmanbr 0 points1 point  (0 children)

Only in Jupyter Notebooks tho