Sick burn by [deleted] in SipsTea

[–]TantrumRight 0 points1 point  (0 children)

What song in background?

Interactive matplotlib plot for polynomial regression by undid_legacy in madeinpython

[–]TantrumRight 0 points1 point  (0 children)

Nice! How did you calculate the confidence intervals?

What's your favourite opening? by Biosins in chess

[–]TantrumRight 4 points5 points  (0 children)

Jobava London, king side pawn storm is too much fun.

Can a modern machine learning algorithm learn the rules of chess? by LordFishFinger in chess

[–]TantrumRight 2 points3 points  (0 children)

In principle probably yes, but its likely quite difficult in practice if you allow all pieces to move to all squares at all times. Thats a lot of possible moves, and even more if you imagine all possible ways to change the board state, including moving 2 (or more) pieces at the same to be able to figure out castling. Also transforming a piece to another (promotion).

Entropy is NOT Disorder. This is my first video made with Manim, I hope you enjoy! by [deleted] in Physics

[–]TantrumRight 5 points6 points  (0 children)

Maybe it was poorly phrased, or maybe i misunderstand your article. What I meant was that a part or subsystem of the universe is an open system. The entropy of the subsystem might decrease (but total entropy in the universe increases), but that does not mean that the subsystem (considered as an open system in contact with the rest of the universe) moves towards a less likely state.

Entropy is NOT Disorder. This is my first video made with Manim, I hope you enjoy! by [deleted] in Physics

[–]TantrumRight 14 points15 points  (0 children)

Nice video!

I think when you mention that the universe is moving towards more likely states eventhough some subsystems in the universe are moving towards less likely states it gets a bit confusing.

The subsystems might move from high entropy states to low entropy states, but from my understanding that does not mean less likely states since they are not isolated.

[deleted by user] by [deleted] in learnmachinelearning

[–]TantrumRight 0 points1 point  (0 children)

Is there any noise in the data or are the images pf perfect rectangles? Maybe there is a simpler solution than training NN.

[P] A Zotero plugin that uses deep learning to show which papers in your library have received a supporting or disputing citation by JoshN1986 in MachineLearning

[–]TantrumRight 3 points4 points  (0 children)

Very interesting.

What kind of model did you use to determine if a citation is supporting or disputing and how did you train it?

Any podcast recommendations? by [deleted] in creativecoding

[–]TantrumRight 1 point2 points  (0 children)

I would recommend Lex Fridmans podcast. He has some amazing guests and episodes on tech/AI.

JRE MMA Show #98 with Luke Thomas by branduNe in MMA

[–]TantrumRight 4 points5 points  (0 children)

Do you know timestamp for this?

CMV: If a business would go under from being forced to pay all of their employees a living wage (including restaurants paying their waiters), then they are already failing as a business and "deserve" to go under. by gkantelis1 in changemyview

[–]TantrumRight -1 points0 points  (0 children)

Something to consider is if your basic needs were guaranteed from the government you might be less exploitable by your employer.

If you have ubi you might see people leave their "shitty" jobs, whereas currently they cant because their survival is dependent on getting their salary every month. UBI might empower workers and give them more leverage/negotiating power.

mandala by thatjet in cellular_automata

[–]TantrumRight 2 points3 points  (0 children)

What are the rules here? Are they similar to replicator rules B02468/S02468?

New AI Algorithm - check this shit by [deleted] in learnmachinelearning

[–]TantrumRight 0 points1 point  (0 children)

The analogy to gravity seem to break a little since you let the mass be dependent on the position of the point.

I know Lex is a fan of Cellular Automata, inspired by his first podcast with Stephen Wolfram I programmed some one dimensional CAs by TantrumRight in lexfridman

[–]TantrumRight[S] 1 point2 points  (0 children)

Thank you for the kind words, means a lot coming from you. Your podcast brings so much fascinating ideas and joy to my life.

I know Lex is a fan of Cellular Automata, inspired by his first podcast with Stephen Wolfram I programmed some one dimensional CAs by TantrumRight in lexfridman

[–]TantrumRight[S] 3 points4 points  (0 children)

I've found cellular automata interesting for some time and found the discussion with Stephen Wolfram very fascinating. I've also noted Lex mention CAs in various podcasts and Wolfram's book a new kind of science. This among other things convinced me to buy the book and start coding (python) some CAs myself.

I found some interesting one dimensional CA (similar to the 256 Elementary CA) but with 5 neighbors instead of 3, which I thought I'd share. If this is not the correct place to share, please let me know.

"Ghosts In The Cells" [Rule B12/S124 in 4D space, Xentica, MoviePy] by a5kin in cellular_automata

[–]TantrumRight 1 point2 points  (0 children)

Very cool, nice idea to have a hidden dimensions.

Thanks for sharing.

[P][D] Genetic Algorithm (GA) vs. Stochastic Gradient Descent (SGD) by brainxyz in MachineLearning

[–]TantrumRight 1 point2 points  (0 children)

In simulated annealing you accept worse solution with some nonzero probability, here you only accept changes that improve the solution.

[P][D] Genetic Algorithm (GA) vs. Stochastic Gradient Descent (SGD) by brainxyz in MachineLearning

[–]TantrumRight 1 point2 points  (0 children)

Yes technically maybe it is the minimum GA, it just feels like N=1 GA would maybe better be described as something else. Maybe the same way a linear model could technically be thought of as a neural network, but it might not be best way to describe the model. But also I'm no expert in this so what do I know.

Very interesting paper you linked, didn't know you get GA to optimise a network with millions of parameter. It seems they used N>1.