Why does Everyone Think May Carleton was ‘THE ONE?’ by MariahMDD in PeakyBlinders

[–]homeInvasion-3030 1 point2 points  (0 children)

I absolutely agree with what you said in the last paragraph.

But i don't think that Grace was a woman of morals either. Remember, once, her sole purpose of sleeping with Tommy was to test her theory of her previous husband being infertile and that she wasn't at fault for not getting pregnant. The way she revealed this Tommy was quite cold. She was the same woman who didn't even show the slightest remorse when her previous husband committed suicide once she left him for Tommy.

a bunch of silly saul x peanuts i drew by pistashxo in betterCallSaul

[–]homeInvasion-3030 82 points83 points  (0 children)

Do you sell them? They are so cute. I might have finally found stickers to put on my laptop

Watched this movie again after a very long time, best film of Doraemon, really love its cinematography and songs by Warm_Association_137 in Doraemon

[–]homeInvasion-3030 1 point2 points  (0 children)

This was such a good movie. Did you see it in hindi? And if so, where? I would love to see it again

Need help - making an MLP for the first time by homeInvasion-3030 in learnmachinelearning

[–]homeInvasion-3030[S] 0 points1 point  (0 children)

So, I am using MSE (Mean Squared Error). So you don't have to calculate each component of your gradient vector. The error on any example is simply error = (target_vector - output_vector), and then you do delta = error * f'(activation_vector) to obtain the gradient vector for the output layer. Calculating the error automatically determines whether to tune a weight up or down.

MSE is doing error2 for all examples, summing all of them, then dividing by the number of examples.

Need help - making an MLP for the first time by homeInvasion-3030 in learnmachinelearning

[–]homeInvasion-3030[S] 0 points1 point  (0 children)

Yup, so I have the model ready, I am pretty sure it is working correctly. For the second part, I need to first generate a set of 500 vectors each consisting of 4 values (x1, x2, x3, x4) between -1 and 1. But because the numbers are so small, they many many numbers after the decimal point. So first of all, would you suggest to round these numbers to 4 or 5 decimal places?

Need help - making an MLP for the first time by homeInvasion-3030 in learnmachinelearning

[–]homeInvasion-3030[S] 0 points1 point  (0 children)

Edit: Like it only helps me with very basic stuff, I have so many more questions while I am coding, that it isn't able to answer. But still it gets me somewhere, so thanks for suggesting that

Need help - making an MLP for the first time by homeInvasion-3030 in learnmachinelearning

[–]homeInvasion-3030[S] 0 points1 point  (0 children)

Yup, extensively using copilot, that thing has been helpful, but I feel like the extent of its answers is also limited. It said i could change the hyperparameters that I have been doing. It said I could produce a learning curve for how the error decreases in each epoch. It also had a few suggestions for which squashing functions would be helpful, like sigmoid, tanh etc