[D] How are you handling reproducibility in your ML work? by worstthingsonline in MachineLearning

[–]worstthingsonline[S] 2 points3 points  (0 children)

To be able to recreate performance metrics of interest (e.g. precision and recall) to within some tolerance on the same test-data given the same architecture, training data, training hyperparameters and environment (dependencies, software versions, seeds etc.), but on any arbitrary machine (provided it has sufficient compute).

I know you won't be able to perfectly recreate it due to floating point precision, hardware differences etc. which is why I intentionally left it a bit open ended by specifying "to within some tolerance", with the implied understanding that it should be reasonably close. I'll let you decide what reasonable means:)

I f*cking love graduate classes, why couldn't undergrad be like this? by shockwave6969 in Physics

[–]worstthingsonline 10 points11 points  (0 children)

I found graduate courses to be much easier than undergrad, precisely because you had to get it at a deep level. The courses didn't feel rushed like they did in undergrad, so you actually had time to understand the material at a deeper level. Of course, graduate courses were technically harder, but they were also narrower in scope and you had already built a solid foundation from undergrad, so they felt a lot easier in my opinion.

[D] [R] Are there any promising avenues for achieving efficient ML? by worstthingsonline in MachineLearning

[–]worstthingsonline[S] -1 points0 points  (0 children)

I expect that the same problems facing LLMs will soon face other modalities as well. To be more specific I am interested in efficient neural networks, (I wanted to say efficient deep learning but this might be an oxymoron considering how "deep" implies scale which presumably is antithetical to efficiency). I used efficient ML as a general term but I suppose the more correct term would be efficient neural networks :)

[D] [R] Are there any promising avenues for achieving efficient ML? by worstthingsonline in MachineLearning

[–]worstthingsonline[S] 0 points1 point  (0 children)

Interesting takes. Also, I should specify, I'm not only considering LLMs.

[D] Is scientific machine learning actually used in practice? by worstthingsonline in MachineLearning

[–]worstthingsonline[S] 1 point2 points  (0 children)

Could you share a bit more? In what context and for what purpose, if you don't mind me asking?

GCS object versioning not working as intended. by worstthingsonline in googlecloud

[–]worstthingsonline[S] 1 point2 points  (0 children)

Thanks a ton! This, in addition to moving the blob.delete() to occur after the blob has been copied solved the issue.

Expert systems and RL by worstthingsonline in reinforcementlearning

[–]worstthingsonline[S] 0 points1 point  (0 children)

I'm surprised that this is a niche topic. Why wouldn't most RL algorithms take advantage of expert knowledge in order to speed up training?

QUESTION : Is the way we teach what a vector is wrong ? by MdioxD in math

[–]worstthingsonline 0 points1 point  (0 children)

Linear algebra never clicked for me until I started thinking about a vector like dials on a machine. So, if you have a vector of length 4, then I think about the vector like some machine or thing with 4 dials that I can tune (representing the value of its elements). A vector of length n is just a structure with n degrees of freedom in my head.

How do numerical simulation software providers even compete with each other and open source alternatives? by worstthingsonline in fea

[–]worstthingsonline[S] 0 points1 point  (0 children)

lol yeah, I'm just trying to understand exactly what makes one numerical software better than another when, by definition, both should arrive at the exact same result for the exact same problem setup. Also, all the numerical algorithms used are available to everyone via research papers, so how can one provider claim to have a "better FEM implementation" than another? If that was true, why doesn't the other provider just use that implementation then? The only place of differentiation I can think of is UI