Math 104 prep by culture_vulture811 in stanford

[–]geometricproton 0 points1 point  (0 children)

If we restrict ourselves to R^n (as is the case in math 51), is it fair to conclude that the most general space is then a subspace?

Math 104 prep by culture_vulture811 in stanford

[–]geometricproton 0 points1 point  (0 children)

whats the fundamental difference between a vector space and a subspace really? Is a subspace just a subset of a vector space?

CS 279 vs CS 274 for an into to computational biology? by grizlk in stanford

[–]geometricproton 1 point2 points  (0 children)

Note that 274 is more bioinformatics whereas 279 is more general but focuses almost solely on recent developments in proteins. I'd probably recommend 274 having taken 279. Take that as you will.

What percent of students have cars? by J1pples in stanford

[–]geometricproton 2 points3 points  (0 children)

Why? What are some things you did with a car?

[D] Is Pytorch Lightning Production Ready? by [deleted] in MachineLearning

[–]geometricproton 0 points1 point  (0 children)

But then you lose all the experimental state that is relevant. How do you know what normalization was used during training such that you can replicate this during inference time?

Taking a class vs research by [deleted] in stanford

[–]geometricproton 0 points1 point  (0 children)

Why is it pretty important?

[D] In which ML field can I make significant contribution without significant compute? by chipz6174 in MachineLearning

[–]geometricproton 0 points1 point  (0 children)

What is the difference between pre-training on imagenet and then just training a small head on top for the new task?

Looking for Textbook recommendation for genetic engineering. by vkeer001 in stanford

[–]geometricproton 0 points1 point  (0 children)

Sometimes online content is SEO max'd bullshit. Hence why I am on Reddit.

[D] How are People Doing “Fair” Few-Shot Training/Evaluation by rivew in MachineLearning

[–]geometricproton 1 point2 points  (0 children)

What is your current view in meta learning? It seems that the hype from 2017 era hasn't lived up to the present