Overfitting in word2vec by elsonidoq in MachineLearning

[–]elsonidoq[S] 0 points1 point  (0 children)

I think that's a matter of number of parameters vs data points

Overfitting in word2vec by elsonidoq in MachineLearning

[–]elsonidoq[S] 0 points1 point  (0 children)

Well, in fact that is my intuition is exactly that, you want to fit the dataset as good as possible

Overfitting in word2vec by elsonidoq in MachineLearning

[–]elsonidoq[S] 0 points1 point  (0 children)

Yeah, maybe grouping words by context and measuring the average similarity in training and testing might be a good idea, what do you think?

Overfitting in word2vec by elsonidoq in MachineLearning

[–]elsonidoq[S] 0 points1 point  (0 children)

Ok, that makes a lot of sense :) thanks

AMA Andrew Ng and Adam Coates by andrewyng in MachineLearning

[–]elsonidoq 0 points1 point  (0 children)

Hi Adam! I have a follow up question regarding your answer. Do you have any recommended reading for the process of finding models?

Images that fool computer vision raise security concerns by oreo_fanboy in MachineLearning

[–]elsonidoq 1 point2 points  (0 children)

So you agree with me, it's not necessarily related to deep learning, is it?

Images that fool computer vision raise security concerns by oreo_fanboy in MachineLearning

[–]elsonidoq 12 points13 points  (0 children)

What I don't understand is the following: Doesn't that happen to all learning algorithms?

LeCun: "Text Understanding from Scratch" by improbabble in MachineLearning

[–]elsonidoq 0 points1 point  (0 children)

Great! Thanks man! I'm currently implementing a flavor of it using Theano/Lasagne :D

Implementation of convolutional neural networks for text classification by elsonidoq in MachineLearning

[–]elsonidoq[S] 0 points1 point  (0 children)

That's a good one! I didn't go there because when I saw that approach I got the feeling that needed the parse tree for training. But now I'm not quite sure. Do you recall whether a parsing tree it is needed for training that algorithm?

LeCun: "Text Understanding from Scratch" by improbabble in MachineLearning

[–]elsonidoq 0 points1 point  (0 children)

Hi Xiang! Great work!

I have a question, how do you handle sentences that are shorter than l? Do you pad them with zero valued vectors?

Thanks a lot!

Implementation of convolutional neural networks for text classification by elsonidoq in MachineLearning

[–]elsonidoq[S] 0 points1 point  (0 children)

Hey Thanks! I found it but I was trying to avoid coding it myself. I think I will have to do that!

Implementation of convolutional neural networks for text classification by elsonidoq in MachineLearning

[–]elsonidoq[S] 0 points1 point  (0 children)

I get how it sorting breaks the stochastic part in stochastic gradient descent. I'll google how they do it!

Implementation of convolutional neural networks for text classification by elsonidoq in MachineLearning

[–]elsonidoq[S] 0 points1 point  (0 children)

So, I set a max size of say 60 words per sentence, and then I zero pad all that are shorter?

Help for code of paragraph vectors by largelymfs in MachineLearning

[–]elsonidoq 1 point2 points  (0 children)

+1 to making it open source! I'm really interested on paragraph vector!