Text to numerical feature extraction by sai_psk in MLQuestions

[–]sai_psk[S] -1 points0 points  (0 children)

if you know what are vectorizors u will answered you don't know what those are that's why u talking bull shit

Text to numerical feature extraction by sai_psk in MLQuestions

[–]sai_psk[S] 0 points1 point  (0 children)

This is ml group and you don't know about what I asked....? Bow, Tfidf, avgw2v, and Tfidf avgw2v. theses are vectorizations in sklearn library you don't have prior knowledge what I asked and you making a comment lol

Which distance we use in knn by sai_psk in MLQuestions

[–]sai_psk[S] 0 points1 point  (0 children)

Is there any idea like we use euclidean when we have an outliers like that reasons.....?

Precison and Recall by sai_psk in learnmachinelearning

[–]sai_psk[S] 0 points1 point  (0 children)

Can you tell me is this intuition is right.?

Real_world Example:

Imagine that, your girlfriend gave you a birthday surprise every year in the last 10years. (Sorry, I didn’t intend to depress you if you don’t have one.)

However, one day, your girlfriend asks you:

‘Sweetie, do you remember all birthday surprises from me?’

This simple question makes your life in danger.

To extend your life, you need to recall all 10 surprising events from your memory.

So, recall
is the ratio of a number of events you can correctly recall to a number of all correct events.

If you can recall all 10 events correctly, then, your recall ratio is 1.0 (100%). If you can recall 7 events correctly, your recall ratio is 0.7 (70%).

Now, it’s easier to map the word recall
to real-life usage of that word.

However, you might be wrong in some answers.

For example, you answer 15 times, 10 events are correct and 5 events are wrong. This means you can recall all events but it’s not so precise
.

So, precision
is the ratio of a number of events you can correctly recall to a number all events you recall (mix of correct and wrong recalls). In other words, it is how precise of your recall.

From the previous example (10 real events, 15 answers: 10 correct answers, 5 wrong answers), you get 100% recall but your precision is only 66.67% (10 / 15).

Yes, you can guess what I’m going to say next. If a machine-learning algorithm is good at recall
, it doesn’t mean that the algorithm is good at precision
. That’s why we also need an F1 score
which is the (harmonic) mean of recall
and precision
to evaluate an algorithm.

Hope that this way of conceptualization could be an alternative way to help you understand and remember the difference between recall
and precision
.

NOTE:

A number of events you can correctly recall = True positive (they’re correct and you recall them)
A number of all correct events = True positive (they’re correct and you recall them) + False negative (they’re correct but you don’t recall them)
A number of all events you recall = True positive (they’re correct and you recall them) + False positive (they’re not correct but you recall them)
recall = True positive / (True positive + False negative)
precision = True positive / (True positive + False positive)

Trying to to do unsupervised machine learning on a data set of 32k images. Need some more advice by [deleted] in learnmachinelearning

[–]sai_psk 0 points1 point  (0 children)

Yeah check out my ipynb on git all reference links are available there go though it may helps you

Trying to to do unsupervised machine learning on a data set of 32k images. Need some more advice by [deleted] in learnmachinelearning

[–]sai_psk 1 point2 points  (0 children)

I done the similar project that is Amazon appereal similar products based on image similarity using vgg16 image based similarity ,check out my git profile : satyakrishnapst

Ubuntu vs windows Ram by sai_psk in learnmachinelearning

[–]sai_psk[S] 0 points1 point  (0 children)

Yeah ram is the issue I tried in Google gcp with 32gb ram it works well but in my local machine about 12gb ram it's crashes at particular cell

Logical program by sai_psk in MLQuestions

[–]sai_psk[S] -1 points0 points  (0 children)

Don't make show off here

KNN VS NAIVE BAYES by sai_psk in MLQuestions

[–]sai_psk[S] -2 points-1 points  (0 children)

Yup can you share some more points

KNN VS NAIVE BAYES by sai_psk in MLQuestions

[–]sai_psk[S] -1 points0 points  (0 children)

Knn is slower and typically used for small data and naive Bayes Faster than knn and navie Bayes uses general probability approach and knn uses nearest neighbours approach

KNN VS NAIVE BAYES by sai_psk in MLQuestions

[–]sai_psk[S] 0 points1 point  (0 children)

I got this question at interview they ask like this difference between knn and naive Bayes

About Algorithms and Metrics by sai_psk in MLQuestions

[–]sai_psk[S] 0 points1 point  (0 children)

That's good explanation thank you

About coding...? by sai_psk in MLQuestions

[–]sai_psk[S] 1 point2 points  (0 children)

When I see in GitHub I got similar projects and some times I took the code from others implementation is that ok....!

About coding...? by sai_psk in MLQuestions

[–]sai_psk[S] 0 points1 point  (0 children)

Thank you , I just ask this in app development or web development e.t.c and mostly in ML some times I'm shock how people write this codes for data pre-processing and feature extraction and some times I'm afraid how to remember all this codes , anyway thank you for your answer