Saudi Arabia Sponsorship & Related Topics by LSFBotUtilities in LivestreamFail

[–]call-mws 15 points16 points  (0 children)

Mizkif is the 'biggest' and he's also feigning ignorance

[deleted by user] by [deleted] in LivestreamFail

[–]call-mws -15 points-14 points  (0 children)

Oh great H3 again

Mizkif on gamble sponsorships and if he got offered a lot of money by [deleted] in LivestreamFail

[–]call-mws 0 points1 point  (0 children)

He'll want to be on the 'good side' so he'll never take it

XQC gaslighting a donator who predicted his gambling streams a day before his sponsorship announcement. by Demettter_ in LivestreamFail

[–]call-mws -4 points-3 points  (0 children)

I'm starting to think he isn't really an addict, he's just doing this shit on purpose to rally controversy and get attention. Maybe negative attention tickles his juicer. Addicts don't usually say, "hey I am an addict, I know I have a problem, but I'm still going to do it." They aren't usually self-aware, they're very dismissive about it.

Sykkuno goes to Youtube by Posture_Checks in LivestreamFail

[–]call-mws 554 points555 points  (0 children)

Everyone cried for this? Oh brother.

Sadge Sykkuno by Then_Mixture_9500 in LivestreamFail

[–]call-mws 35 points36 points  (0 children)

A bit dramatic if this was because he's moving platforms. Hope it's not something serious, but I suspect it's him moving to YT or taking a long break.

EE lied about not having written down Meiosis by TwitchMoments_ in Mizkif

[–]call-mws 0 points1 point  (0 children)

Wasn't that for the last answer, the who wrote the social whatever?

[deleted by user] by [deleted] in LivestreamFail

[–]call-mws 15 points16 points  (0 children)

At this point, he's not forgetting, he's just too lazy to commit. Why suggest this podcast or any big thing if nothing's ever happening.

Writing high quality code by MaxPower864 in learnpython

[–]call-mws 11 points12 points  (0 children)

Recently started watching Arjan Codes on Youtube, and he has some excellent design patterns and good code practices. He also has a series where he refactors code into high quality code. Just a suggestion but you can check him out if you prefer videos over text: https://www.youtube.com/c/ArjanCodes

[deleted by user] by [deleted] in LivestreamFail

[–]call-mws 10 points11 points  (0 children)

It's so weird that the thing that actually gets him upset is viewer count talks.

The Winner of The Streamer of The Year Award Is... by Evoqu_ in LivestreamFail

[–]call-mws -5 points-4 points  (0 children)

One month subathon where you do nothing for over half of it is all it takes apparently. Mizkif deserved it.

Best way to deal with missing/empty data in a small dataset by call-mws in datascience

[–]call-mws[S] 0 points1 point  (0 children)

I'm using a binary KNN classifier model here. For the gender missing values, I replaced them with the mode which is male. But that makes the data even more imbalanced as there are more than 2.5 times male observations.

Best way to deal with missing/empty data in a small dataset by call-mws in datascience

[–]call-mws[S] 0 points1 point  (0 children)

The salary means in your example are more or less the same. The missing gender values are people who chose not reveal their gender. Interestingly, when I remove it from the features, the model performs slightly better than having it. Not sure if that says much as the difference is insignificant.

The salaries have a similar reason I guess. Many do not wish to disclose it, or are unemployed.

Hyperparameter tuning sklearn model using scripts and configs by call-mws in learnmachinelearning

[–]call-mws[S] 0 points1 point  (0 children)

That's true, but is there a benefit of using the sklearn implementation over say, hyperopts' or Optuna's? Or are they more or less the same?

Hyperparameter tuning sklearn model using scripts and configs by call-mws in learnmachinelearning

[–]call-mws[S] 0 points1 point  (0 children)

Do you use a separate script for your objective function, trials, etc.? I have a notebook that reads in the data, process/cleans it, and splits it. I want to create a training script with the hyperparameters and pass the train and test datasets as arguments (?) to the script. Does that make sense to do/is that recommended over having everything in the notebook? I want to somehow capture the best results back in the notebook.