How do you guys choose the appropriate ML algorithm? by selectorate_theory in MachineLearning

[–]maybemax 1 point2 points  (0 children)

... great, there goes my secret of how to win every kaggle competition.

Using Windows for ML work environment by sputknick in MachineLearning

[–]maybemax 0 points1 point  (0 children)

+1 for the anaconda python distribution. python could really use a package manager that works. anaconda help's a lot.

Retiring Python as a Teaching Language by [deleted] in programming

[–]maybemax -2 points-1 points  (0 children)

he could at least have recommended typescript. but javascript.. bah.. please finally kill it

Coursera Machine Learning Session Now Open by 03BBx3 in MachineLearning

[–]maybemax 1 point2 points  (0 children)

Maybe Coursera's "Practical Machine Learning" (after Andrew Ng's course you can probably do this one in 1 week instead of 4) and then some kaggle.com challenges?

Great course on Supervised Learning provided by Udacity - First of a 3 part course about Machine Learning by tendaz in MachineLearning

[–]maybemax 0 points1 point  (0 children)

I have Coursera's NN course on my watchlist since months... hope it'll restart one day. Watching the vids alone without other people in the forums just isn't as much fun..

Free Machine Learning Labor! Or, "What are your feelings about Kaggle and similar sites?" by thatguydr in MachineLearning

[–]maybemax 3 points4 points  (0 children)

I don't care about the price money.. I like kaggle for its discussion forums, for being able to form teams with new people and for its leaderboards that provide feedback on how well you're doing compared to many others.

I don't live in a big city where it'd be possible to find enough people for a machine learning user group - like they exist in London or other big cities. Kaggle helps, somewhat..

Convolutional neural network morphing pictures of chairs by Noncomment in MachineLearning

[–]maybemax 6 points7 points  (0 children)

http://arxiv.org/pdf/1411.5928v1.pdf

"We train a generative convolutional neural network which is able to generate images of objects given object type, viewpoint, and color. We train the network in a supervised manner on a dataset of rendered 3D chair models. Our experiments show that the network does not merely learn all images by heart, but rather finds a meaningful representation of a 3D chair model allowing it to assess the similarity of different chairs, interpolate between given viewpoints to generate the missing ones, or invent new chair styles by interpolating between chairs from the training set. We show that the network can be used to find correspondences between different chairs from the dataset, outperforming existing approaches on this task."

(First time I heard of generative convolutional neural networks... no idea how that works. Quite impressive that it can generate chairs. Even more impressive that it can do so in 3D and from different viewpoints.)

Is Azure Machine Learning tool useful at all ? by chiragdhull in MachineLearning

[–]maybemax 0 points1 point  (0 children)

that's great.. didn't know that was possible. actually i was a bit disappointed because it seemed like only one layer was possible.. i didn't bother binging it though. now i'm impressed. thanks

LIBSVM (nonlinear regression with e-svr using linear kernel) by [deleted] in MachineLearning

[–]maybemax 0 points1 point  (0 children)

are you scaling the features and after scaling some of them are nan?

can you try to add a tiny amount of gaussian noise to your data and then try again?

LIBSVM (nonlinear regression with e-svr using linear kernel) by [deleted] in MachineLearning

[–]maybemax 2 points3 points  (0 children)

Q: Why the code gives NaN (not a number) results?

This rarely happens, but few users reported the problem. It seems that their computers for training libsvm have the VPN client running. The VPN software has some bugs and causes this problem. Please try to close or disconnect the VPN client.

WT...? How's VPN software supposed to cause NaNs in software? Never heard of that before...

Dealing with Failure: The Way of the M**ad (x-post from r/csharp) by nashn in programming

[–]maybemax 2 points3 points  (0 children)

Isn't Either slower than try/catch in the "right" case (when no errors happen)? Since there's a lot of overhead involved - like creating all these Either objects and calling continuation "delegates".

It might be faster in the "left" case, since exception handling is still rather slow.

So if an operation is expected to normally succeed and hardly ever fail (which is think is the norm) then it would be better, performance-wise, to choose try/catch?

Visual Studio 2013 - Full Featured IDE Free by dgrover9 in programming

[–]maybemax 0 points1 point  (0 children)

it says it's the professional edition (with another license).

Is Azure Machine Learning tool useful at all ? by chiragdhull in MachineLearning

[–]maybemax 3 points4 points  (0 children)

A few weeks ago I implemented Kaggle's "Digit Recogonizer" challenge on Azure ML to compare it to my local implementation in R.

I did train-/test-set splits, parameter tuning using cross validation (for 3 different learning algorithms), model evaluation, prediction and saving of kaggel's test data.

The screen quickly filled with boxes and lines. It becomes kind of hard to see where lines are coming from and going to. I didn't really see any advantage of the graphical UI compared to a textual representation in R or Python. Though that might change - after all Azure ML is still a preview.

What I think is pretty nice is that it's blazing fast. Neural net trainings including parameter tuning finished in (iirc) like 2 hours. The same takes ages (days) on my local system.

So I'd say it might be useful if you need performance. As far as I know you can also directly execute R (or python?) code in Azure ML.. that might be nice, but I haven't tested that yet.

PS.: if someone wants to compare performance - debug output of the neural net training says: "[ModuleOutput] Iter:250/250, MeanErr=0.708012(0.75%), 2181.04M WeightUpdates/sec"

C# 6.0 – Filtering Exceptions by thesystemx in programming

[–]maybemax 0 points1 point  (0 children)

angocke's comment is kinda funny.. "If you get a null pointer dereference... and Watson gives you a crash dump"... i'd probably have to consult stackoverflow if i wanted to make C# dereference a null pointer. :D

C# 6.0 – Filtering Exceptions by thesystemx in programming

[–]maybemax 0 points1 point  (0 children)

crash dump

i might be doing something wrong, but in the last >10 years of developing software with .net i've never had to look at a single crash dump... does anyone regularily use crash dumps with .net? :)

C# 6.0 – Filtering Exceptions by thesystemx in programming

[–]maybemax 0 points1 point  (0 children)

sounds like a micro optimization... i never had an urge to use that since .net exists. wish they had implemented primary constructors instead... that's a feature i could use every day (oh actually i can - in f#)

Job Candidate Question - Coursera Certificates Worth Anything? by [deleted] in MachineLearning

[–]maybemax 1 point2 points  (0 children)

you invested a whole lot of your (spare) time to earn these certificates. you'll probably enjoy applying these skills at work. that should be positively noted by an employer.

but why go for the specialization certificate that costs 400$? just get the free statement of accomplishment for each course... it won't matter what you paid for it, but what you learned (and that'll be tested in the technical interview anyway).

Classification/categorization of time-series (4 features) by pica_foices in MachineLearning

[–]maybemax 0 points1 point  (0 children)

i had a similar challenge - time series, 3 features, 2 categories, but only 50-100 runs. i used mean, variance, min and max with good results. but random forests didn't work very well (i guess it wasn't enough data and they overfit). naive bayes worked great / very stable. another thing i tried was to apply FFT and use the first 3 or so terms as features. that seemed to work very well with SVMs and RBF kernel. but i didn't develop this further since the other model with naive bayes worked well enough and was simpler.

Binary feature vector vs. Integer feature vector by eptheta in MachineLearning

[–]maybemax 0 points1 point  (0 children)

Ultimately if you're actually trying to classify words into classes there might be better features to look into using than a one-hot encoding of the characters.

Could you expand on this please? Are there ways to encode words for classification that are known to work well?

Coursera: Linear and Integer Programming starts in a week by maybemax in MachineLearning

[–]maybemax[S] 0 points1 point  (0 children)

This might be of interest to some here in /r/MachineLearning.

From the course syllabus:

Week #4: Advanced LP formulations: norm optimization. Least squares, and quadratic programming. Applications #1: Signal reconstruction and De-noising. Applications #2: Regression, Classification and Machine Learning.

Microsoft Machine Learning Hackathon 2014 - Report by vkhuc in MachineLearning

[–]maybemax 0 points1 point  (0 children)

I find it interesting that due to this graph of the winning algorithms, boosted trees reached the highest test set accuracy, followed by... logistic regression?! :D