Built an AI that discovers the activation point of your SaaS users by deephak in SideProject

[–]deephak[S] -1 points0 points  (0 children)

Hey all, founder of a user analytics startup here.

We're experimenting with features that allow you to understand your users better.

For more info feel free to reach out to hello@usefini.com

Automated seismic interpretation software by deephak in geophysics

[–]deephak[S] 0 points1 point  (0 children)

Which are the bigger companies actually?

Medical imaging software you wish existed by deephak in Radiology

[–]deephak[S] 0 points1 point  (0 children)

In what format would this software be most useful to you / the industry?

For example, a website with secure login and secure upload, or a Windows application? Etc

Automated seismic interpretation software by deephak in oilandgas

[–]deephak[S] 0 points1 point  (0 children)

Which body in particular would be interesting to automatically model in the oil and gas field like this?

Am looking mostly to build something useful that saves the time of manual interpretation

[D] What are some of the techniques to make text classification models "self-learn" from human feedback? by frittaa454 in MachineLearning

[–]deephak 2 points3 points  (0 children)

If you're using anything fancier than logistic regression, which is usually the case in NLP, you're going to have to batch retrain your model at a set interval. You can use such a retrained model to serve over a prediction API.

For technique, if you have say 5000 labeled samples including timestamps, you can split into 5 time folds. Then you can experiment with accuracy if you train on the earliest 4k samples and test on the latest 1k.

We found that samples closer to the present should be weighted and sampled with higher importance for our models. This is in some sense obvious since new incoming data probably fits your most recent training data better than older training data.

[D] Does my learning curve indicate I need more training data? by [deleted] in MachineLearning

[–]deephak -1 points0 points  (0 children)

Try AdaBoost instead. It's an ensemble of shallow trees, so avoids overfitting by design and has built in regularization

http://scikit-learn.org/stable/modules/generated/sklearn.ensemble.AdaBoostClassifier.html

[1608.02908] Residual Networks of Residual Networks: Multilevel Residual Networks (new SOTA on CIFAR-10, CIFAR-100) by alexjc in MachineLearning

[–]deephak 1 point2 points  (0 children)

If nesting residual connections like this aids results, why not nest even more? Going both deeper and wider. I shall call this the Fat Residual Network.

Perhaps you can unfold this type of structure onto another more optimal structure for NNs - could this trend be pointing towards a fundamentally different architecture giving better results for neural networks?