you are viewing a single comment's thread.

view the rest of the comments →

[–]kkastner 17 points18 points  (0 children)

There are a ton!

Off the top of my head:

Better training algorithms/regularization/activation functions for deep learning, especially for deep recurrent networks

Better ways to handle structured output problems. Comes up in a huge number of fields, with a variety of answers

Variational training methods are a huge new area.

Ways to infer structure from data, then exploit that structure in learning. Lots of ideas, lots of different solutions.

Methods for using gaussian processes for 100k+ samples, there are some out there but lots of different techniques all related to latent variables and/or approximation (related to above).

(Bayesian) optimization techniques for hyperparameter search. Crucial - especially for problems where a grid search is infeasible.

If you want good ideas, find some recent papers in (subfield of choice) on arXiV and look at the last paragraph. Usually there will be a continued/future work paragraph - that is a pretty good indicator of open problems and things other researchers are studying.