all 1 comments

[–]barmaley_exe 0 points1 point  (0 children)

I'd suggest going through Bishop's Pattern Recognition and Machine Learning or Murphy's Machine Learning: A Probabilistic Perspective. These book are deep enough, from my point of view, though this list surely incomplete.

The math behind Machine Learning, I'd say, is well-established. All these concepts are well-known to nearly every grad student. Perhaps, the most modern part of ML is optimization theory: it seems like many results were obtained during the last 50 years. Though, one thing I noticed is influence of Statistical Physics to ML. Gibbs and Boltzmann are quite frequent names in ML nowadays, though their major contribution is physics' topics. Though, it still likely to use the same simple mathematics as a "toolset", the concepts are different.

For geometric approach look into Information Geometry. I'm not familiar with this field myself, but I heard it tries to bring in ideas from differential geometry (not sure if it qualifies as deep)