Materials for my Machine Learning class
This course offers a discovery of the landscape of Machine Learning through some key algorithms. Although the first session tries to cover the full span of Machine Learning techniques, the subsequent sessions will focus on the Supervized Learning problem and will categorize the algorithms from four distinct points of view (the Bayesian perspective, linear separation, neural networks and ensemble methods). The approach taken mixes voluntarily hands-on practice in Python with theoretical and mathematical understanding of the methods. At the end of the course aims at giving the capacity to make an informed choice between the main families of ML algorithms, depending on the problem at hand. It gives an understanding of the algorithmic and mathematical properties of each family of methods and provides a basic practical knowledge of the Scikit-Learn and Keras Python libraries.
Done with this class:
- implement a generic workflow of data analysis for your application field;
- know the main bottlenecks and challenges of data-driven approaches;
- link some field problems to their formal Machine Learning counterparts;
- know the main categories of Machine Learning algorithms and which formal problem they solve;
- know the name and principles of some key methods in Machine Learning:
- SVM and kernel methods,
- Naive Bayes Classification,
- Gaussian Processes,
- Artificial Neural Networks and Deep Learning,
- Decision Trees,
- Ensemble methods: Boosting, Bagging, Random Forests;
- know the basics of Scikit-Learn and Keras.
The Elements of Statistical Learning.
T. Hastie, R. Tibshirani, J. Friedman.
Springer Series in Statistics.
https://web.stanford.edu/~hastie/ElemStatLearn/
Deep Learning
Ian Goodfellow and Yoshua Bengio and Aaron Courville
MIT Press
https://www.deeplearningbook.org/