For our final machine learning project, we used the data from a Kaggle competition---after preprocessing it---to test the ML algorithms that we coded by ourselves. We made four experiments in total, two per ML algorithm: elastic nets with different optimization approaches, and decision trees with different split criterion.
- Elastic nets with Stochastic Gradient Descend
- Elastic nets with Batch Simulated Annealing (our innovation)
- Decision tree with MSE split criterion
- Decision tree with MAE split criterion
For the elastic nets, got very close results to the Scikit-learn implementation. However, we couldn't achieve the same with our decision trees.
More details can be found in the .
Coded by Alexander Rodriguez and Santosh Malgireddy for the course Machine Learning at the University of Oklahoma.