Code Monkey home page Code Monkey logo

ml-coding-practice's Introduction

Machine Learning Practice

Things learned :

  • Data Selection: Consider what data is available, what data is missing and what data can be removed.
  • Data Preprocessing: Organize selected data by formatting, cleaning and sampling from it.
  • Data Transformation: Transform preprocessed data ready for machine learning by engineering features using scaling, attribute decomposition and attribute aggregation.
  • Used numpy, matplotlib, numpy and sklearn libraries to handle, visualize and manipulate data

Things learned :

  • What is SVM? Concepts and theory
  • Implementation in python: SVM and Kernel SVM
  • Different Kernel SVMs and comparison

Things learned : (tutorial helps from machinelearningmastery)

  • How Decision Trees work
  • Splits made on the basis of entropy or Gini Index
  • Implementation in sklearn and from scratch

Things learned : (Fundamentals of Deep Learning, Chapter 3)

  • Logistic Regression on MNIST Data
  • Feed Forward Network on MNIST data and comparison
  • Tensorflow implementation : using variable scope and name scope for network

Day 5 :

Beyond Gradient Descent (Chapter 4 of Fundamentals of Deep Learning by Nikhil Budma)

Things learned

  • Challenges with Gradient Descent: Local Minima and their effect in deep learning error surfaces
  • Momentum based Optimization: keeping memory of grdients for smoother error surfaces
  • Learning Rate Adaptation: (1) Adagrad (2) RMSProp (3) Adam
  • Adagrad accumaltes and adapts the global learning rate using istorical gradients
  • RMSProp is exponentially weighted moving average of gradients: it enables us to "toss out" measurements we made a long time ago
  • Adam is the variant of both RMSProp and AutoGrad

Day 6 :

MADL-Videos Day (Machine and Deep Learning- Videos Day)

Watched Documentaries and videos relating ML and DL

Things learned

  • Implementation of Naive Bayes Classifier in python
  • Class probability and attribute probability based classifier
  • Functions for class probabilities and attribute probabilities
  • More work required on the concepts

Day 8 :

Things learned

  • How to think about a machine learning problem
  • Howto think about the output and to analyse what we want to predict from the model
  • While thinking about features and input values, think about the availability of all the data at deployment time

Things learned

  • How to use speech recognition library in python for speech_recognition from microphone
  • Develop a small guessing game based on the recognized speech from microphone
  • Theory of how it all works

Things learned

  • What are convolutional neural networks and what was the need?
  • What are filters and feature maps and how convolution helps extracting features
  • Implementation on MNIST data using Tensorflow

Things learned

  • Batch normalization and how it is helpful for training
  • How CIFAR 10 dataset is handled by using batch normalization
  • Implementation of the network

Day 12

Chapter 6: Fundamentals of Deep Learning Book (Embedding and Representation Learning)

Things learned

  • Embedding and Representation Learning: A way to escape the curse of dimensionality
  • Principal Component Analysis: concepts and mathematical formulation study
  • AutoEncoders: Introduction and basic concepts

Chapter 6 (continued): Fundamentals of Deep Learning Book (Embedding and Representation Learning)

Things learned

  • Denoising Autoencoders: More Robust Autoencoders
  • Introducing Sparsity in Autoencoders
  • When context is important in representations: English language as an example and their representation learning
  • Coded an mnist autoencoder (using dense layers) in keras

Things learned

  • Implementing Autoencoders using convolutional neural networks
  • Using convo layers works better than fully connected layer in terms of reconstruction

Pytorch tutorial:

Things learned

  • pytorch methods for building neural networks
  • Data loading and manipulations to tensors in pytorch
  • Autograd and backprop concepts in pytorch

Pytorch tutorial:

Things learned

  • How to create your own custom dataloader
  • How to transform your data to make it same
  • How to use DataLoader pytorch class to enable batching, shuffling and parallel loading of data

Day 17

Leraning Pytorch with Examples

Pytorch tutorial:

Things learned

  • Main Features of Pytorch: n-dimensional tensors, kinda like numpy arrays but can run on GPU. Second important is Automatic differentiation for building and training networks
  • Deeper understanding of Autograd and how pytorch builds computational graphs to computer gradients and weight updates
  • Difference between Tensorflow and Pytorch: Dynamic and static graph
  • Hw to build custom nn modules and optimizers

Pytorch tutorial:

Things learned

  • What is transfer learning and the importance of transfer learning: fine-tuning, as a fixed feature extractor
  • Constraints of Transfer learning
  • Implementation of transfer learning using pytorch (help from pytorch tutorial)

Day 19

Recurrent Neural Networks and Sequence-to-Sequence (Tutorials and concepts)

Lecture

Day 20

How Google does Machine Learning (Learning google cloud platform and ML APIs)

The course took 5 days with extensive lab introduction and practice

ml-coding-practice's People

Contributors

sohaib90 avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.