This project designs a multilayer perceptron neural network using a fully connected MLP architecture. In a fully connected MLP, also known as a dense MLP, each neuron in one layer is connected to every neuron in the next layer. This type of architecture allows for complex nonlinear mappings but runs the risk of overfitting with large datasets. The neural network has an input layer, 2 hidden layers and an output layer. For the forward pass, the activation function used in each layer and neuron is the Sigmoid function. It does not use back propagation but leverages Cross entropy as an optimizer. This project is entirely experimental. The model is further used to predict customer churn for a bank achieving same classification metrics as the Scikit learn library MLP model.
For a detailed explanation on the theory used for this computation and an overview on how machines learn, check out the accompanying article on medium
You can find the code for this project here.
File overview:
MLP_Classification.ipynb
- the full code from this project
To follow this project, please install the following locally:
- Python 3.8+
- Python packages
- pandas
- numpy
- matplotlib
- sklearn
- scipy
The data used for this implementation is the Bank Customer Churn Dataset originally on Kaggle.
You can download the exact file used in this project here:
- Bank Customer Churn Dataset - the customer churn data used in this project.