Code Monkey home page Code Monkey logo

mackey_glass_time_series's Introduction

Mackey_Glass_Time_Series

Implementation of a two-layer perceptron (from scratch) with four back-propagation methods in Python

Mackey-Glass Time Series

For generating the Mackey-Glass time series, I used the formula below: 1

You can also see the plot of data below:
data_series

The goal is to predict x(t + 1) from x(t - 2), x(t - 1) and x(t). Thus, our neural network has three-dimensional inputs:
input: [x(t - 2), x(t - 1), x(t)]
output: x(t + 1)
I've considered 70 percent of the data as training data, 25 percent as validation data, and 5 percent as my test data. And for more stable training, I have normalized the data by the min-max normalization method.
As a result of normalizing the data, the data range changes to [0, 1], allowing us to use the unipolar sigmoid function as our activation function.

The architecture of two-layer perceptron

As I mentioned above, the input dimension of the network is three. I've considered five neurons and one neuron for the hidden and output layers, respectively. This is the network architecture: Screenshot (545)

Feed-forward:

Screenshot (548)

Back-propagation:

At first, A uniform distribution is used to randomly initialize the network's weights (W1, W2), then I've used different methods to train the network, which can be seen below:

1: stochastic gradient descent
This method updates the network parameters (weights) at every time step, making it sensitive to noisy data. Screenshot (551)

You can also see the results below: ezgif com-gif-maker

2: emotional learning
This method is very similar to the SGD method but uses the previous step error to make the network learn faster and more accurately. Screenshot (552)

3: adaptive learning rate
In this method, we assign a different learning rate to each of the trainable parameters, which are called adaptive learning rates. For each element of the weights matrix, we consider a different learning rate, which is also trained during the learning process. (As I said before, this method has more learning parameters (twice as much as before), and MSE might fluctuate and need more time for training). 145407636-28acfd04-20b2-47f4-b72c-19b7c3385a7f

4: levenberg marquardt
This algorithm is a generalization of the Gauss-Newton algorithm, designed to increase the convergence speed of the second-order optimization. If ๐œ‡(๐‘ก) equals zero, the algorithm is the same as the Gauss-Newton algorithm, and if ๐œ‡(๐‘ก) is a large number, this algorithm is very similar to SGD. This is a batch-based method. In other words, unlike the above methods, we do not update the parameters after the arrival of the new training sample, but the parameters will be updated after the arrival of all the training samples. After each epoch, parameters are updated based on the jacobian matrix, which stores all epoch information. (This method is appropriate for the low number of data, and by increasing the data size, the volume of calculations increases sharply). Screenshot (554)

The image below shows that the best result for the network is achieved with the Levenberg Marquardt algorithm: Figure 2021-12-10 002638

mackey_glass_time_series's People

Contributors

parham1998 avatar

Stargazers

 avatar

Watchers

 avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.