Code Monkey home page Code Monkey logo

analog-nn's Introduction

Analog-Neural Network

Training Neural Networks using Analog circuits

Implementing an Analog ANN

  • In the Analog circuit implemented the weights of the network are stored in capacitors in Op-amp integrators.

  • For every training data, circuit must directly give a weight combination to make the actual output almost same with the expected output.

  • Then all such weights are combined and stored as a vector.

  • After a number of iterations the average of weights can be treated as an optimal solution.

Neurons

  • The neurons need sum up the input and perform a nonlinear activation function.

  • The sum can be achieved with an op-amp adder shown in Fig.

  • ReLU function shown in the following graph is considered for the activation function.

  • The difference is that the lower limit of our activation function has a small positive offset and there is also a upper limit because of the limit of the supply voltage.

Buffer ReLU

Synapses

  • The synapses store the weights and multiply them with the outputs of the last neurons.

  • The capacitor in integrator circuit is used to store a voltage level and output of the integrator can be used as the weight.

  • In this way the charging speed (and hence training) can be controlled directly by a voltage and a zero input voltage will hold the output.

  • Let Vw be the output voltage, Vadjust be the input voltage used to adjust the weight. Then the relation between them can be given by: Formula

Followed by this multiply operation has to be done. This can be achieved by an analog multiplier circuit. (IC AD633 in this case)

Synapses Circuit

img

Feedback circuit

  • The neurons and synapses can only perform the feedforward of an ANN. In order to train a network, feedback circuits are needed.

  • The way to adjust the weights should be designed to make the feedback end up with the equal between the actual output and the expected output of the ANN. So we can employ a voltage follower.

  • The non-inverting input of the op-amp is the expected output of the ANN and the inverting input of the op-amp is the actual output of the ANN.

  • However, with the delay of the op-amp and the ANN circuit, the output of the ANN would be fluctuate around the expected voltage. Specifically, the gain of the op-amp is high, which means a small error would create a big signal to adjust the weights. So we use another feedback to reduce gain of the error along with resistors (R1 and R2) in feedback path to control gain.

  • Let Vadjust be the adjust signal, and Vex and Vac are the expected output and the actual output. Then the relation between them can be obtained as: img

Feedback Circuit

img

Circuit with improved gain control

img

Full Circuit

One Neuron trainer

img

Where the "WEIGHT" block is

img

Simulation Results

LT Spice was used for all simulations.

img

For an input signal of 5v the graph shows:

  • The weight calculated for the Neuron (Red curve).
  • The expected result (Green curve)
  • The obtained result (Blue curve)

So we observe that the neuron rapidly trains based on the expected signal and input fed for it to calculate weight.

References

analog-nn's People

Contributors

sudharsan2000 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar

Forkers

drawson5570

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.