Comments (6)
Hi Matthias,
Unfortunately I don't have an example that works out of the box, but I can share some scripts that I used years ago for quantization experiments with the Distiller library. I won't be able to help much with debugging at this point, but hopefully you'll find some pointers that get you going.
(A good starting point would be low_precision/distiller/run_ptq.py
for post-training quantization or run_qat.py
for quantization-aware training.)
from snn_toolbox.
Thank you very much for your fast response. I will have a look. However, in the mean time I face another issue.
I use a LeNet 5 for the fashionMNIST data set. The parsed model is really close to the ann accuracy with 94%. If i use temporal_mean_rate encoding then I achieve about 92% accuracy which is good. But now if I change to "ttfs" I have still a parsed model of the same accuracy but the SNN has an accuracy around 60%. Do you know why this happen?
Below the config I use.
I am really confused that it works pretty well with the temporal_mean_rate but not with ttfs. Because on the MNIST dataset the ttfs performes pretty well. Therefore, I would expect that it also should work pretty well on the fashionMNIST
config = configparser.ConfigParser()
config['paths'] = {
'path_wd': WORKING_DIR,
'dataset_path': DATASET_DIR,
'filename_ann': MODEL_NAME,
'runlabel': MODEL_NAME+'_'+str(NUM_STEPS_PER_SAMPLE)
}
config['tools'] = {
'evaluate_ann': True,
'parse': True,
'normalize': True,
'simulate': True,
'convert' : True
}
config['conversion'] = {
'spike_code': 'ttfs',
'softmax_to_relu':True
}
config['simulation'] = {
'simulator': 'INI',
'duration': NUM_STEPS_PER_SAMPLE,
'num_to_test': NUM_TEST_SAMPLES,
'batch_size': BATCH_SIZE,
'keras_backend': 'tensorflow'
}
config['output'] = {
'verbose': 2,
'plot_vars': {
'input_image',
'spiketrains',
'spikerates',
'spikecounts',
'operations',
'normalization_activations',
'activations',
'correlation',
'v_mem',
'error_t'
},
'overwrite': True
}
from snn_toolbox.
Remember that the TTFS encoding only uses a single spike per neuron to represent a floating point activation value. Temporal mean rate uses many times that number of spikes so it can achieve higher precision / accuracy more easily at the cost of increased computation. To illustrate some of the issues when using a single spike: In our coding scheme, large activations result in fast spikes, small activations in slow spikes. So if a neuron receives as input both slow and fast spikes, it may fire an output spike due to the fast input spike without waiting for the slow input spike (which could have inhibited the firing). We explain it in a bit more detail in the paper.
Unfortunately, MNIST is not a good predictor for success of a method. Fashion MNIST was designed explicitly to be harder than MNIST while still small and easy to handle. We've struggled to make TTFS work with CIFAR for example, so I'm not surprised that the accuracy dropped for Fashion MNIST. There were a couple of things we tried to improve performance - a dynamic threshold, or training with quantized / clipped activations.
from snn_toolbox.
Thank you for your response.
Oke so you would no expect a bug or something like that?
I was confused, because a paper which uses your toolbox noted an accuracy of 88.9% with TTFS on the fashionMNIST dataset. Therefore, I assumed that there is some mistake.
from snn_toolbox.
I don't think it is a bug, more likely a configuration issue / finding the right hyperparameters. Or training the ANN in a certain way before conversion (it usually helps if the activations of all layers are distributed about equally between 0 and 1, or are clipped or quantized - because the converted SNN effectively quantizes and clips the spikerates due to a finite simulation resolution and maximum firing rate). Perhaps you could ask the authors of that paper to share their config?
from snn_toolbox.
Thanks for your help.
from snn_toolbox.
Related Issues (20)
- TypeError: can't multiply sequence by non-int of type 'float' HOT 4
- IndexError: only integers, slices (`:`), ellipsis (`...`), numpy.newaxis (`None`) and integer or boolean arrays are valid indices
- SpinnmanIOException: IO Error: Failed to communicate with the machine HOT 7
- Query regarding INI simulator HOT 6
- Conv1D Conversion Normalization Issue HOT 2
- ONNX model could not be ported to Keras.Mismatched elements 100% HOT 1
- Code required for a research paper HOT 2
- ModuleNotFoundError: No module named 'keras_rewiring' HOT 2
- Which neuromorphic hardware does SNNtoolbox simulate ? HOT 3
- Error happened while building parsed model HOT 2
- Key Error HOT 1
- index -1 is out of bounds for axis 1 with size 0 HOT 2
- Membrane Potential Values after spike conv layer. HOT 1
- Loading a a converted SNN .h5 model using 'load_model' HOT 1
- Energy and runtime estimation for running the SNN on neuromorphic simulator HOT 3
- TTTFS dyn thresh and TTFS corrective not working HOT 2
- Poisson Rate Encoding HOT 4
- TTFS HOT 1
- Cannot import name 'literal' from typing. HOT 1
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from snn_toolbox.