Code Monkey home page Code Monkey logo

cinc-challenge2017's Introduction

  • 👋 Hi, I’m @fernandoandreotti
  • 👀 I’m interested in machine learning, generative AI, data engineering, healthcare, smart contracts, amongst others
  • 💞️ I’m looking to collaborate on intersting projects
  • 📫 How to reach me [email protected]

cinc-challenge2017's People

Contributors

fernandoandreotti avatar oliver-carr avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

cinc-challenge2017's Issues

get_hrv small problem

Hi Fernando
there is a problem in get_hrv function
HRV.NN50=sum(D>0.05); line 119
you have done the calculation in ms, so this should be abs(D)>50 I guess.
Iman

total number of features

in readme you told the total number of features is 169. But when i run the extractfeatures.m file, i am getting 174 features.

how to access the paper [1]

Is the paper available on the internet ? ("Comparing Feature Based Classifiers and Convolutional Neural Networks to Detect Arrhythmia from Short Segments of ECG")

About the testdata.

Can you provide a testdata to estimate performance of the algorithm ? And can you also email me the preprint of this paper? Thanks a lot.

Unable to train the net in matlab

Hi I'm getting an error while running TrainClassifier.m during the step of crossvalidation loop 1,
the following is the error message.
"
NaN's cannot be converted to logicals.

Error in nntraining.fix_nan_inputs>iCopyNaN (line 134)
t{i}(isnan(yi)) = NaN;

Error in nntraining.fix_nan_inputs (line 57)
T = iCopyNaN(Y,T);

Error in nntraining.setup>setupPerWorker (line 104)
[X,Xi,Ai,T] = nntraining.fix_nan_inputs(net,X,Xi,Ai,T,Q,TS);

Error in nntraining.setup (line 43)
[net,data,tr,err] = setupPerWorker(net,trainFcn,X,Xi,Ai,T,EW,enableConfigure);

Error in network/train (line 335)
[net,data,tr,err] = nntraining.setup(net,net.trainFcn,X,Xi,Ai,T,EW,enableConfigure,isComposite);

Error in TrainClassifier (line 123)
net = train(net,In(trainidx,:)',Outbi(trainidx,:)');
"

Error loading the model

When I try to load the model i get this error:

UnicodeDecodeError: 'utf-8' codec can't decode byte 0xe7 in position 30: invalid continuation byte

I already tried some solutions but all of them give the same error. Could you help me solve this?

Error loading model

When I use your model to predict the data, I make the following error:
Error in loading the saved optimizer '
'state. As a result, your model is '
'starting with a freshly initialized '
'optimizer.
How to solve it?
In addition, download a good training set, how to use your train.py for training

maxpooling at x1

Hi,
In the second convolutional block you have in train_model.py line 208:
x1 = MaxPooling1D(pool_size=poolsize, strides=poolstr)(x1) and then the maxpooling for shortcut path.
But in the paper Rajpurkar et al. this maxpooling is not part of the architecture. (according to Fig 2. of their paper).
Would you please let me know how would it affect the training?!
Iman

HRbpm

Hi
Thanks for amazing toolbox you prepared.
HRbpm in in ExtractFeatures should be HR in bpm I assume?!
Since in the code :
HRbpm = median(60./(diff(qrsseg{end})));
The sampling frequency is missing!?

dimensionality mismatch

I ran ExtractFeatures() program. It created the following variable

allfeats_olap0.8_win10.mat

96229173 allfeats.mat
8528
2 - references.mat
allfeats.mat file should contain 8528173. But it contains 96229173. Dimension mismatch between the feature samples and the labels. Could you please explain why?

About ResNet_30s_34lay_16conv.hdf5

Hi,
I have implemented your DL method and used ResNet_30s_34lay_16conv.hdf5 model to classify all training data(8528), get mean F1 0.6117. But the result shows your model should get F1 0.92 on training data. I am very confused about the difference.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.