Code Monkey home page Code Monkey logo

npd's People

Contributors

wincle avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

npd's Issues

Where is the feaId from ?

Hi,
I would like to transfer the Matlab Frontal.mat to binary file. Firstly I check the code of LoadModel in C++ file, but I didn't figure out how can I get the feaId variable in this line
fread(_feaId,sizeof(int),numBranchNodes,file);

代码疑问

代码中的 feaIds 是做什么用的
feaIds.push_back(feaId[i]); leftChilds.push_back(leftChild[i]); rightChilds.push_back(rightChild[i]); cutpoints.push_back(cutpoint[i << 1]);

about NMS

opencv function Mat::eye is memory consumptive, when numCandidates is 200k, it will take away 200000*200000=40,000,000,000, nearly 40G memory.

frontal model

Hi,
can you also provide a dump of the 72-stage model (model_frontal.mat) from the matlab code or the code you used to create your 1226model file?
Thanks!

Which training parameters are be modified?

Hello,my learned model is poor.How many neg images and hard neg images did you use and what is the ration of neg/pos?Which training parameters are be modified for your 620model?Thank you very much!

Is it possible to offer an operation examples?

Hello there, I have already done make process and generated the demo file in the detection folder. However, I don't know how to operate NPD on a single image. Would you like to give an example about how to perform NPD on NPD/detection/1.jpg ?

MiningNeg报错

每次在运行到若干个iteration后都会出现OpenCV Error,检查后发现出错在MiningNeg函数里,NegImgs[i]这里取不到图像数据,导致ROI出错。负样本数据库里的每一幅图都检查过了可以正常读存,有别人碰到这个问题吗?
default

LearnGAB.cpp bug

LearnGAB.cpp line 276 and 278

int y0 = max(rects[idx].y - floor(3.0 * delta),0);

int x0 = max(rects[idx].x + floor(0.25 * delta),0);

why the y0 offset is 3 times of delta, but x is only 0.25 times of detla?

shouldn't the 3 become 0.3?

num of negative dataset

The negRatio is 0.5 means when loadding negative dataset we just load the half size of positive dataset. I used 24142 face images, acturlly 241420 for training,and the loaded neg size is 120710. The nNeg was declining. In LearnGAB.cpp, void GAB::MiningNeg(int n,DataSet& neg), the function can not mine more negs, so the program stucked when I trained 199 stages.
I am confused that should not negdataset much larger than posdataset generally? I tried to change the negRatio to 2 or 3 to load more negs, but run error. what's wrong?

关于检测速度?

楼主好,感谢您提供c++版本的代码。我在使用过程中发现检测速度特别慢,我的电脑配置为I5-6500 3.2GHz,内存8G,对640*480图片(只有一张脸)检测时间接近4.5s,我选择的最小人脸尺寸80,模型为620model,请问这样的速度是正常的吗?

coefficients by using matlab model

As you mentioned 'Because the difference between matlab and OpenCV. You should also change the coefficient in detection/LearnGAB.cpp:276~279 to fit the model.' , if I keep the parameters as they are, i get a bit strange outputs. Do you have any hints how to set those parameters if I want to use the model trained from Matlab? Thanks.

无法检测出人脸

框为32时无法检测出人脸;
框为24*24时,Nms函数崩溃:
int numCandidates = rects.size();
Mat predicate = Mat::eye(numCandidates,numCandidates,IPL_DEPTH_1U); //此处宽高超过31000会崩溃
宽高太大,也就是rects数目太多(调试时看到是6位数)

FAR is the keyword of C language

the variable FAR is the keyword of C language. I compiled it in vs and got errors.
btw, the training method in the paper is soft cascade , and your implemention is gentle adaboost?
is there any testing result?

HOW TO TRAIN?

Hi, @wincle ,
In data folder, I creat two file named FaceDB.txt and NonFaceDB.txt.Every thing is done.
when I run ./demo, it print:

[daniel@localhost NPD-c]$ ./train/demo 
NPD

train:  train a model ,if you already have, will resume it

and

[daniel@localhost NPD-c]$ ./train/demo train
Loading Pos data
段错误(吐核)

How to train the data? would you give some me help? Thank you.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.