Code Monkey home page Code Monkey logo

impsdm's Introduction

Matlab Implementation of Supervised Descent Method

A simple Matlab implementation of Supervised Descent Method (SDM) for Face Alignment.

I provide both training and testing modules and one trained model of LFPW subset of 300-W dataset.

You can find the ogirinal paper of my implementation:

Xiong et F. De la Torre, Supervised Descent Method and its Applications to Face Alignment, CVPR 2013.

===========================================================================

Dependency:

Datasets in use:

[300-W] http://ibug.doc.ic.ac.uk/resources/facial-point-annotations/

How to use:

  1. Download 300-W data (i.e. LFPW) from above link and put into "./data" folder, then correct the dataset path to your dataset foler in setup.m

    mkdir -p data

    For example:

    options.trainingImageDataPath = './data/lfpw/trainset/';

    options.trainingTruthDataPath = './data/lfpw/trainset/';

    options.testingImageDataPath = './data/lfpw/testset/';

    options.testingTruthDataPath = './data/lfpw/testset/';

  2. Download and install dependencies: libLinear, Vlfeat, mexopencv, put into "./lib" folder and compile if necessary. Make sure you already addpath(...) all folders in matlab. Check and correct the library path in setup.m.

    mkdir -p lib

    libLinear:

    • Open Matlab
    • Go to i.e. lib/liblinear-1.96/matlab/ in Matlab editor.
    • Run make.m to comile *.mex files.

    Vlfeat:

    • cd lib/vlfeat/ && make

    • cd ./toolbox in Matlab editor.
    • Run vl_setup
    • Compile mex Hog functions:

      cd misc mex -L../../bin/glnx86 -lvl -I../ -I../../ vl_hog.c

    • Setup libvl.so path.
    • Assume that your libvl.so located at: <vlfeat_folder>/bin/glnx86 Create soft link:

      ln -s <vlfeat_folder>/bin/glnx86/libvl.so /usr/local/libvl.so Check if the libvl.so is ready to use. ldd vl_hog.mexglx If libvl.so still not found. Add /usr/local/lib into /etc/ld.so.conf (sudo). sudo ldconfig ldconfig -p | grep libvl.so Check again: >> ldd vl_hog.mexglx

  3. If you run first time. You should set these following parameters to learn shape and variation. For later time, reset to 0.

    options.learningShape = 1; options.learningVariation = 1;

  4. Do training:

    run_training();

  5. Do testing:

    do_testing();

Note: in the program, we provide training models of LFPW (68 landmarks) in folder: "./model". The program does not optimize speed and memory during training, the memory problem may happens if you train on too much data.

impsdm's People

Watchers

 avatar  avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.