Code Monkey home page Code Monkey logo

lpo's People

Contributors

philkr avatar rodrigob avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

lpo's Issues

eval_box raises exception with demonstration models

Following the readme steps, running eval_box.py also raises an exception.

python eval_box.py test.txt ../models/lpo_VOC_0.1.dat
Traceback (most recent call last):
  File "eval_box.py", line 57, in <module>
    bo,pool_s = evaluateBox( prop, over_segs, boxes, name='(tst)' )
  File "eval_box.py", line 43, in evaluateBox
    print( "LPO %05s & %d & %0.3f & %0.3f & %0.3f & %0.3f & %0.3f \\\\"%(name,np.nanmean(pool_s),np.mean(bo),np.mean(bo>=0.5), np.mean(bo>=0.7), np.mean(bo>=0.9), np.mean(2*np.maximum(bo-0.5,0)) ) )
UnboundLocalError: local variable 'pool_s' referenced before assignment

both bo, pool_s are dangerously non-initialized.

I guess something else is off, but I cannot quite figure out what, since none of the scripts seems to work as expected.

ImportError: No module named python.lpo

I followed the install pipeline and no error. But at last when I run the command-''bash eval_all.sh', the console printed "ImportError: No module named pylab". Then I changed the eval_all.sh script file, all python3 to python. Though there wasn't previous error, the console printed a new error, "ImportError: No module named python.lpo". Now I had no idea and hope you could help me. Thanks.

ImportError: dynamic module does not define init function

I followed the instructions in the readme and compiled with -DUSE_PYTHON=2 , when I do bash eval_all.sh I get error:

Traceback (most recent call last):
File "train_lpo.py", line 31, in
from lpo import *
File "/home/revathy/lpo-release/src/lpo.py", line 45, in
from python.lpo import *
ImportError: dynamic module does not define init function (PyInit_lpo)

hope you could help me. Thanks.

Typos in CMakeLists.txt

Hi, @philkr
I think there is a typos in matlab/CMakeLists.txt

add_library( gop_mex SHARED gop_mex.cpp )
target_link_libraries( gop_mex util imgproc learning contour segmentation proposals gomp )

should be

add_library( lpo_mex SHARED lpo_mex )
target_link_libraries( lpo_mex util imgproc learning contour segmentation proposals gomp )

As you do not have gop_mex.cpp in this project, Do you?

How to fix 'File '/path/to/datasets/VOC2012/ImageSets/Segmentation/val.txt' not found'?

When I execute ./sam_eval_all.sh which just change to use python2.7 , it shows:

sam@sam-desktop:~/code/download/Segmentation/lpo/src$ ./sam_eval_all.sh
File '/path/to/datasets/VOC2012/ImageSets/Segmentation/val.txt' not found! Check if DATA_DIR is set properly.
Traceback (most recent call last):
File "train_lpo.py", line 139, in
over_segs,segmentations,boxes,names = loadVOCAndOverSeg( 'test', detector='mssf' )
File "/home/sam/code/download/Segmentation/lpo/src/util.py", line 98, in loadVOCAndOverSeg
return loadAndOverSegDataset( lambda: ldr(im_set=="train",im_set=="valid",im_set=="test"), "VOC%s_%s"%(year,im_set), detector=detector, N_SPIX=N_SPIX )
File "/home/sam/code/download/Segmentation/lpo/src/util.py", line 70, in loadAndOverSegDataset
data = loader()
File "/home/sam/code/download/Segmentation/lpo/src/util.py", line 98, in
return loadAndOverSegDataset( lambda: ldr(im_set=="train",im_set=="valid",im_set=="test"), "VOC%s_%s"%(year,im_set), detector=detector, N_SPIX=N_SPIX )
ValueError: Failed to load dataset
File '/path/to/datasets/VOC2012/ImageSets/Segmentation/val.txt' not found! Check if DATA_DIR is set properly.
Traceback (most recent call last):
File "train_lpo.py", line 139, in
over_segs,segmentations,boxes,names = loadVOCAndOverSeg( 'test', detector='mssf' )
File "/home/sam/code/download/Segmentation/lpo/src/util.py", line 98, in loadVOCAndOverSeg
return loadAndOverSegDataset( lambda: ldr(im_set=="train",im_set=="valid",im_set=="test"), "VOC%s_%s"%(year,im_set), detector=detector, N_SPIX=N_SPIX )
File "/home/sam/code/download/Segmentation/lpo/src/util.py", line 70, in loadAndOverSegDataset
data = loader()
File "/home/sam/code/download/Segmentation/lpo/src/util.py", line 98, in
return loadAndOverSegDataset( lambda: ldr(im_set=="train",im_set=="valid",im_set=="test"), "VOC%s_%s"%(year,im_set), detector=detector, N_SPIX=N_SPIX )
ValueError: Failed to load dataset
File '/path/to/datasets/VOC2012/ImageSets/Segmentation/val.txt' not found! Check if DATA_DIR is set properly.
Traceback (most recent call last):
File "train_lpo.py", line 139, in
over_segs,segmentations,boxes,names = loadVOCAndOverSeg( 'test', detector='mssf' )
File "/home/sam/code/download/Segmentation/lpo/src/util.py", line 98, in loadVOCAndOverSeg
return loadAndOverSegDataset( lambda: ldr(im_set=="train",im_set=="valid",im_set=="test"), "VOC%s_%s"%(year,im_set), detector=detector, N_SPIX=N_SPIX )
File "/home/sam/code/download/Segmentation/lpo/src/util.py", line 70, in loadAndOverSegDataset
data = loader()
File "/home/sam/code/download/Segmentation/lpo/src/util.py", line 98, in
return loadAndOverSegDataset( lambda: ldr(im_set=="train",im_set=="valid",im_set=="test"), "VOC%s_%s"%(year,im_set), detector=detector, N_SPIX=N_SPIX )
ValueError: Failed to load dataset
File '/path/to/datasets/VOC2012/ImageSets/Segmentation/val.txt' not found! Check if DATA_DIR is set properly.
Traceback (most recent call last):
File "train_lpo.py", line 139, in
over_segs,segmentations,boxes,names = loadVOCAndOverSeg( 'test', detector='mssf' )
File "/home/sam/code/download/Segmentation/lpo/src/util.py", line 98, in loadVOCAndOverSeg
return loadAndOverSegDataset( lambda: ldr(im_set=="train",im_set=="valid",im_set=="test"), "VOC%s_%s"%(year,im_set), detector=detector, N_SPIX=N_SPIX )
File "/home/sam/code/download/Segmentation/lpo/src/util.py", line 70, in loadAndOverSegDataset
data = loader()
File "/home/sam/code/download/Segmentation/lpo/src/util.py", line 98, in
return loadAndOverSegDataset( lambda: ldr(im_set=="train",im_set=="valid",im_set=="test"), "VOC%s_%s"%(year,im_set), detector=detector, N_SPIX=N_SPIX )
ValueError: Failed to load dataset
File '/path/to/datasets/VOC2012/ImageSets/Segmentation/val.txt' not found! Check if DATA_DIR is set properly.
Traceback (most recent call last):
File "train_lpo.py", line 139, in
over_segs,segmentations,boxes,names = loadVOCAndOverSeg( 'test', detector='mssf' )
File "/home/sam/code/download/Segmentation/lpo/src/util.py", line 98, in loadVOCAndOverSeg
return loadAndOverSegDataset( lambda: ldr(im_set=="train",im_set=="valid",im_set=="test"), "VOC%s_%s"%(year,im_set), detector=detector, N_SPIX=N_SPIX )
File "/home/sam/code/download/Segmentation/lpo/src/util.py", line 70, in loadAndOverSegDataset
data = loader()
File "/home/sam/code/download/Segmentation/lpo/src/util.py", line 98, in
return loadAndOverSegDataset( lambda: ldr(im_set=="train",im_set=="valid",im_set=="test"), "VOC%s_%s"%(year,im_set), detector=detector, N_SPIX=N_SPIX )
ValueError: Failed to load dataset
File '/path/to/datasets/VOC2012/ImageSets/Segmentation/val.txt' not found! Check if DATA_DIR is set properly.
Traceback (most recent call last):
File "train_lpo.py", line 139, in
over_segs,segmentations,boxes,names = loadVOCAndOverSeg( 'test', detector='mssf' )
File "/home/sam/code/download/Segmentation/lpo/src/util.py", line 98, in loadVOCAndOverSeg
return loadAndOverSegDataset( lambda: ldr(im_set=="train",im_set=="valid",im_set=="test"), "VOC%s_%s"%(year,im_set), detector=detector, N_SPIX=N_SPIX )
File "/home/sam/code/download/Segmentation/lpo/src/util.py", line 70, in loadAndOverSegDataset
data = loader()
File "/home/sam/code/download/Segmentation/lpo/src/util.py", line 98, in
return loadAndOverSegDataset( lambda: ldr(im_set=="train",im_set=="valid",im_set=="test"), "VOC%s_%s"%(year,im_set), detector=detector, N_SPIX=N_SPIX )

ValueError: Failed to load dataset

sam@sam-desktop:~/code/download/Segmentation/lpo/src$ cat ./sam_eval_all.sh

This script reproduces table 3 in the paper

python train_lpo.py -f0 0.2 ../models/lpo_VOC_0.2.dat
python train_lpo.py -f0 0.1 ../models/lpo_VOC_0.1.dat
python train_lpo.py -f0 0.05 ../models/lpo_VOC_0.05.dat
python train_lpo.py -f0 0.03 ../models/lpo_VOC_0.03.dat
python train_lpo.py -f0 0.02 ../models/lpo_VOC_0.02.dat
python train_lpo.py -f0 0.01 ../models/lpo_VOC_0.01.dat -iou 0.925 # Increase the IoU a bit to make sure the number of proposals match
sam@sam-desktop:~/code/download/Segmentation/lpo/src$

How to solve it? Thank you~

Simple python example

It's great that you made easy-to-use code to evaluate on multiple datasets. However, I'm having a little trouble evaluating on individual images for my own application.

Could you make a simple python example showing how to use this library for evaluating on an individual image?

Thanks!

matlab cmakelists GOP is included

Hi,

Does this library require GOP library? because the cmakelists.txt of matlab folder contains:

  • add_library( gop_mex SHARED gop_mex.cpp )
  • target_link_libraries( gop_mex util imgproc learning contour segmentation proposals gomp )
  • Question on models in lpo

    Hi! The paper mentions that models are trained for both Pascal VOC and COCO, as the COCO data tends to be a small part of the image, and a different model is needed to segment such small parts. In the data folder, there are files that have VOC in the name. Are the COCO models in this library already too, or do we need to train on COCO to generate those?

    Segmentation Fault on OS X

    On 10.10.3, with Python 2.7.10 and boost and boost-python 1.58.0. Built using:

    cmake .. -DCMAKE_BUILD_TYPE=Release -DDATA_DIR=~/test_images/coco-master/images/val2014 -DUSE_PYTHON=2
    make -j9

    Saw some warnings only, such as:
    In file included from /Users/peterwang/CPP_Resources/lpo-release/lib/crf/crf.cpp:31:
    /Users/peterwang/CPP_Resources/lpo-release/external/ibfs/ibfs.h:161:2: warning: 'Node' defined as a class here but previously declared as a struct
    [-Wmismatched-tags]
    class Node
    ^
    /Users/peterwang/CPP_Resources/lpo-release/external/ibfs/ibfs.h:150:2: note: did you mean class here?
    struct Node;
    ^~~~~~
    class

    Tried:

    python train_lpo.py -f0 0.2 ../models/lpo_VOC_0.2.dat
    and got: Segmentation fault: 11
    This appears to have crashed on "from python.lpo import *" in lpo.py.

    Crash report contained:
    ...
    Thread 0 Crashed:: Dispatch queue: com.apple.main-thread
    0 ??? 000000000000000000 0 + 0
    1 org.python.python 0x0000000103d150dd PyEval_GetGlobals + 23
    2 org.python.python 0x0000000103d2462b PyImport_Import + 137
    3 org.python.python 0x0000000103d22d27 PyImport_ImportModule + 31
    4 lpo.so 0x00000001033f45a3 init_numpy() + 19
    5 lpo.so 0x00000001033f4779 defineUtil() + 25
    6 lpo.so 0x00000001033f4499 init_module_lpo() + 9
    7 libboost_python-mt.dylib 0x0000000103c36391 boost::python::handle_exception_impl(boost::function0) + 81
    8 libboost_python-mt.dylib 0x0000000103c373b9 boost::python::detail::init_module(char const_, void (_)()) + 121
    9 org.python.python 0x0000000101836327 _PyImport_LoadDynamicModule + 140
    ...

    Saw the note in external/boost/readme.txt:
    "In order to use a non-system boost library copy the "boost" and "libs" directory of a recent boost release (eg 1.57) here."

    And in build/lib/python/CMakeFiles/lpo.dir/depend.make:
    ...
    lib/python/CMakeFiles/lpo.dir/boost.cpp.o: /usr/local/include/boost/array.hpp
    lib/python/CMakeFiles/lpo.dir/boost.cpp.o: /usr/local/include/boost/assert.hpp
    lib/python/CMakeFiles/lpo.dir/boost.cpp.o: /usr/local/include/boost/bind.hpp
    ...

    These seem to suggest the seg fault was due to boost version mismatch?

    Is it sufficient to just do:

    ln -s /usr/local/Cellar/boost/1.58.0 external/boost/
    ln -s /usr/local/Cellar/boost/1.58.0/lib external/boost/libs

    Or something else?

    BTW, boost and boost-python were installed as part of setting up Caffe. The Caffe ImageNet model ran successfully when invoked from a Python test app.

    Thanks for any light you could help shed.

    error when make: ‘sleep_for’ is not a member of ‘std::this_thread’

    [ 26%] Building CXX object lib/util/CMakeFiles/util.dir/geodesics.cpp.o
    In file included from /media/dat1/liao/lpo/lib/util/threading.cpp:27:0:
    /media/dat1/liao/lpo/lib/util/threading.h: In member function ‘void ThreadedQueue<T>::process(ThreadedQueue<T>::F, const std::vector<T>&)’:
    /media/dat1/liao/lpo/lib/util/threading.h:188:5: error: ‘sleep_for’ is not a member of ‘std::this_thread’
    make[2]: *** [lib/util/CMakeFiles/util.dir/threading.cpp.o] Error 1
    make[2]: *** Waiting for unfinished jobs....
    In file included from /media/dat1/liao/lpo/lib/util/algorithm.h:32:0,
    from /media/dat1/liao/lpo/lib/util/algorithm.cpp:27:
    /media/dat1/liao/lpo/lib/util/threading.h: In member function ‘void ThreadedQueue<T>::process(ThreadedQueue<T>::F, const std::vector<_RealType>&)’:
    /media/dat1/liao/lpo/lib/util/threading.h:188:5: error: ‘sleep_for’ is not a member of ‘std::this_thread’
    make[2]: *** [lib/util/CMakeFiles/util.dir/algorithm.cpp.o] Error 1
    make[1]: *** [lib/util/CMakeFiles/util.dir/all] Error 2
    make: *** [all] Error 2

    error when training lpo for bounding box performance

    When I tried to train LPO on the COCO dataset using the command:
    python train_lpo.py ../models/lpo_COCO_0.02.dat -b -t -f0 0.02
    I get the following error:
    Traceback (most recent call last):
    File "train_lpo.py", line 121, in
    boxes = [proposals.Proposals(s,np.eye(np.max(s)+1).astype(bool)).toBoxes() for s in segmentations]
    File "/home/anaconda/lib/python2.7/site-packages/numpy/core/fromnumeric.py", line 2135, in amax
    out=out, keepdims=keepdims)
    File "/home/anaconda/lib/python2.7/site-packages/numpy/core/_methods.py", line 26, in _amax
    return umr_maximum(a, axis, None, out, keepdims)
    ValueError: operands could not be broadcast together with shapes (10,2) (5,2)

    Without the -b argument, I was able to successfully train LPO on the COCO dataset. I'm using Mac OS X 10.10.3.

    Simple C++ example of process one image

    Hello. I am very excited of your previous approach: Geodesic Object Proposals. And there you propose good example of using library in C++ code. Can you give some part of code for use this library? I found that they very similar.

    Thank you.

    analyze_model.py raises division by zero (with demonstration models)

    When running analyze_model.py (with python 2.7) over the demonstratation model, I get a division by zero.

    python analyze_model.py ../models/lpo_VOC_0.1.dat 
    /home/rodrigob/.local/lib/python2.7/site-packages/numpy/core/_methods.py:55: RuntimeWarning: Mean of empty slice.
      warnings.warn("Mean of empty slice.", RuntimeWarning)
    /home/rodrigob/.local/lib/python2.7/site-packages/numpy/core/_methods.py:67: RuntimeWarning: invalid value encountered in double_scalars
      ret = ret.dtype.type(ret / rcount)
    Traceback (most recent call last):
      File "analyze_model.py", line 115, in <module>
        evaluateDetailed( prop, over_segs, segmentations )
      File "analyze_model.py", line 101, in evaluateDetailed
        print( names[m], '&', np.mean(ps[m]), '&', np.mean(bo[m]>=bbo)*100, '&', np.sqrt(np.mean(ma[m])), '&', t[m]/len(ma[m]) )
    ZeroDivisionError: integer division or modulo by zero
    

    I am not getting this right from the help and readme ?

    error building the program

    When I was building the program, I got an error like :

    **collect2: error: ld returned 1 exit status

    examples/CMakeFiles/example.dir/build.make:97: recipe for target 'examples/example' failed
    make[2]: *** [examples/example] Error 1
    CMakeFiles/Makefile2:729: recipe for target 'examples/CMakeFiles/example.dir/all' failed
    make[1]: *** [examples/CMakeFiles/example.dir/all] Error 2
    [ 97%] Built target gop
    Makefile:116: recipe for target 'all' failed
    make: * [all] Error 2

    How to resolve my problem?
    Thank you very much!

    GCC requirement

    It seems gcc 4.8 is required to support the vectorized array subscripts. With gcc 4.7 I'm getting:

    lpo/lib/imgproc/color.cpp:77:31: error: invalid types ‘__m128 {aka __vector(4) float}[int]’ for array subscript

    Should the README be updated, or am I doing anything wrong ?

    run compile show "error"

    On win8 with 64 bit, I run compile.m, and modify the line
    < 'cmd = ['mex -DLBFGS_FLOAT=32 -DEIGEN_DONT_PARALLELIZE -DNO_IMREAD -I../lib -I',EIGEN_DIR,' -I../external/liblbfgs-1.10/include/ lpo_mex.cpp ',' ../external/liblbfgs-1.10/lib/lbfgs.c ', all_files]' >.
    But it fails to build the lpo_mex.cpp. I try it, but some problems are present. Can you find this bug?

    How to fix 'ValueError: Unknown model type 'GlobalCRFModel' problem?

    Hello, thanks for sharing codes!
    I compile your code success with python 2.7.
    I also change python3 to python in eval_all.sh file.
    When I run eval_all.sh, it shows:

    Traceback (most recent call last):
    File "train_lpo.py", line 129, in
    prop.load(save_name)
    ValueError: Unknown model type 'GlobalCRFModel'!
    Traceback (most recent call last):
    File "train_lpo.py", line 129, in
    prop.load(save_name)
    ValueError: Unknown model type 'GlobalCRFModel'!
    Traceback (most recent call last):
    File "train_lpo.py", line 129, in
    prop.load(save_name)
    ValueError: Unknown model type 'GlobalCRFModel'!
    Traceback (most recent call last):
    File "train_lpo.py", line 129, in
    prop.load(save_name)
    ValueError: Unknown model type 'GlobalCRFModel'!
    Traceback (most recent call last):
    File "train_lpo.py", line 129, in
    prop.load(save_name)
    ValueError: Unknown model type 'GlobalCRFModel'!
    Traceback (most recent call last):
    File "train_lpo.py", line 129, in
    prop.load(save_name)
    ValueError: Unknown model type 'GlobalCRFModel'!
    sam@sam-desktop:~/code/download/Segmentation/lpo/src$

    How to solve it?
    Thank you very much~

    Trying to figure out the output...

    Hey,

    I've been trying to get a small bounding-box example working, based on your propose_hf5.py code.

    Here is my code:

    imgs = [ lpo.imgproc.imread('cat.jpg') ]
    
    prop = lpo.proposals.LPO()
    prop.load( 'dats/lpo_VOC_0.02.dat' )
    
    detector = lpo.contour.MultiScaleStructuredForest()
    detector.load( 'dats/sf.dat' )
    
    over_segs = lpo.segmentation.generateGeodesicKMeans( detector, imgs, 1000 )
    
    props = prop.propose( over_segs, 0.01, True )
    
    props = props[0][0]
    
    fig = plt.figure()
    ax = fig.add_subplot(1, 1, 1)
    ax.imshow(imgs[0])
    for bb in props.toBoxes():
        ax.add_patch(matplotlib.patches.Rectangle((bb[0],bb[1]),bb[2],bb[3], color='red', fill=False))
    
    

    I end up with:
    catboxes

    If I play around with some of the parameters, I end up getting an enormous amount of proposal boxes.

    I was hoping somebody could provide some advice to help me get this working.

    How to solve 'global name 'FileNotFoundError' is not defined" ?

    Hello,
    I run sed -e 's:python3:python:g' eval_all.sh > sam_eval_all.sh
    And here is the output when execute sam_eval_all.sh:

    Traceback (most recent call last):
    File "train_lpo.py", line 137, in
    over_segs,segmentations,boxes,names = loadVOCAndOverSeg( 'test', detector='mssf' )
    File "/home/sam/code/download/Segmentation/lpo/src/util.py", line 94, in loadVOCAndOverSeg
    return loadAndOverSegDataset( lambda: ldr(im_set=="train",im_set=="valid",im_set=="test"), "VOC%s_%s"%(year,im_set), detector=detector, N_SPIX=N_SPIX )
    File "/home/sam/code/download/Segmentation/lpo/src/util.py", line 52, in loadAndOverSegDataset
    except FileNotFoundError:
    NameError: global name 'FileNotFoundError' is not defined
    Traceback (most recent call last):
    File "train_lpo.py", line 137, in
    over_segs,segmentations,boxes,names = loadVOCAndOverSeg( 'test', detector='mssf' )
    File "/home/sam/code/download/Segmentation/lpo/src/util.py", line 94, in loadVOCAndOverSeg
    return loadAndOverSegDataset( lambda: ldr(im_set=="train",im_set=="valid",im_set=="test"), "VOC%s_%s"%(year,im_set), detector=detector, N_SPIX=N_SPIX )
    File "/home/sam/code/download/Segmentation/lpo/src/util.py", line 52, in loadAndOverSegDataset
    except FileNotFoundError:
    NameError: global name 'FileNotFoundError' is not defined
    Traceback (most recent call last):
    File "train_lpo.py", line 137, in
    over_segs,segmentations,boxes,names = loadVOCAndOverSeg( 'test', detector='mssf' )
    File "/home/sam/code/download/Segmentation/lpo/src/util.py", line 94, in loadVOCAndOverSeg
    return loadAndOverSegDataset( lambda: ldr(im_set=="train",im_set=="valid",im_set=="test"), "VOC%s_%s"%(year,im_set), detector=detector, N_SPIX=N_SPIX )
    File "/home/sam/code/download/Segmentation/lpo/src/util.py", line 52, in loadAndOverSegDataset
    except FileNotFoundError:
    NameError: global name 'FileNotFoundError' is not defined
    Traceback (most recent call last):
    File "train_lpo.py", line 137, in
    over_segs,segmentations,boxes,names = loadVOCAndOverSeg( 'test', detector='mssf' )
    File "/home/sam/code/download/Segmentation/lpo/src/util.py", line 94, in loadVOCAndOverSeg
    return loadAndOverSegDataset( lambda: ldr(im_set=="train",im_set=="valid",im_set=="test"), "VOC%s_%s"%(year,im_set), detector=detector, N_SPIX=N_SPIX )
    File "/home/sam/code/download/Segmentation/lpo/src/util.py", line 52, in loadAndOverSegDataset
    except FileNotFoundError:
    NameError: global name 'FileNotFoundError' is not defined
    Traceback (most recent call last):
    File "train_lpo.py", line 137, in
    over_segs,segmentations,boxes,names = loadVOCAndOverSeg( 'test', detector='mssf' )
    File "/home/sam/code/download/Segmentation/lpo/src/util.py", line 94, in loadVOCAndOverSeg
    return loadAndOverSegDataset( lambda: ldr(im_set=="train",im_set=="valid",im_set=="test"), "VOC%s_%s"%(year,im_set), detector=detector, N_SPIX=N_SPIX )
    File "/home/sam/code/download/Segmentation/lpo/src/util.py", line 52, in loadAndOverSegDataset
    except FileNotFoundError:
    NameError: global name 'FileNotFoundError' is not defined
    Traceback (most recent call last):
    File "train_lpo.py", line 137, in
    over_segs,segmentations,boxes,names = loadVOCAndOverSeg( 'test', detector='mssf' )
    File "/home/sam/code/download/Segmentation/lpo/src/util.py", line 94, in loadVOCAndOverSeg
    return loadAndOverSegDataset( lambda: ldr(im_set=="train",im_set=="valid",im_set=="test"), "VOC%s_%s"%(year,im_set), detector=detector, N_SPIX=N_SPIX )
    File "/home/sam/code/download/Segmentation/lpo/src/util.py", line 52, in loadAndOverSegDataset
    except FileNotFoundError:
    NameError: global name 'FileNotFoundError' is not defined

    Thank you~

    Error with CMake

    Hey, whenever I try to compile using cmake it throws an error. Below is the output. Any help is appreciated.


    CMake Error at /usr/local/share/cmake-3.2/Modules/FindPackageHandleStandardArgs.cmake:138 (message):
    Could NOT find PythonInterp: Found unsuitable version "1.4", but required
    is at least "3.3" (found /usr/bin/python3)
    Call Stack (most recent call first):
    /usr/local/share/cmake-3.2/Modules/FindPackageHandleStandardArgs.cmake:372 (_FPHSA_FAILURE_MESSAGE)
    /usr/local/share/cmake-3.2/Modules/FindPythonInterp.cmake:162 (FIND_PACKAGE_HANDLE_STANDARD_ARGS)

    lib/python/CMakeLists.txt:10 (find_package)

    ...I'm running Python 3.4 and CMake 3.2.3.

    example please

    @philkr

    i got the code installed and could get it working - I rain eval_all.sh and it seemed to run fine.

    but i seem to be a bit lost post that. Specifically, I am looking for some simple steps on how to get this working for a custom data set (say even VOC 2007), step-by-step.

    To get the box proposals using our own datasets, I think we need to have the equivalent sf.dat and the lpo_VOC.xx.dat files as inputs. But, how to generate these 2 for our own datasets?

    When I ran train_lpo.py -f0 0.05 -t, it only created a VOC_2007_train_mssf_1000.dat in the /tmp folder. And I couldn't load it as I got an "out of memory error"

    ImportError

    ImportError: /home/ksivakumar/lpo/build/lib/python/lpo.so: undefined symbol: png_set_longjmp_fn

    ...I am getting this error and I'm using libpng 1.5, which does have support for png_set_longjpm_fn. Any ideas?

    Convert from ndarray

    Hi @philkr, and other lpo users,

    Is there any way, convenient or not, to convert an ndarray object to the Image8u type used in this library? Using the lpo.improc.imread function obviously reads a file from disk, and provides the proper data type. I'm wondering if I can avoid having to read from disk if it's already in memory as an ndarray?

    Thanks

    Recommend Projects

    • React photo React

      A declarative, efficient, and flexible JavaScript library for building user interfaces.

    • Vue.js photo Vue.js

      🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

    • Typescript photo Typescript

      TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

    • TensorFlow photo TensorFlow

      An Open Source Machine Learning Framework for Everyone

    • Django photo Django

      The Web framework for perfectionists with deadlines.

    • D3 photo D3

      Bring data to life with SVG, Canvas and HTML. 📊📈🎉

    Recommend Topics

    • javascript

      JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

    • web

      Some thing interesting about web. New door for the world.

    • server

      A server is a program made to process requests and deliver data to clients.

    • Machine learning

      Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

    • Game

      Some thing interesting about game, make everyone happy.

    Recommend Org

    • Facebook photo Facebook

      We are working to build community through open source technology. NB: members must have two-factor auth.

    • Microsoft photo Microsoft

      Open source projects and samples from Microsoft.

    • Google photo Google

      Google ❤️ Open Source for everyone.

    • D3 photo D3

      Data-Driven Documents codes.