Code Monkey home page Code Monkey logo

Comments (3)

myopengit avatar myopengit commented on June 4, 2024

The following is all the outputs, is it because I use a new version of caffe?

WARNING: Logging before InitGoogleLogging() is written to STDERR
W0315 21:17:22.549293 6844 _caffe.cpp:122] DEPRECATION WARNING - deprecated use of Python interface
W0315 21:17:22.549314 6844 _caffe.cpp:123] Use this instead (with the named "weights" parameter):
W0315 21:17:22.549316 6844 caffe.cpp:125] Net('/home/c/code/ModelCNN/cnn_age_gender_models_and_data.0.0.2/deploy_age.prototxt', 1, weights='/home/c/code/ModelCNN/cnn_age_gender_models_and_data.0.0.2/age_net.caffemodel')
I0315 21:17:22.550670 6844 upgrade_proto.cpp:53] Attempting to upgrade input file specified using deprecated V1LayerParameter: /home/c/code/ModelCNN/cnn_age_gender_models_and_data.0.0.2/deploy_age.prototxt
I0315 21:17:22.550699 6844 upgrade_proto.cpp:61] Successfully upgraded file specified using deprecated V1LayerParameter
I0315 21:17:22.550709 6844 upgrade_proto.cpp:67] Attempting to upgrade input file specified using deprecated input fields: /home/c/code/ModelCNN/cnn_age_gender_models_and_data.0.0.2/deploy_age.prototxt
I0315 21:17:22.550714 6844 upgrade_proto.cpp:70] Successfully upgraded file specified using deprecated input fields.
W0315 21:17:22.550714 6844 upgrade_proto.cpp:72] Note that future Caffe releases will only support input layers and not input fields.
I0315 21:17:22.971917 6844 net.cpp:58] Initializing net from parameters:
name: "CaffeNet"
state {
phase: TEST
level: 0
}
layer {
name: "input"
type: "Input"
top: "data"
input_param {
shape {
dim: 1
dim: 3
dim: 227
dim: 227
}
}
}
layer {
name: "conv1"
type: "Convolution"
bottom: "data"
top: "conv1"
convolution_param {
num_output: 96
kernel_size: 7
stride: 4
}
}
layer {
name: "relu1"
type: "ReLU"
bottom: "conv1"
top: "conv1"
}
layer {
name: "pool1"
type: "Pooling"
bottom: "conv1"
top: "pool1"
pooling_param {
pool: MAX
kernel_size: 3
stride: 2
}
}
layer {
name: "norm1"
type: "LRN"
bottom: "pool1"
top: "norm1"
lrn_param {
local_size: 5
alpha: 0.0001
beta: 0.75
}
}
layer {
name: "conv2"
type: "Convolution"
bottom: "norm1"
top: "conv2"
convolution_param {
num_output: 256
pad: 2
kernel_size: 5
}
}
layer {
name: "relu2"
type: "ReLU"
bottom: "conv2"
top: "conv2"
}
layer {
name: "pool2"
type: "Pooling"
bottom: "conv2"
top: "pool2"
pooling_param {
pool: MAX
kernel_size: 3
stride: 2
}
}
layer {
name: "norm2"
type: "LRN"
bottom: "pool2"
top: "norm2"
lrn_param {
local_size: 5
alpha: 0.0001
beta: 0.75
}
}
layer {
name: "conv3"
type: "Convolution"
bottom: "norm2"
top: "conv3"
convolution_param {
num_output: 384
pad: 1
kernel_size: 3
}
}
layer {
name: "relu3"
type: "ReLU"
bottom: "conv3"
top: "conv3"
}
layer {
name: "pool5"
type: "Pooling"
bottom: "conv3"
top: "pool5"
pooling_param {
pool: MAX
kernel_size: 3
stride: 2
}
}
layer {
name: "fc6"
type: "InnerProduct"
bottom: "pool5"
top: "fc6"
inner_product_param {
num_output: 512
}
}
layer {
name: "relu6"
type: "ReLU"
bottom: "fc6"
top: "fc6"
}
layer {
name: "drop6"
type: "Dropout"
bottom: "fc6"
top: "fc6"
dropout_param {
dropout_ratio: 0.5
}
}
layer {
name: "fc7"
type: "InnerProduct"
bottom: "fc6"
top: "fc7"
inner_product_param {
num_output: 512
}
}
layer {
name: "relu7"
type: "ReLU"
bottom: "fc7"
top: "fc7"
}
layer {
name: "drop7"
type: "Dropout"
bottom: "fc7"
top: "fc7"
dropout_param {
dropout_ratio: 0.5
}
}
layer {
name: "fc8"
type: "InnerProduct"
bottom: "fc7"
top: "fc8"
inner_product_param {
num_output: 8
}
}
layer {
name: "prob"
type: "Softmax"
bottom: "fc8"
top: "prob"
}
I0315 21:17:22.972645 6844 layer_factory.hpp:77] Creating layer input
I0315 21:17:22.972664 6844 net.cpp:100] Creating Layer input
I0315 21:17:22.972674 6844 net.cpp:408] input -> data
I0315 21:17:22.973042 6844 net.cpp:150] Setting up input
I0315 21:17:22.973152 6844 net.cpp:157] Top shape: 1 3 227 227 (154587)
I0315 21:17:22.973203 6844 net.cpp:165] Memory required for data: 618348
I0315 21:17:22.973243 6844 layer_factory.hpp:77] Creating layer conv1
I0315 21:17:22.973295 6844 net.cpp:100] Creating Layer conv1
I0315 21:17:22.973307 6844 net.cpp:434] conv1 <- data
I0315 21:17:22.973317 6844 net.cpp:408] conv1 -> conv1
I0315 21:17:34.799007 6844 net.cpp:150] Setting up conv1
I0315 21:17:34.799028 6844 net.cpp:157] Top shape: 1 96 56 56 (301056)
I0315 21:17:34.799031 6844 net.cpp:165] Memory required for data: 1822572
I0315 21:17:34.799041 6844 layer_factory.hpp:77] Creating layer relu1
I0315 21:17:34.799049 6844 net.cpp:100] Creating Layer relu1
I0315 21:17:34.799052 6844 net.cpp:434] relu1 <- conv1
I0315 21:17:34.799055 6844 net.cpp:395] relu1 -> conv1 (in-place)
I0315 21:17:34.799383 6844 net.cpp:150] Setting up relu1
I0315 21:17:34.799389 6844 net.cpp:157] Top shape: 1 96 56 56 (301056)
I0315 21:17:34.799391 6844 net.cpp:165] Memory required for data: 3026796
I0315 21:17:34.799393 6844 layer_factory.hpp:77] Creating layer pool1
I0315 21:17:34.799398 6844 net.cpp:100] Creating Layer pool1
I0315 21:17:34.799399 6844 net.cpp:434] pool1 <- conv1
I0315 21:17:34.799402 6844 net.cpp:408] pool1 -> pool1
I0315 21:17:34.799413 6844 net.cpp:150] Setting up pool1
I0315 21:17:34.799417 6844 net.cpp:157] Top shape: 1 96 28 28 (75264)
I0315 21:17:34.799418 6844 net.cpp:165] Memory required for data: 3327852
I0315 21:17:34.799419 6844 layer_factory.hpp:77] Creating layer norm1
I0315 21:17:34.799433 6844 net.cpp:100] Creating Layer norm1
I0315 21:17:34.799435 6844 net.cpp:434] norm1 <- pool1
I0315 21:17:34.799438 6844 net.cpp:408] norm1 -> norm1
I0315 21:17:34.799562 6844 net.cpp:150] Setting up norm1
I0315 21:17:34.799567 6844 net.cpp:157] Top shape: 1 96 28 28 (75264)
I0315 21:17:34.799568 6844 net.cpp:165] Memory required for data: 3628908
I0315 21:17:34.799571 6844 layer_factory.hpp:77] Creating layer conv2
I0315 21:17:34.799576 6844 net.cpp:100] Creating Layer conv2
I0315 21:17:34.799578 6844 net.cpp:434] conv2 <- norm1
I0315 21:17:34.799582 6844 net.cpp:408] conv2 -> conv2
I0315 21:17:34.801440 6844 net.cpp:150] Setting up conv2
I0315 21:17:34.801450 6844 net.cpp:157] Top shape: 1 256 28 28 (200704)
I0315 21:17:34.801451 6844 net.cpp:165] Memory required for data: 4431724
I0315 21:17:34.801457 6844 layer_factory.hpp:77] Creating layer relu2
I0315 21:17:34.801461 6844 net.cpp:100] Creating Layer relu2
I0315 21:17:34.801463 6844 net.cpp:434] relu2 <- conv2
I0315 21:17:34.801466 6844 net.cpp:395] relu2 -> conv2 (in-place)
I0315 21:17:34.801789 6844 net.cpp:150] Setting up relu2
I0315 21:17:34.801795 6844 net.cpp:157] Top shape: 1 256 28 28 (200704)
I0315 21:17:34.801797 6844 net.cpp:165] Memory required for data: 5234540
I0315 21:17:34.801800 6844 layer_factory.hpp:77] Creating layer pool2
I0315 21:17:34.801803 6844 net.cpp:100] Creating Layer pool2
I0315 21:17:34.801805 6844 net.cpp:434] pool2 <- conv2
I0315 21:17:34.801808 6844 net.cpp:408] pool2 -> pool2
I0315 21:17:34.801815 6844 net.cpp:150] Setting up pool2
I0315 21:17:34.801816 6844 net.cpp:157] Top shape: 1 256 14 14 (50176)
I0315 21:17:34.801818 6844 net.cpp:165] Memory required for data: 5435244
I0315 21:17:34.801820 6844 layer_factory.hpp:77] Creating layer norm2
I0315 21:17:34.801823 6844 net.cpp:100] Creating Layer norm2
I0315 21:17:34.801826 6844 net.cpp:434] norm2 <- pool2
I0315 21:17:34.801828 6844 net.cpp:408] norm2 -> norm2
I0315 21:17:34.801950 6844 net.cpp:150] Setting up norm2
I0315 21:17:34.801956 6844 net.cpp:157] Top shape: 1 256 14 14 (50176)
I0315 21:17:34.801957 6844 net.cpp:165] Memory required for data: 5635948
I0315 21:17:34.801959 6844 layer_factory.hpp:77] Creating layer conv3
I0315 21:17:34.801964 6844 net.cpp:100] Creating Layer conv3
I0315 21:17:34.801966 6844 net.cpp:434] conv3 <- norm2
I0315 21:17:34.801970 6844 net.cpp:408] conv3 -> conv3
I0315 21:17:34.803786 6844 net.cpp:150] Setting up conv3
I0315 21:17:34.803794 6844 net.cpp:157] Top shape: 1 384 14 14 (75264)
I0315 21:17:34.803797 6844 net.cpp:165] Memory required for data: 5937004
I0315 21:17:34.803803 6844 layer_factory.hpp:77] Creating layer relu3
I0315 21:17:34.803807 6844 net.cpp:100] Creating Layer relu3
I0315 21:17:34.803809 6844 net.cpp:434] relu3 <- conv3
I0315 21:17:34.803812 6844 net.cpp:395] relu3 -> conv3 (in-place)
I0315 21:17:34.804126 6844 net.cpp:150] Setting up relu3
I0315 21:17:34.804132 6844 net.cpp:157] Top shape: 1 384 14 14 (75264)
I0315 21:17:34.804134 6844 net.cpp:165] Memory required for data: 6238060
I0315 21:17:34.804136 6844 layer_factory.hpp:77] Creating layer pool5
I0315 21:17:34.804157 6844 net.cpp:100] Creating Layer pool5
I0315 21:17:34.804159 6844 net.cpp:434] pool5 <- conv3
I0315 21:17:34.804162 6844 net.cpp:408] pool5 -> pool5
I0315 21:17:34.804168 6844 net.cpp:150] Setting up pool5
I0315 21:17:34.804172 6844 net.cpp:157] Top shape: 1 384 7 7 (18816)
I0315 21:17:34.804172 6844 net.cpp:165] Memory required for data: 6313324
I0315 21:17:34.804174 6844 layer_factory.hpp:77] Creating layer fc6
I0315 21:17:34.804178 6844 net.cpp:100] Creating Layer fc6
I0315 21:17:34.804179 6844 net.cpp:434] fc6 <- pool5
I0315 21:17:34.804183 6844 net.cpp:408] fc6 -> fc6
I0315 21:17:34.810413 6844 net.cpp:150] Setting up fc6
I0315 21:17:34.810448 6844 net.cpp:157] Top shape: 1 512 (512)
I0315 21:17:34.810451 6844 net.cpp:165] Memory required for data: 6315372
I0315 21:17:34.810458 6844 layer_factory.hpp:77] Creating layer relu6
I0315 21:17:34.810464 6844 net.cpp:100] Creating Layer relu6
I0315 21:17:34.810467 6844 net.cpp:434] relu6 <- fc6
I0315 21:17:34.810472 6844 net.cpp:395] relu6 -> fc6 (in-place)
I0315 21:17:34.810688 6844 net.cpp:150] Setting up relu6
I0315 21:17:34.810693 6844 net.cpp:157] Top shape: 1 512 (512)
I0315 21:17:34.810714 6844 net.cpp:165] Memory required for data: 6317420
I0315 21:17:34.810715 6844 layer_factory.hpp:77] Creating layer drop6
I0315 21:17:34.810719 6844 net.cpp:100] Creating Layer drop6
I0315 21:17:34.810721 6844 net.cpp:434] drop6 <- fc6
I0315 21:17:34.810724 6844 net.cpp:395] drop6 -> fc6 (in-place)
I0315 21:17:34.810729 6844 net.cpp:150] Setting up drop6
I0315 21:17:34.810731 6844 net.cpp:157] Top shape: 1 512 (512)
I0315 21:17:34.810732 6844 net.cpp:165] Memory required for data: 6319468
I0315 21:17:34.810734 6844 layer_factory.hpp:77] Creating layer fc7
I0315 21:17:34.810737 6844 net.cpp:100] Creating Layer fc7
I0315 21:17:34.810739 6844 net.cpp:434] fc7 <- fc6
I0315 21:17:34.810755 6844 net.cpp:408] fc7 -> fc7
I0315 21:17:34.811107 6844 net.cpp:150] Setting up fc7
I0315 21:17:34.811111 6844 net.cpp:157] Top shape: 1 512 (512)
I0315 21:17:34.811113 6844 net.cpp:165] Memory required for data: 6321516
I0315 21:17:34.811138 6844 layer_factory.hpp:77] Creating layer relu7
I0315 21:17:34.811142 6844 net.cpp:100] Creating Layer relu7
I0315 21:17:34.811144 6844 net.cpp:434] relu7 <- fc7
I0315 21:17:34.811146 6844 net.cpp:395] relu7 -> fc7 (in-place)
I0315 21:17:34.811532 6844 net.cpp:150] Setting up relu7
I0315 21:17:34.811538 6844 net.cpp:157] Top shape: 1 512 (512)
I0315 21:17:34.811540 6844 net.cpp:165] Memory required for data: 6323564
I0315 21:17:34.811560 6844 layer_factory.hpp:77] Creating layer drop7
I0315 21:17:34.811564 6844 net.cpp:100] Creating Layer drop7
I0315 21:17:34.811566 6844 net.cpp:434] drop7 <- fc7
I0315 21:17:34.811583 6844 net.cpp:395] drop7 -> fc7 (in-place)
I0315 21:17:34.811588 6844 net.cpp:150] Setting up drop7
I0315 21:17:34.811589 6844 net.cpp:157] Top shape: 1 512 (512)
I0315 21:17:34.811591 6844 net.cpp:165] Memory required for data: 6325612
I0315 21:17:34.811592 6844 layer_factory.hpp:77] Creating layer fc8
I0315 21:17:34.811595 6844 net.cpp:100] Creating Layer fc8
I0315 21:17:34.811597 6844 net.cpp:434] fc8 <- fc7
I0315 21:17:34.811600 6844 net.cpp:408] fc8 -> fc8
I0315 21:17:34.811614 6844 net.cpp:150] Setting up fc8
I0315 21:17:34.811616 6844 net.cpp:157] Top shape: 1 8 (8)
I0315 21:17:34.811617 6844 net.cpp:165] Memory required for data: 6325644
I0315 21:17:34.811621 6844 layer_factory.hpp:77] Creating layer prob
I0315 21:17:34.811625 6844 net.cpp:100] Creating Layer prob
I0315 21:17:34.811626 6844 net.cpp:434] prob <- fc8
I0315 21:17:34.811630 6844 net.cpp:408] prob -> prob
I0315 21:17:34.811761 6844 net.cpp:150] Setting up prob
I0315 21:17:34.811765 6844 net.cpp:157] Top shape: 1 8 (8)
I0315 21:17:34.811767 6844 net.cpp:165] Memory required for data: 6325676
I0315 21:17:34.811787 6844 net.cpp:228] prob does not need backward computation.
I0315 21:17:34.811789 6844 net.cpp:228] fc8 does not need backward computation.
I0315 21:17:34.811792 6844 net.cpp:228] drop7 does not need backward computation.
I0315 21:17:34.811794 6844 net.cpp:228] relu7 does not need backward computation.
I0315 21:17:34.811795 6844 net.cpp:228] fc7 does not need backward computation.
I0315 21:17:34.811796 6844 net.cpp:228] drop6 does not need backward computation.
I0315 21:17:34.811799 6844 net.cpp:228] relu6 does not need backward computation.
I0315 21:17:34.811800 6844 net.cpp:228] fc6 does not need backward computation.
I0315 21:17:34.811801 6844 net.cpp:228] pool5 does not need backward computation.
I0315 21:17:34.811803 6844 net.cpp:228] relu3 does not need backward computation.
I0315 21:17:34.811805 6844 net.cpp:228] conv3 does not need backward computation.
I0315 21:17:34.811806 6844 net.cpp:228] norm2 does not need backward computation.
I0315 21:17:34.811808 6844 net.cpp:228] pool2 does not need backward computation.
I0315 21:17:34.811810 6844 net.cpp:228] relu2 does not need backward computation.
I0315 21:17:34.811811 6844 net.cpp:228] conv2 does not need backward computation.
I0315 21:17:34.811813 6844 net.cpp:228] norm1 does not need backward computation.
I0315 21:17:34.811815 6844 net.cpp:228] pool1 does not need backward computation.
I0315 21:17:34.811817 6844 net.cpp:228] relu1 does not need backward computation.
I0315 21:17:34.811820 6844 net.cpp:228] conv1 does not need backward computation.
I0315 21:17:34.811821 6844 net.cpp:228] input does not need backward computation.
I0315 21:17:34.811822 6844 net.cpp:270] This network produces output prob
I0315 21:17:34.811831 6844 net.cpp:283] Network initialization done.
I0315 21:17:34.862948 6844 upgrade_proto.cpp:53] Attempting to upgrade input file specified using deprecated V1LayerParameter: /home/c/code/ModelCNN/cnn_age_gender_models_and_data.0.0.2/age_net.caffemodel
I0315 21:17:34.882393 6844 upgrade_proto.cpp:61] Successfully upgraded file specified using deprecated V1LayerParameter
I0315 21:17:34.882598 6844 net.cpp:761] Ignoring source layer data
I0315 21:17:34.888499 6844 net.cpp:761] Ignoring source layer loss
Traceback (most recent call last):
File "/home/c/code/eclipsecode/AgeCNN/src/main.py", line 31, in
image_dims=(256, 256))
File "/home/c/software/gpu_caffe/python/caffe/classifier.py", line 34, in init
self.transformer.set_mean(in
, mean)
File "/home/c/software/gpu_caffe/python/caffe/io.py", line 259, in set_mean
raise ValueError('Mean shape incompatible with input shape.')
ValueError: Mean shape incompatible with input shape.

from agegenderdeeplearning.

GilLevi avatar GilLevi commented on June 4, 2024

Hi,
This is a common issue and it's related to a difference between the Caffe version that we used and the current Caffe version. We describe how to solve this issue in an update from the 15th of June 2015 on the project page:

http://www.openu.ac.il/home/hassner/projects/cnn_agegender/

from agegenderdeeplearning.

danionescu0 avatar danionescu0 commented on June 4, 2024

Hello, i'm having the same issue. However i found that the fix from the project page is missing.
"Update: To adjust the code snippet to newer versions of Caffe, a small modification of the io.py file is required. A modified version is available here(https://talhassner.github.io/home/projects/cnn_agegender/io.py). This update comes in response to issues reported by several people and covered also in the answer to the following Stack Overflow question."

The given page is a 404

from agegenderdeeplearning.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.