dichen-cd / nae4ps Goto Github PK
View Code? Open in Web Editor NEWNorm-Aware Embedding for Efficient Person Search (CVPR'20)
License: MIT License
Norm-Aware Embedding for Efficient Person Search (CVPR'20)
License: MIT License
Hi,
Can you please tell how can I reproduce OIM-base results. Is there a separate model file for it? Can you please share?
Regards
Bharti
Excuse me, in line 62 of the prw.py file, what is the use of the _adapt_pid_to_cls function? I want to know how those people who cannot be determined, that is, those with id 5555, have done?
Training command:
CUDA_VISIBLE_DEVICES=0 python scripts/train_NAE.py --debug --lr_warm_up -p ./logs --batch_size 5 --nw 5 --w_RCNN_loss_bbox 10.0 --epochs 22 --lr 0.003
Then I met a problem during testing.
[Errno 2] No such file or directory: './logs/PRW/query_info_2057.txt'
To solve the problem, I renamed PRW/query_info.txt
to PRW/query_info_2057.txt
But I get the following result which is worse than the paper :
[~] Evaluating detections:
all detection:
recall = 87.24%
ap = 80.03%
labeled only detection:
recall = 90.46%
[~] Evaluating search:
search ranking:
mAP = 30.36%
top- 1 = 60.48%
top- 5 = 75.89%
top-10 = 80.26%
Could you give me some suggestions?
Hi,
Thanks for your work.
I just didn't find the Class Weighted Similarity (CWS) in the code. Could you please help me locate it ?
I run the train_NAE.py,but it's out of memory.So I add model=torch.nn.DataParallel(model) but another error occured.
Traceback (most recent call last): File "scripts/train_NAE.py", line 85, in <module> main(args, fn) File "scripts/train_NAE.py", line 70, in main trainer.run(train_loader, max_epochs=args.train.epochs) File "/opt/conda/envs/NAE4PS/lib/python2.7/site-packages/ignite/engine/engine.py", line 359, in run self._handle_exception(e) File "/opt/conda/envs/NAE4PS/lib/python2.7/site-packages/ignite/engine/engine.py", line 324, in _handle_exception raise e RuntimeError: Caught RuntimeError in replica 0 on device 0. Original Traceback (most recent call last): File "/opt/conda/envs/NAE4PS/lib/python2.7/site-packages/torch/nn/parallel/parallel_apply.py", line 60, in _worker output = module(*input, **kwargs) File "/opt/conda/envs/NAE4PS/lib/python2.7/site-packages/torch/nn/modules/module.py", line 547, in __call__ result = self.forward(*input, **kwargs) File "/opt/conda/envs/NAE4PS/lib/python2.7/site-packages/torchvision/models/detection/generalized_rcnn.py", line 47, in forward images, targets = self.transform(images, targets) File "/opt/conda/envs/NAE4PS/lib/python2.7/site-packages/torch/nn/modules/module.py", line 547, in __call__ result = self.forward(*input, **kwargs) File "/opt/conda/envs/NAE4PS/lib/python2.7/site-packages/torchvision/models/detection/transform.py", line 40, in forward image = self.normalize(image) File "/opt/conda/envs/NAE4PS/lib/python2.7/site-packages/torchvision/models/detection/transform.py", line 55, in normalize return (image - mean[:, None, None]) / std[:, None, None] RuntimeError: The size of tensor a (2) must match the size of tensor b (3) at non-singleton dimension 0
How to run the model in mutli-GPU?what should I do?
直接下载本项目,执行后出现如下的错误,请问是什么原因。除了修改数据集路径,未做任何改动
RuntimeError: The expanded size of the tensor (896) must match the existing size (900) at non-singleton dimension 1. Target sizes: [3, 896, 1125]. Tensor sizes: [3, 900, 1125]
Hi, I met a problem when I tried to train pixel wise model.
CUDA_VISIBLE_DEVICES=0,1 python scripts/train_NAE.py --debug --lr_warm_up -p ./logs/ --batch_size 2 --nw 2 --w_RCNN_loss_bbox 10.0 --epochs 22 --lr 0.0012 --pixel_wise --NAE_pretrain
[!] Working directory: ./logs/Jun28_18-17-58_501
Traceback (most recent call last):
File "scripts/train_NAE.py", line 80, in <module>
main(args, fn)
File "scripts/train_NAE.py", line 51, in main
pretrained_backbone=True)
File "./lib/model/faster_rcnn_pixel_wise_norm_aware.py", line 366, in get_pixel_wise_norm_aware_model
return_res4=return_res4, GAP=False)
TypeError: resnet_backbone() got an unexpected keyword argument 'return_res4'
Paper with code is great!
I met some problems when creating conda env with conda env create -f environment.yml
Solving environment: done
==> WARNING: A newer version of conda exists. <==
current version: 4.5.11
latest version: 4.8.3
Please update conda by running
$ conda update -n base -c defaults conda
Preparing transaction: done
Verifying transaction: done
Executing transaction: done
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py==0.7.1 (from -r /home/zjli/Desktop/NAE4PS/condaenv.uw7gkqf0.requirements.txt (line 1))
Collecting atomicwrites==1.3.0 (from -r /home/zjli/Desktop/NAE4PS/condaenv.uw7gkqf0.requirements.txt (line 2))
Using cached https://files.pythonhosted.org/packages/52/90/6155aa926f43f2b2a22b01be7241be3bfd1ceaf7d0b3267213e8127d41f4/atomicwrites-1.3.0-py2.py3-none-any.whl
Collecting attrs==19.1.0 (from -r /home/zjli/Desktop/NAE4PS/condaenv.uw7gkqf0.requirements.txt (line 3))
Using cached https://files.pythonhosted.org/packages/23/96/d828354fa2dbdf216eaa7b7de0db692f12c234f7ef888cc14980ef40d1d2/attrs-19.1.0-py2.py3-none-any.whl
Collecting backports-shutil-get-terminal-size==1.0.0 (from -r /home/zjli/Desktop/NAE4PS/condaenv.uw7gkqf0.requirements.txt (line 4))
Using cached https://files.pythonhosted.org/packages/7d/cd/1750d6c35fe86d35f8562091737907f234b78fdffab42b29c72b1dd861f4/backports.shutil_get_terminal_size-1.0.0-py2.py3-none-any.whl
Collecting bleach==3.1.0 (from -r /home/zjli/Desktop/NAE4PS/condaenv.uw7gkqf0.requirements.txt (line 5))
Using cached https://files.pythonhosted.org/packages/ab/05/27e1466475e816d3001efb6e0a85a819be17411420494a1e602c36f8299d/bleach-3.1.0-py2.py3-none-any.whl
Collecting certifi==2019.6.16 (from -r /home/zjli/Desktop/NAE4PS/condaenv.uw7gkqf0.requirements.txt (line 6))
Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting configparser==3.7.4 (from -r /home/zjli/Desktop/NAE4PS/condaenv.uw7gkqf0.requirements.txt (line 7))
Using cached https://files.pythonhosted.org/packages/ba/05/6c96328e92e625fc31445d24d75a2c92ef9ba34fc5b037fe69693c362a0d/configparser-3.7.4-py2.py3-none-any.whl
Collecting contextlib2==0.6.0.post1 (from -r /home/zjli/Desktop/NAE4PS/condaenv.uw7gkqf0.requirements.txt (line 8))
Using cached https://files.pythonhosted.org/packages/85/60/370352f7ef6aa96c52fb001831622f50f923c1d575427d021b8ab3311236/contextlib2-0.6.0.post1-py2.py3-none-any.whl
Collecting cython==0.29.14 (from -r /home/zjli/Desktop/NAE4PS/condaenv.uw7gkqf0.requirements.txt (line 9))
Using cached https://files.pythonhosted.org/packages/7b/d2/060a4f311c3b4a83cb050882f1032dabd5c8045b489dc699cff60bcebdba/Cython-0.29.14-cp27-cp27mu-manylinux1_x86_64.whl
Collecting cython-bbox==0.1.3 (from -r /home/zjli/Desktop/NAE4PS/condaenv.uw7gkqf0.requirements.txt (line 10))
Collecting decorator==4.4.0 (from -r /home/zjli/Desktop/NAE4PS/condaenv.uw7gkqf0.requirements.txt (line 11))
Using cached https://files.pythonhosted.org/packages/5f/88/0075e461560a1e750a0dcbf77f1d9de775028c37a19a346a6c565a257399/decorator-4.4.0-py2.py3-none-any.whl
Collecting defusedxml==0.6.0 (from -r /home/zjli/Desktop/NAE4PS/condaenv.uw7gkqf0.requirements.txt (line 12))
Using cached https://files.pythonhosted.org/packages/06/74/9b387472866358ebc08732de3da6dc48e44b0aacd2ddaa5cb85ab7e986a2/defusedxml-0.6.0-py2.py3-none-any.whl
Collecting docutils==0.16 (from -r /home/zjli/Desktop/NAE4PS/condaenv.uw7gkqf0.requirements.txt (line 13))
Using cached https://files.pythonhosted.org/packages/81/44/8a15e45ffa96e6cf82956dd8d7af9e666357e16b0d93b253903475ee947f/docutils-0.16-py2.py3-none-any.whl
Collecting entrypoints==0.3 (from -r /home/zjli/Desktop/NAE4PS/condaenv.uw7gkqf0.requirements.txt (line 14))
Using cached https://files.pythonhosted.org/packages/ac/c6/44694103f8c221443ee6b0041f69e2740d89a25641e62fb4f2ee568f2f9c/entrypoints-0.3-py2.py3-none-any.whl
Collecting et-xmlfile==1.0.1 (from -r /home/zjli/Desktop/NAE4PS/condaenv.uw7gkqf0.requirements.txt (line 15))
Collecting flake8==3.7.9 (from -r /home/zjli/Desktop/NAE4PS/condaenv.uw7gkqf0.requirements.txt (line 16))
Using cached https://files.pythonhosted.org/packages/f8/1f/7ea40d1e4146ea55dbab41cda1376db092a75794914169aabd7e8d7a7def/flake8-3.7.9-py2.py3-none-any.whl
Collecting flake8-import-order==0.18.1 (from -r /home/zjli/Desktop/NAE4PS/condaenv.uw7gkqf0.requirements.txt (line 17))
Using cached https://files.pythonhosted.org/packages/ab/52/cf2d6e2c505644ca06de2f6f3546f1e4f2b7be34246c9e0757c6048868f9/flake8_import_order-0.18.1-py2.py3-none-any.whl
Collecting grpcio==1.22.0 (from -r /home/zjli/Desktop/NAE4PS/condaenv.uw7gkqf0.requirements.txt (line 18))
Using cached https://files.pythonhosted.org/packages/a5/46/5d08b6e26748ed6f3b5e93d980ea5daa63c3a8200b2ad270645b0e2f9566/grpcio-1.22.0-cp27-cp27mu-manylinux1_x86_64.whl
Collecting h5py==2.10.0 (from -r /home/zjli/Desktop/NAE4PS/condaenv.uw7gkqf0.requirements.txt (line 19))
Using cached https://files.pythonhosted.org/packages/12/90/3216b8f6d69905a320352a9ca6802a8e39fdb1cd93133c3d4163db8d5f19/h5py-2.10.0-cp27-cp27mu-manylinux1_x86_64.whl
Collecting huepy==0.9.8.1 (from -r /home/zjli/Desktop/NAE4PS/condaenv.uw7gkqf0.requirements.txt (line 20))
Collecting importlib-metadata==1.3.0 (from -r /home/zjli/Desktop/NAE4PS/condaenv.uw7gkqf0.requirements.txt (line 21))
Using cached https://files.pythonhosted.org/packages/e9/71/1a1e0ed0981bb6a67bce55a210f168126b7ebd2065958673797ea66489ca/importlib_metadata-1.3.0-py2.py3-none-any.whl
Collecting inotify==0.2.10 (from -r /home/zjli/Desktop/NAE4PS/condaenv.uw7gkqf0.requirements.txt (line 22))
Using cached https://files.pythonhosted.org/packages/c7/fc/9728f1f708ecd5981007abe133d44fdcddf40915f8d13e12a140b77376ae/inotify-0.2.10-py2-none-any.whl
Collecting ipaddress==1.0.22 (from -r /home/zjli/Desktop/NAE4PS/condaenv.uw7gkqf0.requirements.txt (line 23))
Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting ipdb==0.12.2 (from -r /home/zjli/Desktop/NAE4PS/condaenv.uw7gkqf0.requirements.txt (line 24))
Collecting ipykernel==4.10.0 (from -r /home/zjli/Desktop/NAE4PS/condaenv.uw7gkqf0.requirements.txt (line 25))
Using cached https://files.pythonhosted.org/packages/00/47/764e4fa1b1b89598426b8d79b1c4fbe8042432621b0f8e1991aeb3c24806/ipykernel-4.10.0-py2-none-any.whl
Collecting ipython==5.8.0 (from -r /home/zjli/Desktop/NAE4PS/condaenv.uw7gkqf0.requirements.txt (line 26))
Using cached https://files.pythonhosted.org/packages/b0/88/d996ab8be22cea1eaa18baee3678a11265e18cf09974728d683c51102148/ipython-5.8.0-py2-none-any.whl
Collecting ipython-genutils==0.2.0 (from -r /home/zjli/Desktop/NAE4PS/condaenv.uw7gkqf0.requirements.txt (line 27))
Using cached https://files.pythonhosted.org/packages/fa/bc/9bd3b5c2b4774d5f33b2d544f1460be9df7df2fe42f352135381c347c69a/ipython_genutils-0.2.0-py2.py3-none-any.whl
Collecting ipywidgets==7.5.1 (from -r /home/zjli/Desktop/NAE4PS/condaenv.uw7gkqf0.requirements.txt (line 28))
Using cached https://files.pythonhosted.org/packages/56/a0/dbcf5881bb2f51e8db678211907f16ea0a182b232c591a6d6f276985ca95/ipywidgets-7.5.1-py2.py3-none-any.whl
Collecting jdcal==1.4.1 (from -r /home/zjli/Desktop/NAE4PS/condaenv.uw7gkqf0.requirements.txt (line 29))
Using cached https://files.pythonhosted.org/packages/f0/da/572cbc0bc582390480bbd7c4e93d14dc46079778ed915b505dc494b37c57/jdcal-1.4.1-py2.py3-none-any.whl
Collecting jinja2==2.10.1 (from -r /home/zjli/Desktop/NAE4PS/condaenv.uw7gkqf0.requirements.txt (line 30))
Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Collecting jsonschema==3.0.1 (from -r /home/zjli/Desktop/NAE4PS/condaenv.uw7gkqf0.requirements.txt (line 31))
Using cached https://files.pythonhosted.org/packages/aa/69/df679dfbdd051568b53c38ec8152a3ab6bc533434fc7ed11ab034bf5e82f/jsonschema-3.0.1-py2.py3-none-any.whl
Collecting jupyter==1.0.0 (from -r /home/zjli/Desktop/NAE4PS/condaenv.uw7gkqf0.requirements.txt (line 32))
Using cached https://files.pythonhosted.org/packages/83/df/0f5dd132200728a86190397e1ea87cd76244e42d39ec5e88efd25b2abd7e/jupyter-1.0.0-py2.py3-none-any.whl
Collecting jupyter-client==5.3.1 (from -r /home/zjli/Desktop/NAE4PS/condaenv.uw7gkqf0.requirements.txt (line 33))
Using cached https://files.pythonhosted.org/packages/af/4c/bf613864ae0644e2ac7d4a40bd209c40c8c71e3dc88d5f1d0aa92a68e716/jupyter_client-5.3.1-py2.py3-none-any.whl
Collecting jupyter-console==5.2.0 (from -r /home/zjli/Desktop/NAE4PS/condaenv.uw7gkqf0.requirements.txt (line 34))
Using cached https://files.pythonhosted.org/packages/77/82/6469cd7fccf7958cbe5dce2e623f1e3c5e27f1bb1ad36d90519bc2d5d370/jupyter_console-5.2.0-py2.py3-none-any.whl
Collecting jupyter-core==4.5.0 (from -r /home/zjli/Desktop/NAE4PS/condaenv.uw7gkqf0.requirements.txt (line 35))
Using cached https://files.pythonhosted.org/packages/e6/25/6ffb0f6e57fa6ef5d2f814377133b361b42a6dd39105f4885a4f1666c2c3/jupyter_core-4.5.0-py2.py3-none-any.whl
Collecting lapsolver==1.0.2 (from -r /home/zjli/Desktop/NAE4PS/condaenv.uw7gkqf0.requirements.txt (line 36))
ERROR: Could not find a version that satisfies the requirement lapsolver==1.0.2 (from -r /home/zjli/Desktop/NAE4PS/condaenv.uw7gkqf0.requirements.txt (line 36)) (from versions: none)
ERROR: No matching distribution found for lapsolver==1.0.2 (from -r /home/zjli/Desktop/NAE4PS/condaenv.uw7gkqf0.requirements.txt (line 36))
CondaValueError: pip returned an error
Can you give me some suggestions about how to solve the problem?
I guess it is because python2 is out of date. Is the repo compatible with python3 ?
https://github.com/DeanChan/NAE4PS/blob/012fa760edda62316fd51b0f951283f6254d1c6e/lib/model/faster_rcnn_norm_aware.py#L102
Thanks a lot for sharing! I am try to run it, but class FastRCNNPredictorBN is wrong. Could you supplement it? Thanks.
hi, I just have a small question about the number of identities in CUHK-SYSU.
According to "Joint Detection and Identification Feature Learning for Person Search", the training set has 5532 identities, why the ignore index is 5554 in OIM loss?
Hi, can't open the link to pretrained model and datasets.
Could you please give a BaiduYun or google cloud link?
Thanks!
Good job!
The original OIM model is denoted as OIM-origin.
Thanks.
Hello, here is the result of pixel-wise model:
[~] Evaluating detections:
all detection:
recall = 89.40%
ap = 85.05%
[~] Evaluating search:
search ranking:
mAP = 87.62%
top- 1 = 88.00%
top- 5 = 95.76%
top-10 = 96.97%
The training command is CUDA_VISIBLE_DEVICES=0,1 python scripts/train_NAE.py --debug --lr_warm_up -p ./logs/ --batch_size 2 --nw 2 --w_RCNN_loss_bbox 10.0 --epochs 22 --lr 0.0012 --pixel_wise --NAE_pretrain --embedding_feat_fuse
The testing command is CUDA_VISIBLE_DEVICES=1 python scripts/test_NAE.py -p logs/Jul02_11-50-49_501 --pixel_wise
Did I miss something? I am looking forward to your response.
Hello, I am sorry to disturb you, but I have a question about the logic of evaluator.py
.
evaluator.py
Why the ap calculated by average_precision_score() is scaled by det_rate
.
It does not influence the algorithm's performance, I am just curious about it.
Good job! Thanks for your open code. I only have one GeForce GTX 1080Ti. Can I implement the training code and get the same accuracy ?
Thanks for your open code. Do you have a Pretrain model based on Python3?
NAE4PS/lib/datasets/prw.py in Line 208
Why multiply the recall rate to AP which already considers the recall rate? To consider the detection performance?
If so, is person search metric not exactly same as person re-id metric?
大家能运行吗? apex报错有没有遇到的?
训练时出现以下的错误
CUDA_VISIBLE_DEVICES=2 python scripts/train_NAE.py --debug --lr_warm_up -p ./logs/PRW/ --batch_size 1 --nw 1 --w_RCNN_loss_bbox 10.0 --epochs 22 --lr 0.003
[!] Working directory: ./logs/PRW/Apr17_15-28-53_amax
| ̄ ̄ ̄ ̄ ̄ ̄ ̄ ̄|
| TRAINING |
| epoch |
| 1 |
| _______|
(_/) ||
(•ㅅ•) ||
/ づ
Traceback (most recent call last):
File "scripts/train_NAE.py", line 79, in
main(args, fn)
File "scripts/train_NAE.py", line 64, in main
trainer.run(train_loader, max_epochs=args.train.epochs)
File "/home/luandong/.local/lib/python2.7/site-packages/ignite/engine/engine.py", line 359, in run
self._handle_exception(e)
File "/home/luandong/.local/lib/python2.7/site-packages/ignite/engine/engine.py", line 324, in _handle_exception
raise e
KeyError: 'keypoints'
Hi, thanks for your great paper and codes.
When I trained NAE on CUHK, using:
python scripts/train_NAE.py --lr_warm_up -p jobs/nae_cuhk/logs/ --batch_size 5 --nw 5 --w_RCNN_loss_bbox 10.0 --epochs 22 --lr 0.003
I got results as follow, which is quite good.
mAP = 91.92%
top- 1 = 92.97%
top- 5 = 97.62%
top-10 = 98.34%
But, when I trained NAE+ on CUHK, using
python scripts/train_NAE.py --lr_warm_up -p jobs/nae+_cuhk/logs/ --batch_size 5 --nw 5 --w_RCNN_loss_bbox 10.0 --epochs 11 --lr 0.003 --pixel_wise --NAE_pretrain --lr_decay_step 8 --embedding_feat_fuse
I only got:
mAP = 90.41%
top- 1 = 90.72%
top- 5 = 97.00%
top-10 = 98.07%
Did I use the right arguments to train NAE+?
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.