Code Monkey home page Code Monkey logo

Comments (13)

Alan-Paul avatar Alan-Paul commented on August 16, 2024 1

I encountered the same problems .
I run the selftraining.py to train an duke2market model using your pretrained model, however, the results drop in performance. The final results is mAP : 54.0% , rank1 : 76.7% , as the results reported in your paper is mAP: 58.3%, rank1 : 80%. Here are my parameters . Python environment is pytorch 1.1.0 , python 3.6.0. Any suggestions will be appreciated !!
'''
arch='resnet50',
batch_size=128,
combine_trainval=False,
data_dir='./data',
dce_loss=False,
dist_metric='euclidean',
dropout=0,
epochs=70,
evaluate=False,
features=128,
gpu_devices='0,1',
height=None,
iteration=30,
lambda_value=0.1,
load_dist=False,
logs_dir='logs/duke2market',
lr=6e-05,
margin=0.5,
no_rerank=False,
num_instances=4,
num_split=2,
print_freq=20,
resume='logs/pretrained_models/dukemtmc_trained.pth.tar',
rho=0.0016,
seed=1,
split=0,
src_dataset='dukemtmc',
start_save=0,
tgt_dataset='market1501',
weight_decay=0.0005,
width=None,
workers=4
'''

from self-similarity-grouping.

OasisYang avatar OasisYang commented on August 16, 2024

Sorry I don't know the reasons, can you reproduce the performance using the provide models?
You need use provided model as pre-trained model and make sure the num-split is two.

from self-similarity-grouping.

jh97321 avatar jh97321 commented on August 16, 2024

The num-split in my settings is two. I use source_train.py to get the pre-trained model, it works in Duke2Market, but doesn't work in Market2Duke. I will try the provide models. Thank you anyway.

from self-similarity-grouping.

geyutang avatar geyutang commented on August 16, 2024

@jh97321 I also have the same problem with you.

  • My M->D result R-1=70.7, MAP=52.4 for SSG. Because the market pretrained model shows decode error, I re-train the source model on the Market dataset.

In addition, my D2M result also drops.

  • When using pre-trained Duke model, D->M before adaption is R-1=50.6, MAP=24.7. After adaptation by SSG, the R-1 is 76.3, MAP=54.1
  • When using the model training by re-train, D->M before adaption is R-1=50.0, MAP=24.3. After adaptation by SSG, the R-1 is 70.9, MAP=47.2.

Have you solved your problem? @OasisYang Any suggestion about this? Thanks!

from self-similarity-grouping.

OasisYang avatar OasisYang commented on August 16, 2024

If you cannot load the pretrained model, this link maybe helpful.
And please make sure you train our codes on Two GPUs.

from self-similarity-grouping.

geyutang avatar geyutang commented on August 16, 2024

Ok, I will try this. Thanks!
In addition, I have another problem with the DBSCAN algorithm for UDA person re-id.

why we need both the source and target sample distance to calculation self-label for the target sample by DBSCAN. I read the origin DBSCAN paper the sklearn API and found that the input of this clustering algorithm is the feature or distance matrix.

I am confused about this! Any suggestion? Thanks!

from self-similarity-grouping.

geyutang avatar geyutang commented on August 16, 2024

@Alan-Paul , I got the same result with you. And I try to re-train on the dukemtmc dataset using the source_train.py. Same result. The performance drop exists during duke->market.

from self-similarity-grouping.

OasisYang avatar OasisYang commented on August 16, 2024

@geyutang @Alan-Paul There're some suggestions. First, check if the performance of our provide model is same as reported in the paper. Also, check the performance of pretrained model, which should be mAP:26, R1:54 when transfer from Duke to Market (market2duke: 16/30). And I conducted all experiment with pytorch=0.4, torchvision=0.2 and scikit-learn=0.19.1. I hope these suggestions can help us.

from self-similarity-grouping.

geyutang avatar geyutang commented on August 16, 2024

The D2M result at the beginning is right.

Mean AP: 26.8%
CMC Scores market1501
top-1 54.2%
top-5 70.5%
top-10 76.8%

But the model enters saturation from 10 epoch. Following is my log on training rank1 with the iteration. It looks like overfitting. In addition, slightly modifies the learning rate, the result doesn't achieve that reported in your paper. Any suggestion for solving this model saturation problem?
image

Also, my torch version is 1.0.0, this may lead to the results mismatch.
Thanks for your kindly reply.

from self-similarity-grouping.

OasisYang avatar OasisYang commented on August 16, 2024

I trained model again with Pytorch=0.4.1, then I got the adaptation result from Market to Duke, as 53.3/72.4(mAP/R1), it's almost the same with the results reported in the paper.

from self-similarity-grouping.

yihongXU avatar yihongXU commented on August 16, 2024

I trained model again with Pytorch=0.4.1, then I got the adaptation result from Market to Duke, as 53.3/72.4(mAP/R1), it's almost the same with the results reported in the paper.

Hi,
Did you try Duck-> Market? it seems to me that we have difficulty to get 58.3/80.0 (mAP/R1), I've got 52.6/75.7(mAP/R1) instead. Thank you.

from self-similarity-grouping.

OasisYang avatar OasisYang commented on August 16, 2024

I will try it but it make take some time since most of computation resource is used for another ongoing project.

from self-similarity-grouping.

beiyangxiaolaodi avatar beiyangxiaolaodi commented on August 16, 2024

I runned the code in Market2Duke
I use pytorch 0.4.1 but the result of Market2Duke still has a drop in performance.
|SSG method| rank-1 | mAP |
| observed |68.7% | 49.2% |
| reported |73.0% |53.4% |

can you help?thanks

from self-similarity-grouping.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.