Hi.
First of all thank you very much for your work, great pose estimation libary!
When I would like to test your crowdpose trained models, I get the following error:
`root@lsdl030x:/workspace# python tools/inference_demo.py --cfg experiments/inference_demo.yaml --videoFile crowd_issue_example.mp4 --outputDir output --visthre 0.3 TEST.MODEL_FILE model/pose_crowdpose/pose_hrnet_w32_reg_delaysep_bg01_stn_512_adam_lr1e-3_crowdpose_x300.pth
=> loading model from model/pose_crowdpose/pose_hrnet_w32_reg_delaysep_bg01_stn_512_adam_lr1e-3_crowdpose_x300.pth
Traceback (most recent call last):
File "tools/inference_demo.py", line 267, in
main()
File "tools/inference_demo.py", line 190, in main
cfg.TEST.MODEL_FILE), strict=False)
File "/usr/local/lib/python3.7/dist-packages/torch/nn/modules/module.py", line 777, in load_state_dict
self.class.name, "\n\t".join(error_msgs)))
RuntimeError: Error(s) in loading state_dict for PoseHigherResolutionNet:
size mismatch for transition_reg.0.weight: copying a param with shape torch.Size([256, 494, 1, 1]) from checkpoint, the shape in current model is torch.Size([256, 497, 1, 1]).
size mismatch for final_layers.0.weight: copying a param with shape torch.Size([14, 32, 1, 1]) from checkpoint, the shape in current model is torch.Size([17, 32, 1, 1]).
size mismatch for final_layers.0.bias: copying a param with shape torch.Size([14]) from checkpoint, the shape in current model is torch.Size([17]).
size mismatch for final_layers.1.weight: copying a param with shape torch.Size([29, 256, 1, 1]) from checkpoint, the shape in current model is torch.Size([35, 256, 1, 1]).
size mismatch for final_layers.1.bias: copying a param with shape torch.Size([29]) from checkpoint, the shape in current model is torch.Size([35]).`
It seems like PyTorch was unable to load the model weights, to the defined model, because unmatching layer dimensions. I followed your installation instructions throughly and I can test the coco trained models.
Do you have any idea why this occurs or can you guide me towards solving this issue? Thank you in advance!
Peter