Code Monkey home page Code Monkey logo

Comments (42)

stan-haochen avatar stan-haochen commented on May 19, 2024

This utility is not implemented in this version yet. It is in the plan.

However, we have tested it and it should be fairly easy to implement.
You can refer to the official detectron2 export script.

from adelaidet.

lucasjinreal avatar lucasjinreal commented on May 19, 2024

It seems community already have some exploration convert FCOS to ncnn, it would be very nice if BlendMask can support export to onnx and acceleration by TensorRT or implemented to mobile platform. Hoping for your guys updates.

from adelaidet.

blueardour avatar blueardour commented on May 19, 2024

@stan-haochen
If you can provide pretrained mode with the BN normalization in the head, I could pay some effort to convert Blendmask into other libraries or SDK such as ONNX/ncnn/caffe.

from adelaidet.

lucasjinreal avatar lucasjinreal commented on May 19, 2024

@blueardour I can train a bn head BlendMask for u to convert.

from adelaidet.

blueardour avatar blueardour commented on May 19, 2024

@jinfagang Sorry for the late rely

I'm very willing to do the convert if you could provide the BN head model.

from adelaidet.

lucasjinreal avatar lucasjinreal commented on May 19, 2024

@blueardour OK, let's do it, i'd like convert to tensorrt once it can be converted to onnx.

from adelaidet.

snaillp avatar snaillp commented on May 19, 2024

thank a lot for the work, is threre any new about the issue

from adelaidet.

lucasjinreal avatar lucasjinreal commented on May 19, 2024

no, bn head model training pended...

from adelaidet.

blueardour avatar blueardour commented on May 19, 2024

@jinfagang @snaillp I've discussed with @stan-haochen. We would pay some effort on training and converting the model.

from adelaidet.

lucasjinreal avatar lucasjinreal commented on May 19, 2024

@blueardour A blendmask RT model trained with bn only https://drive.google.com/open?id=1wMqOxOKCSeTRX_-xOomREbPNE3Ec87Ur
it uses a config file like:

cat configs/BlendMask/Base-550.yaml      
_BASE_: "Base-BlendMask.yaml"
MODEL:
  FCOS:
    TOP_LEVELS: 1
    IN_FEATURES: ["p3", "p4", "p5", "p6"]
    FPN_STRIDES: [8, 16, 32, 64]
    SIZES_OF_INTEREST: [64, 128, 256]
    NUM_SHARE_CONVS: 3
    NUM_CLS_CONVS: 0
    NUM_BOX_CONVS: 0
    NORM: "SyncBN"
  BASIS_MODULE:
    NUM_CONVS: 2
INPUT:
  MIN_SIZE_TRAIN: (440, 462, 484, 506, 528, 550)
  MAX_SIZE_TRAIN: 916
  MIN_SIZE_TEST: 550
  MAX_SIZE_TEST: 916

cat configs/BlendMask/RT_R_50_4x_bn.yaml                                                                                                                                                                                                           _BASE_: "Base-550.yaml"                                                                                                                                                                                                                                                        
INPUT:
  MIN_SIZE_TRAIN: (256, 288, 320, 352, 384, 416, 448, 480, 512, 544, 576, 608)
  MAX_SIZE_TRAIN: 900
  MAX_SIZE_TEST: 736
  MIN_SIZE_TEST: 512
MODEL:
  WEIGHTS: "detectron2://ImageNetPretrained/MSRA/R-50.pkl"
  RESNETS:
    DEPTH: 50
    NORM: "SyncBN"
  BACKBONE:
    FREEZE_AT: -1
SOLVER:
  STEPS: (300000, 340000)
  MAX_ITER: 360000
OUTPUT_DIR: "output/blendmask/RT_R_50_4x"

mAP with 30 trained so far, we can try figure out convert to onnx first.

from adelaidet.

stan-haochen avatar stan-haochen commented on May 19, 2024

I will provide BN-head models for RT_R50 this week.

from adelaidet.

blueardour avatar blueardour commented on May 19, 2024

@jinfagang Thanks for the training file. I'm working on the covert.

from adelaidet.

lucasjinreal avatar lucasjinreal commented on May 19, 2024

Waiting for your progress

from adelaidet.

blueardour avatar blueardour commented on May 19, 2024

@ALL
I almost finished the convert to onnx. Verfication code in the onnxruntime is also supported.
Here 'almost' indicates most of the result are correct tested in onnxruntime for the converted onnx.
However, for the basis_module in Blendmask, interpolate with bilinear is employed. The operation seems not well supported.

From the exporting the onnx, it requires opset 11 to give consistent result with pytorch. However, in onnxruntime/caffe2/... software, they currently only support opset 9. A work around is change the bilinear to nearest mode and omit the align_corners parameter. For this issue, refer
pytorch/pytorch#18113

As it is not 100% perfectly completed, I only submit the code in my only repo rather than applying a PR in this repo. https://github.com/blueardour/uofa-AdelaiDet

from adelaidet.

lucasjinreal avatar lucasjinreal commented on May 19, 2024

@blueardour How many onnx ops envolved in converted onnx? Is it can be converted to trt engine through onnx-tensorrt?

from adelaidet.

blueardour avatar blueardour commented on May 19, 2024

yes, the both verification in tensorrt and onnxruntime are added. check the new PR, please

from adelaidet.

lucasjinreal avatar lucasjinreal commented on May 19, 2024

@blueardour Did u using onnxruntime's TensorRT backend for checking tensorrt compatible or using onnx-tensorrt converting onnx to engine?

from adelaidet.

blueardour avatar blueardour commented on May 19, 2024

@jinfagang I found I made a mistake by take the caffe2 engine, I'm swtiching to the TensorRT engine

from adelaidet.

lucasjinreal avatar lucasjinreal commented on May 19, 2024

tensorrt might can not properly convert several ops such as RoiAlign, and need some surgery to change write way in pytorch code before export.

from adelaidet.

blueardour avatar blueardour commented on May 19, 2024

@jinfagang
I already block those unsupport layers when exporting the onnx file, such as RoiAlign and NMS.

I fix the onnx verification script and add an one-in-all script demo pytorch-onnx-caffe-ncnn-rt.sh under onnx folder. It contains the verification of the onnx file in onnxruntime/caffe2/tensorrt.

Note that I pass the test for FCOS. Failed for Blendmask as it contains unsupported UpSample type.
If Blendmask require, the code need to be revised. (For example change the bilinear upsample to nearest upsample)

from adelaidet.

lucasjinreal avatar lucasjinreal commented on May 19, 2024

block ROIAlign and NMS so it can only forward previous modules can not forward whole model?

from adelaidet.

blueardour avatar blueardour commented on May 19, 2024

Yes. As those layers seem not to be standard supported in many frameworks and even some frameworks have the support but with a different implementation, layers such as ROIAlign and NMS are not exported. In the export script, one can check the all the exported nodes by dumping the output_names

from adelaidet.

lucasjinreal avatar lucasjinreal commented on May 19, 2024

@blueardour Did u managed convert the onnx to tensorrt? I hav a work around it to trt engine.
the trt outputs seems likely fcos, the resting would do the rest process as fcos for blendmask

enginecheck blendmask.trt                                                                                                                                                                                               master!?
check on engine: blendmask.trt
=> checking engine....
=> engine maxBatchSize: 32
=> engine NbBindings: 18
    => BindingName at: 0=input_image Dims=4 shape: 1,3,800,1088,
    => BindingName at: 1=bases Dims=4 shape: 1,4,200,272,
    => BindingName at: 2=P3logits Dims=4 shape: 1,80,100,136,
    => BindingName at: 3=P3centerness Dims=4 shape: 1,1,100,136,
    => BindingName at: 4=P3bbox_reg Dims=4 shape: 1,4,100,136,
    => BindingName at: 5=P3top_feats Dims=4 shape: 1,784,100,136,
    => BindingName at: 6=P4logits Dims=4 shape: 1,80,50,68,
    => BindingName at: 7=P4centerness Dims=4 shape: 1,1,50,68,
    => BindingName at: 8=P4bbox_reg Dims=4 shape: 1,4,50,68,
    => BindingName at: 9=P4top_feats Dims=4 shape: 1,784,50,68,
    => BindingName at: 10=P5logits Dims=4 shape: 1,80,25,34,
    => BindingName at: 11=P5centerness Dims=4 shape: 1,1,25,34,
    => BindingName at: 12=P5bbox_reg Dims=4 shape: 1,4,25,34,
    => BindingName at: 13=P5top_feats Dims=4 shape: 1,784,25,34,
    => BindingName at: 14=P6logits Dims=4 shape: 1,80,13,17,
    => BindingName at: 15=P6centerness Dims=4 shape: 1,1,13,17,
    => BindingName at: 16=P6bbox_reg Dims=4 shape: 1,4,13,17,
    => BindingName at: 17=P6top_feats Dims=4 shape: 1,784,13,17,
done.
engine shutdown.

from adelaidet.

stan-haochen avatar stan-haochen commented on May 19, 2024

I have trained a realtime version of R-50 model, mask AP 35.1/FPS 31@1080Ti.

You can find it here: https://github.com/aim-uofa/AdelaiDet/blob/master/configs/BlendMask/README.md#blendmask-real-time-models

from adelaidet.

blueardour avatar blueardour commented on May 19, 2024

@jinfagang I only test the onnx file by the onnx_tensorrt package. Refer line 270/271 in https://github.com/blueardour/uofa-AdelaiDet/blob/master/onnx/test_onnxruntime.py I didn't test in a standalone tensorrt.

Ps. My PR seems not approved yet. refer: #80

from adelaidet.

lucasjinreal avatar lucasjinreal commented on May 19, 2024

@blueardour Will u also written a standalone inference repo based on onnx in ncnn or some other framework? So that we can inference it and visiaulize result.

from adelaidet.

blueardour avatar blueardour commented on May 19, 2024

Sorry @jinfagang I'm currently busing preparing my conference paper. The deadline is coming. I can spare time about mid June on this issue. If you cannot wait so long, please refer my FCOS implementation. The only remained work is to add the basis module branch in NCNN.

from adelaidet.

lucasjinreal avatar lucasjinreal commented on May 19, 2024

@blueardour I'd like to finish Blendmask part with your help (if you don't have time to do it), do u have a slack or telegram to talk with? wechat is also ok

from adelaidet.

blueardour avatar blueardour commented on May 19, 2024

Please leave a message to my email: [email protected]

from adelaidet.

lucasjinreal avatar lucasjinreal commented on May 19, 2024

@blueardour Hi, still the BlendMask problem. I looked around and found only this method have a high accuracy and high speed. Do u have a plan to export the onnx model with postprocess included as much as possible?

from adelaidet.

blueardour avatar blueardour commented on May 19, 2024

Sorry. Current get no hand on developing the postprocessing code. You might contact me by the email if you have a strong will to implement the whole network.

BTW, if speed is a concern, might my new rep on model quantization will be helpful. https://github.com/blueardour/model-quantization Note detection/segmentation related files are still in preparation.

from adelaidet.

lucasjinreal avatar lucasjinreal commented on May 19, 2024

@blueardour Tried contact u with email without a response. do u have a more instant contact such as telegram or something?

from adelaidet.

blueardour avatar blueardour commented on May 19, 2024

@jinfagang Hi, I checkout the mail box and be sure having replied you. As your original email has not subject, might it be possible filtered by the Google Mail? Anyway, I edit the subject of mail and reply you again. Also, feel free to contact me with the wechat leave in the email.

from adelaidet.

Cndbk avatar Cndbk commented on May 19, 2024

Hi. Does BlendMask support export to onnx now?

from adelaidet.

blueardour avatar blueardour commented on May 19, 2024

Try the following steps (path/filename should be revised based on your own machine):

FASTDIR=~/workspace

# https://cloudstor.aarnet.edu.au/plus/s/hI15l4ChWFqWvHp/download
pytorch_model=$PWD/weights/RT_R_50_4x_bn-head_syncbn_shtw.pth
onnx_repo=$PWD/weights/

config=configs/BlendMask/RT_R_50_4x_bn-head_syncbn_shtw.yaml
case=blendmask-bn-head
height=640
width=480

if [ ! -e $onnx_repo/$case.onnx ] || [ "$update" != "" ];
then
  cd $FASTDIR/git/uofa-AdelaiDet/ # folder of project https://github.com/aim-uofa/AdelaiDet
  pwd
  python -V # ensure python3.x
  python onnx/export_model_to_onnx.py \
    --config-file $config \
    --output $onnx_repo/$case.onnx \
    --width $width --height $height \
    --opts MODEL.WEIGHTS $pytorch_model MODEL.DEVICE cpu
  if [ $? -ne 0 ]; then exit; fi
fi

from adelaidet.

linhaoqi027 avatar linhaoqi027 commented on May 19, 2024

Hello, I have converted blendmask into onnx model,according to the code you provided. What is the meaning of the output of the converted onnx model?

from adelaidet.

Takugo avatar Takugo commented on May 19, 2024

@linhaoqi027 Hi, could you please provide your successfully converted onnx model? It will be very helpful. I got stuck at converting any pth model into onnx, the ArrayRef errors popup.

from adelaidet.

hanchangdong avatar hanchangdong commented on May 19, 2024

Hello, I have converted blendmask into onnx model,according to the code you provided. What is the meaning of the output of the converted onnx model?

Hello, how do u run the script(export_model_to_onnx.py). when i run it to export BlendMask, got error:
RuntimeError: ONNX export failed: Couldn't export Python operator _ModulatedDeformConv

from adelaidet.

zxcvbml avatar zxcvbml commented on May 19, 2024

@linhaoqi027 Hi, could you please provide your successfully converted onnx model? It will be very helpful. I got stuck at converting any pth model into onnx, the ArrayRef errors popup.

Hi, have you solved it? I have the same problem.

from adelaidet.

13572320829 avatar 13572320829 commented on May 19, 2024

you can use orther model that donn't contain the "_ModulatedDeformConv". for example: blendmask_550_r_50_3x.pth.

Hello, I have converted blendmask into onnx model,according to the code you provided. What is the meaning of the output of the converted onnx model?

Hello, how do u run the script(export_model_to_onnx.py). when i run it to export BlendMask, got error: RuntimeError: ONNX export failed: Couldn't export Python operator _ModulatedDeformConv

from adelaidet.

xunzha avatar xunzha commented on May 19, 2024

Sorry. Current get no hand on developing the postprocessing code. You might contact me by the email if you have a strong will to implement the whole network.抱歉目前还没有开发后处理代码。如果你有强烈的意愿来实现整个网络,你可以通过电子邮件与我联系。

BTW, if speed is a concern, might my new rep on model quantization will be helpful. https://github.com/blueardour/model-quantization Note detection/segmentation related files are still in preparation.顺便说一句,如果速度是一个问题,可能我的新代表模型量化将有所帮助。https://github.com/blueardour/model-quantization注意检测/分割相关文件仍在准备中。
Hi,now do you developing the postprocessing code for blendmask?
I contact you by the email

from adelaidet.

xunzha avatar xunzha commented on May 19, 2024

Sorry. Current get no hand on developing the postprocessing code. You might contact me by the email if you have a strong will to implement the whole network.抱歉目前还没有开发后处理代码。如果你有强烈的意愿来实现整个网络,你可以通过电子邮件与我联系。

BTW, if speed is a concern, might my new rep on model quantization will be helpful. https://github.com/blueardour/model-quantization Note detection/segmentation related files are still in preparation.顺便说一句,如果速度是一个问题,可能我的新代表模型量化将有所帮助。https://github.com/blueardour/model-quantization注意检测/分割相关文件仍在准备中。
Hi,now do you developing the postprocessing code for blendmask?
I contact you by the email

from adelaidet.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.