Code Monkey home page Code Monkey logo

Comments (5)

fuyawangye avatar fuyawangye commented on July 20, 2024 5

@newExplore-hash
modifiy "networks/AugmentCE2P.py" as follow is better

# from ..modules import InPlaceABNSync

# BatchNorm2d = functools.partial(InPlaceABNSync, activation='none')
BatchNorm2d = nn.BatchNorm2d


class InPlaceABNSync(BatchNorm2d):
    def __init__(self, *args, **kwargs):
        super(InPlaceABNSync, self).__init__(*args, **kwargs)
        self.act = nn.LeakyReLU()

    def forward(self, input):
        output = super(InPlaceABNSync, self).forward(input)
        output = self.act(output)
        return output

from self-correction-human-parsing.

GoGoDuck912 avatar GoGoDuck912 commented on July 20, 2024

Hi @newExplore-hash ,

In brief, you don't need to install the latest InPlaceABNSync by yourself. The cuda files in ./modules are compiled by Pytorch automatically.

As your own environmental problems, please refer to https://github.com/mapillary/inplace_abn/tree/v0.1.1 for some needed packages.

from self-correction-human-parsing.

GoGoDuck912 avatar GoGoDuck912 commented on July 20, 2024

Moreover, if you only need inference, you can replace the InPlaceABNSync layer with one BatchNorm2D layer and one LeakyRelu layer in PyTorch.

from self-correction-human-parsing.

newExplore-hash avatar newExplore-hash commented on July 20, 2024

Moreover, if you only need inference, you can replace the InPlaceABNSync layer with one BatchNorm2D layer and one LeakyRelu layer in PyTorch.

yes, i just use this for inference, i know the role of InPlaceABNSync is to reduce the memory required for training deep networks. So i use nn.BatchNorm2d and nn.LeakyReLU instead of InPlaceABNSync in the script of AugmentCE2P.py for inference, but there is a problem as following:
Traceback (most recent call last):
File "simple_extractor.py", line 166, in
main()
File "simple_extractor.py", line 115, in main
model.load_state_dict(new_state_dict)
File "/root/miniconda3/envs/human_parsing/lib/python3.6/site-packages/torch/nn/modules/module.py", line 845, in load_state_dict
self.class.name, "\n\t".join(error_msgs)))
RuntimeError: Error(s) in loading state_dict for ResNet:
Missing key(s) in state_dict: "decoder.conv3.4.weight", "decoder.conv3.4.bias", "decoder.conv3.4.running_mean", "decoder.conv3.4.running_var".
Unexpected key(s) in state_dict: "decoder.conv3.2.weight", "decoder.conv3.3.bias", "decoder.conv3.3.running_mean", "decoder.conv3.3.running_var".
size mismatch for decoder.conv3.3.weight: copying a param with shape torch.Size([256]) from checkpoint, the shape in current model is torch.Size([256, 256, 1, 1]).

this error just happened in Decoder_Module for self.conv3, and there is normal for PSPModule and Edge_Module and other operations(self.conv1 and self.conv2) for Decoder_Module .

this is code:
class Decoder_Module(nn.Module):
"""
Parsing Branch Decoder Module.
"""

def __init__(self, num_classes):
    super(Decoder_Module, self).__init__()
    self.conv1 = nn.Sequential(
        nn.Conv2d(512, 256, kernel_size=1, padding=0, dilation=1, bias=False),
        InPlaceABNSync(256),
        #######
        nn.LeakyReLU()
    )
    self.conv2 = nn.Sequential(
        nn.Conv2d(256, 48, kernel_size=1, stride=1, padding=0, dilation=1, bias=False),
        InPlaceABNSync(48),
        ########
        nn.LeakyReLU()
    )
    self.conv3 = nn.Sequential(
        nn.Conv2d(304, 256, kernel_size=1, padding=0, dilation=1, bias=False),
        InPlaceABNSync(256),
        ########
        nn.LeakyReLU(),
        nn.Conv2d(256, 256, kernel_size=1, padding=0, dilation=1, bias=False),
        InPlaceABNSync(256),
        #########
        nn.LeakyReLU()
    )

    self.conv4 = nn.Conv2d(256, num_classes, kernel_size=1, padding=0, dilation=1, bias=True)

tks

from self-correction-human-parsing.

fuyawangye avatar fuyawangye commented on July 20, 2024

I modified "networks/AugmentCE2P.py" which replace C++ extended “InPlaceABNSync” for inferencing on cpu
successful!

# from ..modules import InPlaceABNSync

# BatchNorm2d = functools.partial(InPlaceABNSync, activation='none')
BatchNorm2d = nn.BatchNorm2d


class ReplacePlaceABNSync(BatchNorm2d):
    def __init__(self, *args, **kwargs):
        super(ReplacePlaceABNSync, self).__init__(*args, **kwargs)
        self.act = nn.LeakyReLU()

    def forward(self, input):
        output = super(ReplacePlaceABNSync, self).forward(input)
        output = self.act(output)
        return output

from self-correction-human-parsing.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.