Code Monkey home page Code Monkey logo

Comments (5)

Kylin9511 avatar Kylin9511 commented on July 25, 2024

And if you truly want to show your attitude toward module re-use that much, warnings is more recommended as it won't interrupt the program. 😄

By raising a Warning, you're generating error, rather than real warning.

from pytorch-opcounter.

Lyken17 avatar Lyken17 commented on July 25, 2024

Question, does the example code you provide trigger the warning? Though ReLU is forwarded several times, the registration should only happen once.

from pytorch-opcounter.

Kylin9511 avatar Kylin9511 commented on July 25, 2024

@Lyken17 Sorry that I didn't dig deep into your code's logic, see the following NN design instead.

class SomeNet(nn.Module):
    def __init__(self, reduction=4):
        super(SomeNet, self).__init__()
        self.activation = nn.LeakyReLU(negative_slope=0.3, inplace=True)

        self.encoder_feature = nn.Sequential(OrderedDict([
            ("conv3x3_bn", conv3x3_bn(in_channel, 2)),
            ("activation", self.activation),
        ]))
        ...

The problem has nothing to do with module reuse in forward function. THOP will recursively enter each leaf nn.Module and detect if there exist any "re-defined" module.

However, the aforementioned re-defined is legal, since I want to define a globally same activation function and plant it in many sub-modules. And THOP stops me from doing that.

Like I said, even my definition of network is wrong, it is not THOP's job to point it out. And even if THOP would like to point it out, warnings rather than Raise Error should be used.

As a flops counting tools, it should not do extra judgement or security check. A correct flops and params output is the only thing to focus on.

from pytorch-opcounter.

Lyken17 avatar Lyken17 commented on July 25, 2024

It is a weird usage, I have to say. For your code example, each activation should have its own nn.Module. The proper case should be a recurrent structure where one layer is used in different depth. I have relaxed the check: now it only prints the warning instead of raises it.

from pytorch-opcounter.

Kylin9511 avatar Kylin9511 commented on July 25, 2024

Yet a legal one and a convinient one, too. Still unable to get your points of checking user's network... If I were you I would cancel the redundant check, but after all it is your project 😂

from pytorch-opcounter.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.