Code Monkey home page Code Monkey logo

guided-attention-inference-network's Introduction

guided-attention-inference-network's People

Contributors

alokwhitewolf avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

guided-attention-inference-network's Issues

Trying to understand why xp.random.choice was used in set_init_grad function when the labels are available

Thanks for your code.
I had a question of the set_init_grad code reference here:

def set_init_grad(self, var, label):
var.grad = self.xp.zeros_like(var.data)
if label is None:
class_id = F.argmax(var).data
var.grad[0][class_id] = 1
else:
class_id = self.xp.random.choice(label, 1)
var.grad[0][class_id] = 1
return class_id

On line 61, you use xp.random.choice(label,1). Can you explain why are you using random choice when you know the ground truth label index? I did not see that in the original paper and was curious to know if it improved your accuracy or performance. @alokwhitewolf

Do I need to install any deep-learning libraries?

Thanks a lot for your great work.
I have read your paper and now try to reproduce your work.
For this purpose, do I need to install any deep-learning libraries, such as pytorch or tensorflow?
Thanks a lot.

No heating after a minimal training with some not too bad accuracy

Hi, thank you for your work.
Do you have any suggestions on this behavior ?

image

I ran the training for the classificator (without GAIN) the minimal amount of iterations just to have a snapshot of the model, same with GAIN (I've tried to use the previously pretrained model, and also without pretraining, same accuracy, no change).

The accuracy for both with and without GAIN is about 72%, although maybe it is explainable, why the result is the same (not trained enough), but isn't there should be any heating (in the meaning of the heating maps) just with 72% accuracy ?

Is this expected or anomalous behavior ?
I didn't use any pretrained models of FCN8, because I didn't find them available now, so I can look only at those results.
If it is anomalous, what can be the reason, where should I look for the problem for your opinion ?

Thank you in advance, appreciate your help.

ValueError: Loss is nan.

Thank you for sharing the implementation. I tried to reproduce by running 'train_classifier.py' which produces classifier_model_14640, then running train_GAIN.py with the classifier_model_14640 as the pretrained FCN8 model. However it throw out this error 'ValueError: Loss is nan.' at [line 206, fcn8.py]. All variables after conv2 seem to be all nan. Please could you have a look at this problem? Many thanks.

Pretrained model not available

The pretrained model is not available. I tried following the code "python3 train_classifier.py --device 0" but the file is not there for usage.

The implement of attention loss

Thanks for this great implement. I go through your codes, and find that this repo has implemented the classification loss and attention mining loss in the paper. Do you have the plan to add the attention loss (directly compare the l2 distance of the attention map and pixel-wise mask) in the paper? Meanwhile, I notice that you add the segmentation loss in this repo. This segmentation loss is helpful for generating an accurate attention map?

some variables are assigned but never used

In visualize.py file, some variables are assigned but never used, for example, trainer in line 47 and undefined variable in line 49 etc.

Could you please check this out?

KeyError: 'fc6/b is not a file in the archive'

After training classifer, it generates snapshot. When i use this snapshot to train gain, this error occurs. Maybe fc6/b is bias of conv layer, which is not in classifier model. In classifier, it is fully connected layer named fc6_cl.
I am new to chainer, how to load snapshot to Gain model?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.