wjnam / relative_attributing_propagation Goto Github PK
View Code? Open in Web Editor NEWInterpreting DNNs, Relative attributing propagation
Interpreting DNNs, Relative attributing propagation
Hi,
in the paper you say:
We implement RAP with both Pytorch and Keras and visualize the explanation as a heatmap. For the evaluation, we utilized the Keras version to fairly compare with other explaining methods.
Would it be possible for you to share the Keras implementation as well?
First of all, thanks to open your codes.
I wonder how below codes are works at line 299 ~ 302 in resnet.py
R = self.layer4.relprop(R, alpha)
R = self.layer3.relprop(R, alpha)
R = self.layer2.relprop(R, alpha)
R = self.layer1.relprop(R, alpha)
For example, resnet18, each self.layer4 has 2 blocks (block(1) and block(2)).
((1) and (2) mean order of appended.)
In my understand, relevance scores are propagated from block(2) to block(1).
However, in your code, just call self.layer4.relprop(R, alpha)
, it looks relevance scores are propagated from block(1) to block(2).
Could you explain how these code work correctly?
thanks.
Could you share the evaluation metric code for each attribute map for accurate comparison for new algorithms?
Hi, while running the main.py for resnet50 with 365 output classes, I ran into the following error
torch.Size([1, 2048]) torch.Size([365, 2048]) torch.Size([1, 1000])
---------------------------------------------------------------------------
RuntimeError Traceback (most recent call last)
<ipython-input-3-02abbf51b5f9> in <module>
7 # Res = model.relprop(R = output * T, alpha= 1).sum(dim=1, keepdim=True)
8 #else:
----> 9 RAP = model.RAP_relprop(R=T)
10 Res = (RAP).sum(dim=1, keepdim=True)
11 # Check relevance value preserved
/home/SharedData3/ushasi/tub/gan/modules/resnet.py in RAP_relprop(self, R)
324 return R
325 def RAP_relprop(self, R):
--> 326 R = self.fc.RAP_relprop(R)
327 R = R.reshape_as(self.avgpool.Y)
328 R = self.avgpool.RAP_relprop(R)
/home/SharedData3/ushasi/tub/gan/modules/layers.py in RAP_relprop(self, R_p)
372 pd = R_p
373
--> 374 Rp_tmp = first_prop(pd, px, nx, pw, nw)
375 A = redistribute(Rp_tmp)
376
/home/SharedData3/ushasi/tub/gan/modules/layers.py in first_prop(pd, px, nx, pw, nw)
317 #print(px,pw)
**318 print(px.shape,pw.shape,pd.shape)**
--> 319 Rpp = F.linear(px, pw) * pd
320 Rpn = F.linear(px, nw) * pd
321 Rnp = F.linear(nx, pw) * pd
RuntimeError: The size of tensor a (365) must match the size of tensor b (1000) at non-singleton dimension 1
The top 3 shapes are a result of the print statement I added (in bold).
If I change my T = (T[:, np.newaxis] == np.arange(1000)) * 1.0
to T = (T[:, np.newaxis] == np.arange(365)) * 1.0
in compute_pred function, then the error goes away.
I just wanted to confirm that this is indeed the right way to fix the error, I hope I am not doing something random to get rid of the error and in the process, giving the wrong output.
It appears there is a bug when attempting to use the attribution method for ResNet18/Resnet34 model. Resnet50 works fine. Perhaps, there is a bug in BasicBlock class. Could you please take a look? The error trace is below.
Traceback (most recent call last):
File "main.py", line 119, in <module>
RAP = model.RAP_relprop(R=T)
File "Relative_Attributing_Propagation/modules/resnet.py", line 335, in RAP_relprop
R = self.layer4.RAP_relprop(R)
File "Relative_Attributing_Propagation/modules/layers.py", line 207, in RAP_relprop
Rp = m.RAP_relprop(Rp)
File "Relative_Attributing_Propagation/modules/resnet.py", line 113, in RAP_relprop
return self.clone.RAP_relprop([x1, x2])
File "Relative_Attributing_Propagation/modules/layers.py", line 154, in RAP_relprop
Rp_tmp = backward(tmp_R_p[i])
File "Relative_Attributing_Propagation/modules/layers.py", line 139, in backward
for z, rp, rn in zip(Z, R_p):
ValueError: not enough values to unpack (expected 3, got 2)
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.