Comments (5)
Hi, thank you for your support!
The code for the Fourier analysis is messy (at this moment) so I did not release it yet. I will release the code after re-implementing it.
The snippet below is a pseudo-code for the Fourier analysis. I hope this helps you.
import math
import matplotlib.pyplot as plt
def fourier(x): # 2D Fourier transform
f = torch.fft.fft2(x)
f = f.abs() + 1e-6
f = f.log()
return f
def shift(x):
b, c, h, w = x.shape
return torch.roll(x, shifts=(int(h/2), int(w/2)), dims=(2, 3))
fig, ax = plt.subplots(1, 1, figsize=(3.3, 4))
for latent in latents: # `latents` is a list of hidden feature maps in latent spaces
if len(latent.shape) == 3: # For ViT
b, n, c = latent.shape
h, w = int(math.sqrt(n)), int(math.sqrt(n))
latent = latent.permute(0, 2, 1).reshape(b, c, h, w)
elif len(latent.shape) == 4: # For CNN
b, c, h, w = latent.shape
else:
raise Exception("shape: %s" % str(latent.shape))
latent = fourier(latent)
latent = shift(latent).mean(dim=(0, 1))
latent = latent.diag()[int(h/2):] # Only use the half-diagonal components
latent = latent - latent[0] # Visualize 'relative' log amplitudes
# Plot Fourier transformed relative log amplitudes
freq = np.linspace(0, 1, len(latent))
ax.plot(freq, latent)
from how-do-vits-work.
Hi @dinhanhx
latents
is hidden states (latent feature maps). If you're using timm
, you can get latents
by using the snippet below:
import copy
import timm
import torch
import torch.nn as nn
# Divide the pretrained timm model into blocks.
name = 'vit_tiny_patch16_224'
model = timm.create_model(name, pretrained=True)
class PatchEmbed(nn.Module):
def __init__(self, model):
super().__init__()
self.model = copy.deepcopy(model)
def forward(self, x, **kwargs):
x = self.model.patch_embed(x)
cls_token = self.model.cls_token.expand(x.shape[0], -1, -1)
x = torch.cat((cls_token, x), dim=1)
x = self.model.pos_drop(x + self.model.pos_embed)
return x
class Residual(nn.Module):
def __init__(self, *fn):
super().__init__()
self.fn = nn.Sequential(*fn)
def forward(self, x, **kwargs):
return self.fn(x, **kwargs) + x
class Lambda(nn.Module):
def __init__(self, fn):
super().__init__()
self.fn = fn
def forward(self, x):
return self.fn(x)
def flatten(xs_list):
return [x for xs in xs_list for x in xs]
# `blocks` is a sequence of blocks
blocks = [
PatchEmbed(model),
*flatten([[Residual(b.norm1, b.attn), Residual(b.norm2, b.mlp)]
for b in model.blocks]),
nn.Sequential(Lambda(lambda x: x[:, 0]), model.norm, model.head),
]
# This snippet below build off https://github.com/facebookresearch/mae
import requests
import torch
import numpy as np
from PIL import Image
from einops import rearrange, reduce, repeat
imagenet_mean = np.array([0.485, 0.456, 0.406])
imagenet_std = np.array([0.229, 0.224, 0.225])
# Load an image
img_url = 'https://user-images.githubusercontent.com/11435359/147738734-196fd92f-9260-48d5-ba7e-bf103d29364d.jpg'
xs = Image.open(requests.get(img_url, stream=True).raw)
xs = xs.resize((224, 224))
xs = np.array(xs) / 255.
assert xs.shape == (224, 224, 3)
# Normalize by ImageNet mean and std
xs = xs - imagenet_mean
xs = xs / imagenet_std
xs = rearrange(torch.tensor(xs, dtype=torch.float32), 'h w c -> 1 c h w')
# Accumulate `latents` by collecting hidden states of a model
latents = []
with torch.no_grad():
for block in blocks:
xs = block(xs)
latents.append(xs)
latents = [latent[:,1:] for latent in latents] # Drop CLS token
latents = latents[:-1] # Drop logit
I can't give a definite timeline, but I’ll try hard to release the whole code for Fourier analysis by next Friday!
from how-do-vits-work.
Alright thanks for the work
from how-do-vits-work.
I have just released the code for Fourier analysis! Please refer to the fourier_analysis.ipynb
notebook. The code also can run on Colab (no GPU is needed).
Please feel free to reopen this issue if the problem still exists.
from how-do-vits-work.
@xxxnell when will you probably release the code for all Fourier stuff? (you don't need to answer exact days or dates. You can say next week approximate or sth like that.)
Also how do I find latents
of a model?
from how-do-vits-work.
Related Issues (20)
- code about robustness for noise frequency exp (Fig. 2b) HOT 2
- question about detail--drop_pro parameter sd HOT 1
- Frequency Analysis for MoCo-v3 HOT 1
- What exactly makes MSAs data specificity? HOT 7
- Findings not compatible with other work? HOT 4
- ViT vs ResNet: Did you use SAM optimizer? HOT 1
- How would the MSA build-up rules differ for upsampling stages? HOT 1
- Conclusion about long-range dependency seems not true HOT 6
- What factors determine if a model or a layer behaves like a low- or high-pass filter? HOT 1
- pretrained model file is corrupted HOT 1
- Total parameters in AlterNet HOT 1
- Hessian Max eigenvalue spectra 코드 관련 질문드립니다. HOT 1
- Question about harmonizing Convs with MSAs HOT 2
- Question about Figure 2(a) HOT 6
- pretrained models HOT 3
- AlterNet on CIFAR10 HOT 1
- relative log magnitude HOT 1
- Understanding loss landscape HOT 1
- Lesion study HOT 1
- Can I get a guideline for hessian eigenvalue visualization? HOT 1
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from how-do-vits-work.