gasteigerjo / gdc Goto Github PK
View Code? Open in Web Editor NEWGraph Diffusion Convolution, as proposed in "Diffusion Improves Graph Learning" (NeurIPS 2019)
Home Page: https://www.daml.in.tum.de/gdc
License: MIT License
Graph Diffusion Convolution, as proposed in "Diffusion Improves Graph Learning" (NeurIPS 2019)
Home Page: https://www.daml.in.tum.de/gdc
License: MIT License
great paper!
I find your idea is similar to the paper: GRAPH WAVELET NEURAL NETWORK?
it also changes the eigenvalues of the Laplacian, am I right?
look forward to your reply~
We want to use GDC on large graph with more than 200,000 nodes, but directly calculating T^{-1} and expm(T) is very show.
We also notice your advice: How to use GDC in large graph
For large graphs I'd recommend using more scalable algorithms for PPR and the heat kernel instead of directly calculating T^{-1} and expm(T), as we do in this repository.
We can't find the more scalable algorithms for PPR and the heat kernel in this repository.
As for some large graph which is difficult to deal with in the format of AHX
, may I ask you about some advice or suggestion on this actual view(such as some knowledge graph) ?
Thanks for writing such a great paper! I noticed this paper because of DCRNN(Diffusion Convolutional Recurrent Neural Network: Data-Driven Traffic Forecasting). The author of DCRNN first purposed Graph Diffusion Convolution(personalized PageRank). However, I found his code is wrong while I'm implementing DCRNN.
I'm so curious about your code. Here, you just show the T_rw of GDC(PPR). Is the T_rw of GDC(HK) equals to T_rw of GDC(PPR) in an undiected graph? And, why T_rw of GDC(PPR) is alpha * (I_n - (1 - alpha) * \hat{A}) instead of D^{-1} * A ? More importantly, what's the formulation of GDC(PPR)? Is it \sum^{K}_{k=0}{\theta_k * T_rw^{k} * x}?
Dear author,
Could you please provide the code for applying diffusion with GAT? I have tried to apply diffusion on GAT, but I can't get the performance improvement with this technique. Or could you give me some guidance or tips when using diffusion on GAT?
Best
I have to say, this is an excellent model!
but I have some questions about the datasets
Hello,
When I try to use it on my data, it is very memory intensive. Is there a way to make it take less memory?
Dear authors,
I have one question about the way that sparsified diffusion matrix is used in the subsequent graph convolution.
To my understanding, the sparsified diffusion matrix is used as a weighted adjacency matrix when calling the function GCNConv. This is indicated by the code below. Am I right?
Line 226 in 996bc47
If so, then the graph convolution can be expressed as \mathbf{X}^{\prime} = \mathbf{\hat{D}}^{-1/2} \mathbf{\hat{\tilde{S}}}
\mathbf{\hat{D}}^{-1/2} \mathbf{X} \mathbf{\Theta}, in which \tilde{S} is the sparsified diffusion matrix.
However, in Eq. (3) of your paper, I learn that the diffusion matrix is equivalent to the graph filter, which shall be directly applied to multiplying the node attributes. Therefore the convolution shall be \mathbf{\tilde{S}}\mathbf{X}\mathbf{\Theta}.
If my understanding is correct, then the question is why the latter way of convolution is not applied in your codes.
Best regards,
Simon
The eps of ppr in the config file is blank. Does it have the original value?
Hi, I check the number of GAT on Pubmed is about 79.0, and the performance of GAT in the figure is about 76% which corresponds to the "None" method. And I am also curious about why GAT performed worse than GCN on CORA/CITESEER/PUBMED?
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.