Code Monkey home page Code Monkey logo

graph_transformer_networks's Introduction

Graph Transformer Networks

This repository is the implementation of Graph Transformer Networks(GTN) and Fast Graph Transformer Networks with Non-local Operations (FastGTN).

Seongjun Yun, Minbyul Jeong, Raehyun Kim, Jaewoo Kang, Hyunwoo J. Kim, Graph Transformer Networks, In Advances in Neural Information Processing Systems (NeurIPS 2019).

Seongjun Yun, Minbyul Jeong, Sungdong Yoo, Seunghun Lee, Sean S, Yi, Raehyun Kim, Jaewoo Kang, Hyunwoo J. Kim, Graph Transformer Networks: Learning meta-path graphs to improve GNNs, Neural Networks 2022.

Updates

  • [Sep 19, 2022] We released the source code of our FastGTN with non-local operations, which improves GTN's scalability (Fast) and performance (non-local operations).
  • [Sep 19, 2022] We updated the source code of our GTNs to address the issue where the latest version of torch_geometric removed the backward() of the multiplication of sparse matrices (spspmm). To be specific, we implemented the multiplication of sparse matrices using pytorch.sparse.mm that includes backward() operation.

Installation

Install pytorch

Install torch_geometric

To run the previous version of GTN (in prev_GTN folder),

$ pip install torch-sparse-old

** The latest version of torch_geometric removed the backward() of the multiplication of sparse matrices (spspmm), so to solve the problem, we uploaded the old version of torch-sparse with backward() on pip under the name torch-sparse-old.

Data Preprocessing

We used datasets from Heterogeneous Graph Attention Networks (Xiao Wang et al.) and uploaded the preprocessing code of acm data as an example.

Running the code

*** To check the best performance of GTN in DBLP and ACM datasets, we recommend running the GTN in OpenHGNN implemented with the DGL library. Since the newly used torch.sparsemm requires more memory than the existing torch_sparse.spspmm, it was not possible to run the best case with num_layer > 1 in DBLP and ACM datasets.

$ mkdir data
$ cd data

Download datasets (DBLP, ACM, IMDB) from this link and extract data.zip into data folder.

$ cd ..
  • DBLP

    • GTN
     $ python main.py --dataset DBLP --model GTN --num_layers 1 --epoch 50 --lr 0.02 --num_channels 2
    
    • FastGTN
    1. w/ non-local operations ( >24 GB)
     $ python main.py --dataset DBLP --model FastGTN --num_layers 4 --epoch 100 --lr 0.02 --channel_agg mean --num_channels 2 --non_local_weight 0 --K 3   --non_local
    
    1. w/o non-local operations
     $ python main.py --dataset DBLP --model FastGTN --num_layers 4 --epoch 100 --lr 0.02 --channel_agg mean --num_channels 2
    
  • ACM

    • GTN
     $ python main_gpu.py --dataset ACM --model GTN --num_layers 1 --epoch 100 --lr 0.02 --num_channels 2
    
    • FastGTN
    1. w/ non-local operations
     $ python main_gpu.py --dataset ACM --model FastGTN --num_layers 3 --epoch 200 --lr 0.05 --channel_agg mean --num_channels 2 --non_local_weight -1 --K 1 --non_local
    
    1. w/o non-local operations
     $ python main_gpu.py --dataset ACM --model FastGTN --num_layers 3 --epoch 200 --lr 0.05 --channel_agg mean --num_channels 2
    
  • IMDB

    • GTN
     $ python main.py --dataset IMDB --model GTN --num_layers 2 --epoch 50 --lr 0.02 --num_channels 2
    
    • FastGTN
    1. w/ non-local operations
     $ python main.py --dataset IMDB --model FastGTN --num_layers 3 --epoch 50 --lr 0.02 --channel_agg mean --num_channels 2 --non_local_weight -2 --K 2 --non_local
    
    1. w/o non-local operations
     $ python main.py --dataset IMDB --model FastGTN --num_layers 3 --epoch 50 --lr 0.02 --channel_agg mean --num_channels 2
    

Citation

If this work is useful for your research, please cite our GTN and FastGTN:

@inproceedings{yun2019GTN,
  title={Graph Transformer Networks},
  author={Yun, Seongjun and Jeong, Minbyul and Kim, Raehyun and Kang, Jaewoo and Kim, Hyunwoo J},
  booktitle={Advances in Neural Information Processing Systems},
  pages={11960--11970},
  year={2019}
}
@article{yun2022FastGTN,
title = {Graph Transformer Networks: Learning meta-path graphs to improve GNNs},
journal = {Neural Networks},
volume = {153},
pages = {104-119},
year = {2022},
}

graph_transformer_networks's People

Contributors

seongjunyun avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.