Code Monkey home page Code Monkey logo

hgnn's Introduction

News

We have release a deep learning toolbox named DHG for graph neural networks and hypergraph neural networks. You can find many intresting things in it. Many correlation structures like simple graph, directed graph, bipartite graph, and simple hypergraph are all supported in the toolbox, as well as their visualization. More details refer to DHG!

Hypergraph Neural Networks

Created by Yifan Feng, Haoxuan You, Zizhao Zhang, Rongrong, Ji, Yue Gao from Xiamen University and Tsinghua University.

pipline

Introduction

This work will appear in AAAI 2019. We proposed a novel framework(HGNN) for data representation learning, which could take multi-modal data and exhibit superior performance gain compared with single modal or graph-based multi-modal methods. You can also check our paper for a deeper introduction.

HGNN could encode high-order data correlation in a hypergraph structure. Confronting the challenges of learning representation for complex data in real practice, we propose to incorporate such data structure in a hypergraph, which is more flexible on data modeling, especially when dealing with complex data. In this method, a hyperedge convolution operation is designed to handle the data correlation during representation learning. In this way, traditional hypergraph learning procedure can be conducted using hyperedge convolution operations efficiently. HGNN is able to learn the hidden layer representation considering the high-order data structure, which is a general framework considering the complex data correlations.

In this repository, we release code and data for train a Hypergrpah Nerual Networks for node classification on ModelNet40 dataset and NTU2012 dataset. The visual objects' feature is extracted by MVCNN(Su et al.) and GVCNN(Feng et al.).

Citation

if you find our work useful in your research, please consider citing:

@article{feng2018hypergraph,
  title={Hypergraph Neural Networks},
  author={Feng, Yifan and You, Haoxuan and Zhang, Zizhao and Ji, Rongrong and Gao, Yue},
  journal={AAAI 2019},
  year={2018}
}

Installation

Install Pytorch 0.4.0. You also need to install yaml. The code has been tested with Python 3.6, Pytorch 0.4.0 and CUDA 9.0 on Ubuntu 16.04.

Usage

Firstly, you should download the feature files of modelnet40 and ntu2012 datasets. Then, configure the "data_root" and "result_root" path in config/config.yaml.

Download datasets for training/evaluation (should be placed under "data_root")

To train and evaluate HGNN for node classification:

python train.py

You can select the feature that contribute to construct hypregraph incidence matrix by changing the status of parameters "use_mvcnn_feature_for_structure" and "use_gvcnn_feature_for_structure" in config.yaml file. Similarly, changing the status of parameter "use_gvcnn_feature" and "use_gvcnn_feature" can control the feature HGNN feed, and both true will concatenate the mvcnn feature and gvcnn feature as the node feature in HGNN.

# config/config.yaml
use_mvcnn_feature_for_structure: True
use_gvcnn_feature_for_structure: True
use_mvcnn_feature: False
use_gvcnn_feature: True

To change the experimental dataset (ModelNet40 or NTU2012)

# config/config.yaml
#Model
on_dataset: &o_d ModelNet40
#on_dataset: &o_d NTU2012

License

Our code is released under MIT License (see LICENSE file for details).

hgnn's People

Contributors

yifanfeng97 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar

hgnn's Issues

Question about feature file

Hi, the project performed very well. Thank you for your contribution.
I have a question about the data. How to create the feature file?
Can you shared the code?

Element-wise multiplication or matrix multiplication in calculating G

In your implementation, we can see that G is obtained by element-wise multiplying the elements:
https://github.com/Yue-Group/HGNN/blob/2d19c82084e4694ec3b2e911d8755dbd9129fd6e/utils/hypergraph_utils.py#L124
However, in equation (10) or (11), it looks like the matrix multiplication is operated.
Also, from the original GCN paper implementation, the normalization part is also realized by matrix multiplication:
https://github.com/tkipf/gcn/blob/master/gcn/utils.py#129
So anything wrong here?

Issue about code with citation dataset

Hi Yifan,

Could you please upload the code with citation dataset? I was wondering how to transform dataset like raw cora (cora.content and cora.cites) to fit for hypergraph neural network. Thanks in advance.

As regards the regularizer on hypergraph

As written in paper , \Omega(f) is a regularize on hypergraph. I reviewed your implement code, only found emprirical loss in your objective function. Is it necessary add this regularizer into loss function? I 'd appreciate if you can answer ASAP, thanks !

How to compute the affinity matrix

Hi, I notice that in your paper these is a formular stated like this :$A_{ij}=exp(-\frac{2D_{ij}^2}{\Delta})$. But in your repo, the corresponding code is np.exp(-dis_vec[0, node_idx] ** 2 / (m_prob * avg_dis) ** 2) , seems like computing $A_{ij}=exp(-\frac{D_{ij}^4}{\Delta})$. Could please explain it's this or mistake or i misundersood your code ?

Code for citation networks

Hi Could you please include the code for the citation networks as well? Or at least the pseudocode for converting the graph to hypergraph and the hyperparameter settings.

New HyperGraph

Hi,
Thanks for your work! I am running your code with a different hypergraph. Unfortunately, I get G as nan. Are there certain properties of the incidence matrix apart from n*m shape?

features

Hello, your work is excellent, and I am very interested in it. Do the extracted GVCNN features you used refer to the final shape descriptor that has performed grouping and inter-group fusion? I know that GVCNN is also your work, but there is no relevant open source code. Would you please provide some code? Thank you

Get killed every time

Hi, great work!
But I got this every time:

/root/HGNN/config/config.py:21: YAMLLoadWarning: calling yaml.load() without Loader=... is deprecated, as the default Loader is unsafe. Please read https://msg.pyyaml.org/load for full details.
cfg = yaml.load(f)
Constructing hypergraph incidence matrix!
(It may take several minutes! Please wait patiently!)
Killed

No other error report, can you help me?

Baseline enquiry

Hi there,

I would like to know how did you get the performance on GCN for the ModelNet40 and if you can make the code available? Another thing is that how can I generate the Adjacency matrix for ModelNet40 using the function load_feature_construct_H? is it just by setting K_neigs=[10] to [1]?

Thanks!

Data problem?

It seems that the hyperlinks of NTU and Net40 are both refered to the same Net40 data. So NTU seems missed.

do your codes trained on cpu or gpu?

do your codes trained on cpu or gpu?it only 18s?why it so quick?
Epoch 550/599
train Loss: 0.0548 Acc: 0.9879
val Loss: 0.6185 Acc: 0.9603
Best val Acc: 0.968801

Training complete in 0m 18s
Best val Acc: 0.968801

question about the data used to construct H

Hi.It seems you construct hypgraph incidence matrix H from both training data and testing data.

In the function hypergraph_utils.construct_H_with_KNN(),the data pts that fed in seems has both training data(idx_train) and testing data(idx_test).

If we use both training data and testing data when we train the model, then use the same testing data to evaluate the model, is the result still accurate?

Issue about hypergraph convolution

Hi, thank you very much for your contribution. I would like to ask you a question.
Assuming that I have built the hypergraph and have the original feature vectors for all nodes, I just want to refine the feature with the convolution operation and then save the refined features for my subsequent tasks. I just need the convolution operation, whereas I don't need to do the vertex classification task on the hypergraph structure. In other words, can convolution operations be extracted separately from HGNN?
How can I achieve it? still need network structure? Could you please give me some good suggestions?
Many thanks for your help and looking forward to your valuable advice.

How to test?


Epoch 500/599
train Loss: 0.0554 Acc: 0.9868
val Loss: 0.5913 Acc: 0.9611
Best val Acc: 0.968801


Epoch 550/599
train Loss: 0.0548 Acc: 0.9879
val Loss: 0.6185 Acc: 0.9603
Best val Acc: 0.968801

Training complete in 0m 18s
Best val Acc: 0.968801
how to test on ModelNet40 dataset?

About graph representation

Hi there,
May I ask why you using incidence matrix(H) to represent hypergraph instead of using adjancy matrix(A)? According to hypergraph learning algorithms, both matrix can be used to represent a hypergraph.
And how do you define the value of weight matrix (W) ?

Thanks!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.