项目作者: ma-compbio

项目描述 :
hypergraph representation learning, graph neural network
高级语言: Python
项目地址: git://github.com/ma-compbio/Hyper-SAGNN.git
创建时间: 2020-04-16T03:01:00Z
项目社区:https://github.com/ma-compbio/Hyper-SAGNN

开源协议:MIT License

下载


Hyper-SAGNN: a self-attention based graph neural network for hypergraphs

This is an implementation of “Hyper-SAGNN: a self-attention based graph neural network for hypergraphs” (ICLR2020)

The datasets included in this repo are originally from DHNE (https://github.com/tadpole/DHNE)

Requirements

python >= 3.6.8

Tensorflow >= 1.0.0 (< 2.0.0)

Pytorch >= 1.0

Usage

To run the code:

  1. cd Code
  2. python main.py --data wordnet -f adj

Change the following arguments to reproduce corresponding results from the manuscript,

The —data argument can take “GPS”, “drug”, “MovieLens”, “wordnet”. This argument is case sensitive

The -f, —feature argument can take “adj” or “walk” represents encoder based approach and random walk based approach respectively.

Other arguments are as followed:

  1. parser.add_argument('--dimensions', type=int, default=64,
  2. help='Number of dimensions. Default is 64.')
  3. parser.add_argument('-l', '--walk-length', type=int, default=40,
  4. help='Length of walk per source. Default is 40.')
  5. parser.add_argument('-r', '--num-walks', type=int, default=10,
  6. help='Number of walks per source. Default is 10.')
  7. parser.add_argument('-k', '--window-size', type=int, default=10,
  8. help='Context size for optimization. Default is 10.')

Cite

If you want to cite our paper:

  1. @inproceedings{
  2. zhang2020hypersagnn,
  3. title={Hyper-{SAGNN}: a self-attention based graph neural network for hypergraphs},
  4. author={Zhang, Ruochi and Zou, Yuesong and Ma, Jian},
  5. booktitle={International Conference on Learning Representations (ICLR)},
  6. year={2020}
  7. }