GitHub Code | MinCut Pooling | 预训练|图上的自监督学习

Posted DrugAI

tags:

篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了GitHub Code | MinCut Pooling | 预训练|图上的自监督学习相关的知识,希望对你有一定的参考价值。

本专栏收集GitHub最近的更新的图网络相关代码合集,也有可能是很早之前的研究工作,最近有新的更新。至于GitHub根据什么算法推荐,小编暂时不清楚~如果你也觉得有用,欢迎转发给需要的小伙伴,另外有相关工作需要宣传也欢迎与小编联系~


1. Spectral Clustering with Graph Neural Networks for Graph Pooling ICML2020



GitHub Code | MinCut Pooling | 预训练|图上的自监督学习
This code reproduces the experimental results obtained with the MinCutPool layer as presented in the ICML 2020 paper
Spectral Clustering with Graph Neural Networks for Graph Pooling
F. M. Bianchi*, D. Grattarola*, C. Alippi
The official implementation of the MinCutPool layer can be found in Spektral.
An implementation of MinCutPool for PyTorch is also available in Pytorch Geometric.
https://github.com/FilippoMB/Spectral-Clustering-with-Graph-Neural-Networks-for-Graph-Pooling
该工作发表在ICML2020, 主要是通过最小割的方式方式进行pooling,优点是可导不需要谱分解,在Image分割,图分类,聚类上取得很好的结果~

2. Strategies for Pre-training Graph Neural Networks ICLR2020


This is a Pytorch implementation of the following paper:
Weihua Hu*, Bowen Liu*, Joseph Gomes, Marinka Zitnik, Percy Liang, Vijay Pande, Jure Leskovec. Strategies for Pre-training Graph Neural Networks. ICLR 2020. arXiv OpenReview
If you make use of the code/experiment in your work, please cite our paper (Bibtex below).
 
   
   
 
@inproceedings{
hu2020pretraining,
title={Strategies for Pre-training Graph Neural Networks},
author={Weihua Hu, Bowen Liu, Joseph Gomes, Marinka Zitnik, Percy Liang, Vijay Pande, Jure Leskovec},
booktitle={International Conference on Learning Representations},
year={2020},
url={https://openreview.net/forum?id=HJlWWJSFDH},
论文发表在ICLR2020, 这篇论文提出了一系列的预训练的方法,代码已经开源
相关解读


3. When Does Self-Supervision Help Graph Convolutional Networks? ICML2020


PyTorch code for When Does Self-Supervision Help Graph Convolutional Networks? [supplement]
Yuning You*, Tianlong Chen*, Zhangyang Wang, Yang Shen
In ICML 2020.

Overview

Properly designed multi-task self-supervision benefits GCNs in gaining more generalizability and robustness. In this repository we verify it through performing experiments on several GCN architectures with three designed self-supervised tasks: node clustering, graph partitioning and graph completion.
这篇论文非常有意思,小编后续会继续跟踪,这里先标记一下。该工作主要是通过设计三种自监督的任务:聚类,分割,补充学习更有泛化性和鲁棒性的图表示。

以上是关于GitHub Code | MinCut Pooling | 预训练|图上的自监督学习的主要内容,如果未能解决你的问题,请参考以下文章

BZOJ 1797: [Ahoi2009]Mincut 最小割

BZOJ-1797Mincut 最小割 最大流 + Tarjan + 缩点

BZOJ 1797: [Ahoi2009]Mincut 最小割

[bzoj1797] [Ahoi2009]Mincut 最小割

HYSBZ 1797 Mincut 最小割

最大流-最小割 MAXFLOW-MINCUT ISAP