期刊论文详细信息
NEUROCOMPUTING 卷:470
Pyramidal Reservoir Graph Neural Network
Article
Bianchi, F. M.1,2  Gallicchio, Claudio3  Micheli, Alessio3 
[1] UiT Arctic Univ Norway, Dept Math & Stat, Hansine Hansens Veg 18, N-9019 Tromso, Norway
[2] NORCE Norwegian Res Ctr AS, Bergen, Norway
[3] Univ Pisa, Dept Comp Sci, Largo B Pontecorvo 3, I-57127 Pisa, Italy
关键词: Reservoir Computing;    Graph Echo State Networks;    Graph Neural Networks;    Graph pooling;   
DOI  :  10.1016/j.neucom.2021.04.131
来源: Elsevier
PDF
【 摘 要 】

We propose a deep Graph Neural Network (GNN) model that alternates two types of layers. The first type is inspired by Reservoir Computing (RC) and generates new vertex features by iterating a non-linear map until it converges to a fixed point. The second type of layer implements graph pooling operations, that gradually reduce the support graph and the vertex features, and further improve the computational efficiency of the RC-based GNN. The architecture is, therefore, pyramidal. In the last layer, the features of the remaining vertices are combined into a single vector, which represents the graph embedding. Through a mathematical derivation introduced in this paper, we show formally how graph pooling can reduce the computational complexity of the model and speed-up the convergence of the dynamical updates of the vertex features. Our proposed approach to the design of RC-based GNNs offers an advantageous and principled trade-off between accuracy and complexity, which we extensively demonstrate in experiments on a large set of graph datasets. (c) 2021 Elsevier B.V. All rights reserved.

【 授权许可】

Free   

【 预 览 】
附件列表
Files Size Format View
10_1016_j_neucom_2021_04_131.pdf 2179KB PDF download
  文献评价指标  
  下载次数:4次 浏览次数:0次