Part of a series on |
Machine learning and data mining |
---|
Graph neural networks (GNN) are specialized artificial neural networks that are designed for tasks whose inputs are graphs.[1][2][3][4][5]
One prominent example is molecular drug design.[6][7][8] Each input sample is a graph representation of a molecule, where atoms form the nodes and chemical bonds between atoms form the edges. In addition to the graph representation, the input also includes known chemical properties for each of the atoms. Dataset samples may thus differ in length, reflecting the varying numbers of atoms in molecules, and the varying number of bonds between them. The task is to predict the efficacy of a given molecule for a specific medical application, like eliminating E. coli bacteria.
The key design element of GNNs is the use of pairwise message passing, such that graph nodes iteratively update their representations by exchanging information with their neighbors. Several GNN architectures have been proposed,[2][3][9][10][11] which implement different flavors of message passing,[12][13] started by recursive[2] or convolutional constructive[3] approaches. As of 2022[update], it is an open question whether it is possible to define GNN architectures "going beyond" message passing, or instead every GNN can be built on message passing over suitably defined graphs.[14]
In the more general subject of "geometric deep learning", certain existing neural network architectures can be interpreted as GNNs operating on suitably defined graphs.[12] A convolutional neural network layer, in the context of computer vision, can be considered a GNN applied to graphs whose nodes are pixels and only adjacent pixels are connected by edges in the graph. A transformer layer, in natural language processing, can be considered a GNN applied to complete graphs whose nodes are words or tokens in a passage of natural language text.
Relevant application domains for GNNs include natural language processing,[15] social networks,[16] citation networks,[17] molecular biology,[18] chemistry,[19][20] physics[21] and NP-hard combinatorial optimization problems.[22]
Open source libraries implementing GNNs include PyTorch Geometric[23] (PyTorch), TensorFlow GNN[24] (TensorFlow), Deep Graph Library[25] (framework agnostic), jraph[26] (Google JAX), and GraphNeuralNetworks.jl[27]/GeometricFlux.jl[28] (Julia, Flux).
wucuipeizhao2022
was invoked but never defined (see the help page).
scarselli2009
was invoked but never defined (see the help page).
micheli2009
was invoked but never defined (see the help page).
sanchez2021
was invoked but never defined (see the help page).
daigavane2021
was invoked but never defined (see the help page).
kipf2016
was invoked but never defined (see the help page).
hamilton2017
was invoked but never defined (see the help page).
velickovic2018
was invoked but never defined (see the help page).
bronstein2021
was invoked but never defined (see the help page).
hajij2022
was invoked but never defined (see the help page).
velickovic2022
was invoked but never defined (see the help page).
wuchen2023
was invoked but never defined (see the help page).
ying2018
was invoked but never defined (see the help page).
stanforddata
was invoked but never defined (see the help page).
gilmer2017
was invoked but never defined (see the help page).
qasim2019
was invoked but never defined (see the help page).
li2018
was invoked but never defined (see the help page).
fey2019
was invoked but never defined (see the help page).
tfgnn2022
was invoked but never defined (see the help page).
jraph2022
was invoked but never defined (see the help page).
Lucibello2021GNN
was invoked but never defined (see the help page).
© MMXXIII Rich X Search. We shall prevail. All rights reserved. Rich X Search