MLTea: Limits, approximation and transferability for GNNs on sparse graphs via graphops
Speaker
Thien Le
CSAIL
Host
Behrooz Tahmasebi
Abstract: We study approximation power and transferability theorems of graph neural networks on sparse graphs via graph limit theory. While most recent advances in applying graph limit to the theoretical studies of graph neural networks are done in the dense case via graphons, sparser graphs such as bounded-degree graphs, planar graphs or graphs sampled from power-law distributions are ubiquitous in applications. In this paper, we introduce graphop neural network: a class of graph neural networks that uses graphops in place of usual graph shift operators such as the adjacency matrix or the Laplacian. Graphops are bounded P-operators introduced in Backhausz and Szegedy (2022) as limiting objects for action convergence of finite graphs, which generalizes both dense graph convergence of graphons and local-global convergence of graphings and thus work for dense graphs and bounded-degree graphs, as well as graphs with intermediate degree distribution (such as the hypercubes). Under the same metric that metrizes action convergence, we give quantitative bounds for the distance between neural networks built from operators acting on infinite-dimensional L2 ([0, 1]), and those built from its discretization - which are graph shift operators built from some finite graphs. As a corollary, we obtain a transferability bound between two graphop neural networks built from two discretizations of different resolutions.