Suggested by some friends, I have been reading graph convolutional neural network. This is not a recent interest, as I have been interested in some graph-related problems (as I developed the graphflow package) and topological data analysis (TDA, as I developed the mogutda package). Graph theory is not a new field as well, but it is interesting to see how it is being applied in deep learning, as a lot of real-world relational data can be expressed in terms of graphs. (Examples: word relations, organizational structure, genes, internet etc.) With the increasing attention to neo4j, we know graph is coming to be hot.

It would be helpful to review some basic graph theory:

• A graph is represented by $\mathcal{G} = (V, E)$, where $V$ and $E$ are the nodes (vertices) and edges respectively.
• A graph can be directed or undirected. In graph convolutional neural network, they are undirected usually.
• The adjacency matrix $A$ describes how nodes are connected: $A_{ij} = 1$ if there is an edge connecting from node $i$ to node $j$, and $0$ otherwise. $A$ is a symmetric matrix for an undirected graph.
• The incidence matrix $B$ is another way to describe how nodes are connected: $B_{ij} = 1$ if a node $i$ is connected with edge $j$. This is useful for undirected graph.
• The degree matrix $D$ is a diagonal matrix, with elements $D_{ii}$ denotes the number of neighbors for node $i$ in undirected matrix.
• The function acting on the nodes is called the filter.
• The graph Laplacian, or Kirchhoff matrix, is defined by $L = D - A$, and the normalized graph Laplacian is $\tilde{L} = I - D^{-\frac{1}{2}} A D^{-\frac{1}{2}}$.

The graph Laplacian is the most important matrix in graph convolutional neural network. It is analogous to the Laplacian operator in Euclidean space, $\nabla^2$. The reader can easily verify this by constructing a graph of 2D lattice and compute the graph Laplacian matrix, and find that it is the same as the discretized Laplacian operator.

We can also get some insights from the Euclidean analogue. In physics, the solution to the Laplacian equation is harmonic: the basis of the solution can be described in the spectral/Fourier space, as:

$\nabla^2 \psi = - \omega^2 \psi$,

And $\psi \propto e^{\pm i \omega x}$. In graph convolutional neural network, as Bruna et. al. suggested in 2013, the graph is calculated in the graph Fourier space, instead of directly dealing with the Laplacian matrix in all layers of network.

On the other hand, we know that for the convolution

$(f \circ g) (x) = \int dy f(x-y) g(y)$,

its Fourier transform is given by

$\tilde{( f \circ g)} (\omega) = \tilde{f} (\omega) \tilde{g} (\omega)$.

In Fourier space, the convolution of two functions are just their products. Similarly, in graph convolutional neural network, convolutions can be computed in the Fourier space as the mere product of two filters in the Fourier space. More specifically, for finding the convolution of the filters $f$ and $g$, with $\Phi$ being the unitary eigenmatrix,

$f \circ g = \Phi ((\Phi^T g) \circ (\Phi^T f)) = \Phi \textbf{diag}(g_1, g_2, \ldots, g_n) f$.

However, such general description is basis-dependent, and it is still computationally expensive. More work has been proposed to smooth the representation, which will be covered in the upcoming blogs.

A side note, the readers can verify themselves that

$\sum_{ij} A_{ij} | f_i - g_j |^2 = f \tilde{L} g$

• Python pacakge: graphflow. [PyPI] [Github]
• Python package: mogutda. [PyPI] [Github]
• Joan Bruna, Wojciech Zaremba, Arthur Szlam, Yann LeCun, “Spectral Networks and Locally Connected Networks on Graphs,” arXiv:1312.6203 (2013). [arXiv]
•  Blake Nelson, Deve Palakkattukudy, “Driving Predictive Analytics with the Power of Neo4j,” neo4j blog (2018). [neo4j]