Syntax and Semantics of μG
The syntax and denotational semantics of were published in our FORTE 2023 article
The operational semantics will be added here after being published in a future article.
Syntax of
We start by recalling the abstract syntax of
Syntax of
Given a set of variable symbols and a set of function symbols, expressions can be
composed with the following abstract syntax:
with and .
Given a graph and an (optional) edge-labeling , the meaning of a expression is a graph neural network, a function between node-labeling functions.
One of the basic terms is the function application . This represents the application of a given function (referenced by ) to the input node-labeling function.
Moreover, the pre-image operator and the post-image operator are used to compute the labeling of a node in terms of the labels of its predecessors and successors, respectively.
Basic terms are composed by sequential composition and parallel composition .
Local variables and functions can be defined using let
and def
.
The choice operator allows to run different GNNs according to the result of a guard GNN.
Finally, the fixed point operator is used to program recursive behavior.
Typing of
We say that a expression is well-formed whenever it can be typed with the rules of the following table.
These rules guarantee that any well-formed expression can be interpreted as a GNN.

Denotational semantics of
We are now ready to discuss the denotational semantics of . In the following definition, denotes a node-labeling function,
and are the outgoing and incoming edges of node , and by we denote the multi-set of tuples
for each .
Denotational semantics of
Given a graph , an edge-labeling function , and an environment that comprises a variable evaluation function
and a function evaluation function
we define the
semantic interpretation function on formulas
by induction in the following way:

For any and any . The functions and are defined as follows:
Function application
The function symbols are evaluated as the graph neural networks that map a node-labeling function to a new node-labeling function by applying a function on both local and
global node information. The local information consists of applying to the input node, while the global information is the multiset of the labels of
all the nodes in the graph. The graph neural network we obtain applies a possibly trainable function to these two pieces of information.
Two particular cases arise if decides to ignore either of the two inputs. If ignores the global information, the GNN returns a node-labeling
function , a purely local transformation of the node labels. On the other hand, if ignores the local information, the GNN returns a
node-labeling function that assigns to each node a label that summarizes the entire graph, emulating what in the GNN literature is known as a
global pooling operator.
Pre-image and Post-Image
The pre-image symbol and the post-image symbol , together with function symbols and are
evaluated as the graph neural networks and for any
. In the case of the pre-image, for any symbol the corresponding function generates
a message from a tuple , and this is repeated for each . Then for any symbol the corresponding
function generates a new label for a node from the multiset of messages obtained from and the current label . The functions
and may be trainable. The case of the post-image is analogous, with the difference that is applied to tuples for each instead.
Sequential composition
An expression of the form for formulas is evaluated as the graph neural network
that maps a node-labeling function to a new node-labeling function obtained from the function composition of
and .
Parallel composition
An expression of the form for formulas is evaluated as the graph neural network
that maps a node-labelling function to a new node-labelling function that maps a node to the tuple . Moreover, combining this operator with the basic function symbols allows the execution of -ary operations on the node-labelling functions: a -ary function application is expressible as the expression .
Choice
The choice operator for formulas is evaluated as the graph neural network if is a node-labelling function such
that . Otherwise it is evaluated as the graph neural network .
Fixed points
The fixed point operator is evaluated as the graph neural network that is obtained by applying to the least fixed point of the functional
associated to . Formally, it is where