Fixed-Point Centrality for Networks
Abstract
This paper proposes a family of network centralities called fixed-point centralities. This centrality family is defined via the fixed point of permutation equivariant mappings related to the underlying network. Such a centrality notion is immediately extended to define fixed-point centralities for infinite graphs characterized by graphons. Variation bounds of such centralities with respect to the variations of the underlying graphs and graphons under mild assumptions are established. Fixed-point centralities connect with a variety of different models on networks including graph neural networks, static and dynamic games on networks, and Markov decision processes.
I Introduction
Centrality which quantifies the “importance” or “influence” of nodes on networks is a useful concept in social network analysis [1, 2, 3] and it also finds applications in biological, technological and economics networks (see e.g. [4, 5, 6, 3]). Plenty of centralities with different properties are defined for different problems (see e.g. [7, 8]), such as, degree centrality, eigencentrality [9], Katz-Bonacich centrality [10, 11, 12], PageRank centrality [13], Shapley value [14], closeness centrality [15], betweenness centrality [16], diffusion centrality [17], among others. These centralities provide a collection of different quantitative measures for the “importance” or “influence” of nodes on networks associated with various application contexts. For instance, the quality of a website may be modelled by the PageRank centrality [13], the importance of individuals on social influence networks may be reflected by eigencentrality in [9], equilibrium actions of certain static network games correspond to Katz-Bonacich centrality [6], contribution values in a coalition game may be represented by Shapley values [14], and so on. Many social, technological and biological networks are growing and varying in terms of nodes and (or) connections and hence centrality values may vary accordingly. Properties of such variations of centrality values with respect to the variations of graphs (see [18]) motivate the current work. A second motivation is to identify a suitable centrality notion for dynamic game problems on networks and graphons ([19, 20, 21, 22]). A third motivation is the search of a class of new centralities for centrality-weighted opinion dynamics models proposed in [23].
I-A Related work
The formulation of fixed-point centrality in this paper follows the idea of the seminal work on graph neural network models ([24, 25]) in using fixed points of some underlying mappings associated with networks. Fixed-point characterizations find applications in many problems in data science, including graph neural networks ([24, 25]), implicit neural networks ([26, 27, 28, 29]), deep equilibrium models [30], among others [31]. A first salient feature of fixed-point centralities that distinguishes themselves from these models above is that permutation equivariance properties must be satisfied. Another salient feature of fixed-point centralities is that the values of fixed-point centralities are restricted to real numbers to allow natural rankings of the nodes and are restricted to non-negative numbers to allow interpretations (after normalizations) as probability distributions. Furthermore, the current paper focuses on variations of the fixed point (centralities) with respect to the variations of graph structures and weights, which differs from [24, 25, 26, 27, 28, 29, 30].
Centralities and graph neural networks are respectively generalized in [18] and [32] to those for infinite graphs characterized by graphons (developed in [33, 34, 35] to characterize dense graph sequences and their limits). The work [18] studies the eigencentrality, PageRank centrality, Katz-Bonacich centrality of symmetric graphs generated from graphons and establishes the rate of convergence of these centralities to the associated graphon centralities. The fixed-point centrality for graphon in the current paper provides a unified view towards these centralities. The graphon versions of graph neural networks as approximations or generalization models of graph neural networks are proposed and analyzed in [32]. One modelling difference is that the graphon neural networks are characterized by layered structures in [32] whereas in the current paper fixed-point equilibrium structures are employed.
I-B Contribution
We propose the “fixed-point centrality”, which is a class of centralities that can be constructed via a (permutation equivariant) fixed-point mapping associated with the underlying graph. This class of centralities unifies many different centralities (including PageRank centrality, eigencentrality, and Katz-Bonacich centrality) and furthermore it connects to a variety of different problems including graph neural networks [24], and LQG mean field games on networks [20]. In addition, fixed-point centralities are applicable to a broader class of graphs, whether they are undirected or directed, unweighted or weighted (with possibly negative weights), finite or infinite. Moreover, variation bounds of fixed-point centralities with respect to the variations of the underlying graphs are established under mild assumptions following a rather simple idea based on fixed-point analysis.
Notation: and denote respectively reals and non-negative reals. For , denotes the graph with the adjacency matrix and the node set . denotes the graph with vertex set and edge set . For a vector , . denotes the -dimensional column vector of s and denotes the function defined over with for all . We use the word “network” to refer to an interconnected group or system where the connection structures along with weights can be characterized by some graph . For a vector , denotes the diagonal matrix with the elements of on the main diagonal, (or ) denotes the th element of , with , and For a matrix , (or ) denotes the th element of and with . We note that
II Preliminaries on Centralities
A centrality for a network characterized by a graph is a mapping that provides a quantification of “importance” or “influence” of nodes on the network. It is worth emphasizing that the “importance” or “influence” of nodes on networks is defined differently under different application contexts. Hence for the same graph structure and graph weights, various centralities can be defined and may be very different from one another. The choice of the range from centralities allows a natural ranking of nodes. The fundamental idea of centralities is to summarize the information about a two-variable function characterized by a graph (or a matrix) into a one-variable function characterized by a centrality (or a vector).
We review several centralities related to the current paper.
II-A Centralities for Finite Networks
Consider a graph with non-negative adjacency matrix . Depending on , the graph may be directed or undirected, and weighted or unweighted.
-
(E1)
Eigencentrality (proposed in [9]): Assume the largest eigenvalue of is simple (i.e. has multiplicity ). Then the eigencentrality of is given by
An equivalent form in terms of local connections is
-
(E2)
Katz-Bonacich centrality with (proposed in [10] and generalized in [11, 12]): Let . One (simplest) Katz-Bonacich centrality is given by
where the upper bound of ensures the boundedness of the infinite series. An equivalent form in terms of local connections is given by
and the equivalent explicit form is
-
(E3)
PageRank centrality (proposed in [13]): Consider a network of webpages, where each node represents a webpage, and if there is a hyperlink from webpage to and otherwise [13]. PageRank centrality with is given by
where is the probability of jumping from node to node in the steady state of the associated random walk. In equivalent forms, PageRank centrality satisfies
and the explicit computation form is given by
PageRank centrality can be interpreted as the steady state distribution of random walks on the network.
II-B Centralities for Graphons
Graphons are defined as bounded symmetric measurable functions , which, roughly speaking, can be viewed as the “adjacency matrix” of graphs with the vertex set (see [36]). Let denote the set of graphons with the range . A graphon can be interpreted as an integral operator (for instance from to ) as follows:
The definitions of eigenvector, PageRank and Katz-Bonacich centralities for graphons in [18] are summarized below. Consider a graphon .
-
(E4)
The graphon eigencentrality for is given by
where denotes the eigenfunction in associated to the largest eigenvalue of and is assumed to have multiplicity .
-
(E5)
The graphon Katz-Bonacich centrality with for is defined by one of the equivalent forms:
where and is the largest eigenvalue of .
-
(E6)
The graphon PageRank centrality with for is given by
(1) where and if , and zero otherwise. Equivalent representation forms are as follows: and
Proposition 1
The graphon PageRank centrality with is a probability density function over . □
Proof
From the equivalent form , we obtain that for all for . Furthermore, based on (1), we verify that
Thus is a probability density. ■
III Fixed-Point Centrality for Finite Networks
A permutation matrix is a square matrix that has exactly one element of in every row and every column and s elsewhere. An -dimensional permutation matrix can be obtained by permuting the rows of an identity matrix according to the permutation map . For any permutation map , its associated permutation matrix is orthonormal, that is, .
Definition 1 (Permutation Equivariance)
A mapping is permutation equivariant with respect to a permutation map if
where is the permutation matrix corresponding to . A mapping is permutation equivariant if it is permutation equivariant with respect to all permutation maps . □
Definition 2 (Permutation Invariance)
A mapping is permutation invariant with respect to permutation map if
where is the permutation matrix corresponding to . A mapping is permutation invariant if it is permutation invariant with respect to all permutation maps . □
Similarly, a mapping is permutation equivariant (resp. permutation invariant) if (resp. ) for all and all permutation maps .
Permutation equivalence and permutation invariance are important properties of many functions associated with DeepSets [37] and graph neural networks [38]. Consider a network, the structure of which is characterized by a graph with the adjacency matrix (which may have negative weights) and the node set . denotes a set of nodal features and denotes its -fold Cartesian product. Let be associated with a metric .
Definition 3 (Fixed-Point Centrality)
A centrality is a fixed-point centrality for associated with the feature space if there exists a permutation equivariant mapping , a permutation equivariant mapping , and a unique under the metric such that
(2) | ||||
□
We note the (symmetric or asymmetric) adjacency matrix of is allowed to have non-negative elements. The choices of and are contingent to the network application context and hence different fixed-point centralities may be associated with the same underlying graph .
The existence of the fixed-point feature is assumed in the definition of fixed-point centrality, which, with extra assumptions, can be established via various fixed-point theorems [39] (see, for instance, [40] based on Kakutani fixed-point theorem and [25] based on Banach fixed-point theorem). The uniqueness of the fixed-point feature depends on the properties of both and , as it is determined by . Thus, for the same permutation equivariant mapping , a different may result in the non-uniqueness (or even non-existence) of the fixed-point feature. To enforce uniqueness of the fixed-point feature (when it exists), one way is to select a suitable feature set along with the metric for the product space such that uniqueness is defined up to equivalent classes (see Remark 1 below for an example).
Remark 1 (Linear Case)
When is a vector space and is a linear function from to (that is, and for any , ) for any , the unique in (2) should be interpreted as the unique -dimensional subspace, or in other words, is unique up to its linear span; a formal way to have the uniqueness is to extend the feature vector space to the Grassmannian (i.e. the space of -dimensional linear subspaces in ) and use a distance for (see e.g. [41]). □
Fixed-point centralities can be viewed as a specialization of graph neural network models in [24, 25] to the case where outputs are characterized by non-negative reals and initial nodal labels there are homogenous. Centralities in naturally allow ranking nodes according to their centrality values, whereas in general outputs of graph neural networks require extra constructions to allow such ranking.
III-A Examples of Fixed-Point Centrality
Proposition 2
Eigencentrality, Katz-Bonacich centrality, and PageRank centrality are fixed-point centralities. □
Proof
The proof is by identifying the functions and following the definition of fixed-point centrality. The mapping in (2) is specialized to the identify mapping from (i.e. ) for Katz-Bonacich centrality, PageRank centrality and eigencentrality. For Katz-Bonacich centrality, We observe that and hence for Katz-Bonacich centrality is a contraction from to under the vector 2-norm. For PageRank centrality, , We note that and hence for PageRank centrality is a contraction under vector 1-norm from to . The existence and uniqueness of fixed-point features for these two cases above are immediate via Banach fixed-point theorem. For the case with eigencentrality, the largest eigenvalue of is assumed to have multiplicity , and the permutation equivariant mapping is . The fixed-point feature is unique up to its linear span as is a linear function from to (see Remark 1). The permutation equivariance properties of functions for these centralities can be easily verified. ■
Proposition 3
Any eigenvector corresponding to a nonzero simple eigenvalue of is a fixed-point centrality for . □
Proofs are omitted as readers can readily verify the result.
We emphasize that the choice of in (2) can be very general: it can be a set of vectors, matrices, functions, probability distributions, strings, etc. Below we give an example where is the space of continuous functions from to denoted by with .
Proposition 4
The equilibrium nodal cost of LQG Network Mean Field Games [21, Sec. IV-B] with homogenous initial conditions, if the unique equilibrium exists, is a fixed-point centrality. □
Proof
Following [21, Prop. 1], the network mean field trajectory denoted by with for on a network with nodes satisfies
where denotes the space of continuous functions from to (endowed with the sup norm) and the permutation equivariant mapping
is characterized by a forward-backward coupled differential equation pair [21, Prop. 1]. The equilibrium nodal cost is
with the same for all nodes. Hence it satisfies the definition of fixed-point centrality. ■
We omit the details of the problem formulation of LQG Network Mean Field Games which requires significant space. Interested readers are referred to [21, Sec. IV-B].
III-B Properties of Fixed-Point Centrality
An automorphism of a (directed or undirected) graph is a permutation map that satisfies
Proposition 5
Any fixed-point centrality of a graph is permutation invariant with respect to any automorphism map of . □
Proof
Let denote the adjacency matrix of . Let and , where is the permutation matrix corresponding to the permutation map . By the definition of an automorphism , (that is, the adjacency matrix does not change), and hence the fixed-point feature given by satisfies that
In the definition of fixed-point centrality, such fixed-point feature is assumed to be unique. Then . That is, an automorphism does not change the fixed-point features and hence does not change the fixed-point centrality . ■
A vertex transitive graph is a graph satisfying that for any given node pair , there exists some automorphism map such that where denotes the set of permutation mappings . See [42] for examples of vertex transitive graphs.
Proposition 6 (Vertex Transitive Graphs)
All nodes of a vertex transitive graph share the same fixed-point centrality value, that is, any fixed-point centrality for a vertex transitive graph is permutation invariant. □
Proof
Following Prop. 5 and the definition of vertex transitive graphs, we obtain, for each , there exists some 111There may be one for each node pair instead of one for all node pairs., such that the fixed-point features satisfy . This implies that for all . Finally, the permutation equivariance of in (2) leads to the desired result. ■
III-C Centrality Variations with Respect to Graph Variations
Consider two graphs and with the same number of nodes. Let be a fixed-point centrality for and that of with the same function , that is,
(3) | ||||
where is specialized to , and and . (The specialization of to is for the simplicity of presentation, and it can be relaxed to any normed vector space). In this section we study the conditions under which and are close and establish upper bounds of their differences.
Let denote the set of feasible fixed-point features with in (3). Consider the following assumption.
Assumption (A1): (a) There exists such that for all ,
(4) |
where the operator norm ;
(b) For any matrix and for any , there exists such that
(5) |
where ;
(c) For the given matrix ,
(d) There exists such that for all ,
We call (A1)-(c) the Contraction Condition for Fixed-Point Centrality for ; if, furthermore, is complete under the chosen norm , it then gives the existence of a unique fixed-point feature for following Banach fixed-point theorem, and one can simply apply fixed-point iterations to identify such fixed-point feature with the given graph .
Remark 2 (Different Choices of Norms)
We note that can take any vector norm, , as long as the operator norm in (A1)-(a) is compatible with the chosen vector norm (that is, ). □
For Katz-Bonacich centrality, we choose -norm and if . For PageRank centrality, we choose -norm and if .
Remark 3 (Fixed-Point Centrality in the Linear Case)
The condition in (A1) is not satisfied under norm for the (normalized) eigencentrality, as for eigencentrality. To establish the error bound, further spectral properties of the graphs are required (see e.g. [18] via rotation analysis of eigenvectors by perturbations [43]). In general, for fixed-point centralities where is a linear function, one should establish the difference of two 1-dimensional subspaces characterized by and . Such difference can be characterized by the angular difference between the two subspaces as follows:
which is a specialization of a Grassmann distance (see [41]) to 1-dimensional subspaces (i.e. Grassmannian ). For characterizing such differences between and when differs from by a small perturbation, one may employ the error estimation results in [43]. □
Theorem 1
Proof
Following the definition of the fixed-point centrality and Assumption (A1)(a)-(A1)(c),
Hence subtracting and then dividing by on both sides yield
(7) |
Then employing the condition in Assumption (A1)-(d) yields the desired result. ■
Remark 4
If does not depend on , then the fixed-point centrality is trivial and the centrality variation upper bounds above should be treated differently. Such examples include degree, closeness and betweenness centralities. □
Centralities can be associated with probability distributions: PageRank centrality is the steady state distribution of random walks on the graph of hyperlinks [13], and degree centrality is used as the probability distribution for forming new connections [44]. To (uniquely) associate the fixed-point centrality with a probability (mass function), we consider the following assumption.
Assumption (A2): The fixed-point centralities are normalized with nonnegative entries, that is,
Clearly, this implies .
Remark 5 (Normalization of Centralities)
If a centrality does not satisfy the condition (A2) above, it can be normalized via This normalization is useful to associate any centrality with a probability distribution. For instance, the degree centrality with normalization where , is used in scale-free network models [44] to represent the probability of forming new connections. In general, one can introduce a monotone function , such that
When (or ), it is then specialized to the softmax function (or the Boltzmann distribution that maximizes an associated entropy). Such normalizations can be incorporated into the permutation equivariant mapping in the definition of fixed-point centrality in (2). □
For a metric space and , let denote the set of all probability measures on with finite th moment. The -Wasserstein distance between two probability measures in is defined as follows:
where denotes the set of probability measures on with marginals and .
Proposition 7
Under Assumptions (A1) and (A2), the following holds for the fixed-point centrality in (3):
(8) |
where the matrix operator norm is . □
Proof
Recall from Theorem 1 that
(9) |
where . Furthermore, one can verify that
since is just a particular transport map. We obtain the desired result by combining the two inequalities above. ■
When , the operator norm is the maximum singular value of . Consider the matrix cut norm [45]
(10) |
(without the scaling factor used in [36, p.127]).
Lemma 1
The following inequality holds for any symmetric matrix with elements :
□
Proof
For any symmetric matrix , the norms and scaled respectively by and correspond to those graphon norms and in [36] for the stepfunction graphon with uniform partitions associated with , where, with an abuse of the notation, and . Since holds for any graphon in (see [46, Lem. E.6 and Eqn. (4.4)] or [18, Lem. 7]), we obtained the desired result. ■
Replacing the operator norm by the cut norm in Prop. 7 via the inequality in Lemma 1 yields the following result.
Proposition 8
Consider two symmetric matrices and . Assume (A1) and (A2) for the fixed-point centrality (3) hold. If and for all , then
(11) |
where , and denotes the set of all permutations from to . □
IV Fixed-point Centrality for Infinite Networks
Graphons are useful in characterizing and comparing graphs of different size and defining limits of (deterministic or random) graph sequences. This section extends the fixed-point centralities to those for graphons.
IV-A Fixed-Point Centrality for Graphons
Let denote the set of symmetric measurable functions with . Let denote the infinite Cartesian product of with the index set . Let denotes the metric for . Similar to the finite graph case, a centrality for a graphon with the vertex set is defined as the mapping which characterizes the “importance” of nodes on the infinite network associated with the underlying graphon.
Definition 4 (Permutation Equivariant Operator)
An operator is permutation equivariant with respect to a measure preserving bijection if
(12) |
where and for . An operator is permutation equivariant if it is permutation equivariant with respect to all measure preserving bijections . □
Definition 5 (Permutation Invariant Operator)
An operator is permutation invariant with respect to a measure preserving bijection if
(13) |
where and for . An operator is permutation invariant if it is permutation invariant with respect to all measure preserving bijections . □
Similarly, a mapping is permutation equivariant (resp. permutation invariant) if for all measure preserving bijections ,
Definition 6 (Graphon Fixed-Point Centrality)
A centrality is a fixed-point centrality for a graphon associated with the feature space if there exists a permutation equivariant fixed-point mapping , a permutation equivariant mapping , and a unique function under the metric , such that
(14) | ||||
□
We note that the “uniqueness” of in the definition above depends on the choice of and the underlying metric , and it could mean an equivalent class of functions. For example, if we choose and let the set be endowed with norm, then the unique is interpreted as the equivalent class up to discrepancies on sets with Lebesgue measure zero. Another such example, similar to the finite graph case, is that the “uniqueness” of when is a linear mapping shall be interpreted as the unique subspace spanned by (see Remark 1).
Proposition 9
Graphon eigencentrality, graphon Katz-Bonacich centrality, and graphon PageRank centrality are graphon fixed-point centralities. □
Proposition 10
Any eigenfunction of a graphon operator from to corresponding to a non-zero simple eigenvalue is a graphon fixed-point centrality. □
Proposition 11
The equilibrium nodal cost of LQG Graphon Mean Field Games [21, Sec. IV-C] with homogenous initial conditions, if the unique equilibrium exists, is a graphon fixed-point centrality. □
In LQG Graphon Mean Field Games [21, Sec. IV-C], the product set is specialized to , where is the dimension of the local state of agents. Proofs of these propositions follow similar arguments as those in the finite network case, and hence are omitted.
IV-B Centrality Variations with Respect to Graphon Variations
Consider two graphons and in , and let and be respectively their fixed-point centralities as in (14), that is,
(15) | ||||
where the feature space is specialized to with , and the operators and are specialized to and .
Let denote the set of feasible fixed-point features associated with in (15). Consider the following assumption.
Assumption (A3):
(a) There exists such that for all ,
(16) |
where the operator norm .
(b) For any graphon and , there exists such that
(17) |
where .
(c) For the given graphon ,
(d) There exists such that for all ,
Theorem 2
The proof essentially follows the same lines of arguments as those for Theorem 1.
Assumption (A4): The graphon fixed-point centrality satisfies
(19) |
that is, .
Proposition 12
Under Assumptions (A3) and (A4), the following holds for the fixed-point centrality in (15):
(20) |
where denotes the set of all measure preserving bijections from to and the operator norm is . □
Proposition 13
Consider two graphons and in . Assume (A3) and (A4) for the graphon fixed-point centrality (15) hold. Then the following holds
(21) |
where and , and denotes the set of all measure preserving bijections . □
Remark 6
Any finite undirected graphs can be represented by stepfunction graphons [36, Chp.7.1] and hence the characterization of centrality variations applies to finite graphs as well. Moreover, finite graphs of different size can be compared via their graphon representations as well as the associated fixed-point centralities. Thus, for the undirected graph case, the results above in Prop. 12 and 13 generalize those in Prop. 7 and 8.□
V Conclusion
The notion of the fixed-point centrality proposed in the current paper is useful in at least the following ways: (a) it helps identify properties for a large family of centralities and apply similar analysis techniques (e.g. in studying changes of centralities with respect to graph perturbations); (b) the well-established theoretical and numerical results of fixed-point analysis can be readily employed for such centralities; (c) learning and training methods can be readily applied to approximate fixed-point centralities due to its close connection with graph neural networks ([24, 25]).
The connection of fixed-point centrality with LQG mean field games on networks suggests collective multi-agent learning of centralities from the equilibrium cost for (dynamic or repeated) game problems. Fixed-point centralities are also related to certain Markov decision processes if each state is viewed as a node and the value function (typically characterized by the fixed-point of the Bellman operator) is then a mapping from the vertex set to non-negative real number. Details will be discussed in future extensions.
The representation of sparse graph sequences and limits requires extra concepts (e.g. graphings for bounded degree graphs [36] and graphon for sparse -random graphs [47]). Future extensions should formulate fixed-point centralities for sparse graph limit models. Other important future directions include: (a) improving upper bounds for centrality variations by exploring further properties of the permutation equivariant mappings; (b) axoimatizing fixed-point centralities via extra properties of , , and the feature space similar to that in [8]; (c) analyzing the change of the ranking properties of fixed-point centralities with respect to modification on networks; (d) exploring variational analysis of fixed-point centralities where the underlying graphs are characterized by vertexon-graphons [48].
References
- [1] K. Faust, “Centrality in affiliation networks,” Social Networks, vol. 19, no. 2, pp. 157–191, 1997.
- [2] A. Landherr, B. Friedl, and J. Heidemann, “A critical review of centrality measures in social networks,” Business & Information Systems Engineering, vol. 2, no. 6, pp. 371–385, 2010.
- [3] M. O. Jackson, Social and economic networks. Princeton university press, 2010.
- [4] D. Koschützki and F. Schreiber, “Centrality analysis methods for biological networks and their application to gene regulatory networks,” Gene Regulation and Systems Biology, vol. 2, pp. GRSB–S702, 2008.
- [5] Z. Wang, A. Scaglione, and R. J. Thomas, “Electrical centrality measures for electric power grid vulnerability analysis,” in Proceedings of the 49th IEEE Conference on Decision and Control (CDC), 2010, pp. 5792–5797.
- [6] C. Ballester, A. Calvó-Armengol, and Y. Zenou, “Who’s who in networks. wanted: The key player,” Econometrica, vol. 74, no. 5, pp. 1403–1417, 2006.
- [7] M. Newman, Networks. Oxford university press, 2018.
- [8] F. Bloch, M. O. Jackson, and P. Tebaldi, “Centrality measures in networks,” arXiv preprint arXiv:1608.05845, 2016.
- [9] P. Bonacich, “Technique for analyzing overlapping memberships,” Sociological Methodology, vol. 4, pp. 176–185, 1972.
- [10] L. Katz, “A new status index derived from sociometric analysis,” Psychometrika, vol. 18, no. 1, pp. 39–43, 1953.
- [11] P. Bonacich, “Power and centrality: A family of measures,” American Journal of Sociology, vol. 92, no. 5, pp. 1170–1182, 1987.
- [12] P. Bonacich and P. Lloyd, “Eigenvector-like measures of centrality for asymmetric relations,” Social Networks, vol. 23, no. 3, pp. 191–201, 2001.
- [13] S. Brin and L. Page, “The anatomy of a large-scale hypertextual web search engine,” Computer networks and ISDN systems, vol. 30, no. 1-7, pp. 107–117, 1998.
- [14] L. S. Shapley, “A value for n-person games, contributions to the theory of games, 2, 307–317,” 1953.
- [15] A. Bavelas, “Communication patterns in task-oriented groups,” The Journal of the Acoustical Society of America, vol. 22, no. 6, pp. 725–730, 1950.
- [16] L. C. Freeman, “A set of measures of centrality based on betweenness,” Sociometry, pp. 35–41, 1977.
- [17] A. Banerjee, A. G. Chandrasekhar, E. Duflo, and M. O. Jackson, “The diffusion of microfinance,” Science, vol. 341, no. 6144, p. 1236498, 2013.
- [18] M. Avella-Medina, F. Parise, M. T. Schaub, and S. Segarra, “Centrality measures for graphons: Accounting for uncertainty in networks,” IEEE Transactions on Network Science and Engineering, vol. 7, no. 1, pp. 520–537, 2018.
- [19] T. Başar and G. J. Olsder, Dynamic noncooperative game theory. SIAM, 1998.
- [20] S. Gao, P. E. Caines, and M. Huang, “LQG graphon mean field games: Graphon invariant subspaces,” in Proceedings of the 60th IEEE Conference on Decision and Control, Austin, Texas, USA, December 2021, pp. 5253–5260.
- [21] ——, “LQG graphon mean field games: Analysis via graphon invariant subpaces,” Conditionally accepted by IEEE Transactions on Automatic Control, 2021, arXiv preprint arXiv:2004.00679.
- [22] P. E. Caines and M. Huang, “Graphon mean field games and their equations,” SIAM Journal on Control and Optimization, vol. 59, no. 6, pp. 4373–4399, 2021.
- [23] S. Gao, “Centrality-weighted opinion dynamics: Disagreement and social network partition,” in Proceedings of the 60th IEEE Conference on Decision and Control, Austin, Texas, USA, December 2021, pp. 5496–5501.
- [24] M. Gori, G. Monfardini, and F. Scarselli, “A new model for learning in graph domains,” in Proceedings of the 2005 IEEE International Joint Conference on Neural Networks, vol. 2, no. 2005, 2005, pp. 729–734.
- [25] F. Scarselli, M. Gori, A. C. Tsoi, M. Hagenbuchner, and G. Monfardini, “The graph neural network model,” IEEE Transactions on Neural Networks, vol. 20, no. 1, pp. 61–80, 2009.
- [26] F. Gu, H. Chang, W. Zhu, S. Sojoudi, and L. El Ghaoui, “Implicit graph neural networks,” Advances in Neural Information Processing Systems, vol. 33, pp. 11 984–11 995, 2020.
- [27] L. El Ghaoui, F. Gu, B. Travacca, A. Askari, and A. Tsai, “Implicit deep learning,” SIAM Journal on Mathematics of Data Science, vol. 3, no. 3, pp. 930–958, 2021.
- [28] S. Jafarpour, A. Davydov, A. Proskurnikov, and F. Bullo, “Robust implicit networks via non-euclidean contractions,” Advances in Neural Information Processing Systems, vol. 34, pp. 9857–9868, 2021.
- [29] A. Davydov, A. V. Proskurnikov, and F. Bullo, “Non-euclidean contractivity of recurrent neural networks,” arXiv preprint arXiv:2110.08298, 2021.
- [30] S. Bai, J. Z. Kolter, and V. Koltun, “Deep equilibrium models,” Advances in Neural Information Processing Systems, vol. 32, 2019.
- [31] P. L. Combettes and J.-C. Pesquet, “Fixed point strategies in data science,” IEEE Transactions on Signal Processing, vol. 69, pp. 3878–3905, 2021.
- [32] L. Ruiz, L. Chamon, and A. Ribeiro, “Graphon neural networks and the transferability of graph neural networks,” Advances in Neural Information Processing Systems, vol. 33, pp. 1702–1712, 2020.
- [33] L. Lovász and B. Szegedy, “Limits of dense graph sequences,” Journal of Combinatorial Theory, Series B, vol. 96, no. 6, pp. 933–957, 2006.
- [34] C. Borgs, J. T. Chayes, L. Lovász, V. T. Sós, and K. Vesztergombi, “Convergent sequences of dense graphs i: Subgraph frequencies, metric properties and testing,” Advances in Mathematics, vol. 219, no. 6, pp. 1801–1851, 2008.
- [35] ——, “Convergent sequences of dense graphs ii. multiway cuts and statistical physics,” Annals of Mathematics, vol. 176, no. 1, pp. 151–219, 2012.
- [36] L. Lovász, Large Networks and Graph Limits. American Mathematical Soc., 2012, vol. 60.
- [37] M. Zaheer, S. Kottur, S. Ravanbakhsh, B. Poczos, R. Salakhutdinov, and A. Smola, “Deep sets,” arXiv preprint arXiv:1703.06114, 2017.
- [38] M. M. Bronstein, J. Bruna, T. Cohen, and P. Veličković, “Geometric deep learning: Grids, groups, graphs, geodesics, and gauges,” arXiv preprint arXiv:2104.13478, 2021.
- [39] R. P. Agarwal, M. Meehan, and D. O’regan, Fixed point theory and applications. Cambridge University Press, 2001, vol. 141.
- [40] L. Lyu, B. Fain, K. Munagala, and K. Wang, “Centrality with diversity,” in Proceedings of the 14th ACM International Conference on Web Search and Data Mining, 2021, pp. 644–652.
- [41] K. Ye and L.-H. Lim, “Schubert varieties and distances between subspaces of different dimensions,” SIAM Journal on Matrix Analysis and Applications, vol. 37, no. 3, pp. 1176–1197, 2016.
- [42] E. W. Weisstein, “Vertex-Transitive Graph.” [Online]. Available: https://mathworld.wolfram.com/Vertex-TransitiveGraph.html
- [43] C. Davis and W. M. Kahan, “The rotation of eigenvectors by a perturbation. iii,” SIAM Journal on Numerical Analysis, vol. 7, no. 1, pp. 1–46, 1970.
- [44] A.-L. Barabási and R. Albert, “Emergence of scaling in random networks,” Science, vol. 286, no. 5439, pp. 509–512, 1999.
- [45] A. Frieze and R. Kannan, “Quick approximation to matrices and applications,” Combinatorica, vol. 19, no. 2, pp. 175–220, 1999.
- [46] S. Janson, “Graphons, cut norm and distance, couplings and rearrangements,” New York Journal of Mathematics, vol. 4, 2013.
- [47] C. Borgs, J. T. Chayes, H. Cohn, and Y. Zhao, “An theory of sparse graph convergence II: LD convergence, quotients and right convergence,” The Annals of Probability, vol. 46, no. 1, pp. 337–396, 2018.
- [48] P. E. Caines, “Embedded vertexon-graphons and embedded GMFG systems,” Accepted for presentation at the 61th IEEE Conference on Decision and Control (CDC), 2022.