Next Article in Journal
Optimal Consumption, Investment, and Housing Choice: A Dynamic Programming Approach
Next Article in Special Issue
Strong Chromatic Index of Outerplanar Graphs
Previous Article in Journal
Modeling the Impact of the Imperfect Vaccination of the COVID-19 with Optimal Containment Strategy
Previous Article in Special Issue
The Local Antimagic Total Chromatic Number of Some Wheel-Related Graphs
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

The Spectral Distribution of Random Mixed Graphs

School of Mathematics and Statistics, Guangdong University of Foreign Studies, Guangzhou 510006, China
*
Authors to whom correspondence should be addressed.
These authors contributed equally to this work.
Submission received: 2 February 2022 / Revised: 8 March 2022 / Accepted: 9 March 2022 / Published: 11 March 2022
(This article belongs to the Special Issue Graph Theory with Applications)

Abstract

:
In this work, we propose a random mixed graph model G n ( p ( n ) , q ( n ) ) that incorporates both the classical Erdős-Rényi’s random graph model and the random oriented graph model. We show that the empirical spectral distribution of G n ( p ( n ) , q ( n ) ) converges to the standard semicircle law under some mild condition, and the Monte Carlo simulation highly agrees with our result.

1. Introduction

A mixed graph G = ( V , E ) is a graph with vertex set V = { v 1 , v 2 , , v n } and edge set E = { e 1 , e 2 , , e m } , in which some edges may be undirected and some may be directed. In this paper, the Hermitian adjacency matrix is defined in such a way that the digons (i.e., a pair of arcs with the same end vertices but in opposite directions) may be thought of as undirected edges. From this point of view, digraphs are equivalent to the class of mixed graphs we consider here. The underlying graph of a mixed graph G, denoted by Γ ( G ) , is the undirected graph that keeps all vertices and edges of G and change every arc of G to an undirected edge.
The notion of a mixed graph generalizes the classical approach of orienting either all edges or none. Undirected graphs, oriented graphs and digraphs are special cases of mixed graphs. We denote an edge (no matter it is directed or not) joining two vertices u and v in G by u v . A subgraph of a mixed graph is called mixed walk, mixed path or mixed cycle if its underlying graph is a walk, path or cycle, respectively. However, by the terms of order, size, number of components, degree of a vertex, and distance, we mean that they are the same as in their underlying graphs. Let W be a mixed walk of a mixed graph G, its underlying graph Γ ( W ) (the undirected graph spanned by W in G) may contain parallel edges since W may go through an edge or arcs (no matter in which direction) several of times and thus not necessarily simple. If we take the edge set of Γ ( W ) without multiplicities, this edge set can span a simple undirected graph which we denote it by Γ ¯ ( W ) . For undefined terminology and notation, we refer the reader to [1].
Let G n ( p , q ) : = G n ( p ( n ) , q ( n ) ) be a random mixed graph on the set of vertices { v 1 , v 2 , , v n } in which independently for each pair of vertices v k , v (with k ), there is an undirected edge v k v with probability p; there is an arc v k v from v k to v (and the reverse arc v v k does not occur) with probability q; there is an arc v v k from v to v k (and the reverse directed edge does not occur) with probability q; and finally there is no any undirected edge or arc between v k and v with probability 1 p 2 q (with p , q , p + 2 q [ 0 , 1 ] ). It is immediately to see that G n ( p , q ) is a mixed graph. Note that if we set q = 0 , then the model G n ( p , 0 ) is the classical Erdős and Rényi’s random graph model, see [2]; if we set p = 0 , then the model G n ( 0 , q / 2 ) is the random oriented graph model, see [3]; if we set q = p p , then G n ( p , p p ) is the random mixed graph model in [4]. Since the parameter q = p p , it is easy to see that the random mixed graph model in [4] is not a generalization of the classical Erdős and Rényi’s random graph model. Thus, it is natural for us to build a rather generalized random mixed graph model G n ( p , q ) , which incorporates both classical Erdős and Rényi’s random graph model and the random oriented graph model.
Note that our model is different with the Łuczak and Cohen’s three-parameter random digraphs model [5], as their parameters are absolute constants, but the parameters in our model are functions on n, i.e., p = p ( n ) and q = q ( n ) . What is more, they use their model for the study of phase transition of a giant strongly connected component. We will use our model for the study of empirical spectral distribution.
For a digraph G, its classical adjacency matrix is a 0 1 matrix A ( G ) with rows and columns indexed by the vertices of G, such that the u v -entry of A ( G ) is equal to 1 if there is an arc from u to v and 0 otherwise. Thus, the classical adjacency matrix of a digraph is not necessary to be symmetric or Hermitian, and thus we cannot guarantee all its eigenvalues to be real. This fact makes the study of the spectrum of digraphs or mixed graphs more difficult than that of undirected graphs. Therefore, Liu and Li [6] and Guo and Mohar [7] independently introduced the Hermitian adjacency matrices of mixed graphs or digraphs. In this work, we will make use of this concept to study the spectral distribution of random mixed graphs.
For brevity, asymptotic notation is used under the assumption that n . For functions f and g of parameter n, we use the notation: f = ω ( g ) if | f | / | g | as n .
The Hermitian adjacency matrix of G n ( p , q ) , denoted by H ( G n ( p , q ) ) = ( h k ) n × n (or H n , for brevity), satisfies that:
  • H n is a Hermitian matrix with h k = h ¯ k for 1 k , n and all diagonal entries h k k = 0 , 1 k n ;
  • The upper-triangular entries h k ( 1 k < n ) are independently identically distributed (i.i.d.) copies of a random variable ξ , which takes value 1 with probability p, i with probability q, i with probability q, and 0 with probability 1 p 2 q ,
where i is the imaginary unit with i = 1 . Note that E ( ξ ) = 1 · p + i · q + ( i ) q + 0 · ( 1 p 2 q ) = p , and
σ 2 : = Var ( ξ ) = E [ ( ξ E ( ξ ) ) ( ξ E ( ξ ) ) ¯ ] = E ( | ξ | 2 )     | E ( ξ ) | 2 = 1 · [ p + 2 q ] + 0 · ( 1 p 2 q ) p 2 = p + 2 q p 2 .
Let { M n } n = 1 be a sequence of n × n random Hermitian matrices. Suppose that λ 1 ( M n ) , λ 2 ( M n ) , , λ n ( M n ) are the eigenvalues of M n . The empirical spectral distribution (ESD) of M n is defined by
F M n ( x ) = 1 n # { λ k ( M n ) | λ k ( M n ) x , k = 1 , 2 , , n } ,
where # { · } is the cardinality of the set.
The distribution to which the ESD of M n converges as n is called the limiting spectral distribution (LSD) of { M n } n = 1 .
Wigner matrix, denoted by X n , is an n × n random Hermitian matrix satisfying:
  • The upper-triangular entries x k ( 1 k < n ) are i.i.d. complex random variables with zero mean and unit variance;
  • The diagonal entries x k k ( 1 k n ) are i.i.d. real random variables, independent of the upper-triangular entries, with zero mean;
  • For each positive integer r, max E ( | x 11 | r ) , E ( | x 12 | r )   < .
Let { X n } n = 1 be a sequence of Wigner matrices. Then, Wigner’s Semicircle Law [8] says that, with probability 1, the ESD of n 1 / 2 X n converges to the standard semicircle distribution whose density is given by
ρ ( x ) = 1 2 π 4 x 2 , for | x | 2 , 0 , for | x | > 2 .
Before we give our main results and their proofs, we need the following results.
Lemma 1
(See [9] Lemma 2.4). The number of closed mixed walks of length 2 s , which went through each of its edge v k v from vertex v k to vertex v once and its reverse from vertex v to vertex v k once and the underlying graph of the closed mixed walk of a tree is 1 s   +   1 2 s s .
Lemma 2
(See [9] Lemma 2.1). For a non-negative integer r, we have
2 2 x r ρ ( x ) d x =   0 , for r = 2 s   +   1 , 1 s + 1 2 s s , for r = 2 s .
Lemma 3
(See [9] Theorem A.43). Let A and B be two n × n Hermitian matrices, then
F A F B 1 n rank ( A B ) ,
where f : = sup x | f ( x ) | for a function f ( x ) , and F A means the ESD of A.
Lemma 4
(Dini’s theorem, see [10] p. 64). Suppose that the sequence of continuous functions { s n ( x ) } converges to continuous function s ( x ) on [ a , b ] pointwisely. If, for any x [ a , b ] , the sequence { s n ( x ) } is monotone, then { s n ( x ) } converges to s ( x ) uniformly.
The following result is from [11] p. 264, and we give a proof of it in Appendix A.
Lemma 5.
For any distribution function H, if | x | k d H < , then
lim n ( n ( k s ) / 2 | x | < n | x | s d H ) = 0 , s k + 1 .
Lemma 6
(Bernstein’s inequality, see [9] p. 21). If X 1 , X 2 , , X n are independent random variables with mean zero and uniformly bounded by b, then for any ϵ > 0 ,
P ( | S n | ϵ ) 2 exp ( ϵ 2 2 ( B n 2 + b ϵ ) ) ,
where S n = X 1 + X 2 + + X n and B n 2 = E ( S n 2 ) .
Lemma 7
(Borel–Cantelli lemma, see [12] Theorem 3.18). Let E 1 , E 2 , be a sequence of events in some probability space. If the sum of the probabilities of the E n is finite, that is n = 1 P ( E n ) < , then the probability that infinitely many of them occur is 0, that is,
P ( lim sup n E n ) = 0 .
Remark 1.
As a result of Lemma 7, for any ϵ > 0 and any sequence { X n } of random variables, we have
  • If n = 1 P ( | X n | ϵ ) < , then X n 0 a . s .
  • If n = 1 E | X n X | k < , k > 0 , then X n X a . s .
Define
M n : = 1 σ [ H n p ( J n I n ) ] = ( η k ) ,
where p : = p ( n ) is the parameter from G n ( p , q ) , σ = p ( 1 p ) + 2 q , J n is the all-ones matrix of order n and I n is the identity matrix of order n. It is easy to check that
  • M n is Hermitian matrix;
  • The diagonal entries η k k = 0 and the upper-triangular entries η k ( 1 k < n ) are i.i.d. copies of random variable η which takes value 1     p σ with probability p, i     p σ with probability q, i     p σ with probability q, and p σ with probability 1 p 2 q .
Note that E ( η ) = E ( ξ     p σ ) = E ( ξ )     p σ = 0 , Var ( η ) = Var ( ξ     p σ ) = Var ( ξ ) σ 2 = 1 and the expectation
E ( | η | s ) = E ( | ξ p σ | s ) = E ( | ξ p | s ) σ s = | 1 p | s · p + | i p | s · q + | i p | s · q + | 0 p | s · ( 1 p 2 q ) ( p + 2 q p 2 ) s / 2 = p ( 1 p ) s + 2 q ( 1 + p 2 ) s / 2 + ( 1 p 2 q ) p s ( p ( 1 p ) + 2 q ) s / 2 .
On one hand, since 0 p : = p ( n ) , q : = q ( n ) , 1 p 2 q 1 , it can easily see from (1) that if σ 2 = p ( 1 p ) + 2 q 0 as n , then E ( | η | s ) < for every positive integer s as n . Then { M n } n = 1 is a sequence of Wigner matrices and hence LSD of { M n } n = 1 is immediate by the Wigner’s Semicircle Law. On the other hands, if σ 2 0 as n , then E ( | η | s ) for every positive integer s 3 as n . Then, { M n } n = 1 is not a sequence of Wigner matrices. Thus, if σ 2 0 as n , the LSD of M n cannot be directly derived by the Wigner’s Semicircle Law. In fact, if σ 2 0 as n , then either p 0 , q 0 or p 1 , q 0 .
For the case p 0 , q 0 , we have
E ( | η | s ) p ( 1 p ) s ( p ( 1 p ) + 2 q ) s / 2 p ( 1 p ) s p s / 2 ( 1 p ) s / 2 = ( 1 p ) s / 2 p s / 2 1 + .
For the case p 1 , q 0 , we have
E ( | η | s ) ( 1 p 2 q ) p s ( p ( 1 p ) + 2 q ) s / 2 ( 1 p ) p s p s / 2 ( 1 p ) s / 2 = p s / 2 ( 1 p ) s / 2 1 + .
From the discussion above, we have:
Lemma 8.
The sequence of matrices { M n } n = 1 are Wigner’s matrices if and only if σ 2 0 .
Now, we arrive at the main results in the following, which say that even when σ 2 0 , we still can get the semicircle distribution of LCD of the corresponding sequences of matrices if n σ 2 holds.
Theorem 1.
Let { H n } n = 1 be a sequence of Hermitian adjacency matrices of random mixed graphs { G n ( p , q ) } n = 1 with p : = p ( n ) , q : = q ( n ) and σ 2 = p ( 1 p ) + 2 q ( 0 p , q , 1 ( p + 2 q ) 1 ) . If σ 2 = ω ( 1 n ) , then with probability 1, the ESD of 1 σ n H n converges to the standard semicircle distribution with density ρ ( x ) .
Note that the above result considers an ensemble of random Hermitian matrices, to which the corresponding random mixed graphs are not necessarily connected. Before we proceed to the proof, we use the Monte Carlo method to demonstrate Theorem 1, and it is easy to see from Figure 1 that
(1)
As n σ 2 changes from a small value to relatively large value, the ESD becomes more and more close to the semicircle distribution.
(2)
In the first row of the Figure 1, we set p = 0 ,   n = 1000 and changes q increasingly from 0.001 to 0.1 (from right to left), then n σ 2 is getting larger and larger, and the ESD is getting more and more close to the semicircle distribution.
(3)
In the second row, the similar things happen as the first line except that we fix q = 0 , n = 1000 and change p increasingly from 0.001 to 0.1 (from right to left).
(4)
In the third row, we fix n = 1000 and increase the values of p , q (from right to left). At this time, we find n σ 2 increases simultaneously with p , q and the ESD is getting more and more close to the semicircle distribution.
(5)
In the fourth row, we fix p = q = 0.01 and change n from 10 , 100 , 10,000 (from right to left) and at the meantime corresponding n σ 2 increases. Finally, we restore the semicircle law.
(6)
In the fifth row we fix q = 0.01 , n = 1000 and increase the value of p and we find that n σ 2 changes from 110 to 270 (from right to left). All ESDs fit with the semicircle law very well.
Remark 2.
The result of the theorem means that if σ 2 = ω ( 1 n ) , then for any bounded continuous function f, we have
f ( x ) d F 1 σ n H n ( α ) ( x ) f ( x ) d F ( x ) a . s .
Clearly,
1 n i = 1 n f ( λ i ) f ( x ) ρ ( x ) d x a . s .
Here, λ i : = λ i ( 1 σ n H n ( α ) ) is the eigenvalue of the matrix 1 σ n H n ( α ) . The proof is exactly the Theorem 1.15 in [13] and we omit it.
Remark 3.
If we set q = 0 , then our model G n ( p , 0 ) is the classical Erdős and Rényi’s random graph model. Since if p = ω ( 1 n ) , then σ 2 = ω ( 1 n ) . Thus, we could get the result of Theorem 1.3 in [14] as a corollary. Note that the paper [14] is to deal with a much more difficult situation and the original proof of Wigner [8] may be extended without difficulty to derive Theorem 1.3.
Since σ 2 = ω ( 1 n ) is equivalent to p = ω ( 1 n ) or 1 p = ω ( 1 n ) . We could say more than that.
Corollary 1
([14]). Let { H n } n = 1 be a sequence of adjacency matrices of random graphs { G n ( p , 0 ) } n = 1 with p : = p ( n ) and σ 2 = p ( 1 p ) ( 0 p , 1 p 1 ) . If p = ω ( 1 n ) or 1 p = ω ( 1 n ) , then with probability 1, the ESD of 1 σ n H n converges to the standard semicircle distribution with density ρ ( x ) .
Remark 4.
If we set p = 0 , then our model G n ( 0 , q / 2 ) is essentially the random oriented graph model since the skew-adjacency matrix S ( G ) = i H ( G ) for an oriented graph G. Therefore, they share the same spectral distribution. Since if q = ω ( 1 n ) , then σ 2 = ω ( 1 n ) . Thus, we could get the result of Theorem 3.1 in [3] as a corollary.
Corollary 2
([3]). Let { H n } n = 1 be a sequence of Hermitian adjacency matrices of random oriented graphs { G n ( 0 , q / 2 ) } n = 1 with q : = q ( n ) and σ 2 = q ( 0 q 1 ) . If q = ω ( 1 n ) , then with probability 1, the ESD of 1 σ n H n converges to the standard semicircle distribution with density ρ ( x ) .
The proof of Theorem 1 will be given in the sequel. We shall first prove the following result.
Theorem 2.
If σ 2 = ω ( 1 n ) , then with probability 1, the ESD of T n : = n 1 / 2 M n converges to the standard semicircle distribution.
Proof. 
Denote T r , n the r-th moment of the ESD of T n . Since the standard semicirlular distribution F has finite support, it is uniquely characterized by its sequences of moments. Thus, to prove the weak convergence of F T n to F, it is equivalent to prove the convergence of moments by the Moment Convergence Theorem (see, e.g., [9]), that is
T r , n = x r d F T n ( x ) x r ρ ( x ) d x a . s . n , r = 1 , 2 ,
Since
T r , n = x r d F T n ( x ) = 1 n k = 1 n λ k r ( T n ) = 1 n trace ( T n r ) = 1 n trace [ ( 1 n M n ) r ] = 1 n 1 + r / 2 trace ( M n r ) = 1 n 1 + r / 2 1 k 1 , , k r n η k 1 k 2 η k 2 k 3 η k r k 1 ,
where W : = v k 1 v k 2 v k r 1 v k r v k 1 corresponds to a closed mixed walk of length r in the complete digraph D K n of order n, and D K n is formed by replacing every undirected edge with a pair of arcs in opposite directions in the complete undirected graph K n of order n. For each edge v k v E ( Γ ¯ ( W ) ) , let q k W be the number of times that the walk W goes from vertex v k to vertex v . Let m k W : = q k + q k . If there is no ambiguity, we will omit the upper symbol for brevity.
Let
η ( W ) : = v k v E ( Γ ¯ ( W ) ) η k q k η k q k .
Then we rewrite (3) as
T r , n = 1 n 1 + r / 2 W η ( W ) = 1 n 1 + r / 2 W v k v E ( Γ ¯ ( W ) ) η k q k η k q k .
Here, the summation is taken over all closed mixed walks W of length r. Note that all different edges (with different pair of end vertices) of a mixed graph are mutually independent, we have
E ( T r , n ) = 1 n 1 + r / 2 W E ( η ( W ) ) = 1 n 1 + r / 2 W v k v E ( Γ ¯ ( W ) ) E ( η k q k η k q k ) .
Define M ˜ n = ( η k ) , where
η k = η k , if | η k |   < n , 0 , if | η k |   n .
Let
T ˜ n = 1 n M ˜ n = ( η k n ) ,
and T r , n be the r-th moment of the ESD of the matrix T ˜ n . Similar to (3)–(5), we have
T r , n = 1 n 1 + r / 2 W η ( W ) = 1 n 1 + r / 2 W v k v E ( Γ ¯ ( W ) ) η k q k η k q k ,
and
E ( T r , n ) = 1 n 1 + r / 2 W E ( η ( W ) ) = 1 n 1 + r / 2 W v k v E ( Γ ¯ ( W ) ) E ( η k q k η k q k ) .
Next, we will prove that
Fact 1.
lim n E ( T r , n ) = 2 2 x r ρ ( x ) d x = 0 , for r = 2 s + 1 , 1 s + 1 2 s s , for r = 2 s . .
Fact 2.
lim n T r , n = lim n E ( T r , n ) a . s .
Fact 3.
lim n T r , n = lim n T r , n a . s .
Combining the above Facts 1–3, we show (2) and complete the proof. □

1.1. Proof of Fact 1

Proof. 
The second equality is just the Lemma 2. Now, we consider the remaining equality.
We decompose E ( T r , n ) into parts E m , r , n , m = 1 , 2 , , r , containing the m-fold sums,
E ( T r , n ) = m = 1 r E m , r , n ,
where
E m , r , n = 1 n 1 + r / 2 W : | E ( Γ ¯ ( W ) ) | = m E ( η ( W ) ) : = 1 n 1 + r / 2 W : | E ( Γ ¯ ( W ) ) | = m v k v E ( Γ ¯ ( W ) ) E ( η k q k η k q k ) ,
Here, the summation in (9) is taken over all closed mixed walks W of length r with | E ( Γ ( W ) ) | = m . Additionally, | E ( Γ ¯ ( W ) ) | = m means that the cardinality of the edge set of Γ ¯ ( W ) is m and thus | V ( Γ ( W ) ) | m + 1 .
For a closed mixed walk W : = v k 1 v k 2 v k r 1 v k r v k 1 . Its underlying graph Γ ( W ) is connected since there exists an edge between any two vertices v j and v j + 1 , j = 1 , , r 1 . Note that Γ ( W ) is a multigraph and Γ ¯ ( W ) is a simple graph. For a closed mixed walks W of length r with | E ( Γ ¯ ( W ) ) | = m , if it goes through some edge only once, then E ( W ) = 0 . Hence, if m > r 2 , then there must be some edge that appears only once in W and thus E ( W ) = 0 . In the following, we only consider closed mixed walks W of length r with m r 2 and m k 2 for all edges v k v in Γ ¯ ( W ) .
If m k = 2 for an edge v k v in Γ ¯ ( W ) . Then,
E ( η k q k η k q k ) E ( | η | 2 ) = E ( | η | 2 I | η | < n )   E ( | η | 2 )   = 1 a s n
If m k 3 for an edge v k v in Γ ¯ ( W ) . Then, by Lemma 5 with k = 2 and E ( | η | 2 )   E ( | η | 2 ) = 1 , we have
E ( η k q k η k q k ) E ( | η k | q k | η k | q k )   = E ( | η k | m k ) = o ( n m k / 2 1 ) a s n
Moreover, we have
E ( η k q k η k q k ) E ( | η k | m k ) n m k / 2 1
for every edge v k v in Γ ¯ ( W ) with m k 2 .
Therefore, if we decompose the edge set of Γ ¯ ( W ) into the following set:
E 2 ( W ) : = { v k v | m k = 2 i n Γ ( W ) } , E 3 ( W ) : = { v k v | m k 3 i n Γ ( W ) } ,
then v k v E 2 ( W ) m k + v k v E 3 ( W ) m k = r and | E 2 ( W ) | + | E 3 ( W ) | = m and we have
E ( η ( W ) ) = v k v Γ ¯ ( W ) E ( η k q k η k q k ) v k v Γ ¯ ( W ) E ( | η k | m k ) = v k v E 2 ( W ) E ( | η k | m k ) v k v E 3 ( W ) E ( | η k | m k ) v k v E 2 ( W ) n m k / 2 1 v k v E 3 ( W ) o ( n m k / 2 1 )
Thus, if | E 3 ( W ) | = 0 , we get E ( η ( W ) ) 1 ; for otherwise
E ( η ( W ) ) v k v E 2 ( W ) n m k / 2 1 · v k v E 3 ( W ) o ( n m k / 2 1 ) = n v k v E 2 ( W ) ( m k / 2 1 ) · o ( n v k v E 3 ( W ) ( m k / 2 1 ) ) = o ( n r / 2 m )
Note that | E ( Γ ¯ ( W ) ) | = m , | V ( Γ ( W ) ) | m + 1 , and then the number of such closed walks W of length r is at most n m + 1 · ( m + 1 ) r .
Now, we consider the following two cases.
Case 1. The closed walks W of length r with E 3 ( W ) .
We have
E m , r , n = 1 n 1 + r / 2 W : | E ( Γ ¯ ( W ) ) | = m v k v Γ ¯ ( W ) E ( η k q k η k q k ) 1 n 1 + r / 2 W : | E ( Γ ¯ ( W ) ) | = m v k v Γ ¯ ( W ) E ( | η k | m k ) = 1 n 1 + r / 2 W : | E ( Γ ¯ ( W ) ) | = m v k v E 2 ( W ) E ( | η k | m k ) v k v E 3 ( W ) E ( | η k | m k ) 1 n 1 + r / 2 W : | E ( Γ ¯ ( W ) ) | = m v k v E 2 ( W ) ( n m k / 2 1 ) v k v E 3 ( W ) o ( n m k / 2 1 ) n m + 1 ( m + 1 ) r n 1 + r / 2 · o ( n r / 2 m ) = ( m + 1 ) r · o ( 1 ) 0 a s n .
Case 2. The closed walks W of length r with E 3 ( W ) = .
We then have E ( η ( W ) ) 1 by the above discussion. We divide this case into two subcases according to the cardinality of V ( Γ ¯ ( W ) ) .
Subcase 2.1. The closed walks W of length r with | V ( Γ ¯ ( W ) ) | m .
We have that the number of such closed walks W of length r is at most n m · m r and
1 n 1 + r / 2 W : | E ( Γ ¯ ( W ) ) | = m , | V ( Γ ¯ ( W ) ) | m E ( η ( W ) ) 1 n 1 + r / 2 W : | E ( Γ ¯ ( W ) ) | = m , | V ( Γ ¯ ( W ) ) | m 1 n m · m r n 1 + r / 2 = m r n 1 + r / 2 m m r n 0 a s n .
Case 2.2. The closed walks W of length r with | V ( Γ ¯ ( W ) ) | = m + 1 and E 3 ( W ) = .
We have that m k = 2 for each edge v k v E ( Γ ¯ ( W ) ) . Since Γ ¯ ( W ) is connected, Γ ¯ ( W ) is a tree with m + 1 vertices and q k = 1 , q k = 1 for every edge v k v Γ ¯ ( W ) . Then, r = 2 m and W is a closed mixed walk of length 2 m , which satisfies the condition of Lemma 1. By Lemma 1, the number of such W is 1 m + 1 2 m m . The number of ways to choose m + 1 vertices of W with the natural order of first appearance in W from n vertices is n ( n 1 ) ( n m ) . Note that E ( η ( W ) ) 1 and E ( η ( W ) ) 1 , as n . Thus,
1 n 1 + m W : | E ( Γ ( W ) ) | = m , | E ( Γ ¯ ( W ) ) | = m + 1 E ( η ( W ) ) 1 n 1 + m W : | E ( Γ ( W ) ) | = m , | V ( Γ ¯ ( W ) ) | m 1 = n ( n 1 ) ( n m ) n 1 + m 1 m + 1 2 m m = n 1 + m ( 1 + O ( n 1 ) ) n 1 + m · 1 m + 1 2 m m = ( 1 + O ( n 1 ) ) · 1 m + 1 2 m m 1 m + 1 2 m m a s n .
Therefore,
lim n E m , 2 m , n = 1 m + 1 2 m m .
From the above discussion, we have
lim n E ( T r , n ) = 1 m + 1 2 m m , for r = 2 m . 0 , otherwise
which completes the proof. □

1.2. Proof of Fact 2

Proof. 
Note that T r , n = 1 n 1 + r / 2 W η ( W ) and since T r , n is a real random variable and so is E ( T r , n ) , we have
E [ | T r , n E ( T r , n ) | 4 ] = E [ ( T r , n E ( T r , n ) ) 4 ] = 1 n 4 + 2 r W 1 , , W 4 E ( k = 1 4 [ η ( W k ) E ( η ( W k ) ] ) ,
due to the symmetric of W k ( k = 1 , 2 , 3 , 4 ) , where W k ( k = 1 , , 4 ) corresponds to a closed mixed walk of length r in the complete digraph of order D K n . We aim to show the righthand side in Equation (11) is O ( n 2 ) . Let k 0 { 1 , 2 , 3 , 4 } . If Γ ¯ ( W k 0 ) has no any common edge with Γ ( W ¯ W k 0 ) , where W ¯ : = W 1 W 2 W 3 W 4 , then the corresponding expectation of the summand in Equation (11) is zero. If there is an edge v k 0 v 0 whose number of occurrence in Γ ( W ¯ ) is 1, then the corresponding expectation of the summand in Equation (11) is also zero. Thus, we only have to consider the case when no edge v k 0 v 0 E ( Γ ( W ¯ ) ) with m k 0 0 W ¯ = 1 and each Γ ¯ ( W k ) must share at least one edge with Γ ¯ ( W ¯ W k ) for k = 1 , 2 , 3 , 4 below. Thus,
E ( | T r , n E ( T r , n ) | 4 ) = 1 n 4 + 2 r W 1 , , W 4 E ( k = 1 4 [ η ( W k ) E ( η ( W k ) ) ] ) = 1 n 4 + 2 r W 1 , , W 4 [ E ( = 1 4 η ( W ) ) 4 E ( η ( W 1 ) ) E ( = 2 4 η ( W ) ) + 6 E ( η ( W 1 ) η ( W 2 ) ) E ( η ( W 3 ) η ( W 4 ) ) 3 = 1 4 E ( η ( W ) ) ] 1 n 4 + 2 r W 1 , , W 4 [ | E ( = 1 4 η ( W ) ) | + 4 | E ( η ( W 1 ) ) E ( = 2 4 η ( W ) ) | + 6 | E ( η ( W 1 ) η ( W 2 ) ) E ( η ( W 3 ) η ( W 4 ) ) | + 3 | = 1 4 E ( η ( W ) ) | ] .
For closed mixed walks W k ( k = 1 , , 4 ) of length r, let | E ( Γ ¯ ( W ¯ ) | = x , | E ( Γ ¯ ( W 1 ) ) | = x 1 , | E ( Γ ¯ ( j = 2 4 W j ) ) | = x 2 , then x x 1 + x 2 , x 1 r / 2 , x 2 3 r / 2 , x 2 r and
v k v E ( Γ ¯ ( W ¯ ) ) m k W ¯ = 4 r ,
which implies
| E ( η ( W 1 ) ) | n r / 2 x 1 , | E ( j = 2 4 η ( W j ) ) | n 3 r / 2 x 2 .
Thus,
| E ( η ( W 1 ) ) E ( j = 2 4 η ( W j ) ) | n 4 r / 2 x 1 x 2 n 2 r x .
Similarly, we have
| E ( η ( W 1 ) ) E ( η ( W 2 ) ) E ( j = 3 4 η ( W j ) ) | n 2 r x
and
| E ( η ( W 1 ) ) E ( η ( W 2 ) ) E ( η ( W 3 ) ) E ( η ( W 4 ) ) | n 2 r x
Therefore,
E ( | T r , n E ( T r , n ) | 4 ) 1 n 4 + 2 r W 1 , , W 4 14 n 2 r x = 14 n 4 W 1 , , W 4 n x .
Since W j is a closed mixed walk of length r, Γ ( W j ) ( j = 1 , 2 , 3 , 4 ) is connected. Combining the previous discussions, the connected components of Γ ( W ¯ ) is at most 2. Then, there are at most x + 2 vertices in Γ ( W ¯ ) , which means that the number of such W ¯ is at most n x + 2 c ( x , r ) with c ( x , r ) a constant only depending on x and r. Hence,
14 n 4 W 1 , , W 4 | E ( Γ ¯ ( W ¯ ) | = x n x 14 n 4 x = 1 2 r n 2 · c ( x , r ) = O ( n 2 ) .
Thus, for every positive integer r
n = 1 E ( | T r , n E ( T r , n ) | 4 ) = n = 1 O ( n 2 ) <
By Lemma 7, we have lim n T r , n = lim n E ( T r , n ) a . s . This finishes the proof of Fact 2. □

1.3. Proof of Fact 3

Proof. 
Note that
T r , n = x r d F T n ( x ) = x r d F n 1 / 2 M n ( x )
and
T r , n = x r d F T ˜ n ( x ) = x r d F n 1 / 2 M ˜ n ( x ) .
So we have
F T n F T ˜ n = F n 1 / 2 M n F n 1 / 2 M ˜ n 1 n rank ( n 1 / 2 M n n 1 / 2 M ˜ n ) = 1 n rank ( M n M ˜ n ) .
Notice that rank ( M n M ˜ n ) the number of nonzero entries in M n M ˜ n , which is bounded by 1 j , k n I | η j k | n , where
I { | η j k | n } = 1 , if | η j k |   < n , 0 , if | η j k |   n .
Then
F T n F T ˜ n 1 n 1 j k n I | η j k | n .
Note that η j k = η ¯ k j , we get
F T n F T ˜ n 2 n 1 j < k n I | η j k | n
Here, the summation occurs among n 2 n 2 terms. We can show the convergence to zero in the righthand side almost surely using Lemma 6. In fact, set β n = P ( | η | n ) , and for any ϵ > 0 , we consider
P ( 1 n 1 j < k n I | η j k | n ϵ ) = P ( 1 j < k n ( I | η j k | n β n ) ϵ n β n · n 2 n 2 ) 2 exp ( ( ϵ n β n · n 2 n 2 ) 2 2 [ n 2 n 2 ( β n β n 2 ) + ϵ n β n · n 2 n 2 ] ) = 2 exp ( ( ϵ β n · n 1 2 ) 2 · n 2 ( ϵ n 1 2 β n 2 ) ) .
Here, we have used E ( I | η j k | n β n ) = 0 and Var ( I | η j k | n β n ) = β n β n 2 . Note that
( ϵ β n · n 1 2 ) 2 ϵ n 1 2 β n 2 ( ϵ β n · n 1 2 ) 2 ϵ n 1 2 β n = ϵ β n ( n 1 ) 2 ϵ n β n 2
Claim. If σ 2 = ω ( 1 n ) , then β n = ω ( 1 n ) .
If s be any integer no less than 3, then by the Markov’s inequality
n β n = n P ( | η | n ) n E ( | η | s ) ( n ) s = n p ( 1 p ) s + 2 q ( 1 + p 2 ) s / 2 + ( 1 p 2 q ) p s [ n ( p + 2 q p 2 ) ] s / 2 = n p ( 1 p ) s + 2 n q ( 1 + p 2 ) s / 2 + n ( 1 p 2 q ) p s [ n p ( 1 p ) + 2 n q ] s / 2 = n p ( 1 p ) s [ n p ( 1 p ) + 2 n q ] s / 2 + 2 n q ( 1 + p 2 ) s / 2 [ n p ( 1 p ) + 2 n q ] s / 2 + n ( 1 p 2 q ) p s [ n p ( 1 p ) + 2 n q ] s / 2 .
By direct calculation, we get that each of the last three summands in (12) tends to zero if σ 2 = ω ( 1 n ) .
By the claim above n β n 0 if σ 2 = ω ( 1 n ) , we can limit it less than ϵ and then we have
P ( 1 n 1 j < k n I | η j k | n ϵ ) 2 exp ( ϵ 2 n ) .
Finally we get
n = 1 P ( 1 n 1 j < k n I | η j k | n ϵ ) < ,
which completes the proof by Lemma 7. □

2. Proof of Theorem 1

Proof. 
Recall that
T n = n 1 / 2 M n = 1 σ n [ ( H n + p I n ) p J n ] ,
and set
T n * = 1 σ n ( H n + p I n ) .
Then
T n * T n = p σ n J n .
By Lemma 3, we have
F T n * F T n 1 n · rank ( p σ n J n ) = 1 n .
This implies that T n * and T n have the same LSD. By Theorem 2, we have
lim n F T n * ( x ) = lim n F T n ( x ) = F ( x ) : = x ρ ( x ) d x a . s .
Set T n * * = 1 σ n H n . Note that
T n * T n * * = p σ n I n = : Δ n I n ,
and
lim n Δ n 2 = lim n p 2 n σ 2 = 0 a s σ 2 = ω ( 1 n ) .
Since T n * = T n * * + Δ n I n , we have that λ is an eigenvalue of T n * * if and only if λ + Δ n is an eigenvalue of T n * . By the definition of F T n * , F T n * * , we know
F T n * * ( x ) = 1 n # { λ ( T n * * ) | λ ( T n * * ) x , i = 1 , 2 , , n } = 1 n # { λ ( T n * * ) | λ ( T n * * ) + Δ n x + Δ n , i = 1 , 2 , , n } = 1 n # { λ ( T n * ) | λ ( T n * ) x + Δ n , i = 1 , 2 , , n } = F T n * ( x + Δ n ) .
Note that by Lemma 4 and the continuity of F ( x ) , we have
lim n F T n * ( x + Δ n ) = lim n [ F T n * ( x + Δ n ) F ( x + Δ n ) + F ( x + Δ n ) ] = lim n [ F T n * ( x + Δ n ) F ( x + Δ n ) ] + lim n F ( x + Δ n ) = F ( x ) a . s .
Finally we get
lim n F T n * * ( x ) = lim n F T n * ( x + Δ n ) = F ( x ) a . s .
This completes the proof. □

3. Conclusions

In this work, we propose a random mixed graph model G n ( p ( n ) , q ( n ) ) . The model generalizes the classical Erdős-Rényi’s random graph model and the random oriented graph model. Furthermore, we give a sufficient condition, i.e., if σ 2 = ω ( 1 n ) , then with probability 1, the ESD of 1 σ n H n converges to the standard semicircle distribution. The simulation result not only supports our theorem, but also indicates that the condition may be necessary. We would like to resolve this problem in future work.

Author Contributions

Conceptualization, J.L. and B.C.; methodology, J.L. and Y.G.; software, Y.G.; validation, M.L., M.C. and L.Z.; formal analysis and investigation, J.L., Y.G. and C.Y.; resources, M.L., J.L. and J.W.; writing—original draft preparation, J.L. and Y.G.; writing—review and editing, M.L. and B.C.; supervision, J.L.; project administration, J.L.; funding acquisition, J.L., Y.G., M.C. and M.L. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by National Natural Science Foundation of China(No.11701105, No.12001117, No.12001118), Natural Science Foundation of Guangdong Province(No.2016A030313686, No.2018A030313267).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Acknowledgments

The research was also funded by the Innovation Team Project: “Graph Theory, Cybernetics and Their Applications in Finance” of Guangdong University of Foreign Studies. We would like to thank the anonymous reviewers for their careful work and constructive suggestions that have helped improve this paper substantially.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

Lemma A1.
For p > 0 , if E | X | p < , then lim x x p P ( | X | > x ) = 0 .
Proof. 
lim n F T n * ( x ) P ( | X | > x ) = Ω I | X | > x d P Ω | X | p x p I | X | > x d P = 1 x p Ω | X | p I | X | > x d P ,
and | X | p I | X | > x converges to zero almost surely, by the dominated convergence theorem, we have Ω | X | p I | X | > x d P tends to zero. □
The following result is from [15] with L ( x ) 1 .
Lemma A2.
Suppose n t + 1 P ( | X | > n 1 / r ) 0 as n . For α > 0 , 0 < γ 1 , if α > r ( t + 1 ) , then
| x | < n γ / r | x | α d F ( x ) = o ( n γ ( α r 1 t 1 ) ) .
Proof of Lemma 5.
Suppose that H is the distribution of random variable X. Taking p = k , x = n by Lemma A1 we have
n k / 2 P ( | X | > n ) = o ( 1 ) .
Taking r = 2 , γ = 1 , α = s , by Lemma A2, we have that n t + 1 P ( | X | > n ) = o ( 1 ) , which yields that | x | < n | x | s d F ( x ) = o ( n s / 2 t 1 ) if s > 2 ( t + 1 ) . Here we shall take t = k / 2 1 and then 2 ( t + 1 ) = k < s since s k + 1 > k . Meanwhile, s / 2 t 1 = ( s k ) / 2 , i.e.,
| x | < n | x | s d F ( x ) = o ( n ( s k ) / 2 )
Finally we get
n ( k s ) / 2 | x | < n | x | s d F ( x ) = o ( 1 ) .

References

  1. Cvetković, D.; Doob, M.; Sachs, H. Spectra of Graphs, 3rd ed.; Johann Ambrosius Barth: Heidelberg, Germany, 1995. [Google Scholar]
  2. Bollobás, B. Random Graphs; Number 73; Cambridge University Press: Cambridge, UK, 2001. [Google Scholar]
  3. Chen, X.; Li, X.; Lian, H. The skew energy of random oriented graphs. Linear Algebra Appl. 2013, 438, 4547–4556. [Google Scholar] [CrossRef]
  4. Hu, D.; Li, X.; Liu, X.; Zhang, S. The spectral distribution of random mixed graphs. Linear Algebra Appl. 2017, 519, 343–365. [Google Scholar] [CrossRef]
  5. Łuczak, T.; Cohen, J.E. Giant components in three-parameter random directed graphs. Adv. Appl. Probab. 1992, 24, 845–857. [Google Scholar] [CrossRef] [Green Version]
  6. Liu, J.; Li, X. Hermitian-adjacency matrices and hermitian energies of mixed graphs. Linear Algebra Appl. 2015, 466, 182–207. [Google Scholar] [CrossRef]
  7. Guo, K.; Mohar, B. Hermitian adjacency matrix of digraphs and mixed graphs. J. Graph Theory 2017, 85, 217–248. [Google Scholar] [CrossRef] [Green Version]
  8. Wigner, E.P. Characteristic vectors of bordered matrices with infinite dimensions. Ann. Math. 1955, 62, 548–564. [Google Scholar] [CrossRef]
  9. Bai, Z.; Silverstein, J.W. Spectral Analysis of Large Dimensional Random Matrices; Springer: Berlin/Heidelberg, Germany, 2010; Volume 20. [Google Scholar]
  10. Royden, H.; Fitzpatrick, P. Real Analysis; Macmillan: New York, NY, USA, 1988; Volume 32. [Google Scholar]
  11. Arnold, L. On the asymptotic distribution of the eigenvalues of random matrices. J. Math. Anal. Appl. 1967, 20, 262–268. [Google Scholar] [CrossRef] [Green Version]
  12. Kallenberg, O. Foundations of Modern Probability; Springer Science & Business Media: Berlin/Heidelberg, Germany, 2006. [Google Scholar]
  13. Guionnet, A. Large random matrices: Lectures on macroscopic asymptotics. In Lecture Notes in Mathematics; Lectures from the 36th Probability Summer School held in Saint-Flour, 2006; Springer: Berlin/Heidelberg, Germany, 2009; Volume 1957. [Google Scholar]
  14. Tran, L.; Vu, V.; Wang, K. Sparse random graphs: Eigenvalues and eigenvectors. Random Struct. Algorithms 2013, 42, 110–134. [Google Scholar] [CrossRef] [Green Version]
  15. Heyde, C.; Rohatgi, V. A pair of complementary theorems on convergence rates in the law of large numbers. In Mathematical Proceedings of the Cambridge Philosophical Society; Cambridge University Press: Cambridge, UK, 1967; Volume 63, pp. 73–82. [Google Scholar]
Figure 1. Simulation for Theorem 1.
Figure 1. Simulation for Theorem 1.
Axioms 11 00126 g001
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Guan, Y.; Cheng, B.; Chen, M.; Liang, M.; Liu, J.; Wang, J.; Yang, C.; Zeng, L. The Spectral Distribution of Random Mixed Graphs. Axioms 2022, 11, 126. https://0-doi-org.brum.beds.ac.uk/10.3390/axioms11030126

AMA Style

Guan Y, Cheng B, Chen M, Liang M, Liu J, Wang J, Yang C, Zeng L. The Spectral Distribution of Random Mixed Graphs. Axioms. 2022; 11(3):126. https://0-doi-org.brum.beds.ac.uk/10.3390/axioms11030126

Chicago/Turabian Style

Guan, Yue, Bo Cheng, Minfeng Chen, Meili Liang, Jianxi Liu, Jinxun Wang, Chao Yang, and Li Zeng. 2022. "The Spectral Distribution of Random Mixed Graphs" Axioms 11, no. 3: 126. https://0-doi-org.brum.beds.ac.uk/10.3390/axioms11030126

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop