Next Article in Journal
A Two-Sample Test of High Dimensional Means Based on Posterior Bayes Factor
Next Article in Special Issue
On De la Peña Type Inequalities for Point Processes
Previous Article in Journal
Scalable Visible Light Indoor Positioning System Using RSS
Previous Article in Special Issue
On the M-Estimator under Third Moment Condition
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A New Bound in the Littlewood–Offord Problem

by
Friedrich Götze
1 and
Andrei Yu. Zaitsev
2,3,*
1
Fakultät für Mathematik, Universität Bielefeld, Postfach 100131, D-33501 Bielefeld, Germany
2
St. Petersburg Department of Steklov Mathematical Institute, Fontanka 27, St. Petersburg 191023, Russia
3
Mathematics and Mechanics Faculty, St. Petersburg State University, 7/9 Universitetskaya nab., St. Petersburg 199034, Russia
*
Author to whom correspondence should be addressed.
Submission received: 11 April 2022 / Revised: 10 May 2022 / Accepted: 10 May 2022 / Published: 19 May 2022
(This article belongs to the Special Issue Limit Theorems of Probability Theory)

Abstract

:
The paper deals with studying a connection of the Littlewood–Offord problem with estimating the concentration functions of some symmetric infinitely divisible distributions. It is shown that the concentration function of a weighted sum of independent identically distributed random variables is estimated in terms of the concentration function of a symmetric infinitely divisible distribution whose spectral measure is concentrated on the set of plus-minus weights.

The aim of the present work is to provide a supplement to the paper of Eliseeva and Zaitsev [1]. We studied a connection of the Littlewood–Offord problem with estimating the concentration functions of some symmetric infinitely divisible distributions. In the study, we repeat the arguments of [1], adding, at the last step, an application of Jensen’s inequality.
Let X , X 1 , , X n be independent identically distributed (i.i.d.) random variables. The concentration function of a R d -dimensional random vector Y with distribution F = L ( Y ) is defined by the equality
Q ( F , λ ) = sup x R d P ( Y x + λ B ) , 0 λ ,
where B = { x R d : x 1 / 2 } . Of course, Q ( F , ) = 1 . Let a = ( a 1 , , a n ) , where a k = ( a k 1 , , a k d ) R d , k = 1 , , n . In this paper, we studied the behavior of the concentration functions of the weighted sums S a = k = 1 n X k a k with respect to the properties of vectors a k . Interest in this subject has increased considerably in connection with the study of eigenvalues of random matrices (see, for instance, Friedland and Sodin [2], Rudelson and Vershynin [3,4], Tao and Vu [5,6], Nguyen and Vu [7], Vershynin [8], Tikhomirov [9], Livshyts, Tikhomirov and Vershynin [10], Campos et al. [11]). For a detailed history of the problem, we refer to a review of Nguyen and Vu [12]. The authors of the above articles (see also Halász [13]) called this question the Littlewood–Offord problem, since, for the first time, this problem was considered in 1943 by Littlewood and Offord [14] in connection with the study of random polynomials. They considered a special case, where the coefficients a k R are one-dimensional, and X takes values ± 1 with probabilities 1 / 2 .
The recent achivements in estimating the probabilities of singularity of random matrices [9,10,11] were based on the Rudelson and Vershynin [3,4,8] method of least common denominator. Note that the results of [2,4,8] (concerning the Littlewood–Offord problem) were improved and refined in [15,16,17].
Now, we introduce some notation. In the sequel, let F a denote the distribution of the sum S a , let E y be the probability measure concentrated at a point y, and let G be the distribution of the symmetrized random variable X ˜ = X 1 X 2 . For δ 0 , we denote
p ( δ ) = G { z : | z | > δ } .
The symbol c will be used for absolute positive constants which may be different, even in the same formulas.
Writing A B means that | A | c B . Furthermore, we will write A B , if A B and B A . We will write A d B , if | A | c ( d ) B , where c ( d ) > 0 depends on d only. Similarly, A d B , if A d B and B d A . The scalar product in R d will be denoted · , · . Later, x is the largest integer k, such that k < x . For x = ( x 1 , , x n ) R n , we will use the norms x 2 = x 1 2 + + x n 2 and | x | = max j | x j | . We denote by F ^ ( t ) , t R d , the characteristic function of d-dimensional distributions F.
Products and powers of measures will be understood in the convolution sense. For infinitely divisible distribution F and λ 0 , we denote by F λ the infinitely divisible distribution with characteristic function F ^ λ ( t ) .
The elementary properties of concentration functions are well studied (see, for instance, refs [18,19,20]). It is known that
Q ( F , μ ) d ( 1 + μ / λ ) d Q ( F , λ )
for any μ , λ > 0 . Hence,
Q ( F , c λ ) d Q ( F , λ ) .
Let us formulate a generalization of the classical Esséen inequality [21] to the multivariate case ([22], see also [19]):
Lemma 1.
Let τ > 0 and let F be a d-dimensional probability distribution. Then,
Q ( F , τ ) d τ d | t | 1 / τ | F ^ ( t ) | d t .
In the general case, Q ( F , τ ) cannot be estimated from below by the right hand side of inequality (4). However, if we assume additionally that the distribution F is symmetric and its characteristic function is non-negative for all t R , then we have the lower bound:
Q ( F , τ ) d τ d | t | 1 / τ F ^ ( t ) d t ,
and, therefore,
Q ( F , τ ) d τ d | t | 1 / τ F ^ ( t ) d t ,
(see [23] or [18], Lemma 1.5 of Chapter II for d = 1 ). In the multivariate case, relations (5) and (6) may be found in Zaitsev [24]. The use of relation (6) allows us to simplify the arguments of Friedland and Sodin [2], Rudelson and Vershynin [4] and Vershynin [8] which were applied to Littlewood–Offord problem (see [15,16,17]).
The main result of this paper is a general inequality which reduces the estimation of concentration functions in the Littlewood–Offord problem to the estimation of concentration functions of some infinitely divisible distributions. This result is formulated in Theorem 1.
For z R , introduce the distribution H z with the characteristic function
H ^ z ( t ) = exp 1 2 k = 1 n 1 cos ( t , a k z ) .
It depends on the vector a. It is clear that H z is a symmetric infinitely divisible distribution. Therefore, its characteristic function is positive for all t R d .
Recall that G = L ( X 1 X 2 ) and F a = L ( S a ) , where S a = k = 1 n X k a k .
Theorem 1.
Let V be an arbitrary one-dimensional Borel measure, such that λ = V { R } > 0 , and V G , that is, V { B } G { B } , for any Borel set B. Then, for any τ > 0 , we have
Q ( F a , τ ) d z R Q ( H 1 λ , τ | z | 1 ) W { d z } ,
where W = λ 1 V .
Corollary 1.
For any ε , τ > 0 , we have
Q ( F a , τ ) d Q ( H 1 p ( τ / ε ) , ε ) ,
where p ( · ) is defined in (1).
In order to verify Corollary 1, we note that the distribution G = L ( X ˜ ) may be represented as the mixture
G = p 0 G 0 + p 1 G 1 , where p j = P X ˜ A j , j = 0 , 1 ,
A 0 = { x : | x | τ / ε } , A 1 = { x : | x | > τ / ε } , G j are probability measures defined for p j > 0 by the formula G j { B } = G { B A j } / p j , for any Borel set B. In fact, G j is the conditional distribution of X ˜ , given that X ˜ A j . If p j = 0 , then we can take G j as an arbitrary measure.
The conditions of Theorem 1 are satisfied for V = p 1 G 1 . λ = p 1 = p ( τ / ε ) , W = G 1 .
Inequalities (2) and (6) imply that
Q ( F a , τ ) d z A 1 Q ( H 1 λ , τ | z | 1 ) W { d z } sup z τ / ε Q H 1 p ( τ / ε ) , τ / z = Q H 1 p ( τ / ε ) , ε ,
proving (9).
Applying Theorem 1 with V of the form
V { d z } = 1 + τ ( ε | z | ) 1 d G { d z } ,
and using inequality (2), we obtain.
Corollary 2.
For any ε , τ > 0 , we have
Q ( F a , τ ) d λ 1 Q ( H 1 λ , ε ) ,
where
λ = λ ( G , τ / ε ) = V { R } = z R 1 + τ ( ε | z | ) 1 d G { d z } .
It is clear that τ ( ε | z | ) 1 = 0 if | z | > τ / ε . Therefore, λ = λ ( G , τ / ε ) p ( τ / ε ) , hence, Q ( H 1 λ , ε ) Q ( H 1 p ( τ / ε ) , ε ) . Thus, if λ d 1 , then inequality (12) of Corollary 2 is stronger than inequality (9) of Corollary 1.
The proof of Theorem 1 is based on elementary properties of concentration functions. We repeat the arguments of [1], adding, at the last step, an application of Jensen’s inequality. In [1], inequality (2) was used instead. The main result of [1] does not imply Corollary 2. Note that H 1 λ is an infinitely divisible distribution with the Lévy spectral measure M λ = λ 4 M * , where M * = k = 1 n E a k + E a k . It is clear that the assertions of Theorem 1 and Corollaries 1 and 2 may be treated as statements about the measure M * .
Corollary 1 was already proved earlier in [1,25], see also [26] for the case τ = 0 . It was used essentially in [25,27] to show that Arak’s inequalities for concentration functions may be used for investigations of the Littlewood–Offord problem. Arak has shown that if the concentration function of infinitely divisible distribution is relatively large, then the spectral measure of this distribution is concentrated in a neighborhood of a set with simple arithmetical structure. Together with Corollary 1, this means that a large value of Q ( F a , τ ) implies a simple arithmetical structure of the set { ± a k } k = 1 n . This statement is similar to the so-called “inverse principle” in the Littlewood–Offord problem (see [5,7,12]).
Note that using the results of Arak [23,28] (see also [18]) one could derive from Corollary 1 inequalities similar to boumds for concentration functions in the Littlewood–Offord problem, which were obtained in a paper of Nguyen and Vu [7] (see also [12]). A detailed discussion of this fact is presented in [25,27]. We noticed that Corollary 2 may be stronger than Corollary 1. Therefore, the results of [25,27] could be improved (in the sense of dependence of constants on the distribution of X 1 ) replacing an application of Corollary 1 by an application of Corollary 2. The authors are going to devote a separate publication to this topic.
Proof of Theorem 1.
Let us show that, for arbitrary probability distribution, W and λ , T > 0 ,
log | t | T exp 1 2 k = 1 n z R 1 cos ( t , a k z ) λ W { d z } d t z R log | t | T exp λ 2 k = 1 n 1 cos ( t , a k z ) d t W { d z } = z R log | t | T H ^ z λ ( t ) d t W { d z } .
It is suffice to prove (14) for discrete distributions W = j = 1 p j E z j , where 0 p j 1 , z j R , j = 1 p j = 1 . Applying in this case the generalized Hölder inequality, we have
| t | T exp 1 2 k = 1 n z R 1 cos ( t , a k z ) λ W { d z } d t = | t | T exp λ 2 j = 1 p j k = 1 n 1 cos ( t , a k z j ) d t j = 1 | t | T exp λ 2 k = 1 n 1 cos ( t , a k z j ) d t p j .
Taking logarithms of the left- and right-hand sides of (15), we get (14). In general cases, we can approximate W by discrete distributions in the sense of weak convergence and pass to the limit. Note also that the integrals | t | T d t may be replaced in (14) by the integrals μ { d t } with an arbitrary Borel measure μ .
Since for characteristic function U ^ ( t ) of a random vector Y, we have
| U ^ ( t ) | 2 = E exp ( i t , Y ˜ ) = E cos ( t , Y ˜ ) ,
where Y ˜ is the corresponding symmetrized random vector, then
| U ^ ( t ) | exp 1 2 1 | U ^ ( t ) | 2 = exp 1 2 E 1 cos ( t , Y ˜ ) .
According to Theorem 1 and relations V = λ W G , (14) and (16), applying Jensen’s inequality of the form exp ( E f ( ξ ) ) E exp ( f ( ξ ) ) for any measurable function f and any random varialble ξ , we have
Q ( F a , τ ) d τ d τ | t | 1 | F a ^ ( t ) | d t d τ d τ | t | 1 exp 1 2 k = 1 n E 1 cos ( t , a k X ˜ ) d t = τ d τ | t | 1 exp 1 2 k = 1 n z R 1 cos ( t , a k z ) G { d z } d t τ d τ | t | 1 exp 1 2 k = 1 n z R 1 cos ( t , a k z ) λ W { d z } d t exp z R log τ d τ | t | 1 H ^ z λ ( t ) d t W { d z } z R τ d τ | t | 1 H ^ z λ ( t ) d t W { d z } .
Using (6), we have
τ d τ | t | 1 H ^ z λ ( t ) d t d Q ( H z λ , τ ) = Q H 1 λ , τ | z | 1 .
Substituting this formula into (17), we obtain (8). In (18), we have used that H z λ = L ( z η ) , where η is a random vector with L ( η ) = H 1 λ . □

Author Contributions

Investigation, F.G.; Writing—original draft, A.Y.Z. All authors have read and agreed to the published version of the manuscript.

Funding

The authors were supported by SFB 1283/2 2021—317210226 and by the RFBR-DFG grant 20-51-12004. The second author was supported by grant RFBR 19-01-00356.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Acknowledgments

We are grateful to the anonymous for valuable remarks.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Eliseeva, Y.S.; Zaitsev, A.Y. On the Littlewood–Offord problem. J. Math. Sci. 2016, 214, 467–473. [Google Scholar] [CrossRef] [Green Version]
  2. Friedland, O.; Sodin, S. Bounds on the concentration function in terms of Diophantine approximation. C. R. Math. Acad. Sci. Paris 2007, 345, 513–518. [Google Scholar] [CrossRef] [Green Version]
  3. Rudelson, M.; Vershynin, R. The Littlewood–Offord problem and invertibility of random matrices. Adv. Math. 2008, 218, 600–633. [Google Scholar] [CrossRef] [Green Version]
  4. Rudelson, M.; Vershynin, R. The smallest singular value of a random rectangular matrix. Comm. Pure Appl. Math. 2009, 62, 1707–1739. [Google Scholar] [CrossRef] [Green Version]
  5. Tao, T.; Vu, V. Inverse Littlewood–Offord theorems and the condition number of random discrete matrices. Ann. Math. 2009, 169, 595–632. [Google Scholar] [CrossRef] [Green Version]
  6. Tao, T.; Vu, V. From the Littlewood–Offord problem to the circular law: Universality of the spectral distribution of random matrices. Bull. Amer. Math. Soc. 2009, 46, 377–396. [Google Scholar] [CrossRef] [Green Version]
  7. Nguyen, H.; Vu, V. Optimal inverse Littlewood–Offord theorems. Adv. Math. 2011, 226, 5298–5319. [Google Scholar] [CrossRef] [Green Version]
  8. Vershynin, R. Invertibility of symmetric random matrices. Random Struct. Algorithms 2014, 44, 135–182. [Google Scholar] [CrossRef] [Green Version]
  9. Tikhomirov, K. Singularity of random Bernoulli matrices. Ann. Math. 2020, 191, 593–634. [Google Scholar] [CrossRef] [Green Version]
  10. Livshyts, G.V.; Tikhomirov, K.; Vershynin, R. The smallest singular value of inhomogeneous square random matrices. Ann. Probab. 2021, 49, 1286–1309. [Google Scholar] [CrossRef]
  11. Campos, M.; Jenssen, M.; Michelen, M.; Sahasrabudhe, J. The singularity probability of a random symmetric matrix is exponentially small. arXiv 2021, arXiv:2105.11384. [Google Scholar]
  12. Nguyen, H.; Vu, V. Small probabilities, inverse theorems and applications. arXiv 2013, arXiv:1301.0019. [Google Scholar]
  13. Halász, G. Estimates for the concentration function of combinatorial number theory and probability. Period. Math. Hung. 1977, 8, 197–211. [Google Scholar] [CrossRef]
  14. Littlewood, J.E.; Offord, A.C. On the number of real roots of a random algebraic equation. Rec. Math. [Mat. Sbornik] N.S. 1943, 12, 277–286. [Google Scholar]
  15. Eliseeva, Y.S.; Zaitsev, A.Y. Estimates for the concentration functions of weighted sums of independent random variables. Theory Probab. Appl. 2013, 57, 670–678. [Google Scholar] [CrossRef] [Green Version]
  16. Eliseeva, Y.S. Multivariate estimates for the concentration functions of weighted sums of independent identically distributed random variables. Zap. Nauchn. Semin. POMI 2013, 412, 121–137. (In Russian) [Google Scholar] [CrossRef] [Green Version]
  17. Eliseeva, Y.S.; Götze, F.; Zaitsev, A.Y. Estimates for the concentration functions in the Littlewood–Offord problem. Zap. Nauchn. Semin. POMI 2013, 420, 50–69. (In Russian) [Google Scholar] [CrossRef] [Green Version]
  18. Arak, T.V.; Zaitsev, A.Y. Uniform limit theorems for sums of independent random variables. Proc. Steklov Inst. Math. 1988, 174, 222. [Google Scholar]
  19. Hengartner, W.; Theodorescu, R. Concentration Functions; Academic Press: New York, NY, USA, 1973. [Google Scholar]
  20. Petrov, V.V. Sums of Independent Random Variables; De Gruyter: Moscow, Russia, 1972. [Google Scholar]
  21. Esséen, C.-G. On the Kolmogorov–Rogozin inequality for the concentration function. Z. Wahrscheinlichkeitstheorie Verw. Geb. 1966, 5, 210–216. [Google Scholar] [CrossRef]
  22. Esséen, C.-G. On the concentration function of a sum of independent random variables. Z. Wahrscheinlichkeitstheorie Verw. Geb. 1968, 9, 290–308. [Google Scholar] [CrossRef]
  23. Arak, T.V. On the approximation by the accompanying laws of n-fold convolutions of distributions with nonnegative characteristic functions. Theory Probab. Appl. 1980, 25, 221–243. [Google Scholar] [CrossRef]
  24. Zaitsev, A.Y. Multidimensional generalized method of triangular functions. J. Soviet Math. 1988, 43, 2797–2810. [Google Scholar] [CrossRef]
  25. Eliseeva, Y.S.; Götze, F.; Zaitsev, A.Y. Arak inequalities for concentration functions and the Littlewood–Offord problem. Theory Probab. Appl. 2018, 62, 196–215. [Google Scholar]
  26. Zaitsev, A.Y. A bound for the maximal probability in the Littlewood–Offord problem. J. Math. Sci. 2016, 219, 743–746. [Google Scholar] [CrossRef] [Green Version]
  27. Götze, F.; Zaitsev, A.Y. New applications of Arak’s inequalities to the Littlewood–Offord problem. Eur. J. Math. 2018, 4, 639–663. [Google Scholar] [CrossRef] [Green Version]
  28. Arak, T.V. On the convergence rate in Kolmogorov’s uniform limit theorem. I. Theory Probab. Appl. 1981, 26, 219–239. [Google Scholar] [CrossRef]
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Götze, F.; Zaitsev, A.Y. A New Bound in the Littlewood–Offord Problem. Mathematics 2022, 10, 1740. https://0-doi-org.brum.beds.ac.uk/10.3390/math10101740

AMA Style

Götze F, Zaitsev AY. A New Bound in the Littlewood–Offord Problem. Mathematics. 2022; 10(10):1740. https://0-doi-org.brum.beds.ac.uk/10.3390/math10101740

Chicago/Turabian Style

Götze, Friedrich, and Andrei Yu. Zaitsev. 2022. "A New Bound in the Littlewood–Offord Problem" Mathematics 10, no. 10: 1740. https://0-doi-org.brum.beds.ac.uk/10.3390/math10101740

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop