Introduction to Random Matrices

C. Tracy,H. Widom

Published 1992 in Unknown venue

ABSTRACT

Here I = S j (a2j 1,a2j) andI(y) is the characteristic function of the set I. In the Gaussian Unitary Ensemble (GUE) the probability that no eigenvalues lie in I is equal to �(a). Also �(a) is a tau-function and we present a new simplified derivation of the system of nonlinear completely integrable equations (the aj's are the independent variables) that were first derived by Jimbo, Miwa, Mori, and Sato in 1980. In the case of a single interval these equations are reducible to a Painleve V equation. For large s we give an asymptotic formula for E2(n;s), which is the probability in the GUE that exactly n eigenvalues lie in an interval of length s. These notes provide an introduction to that aspect of the theory of random matrices dealing with the distribution of eigenvalues. To first orient the reader, we present in Sec. II some numerical experiments that illustrate some of the basic aspects of the subject. In Sec. III we introduce the invariant measures for the three "circular ensembles" involving unitary matrices. We also define the level spacing distributions and express these distributions in terms of a particular Fredholm determinant. In Sec. IV we explain how these measures are modified for the orthogonal polynomial ensembles. In Sec. V we discuss the universality of these level spacing distribution functions in a particular scaling limit. The discussion up to this point (with the possible exception of Sec. V) follows the well-known path pioneered by Hua, Wigner, Dyson, Mehta and others who first developed this theory (see, e.g., the reprint volume of Porter (34) and Hua (17)). This, and much more, is discussed in Mehta's book (25)—the classic reference in the subject. An important development in random matrices was the discovery by Jimbo, Miwa, Mori, and Sato (21) (hereafter referred to as JMMS) that the basic Fredholm determinant mentioned above is a �-function in the sense of the Kyoto School. Though it has been some twelve years since (21) was published, these results are not widely appreciated by the practitioners of random matrices. This is due no doubt to the complexity of their paper. The methods of JMMS are methods of discovery; but now that we know the result, simpler proofs can be constructed. In Sec. VI we give such a proof of the JMMS equations. Our proof is a simplification and generalization of Mehta's (27) simplified proof of the single interval case. Also our methods build on the earlier work of Its, Izergin, Korepin, and Slavnov (18) and Dyson (12). We include in this section a discussion of the connection between the JMMS equations and the integrable Hamiltonian systems that appear in the geometry of quadrics and spectral theory as developed by Moser (31). This section concludes with a discussion of the case of a single interval (viz., probability that exactly n eigenvalues lie in a given interval). In this case the JMMS equations can be reduced to a single ordinary differential equation—the Painleve V equation. Finally, in Sec. VII we discuss the asymptotics in the case of a large single interval of the various level spacing distribution functions (4,38,28). In this analysis both the Painleve representation and new results in Toeplitz/Wiener- Hopf theory are needed to produce these asymptotics. We also give an approach based on the asymptotics of the eigenvalues of the basic linear integral operator (14,25,35). These results are then compared with the continuum model calculations of Dyson (12).

PUBLICATION RECORD

CITATION MAP

EXTRACTION MAP

CLAIMS

  • No claims are published for this paper.

CONCEPTS

  • No concepts are published for this paper.

REFERENCES

Showing 1-37 of 37 references · Page 1 of 1

CITED BY

Showing 1-100 of 1717 citing papers · Page 1 of 18