In Fall 2021, I took a course on measure-theoretic probability with the great Persi Diaconis. Persi continually suggested books and articles throughout the class, which I have gathered here along with some of his comments.

For more in this vein, see my similar article on Nassim Taleb’s probability library.

General probability theory:

- Patrick Billingsley,
*Probability and Measure*. This was the main textbook we used in class. - Leo Breiman,
*Probability*. A very good book for regular conditional probability. - Hogg, McKean, and Craig,
*Introduction to Mathematical Statistics*. Diaconis’s favorite elementary probability book. - Billingsley,
*Convergence of Probability Measures*. Very readable reference for weak convergence on metric spaces. - Kallenberg,
*Foundations of Modern Probability (3rd ed)*. “A book on the shelf of every modern probabilist.” Technical, not many stories. - Dudley,
*Real Analysis and Probability*. Good combination of history / stories and rigour. - Feller, Volumes I and II. Full of stories and a classic.

Remarks on the Lebesgue integral:

- For “nice” functions, the Lebesgue integral agrees with Riemann.
- Sometimes (e.g. Dirichlet’s function) the Lebesgue integral exists when the Riemann doesn’t
- We still need Riemann for some indefinite integrals! \(\int_0^\infty \frac{\sin(x)}{x} dx\) has no Lebesgue integral, but we can Riemann integrate. Similarly, \(\sum_{j=1}^\infty \frac{(-1)^j}{j}\) breaks under Lebesgue integration.
- Riemann is also what we use for computations!
- The Henstock integral combines the best of both worlds (see the American Math Monthly article on this, presumably the one by Bartle).
- Thus “no serious analytical probabilist would throw out the Riemann integral.”

Remarks on the strong law of large numbers:

- Etemadi’s elementary proof of the strong law of large numbers uses what Diaconis calls the “4 Ts argument”. Each of the Ts has “legs”, i.e. is useful in many other places. They are:
- Truncation
- Tchebyshev
- inTerpolation
- (T)subsequences

- Proving \(\frac{S_n}{n} \overset{a.s.}{\to} \mu\) is beautiful, clear, and has absolutely no real-world implication. It tells you nothing quantifiable about
*n*, which is what you would want in practice. The literature has next-to-nothing on this! At least Chebyshev tells you something particular.

Remarks on Poisson approximation and Stein’s method:

- References:
- Arratia, Goldstein, Gordon. “Poisson Approximation and the Chen-Stein Method”,
*Statistical Science*, 1990.- Many good examples!

- Chatterjee, Diaconis, Meckes. “Exchangeable pairs and Poisson approximation”.
*Probability Surveys*, 2005.- This gives Stein’s method of exchangeable pairs.

- Sourav Chatterjee, “A short survey of Stein’s method”. ICM proceedings, 2014.
- A more recent readable survey.

- Chen, Goldstein, Shao.
*Normal Approximation by Stein’s Method*. Springer 2011. - Barbour, Holst, Janson,
*Poisson Approximation*. Oxford University Press, 1992. - (for better bound) Barbour and Eagleson. “Poisson Approximation for Some Statistics Based on Exchangeable Trials”.
*Advances in Applied Probability*, 1983.

- Arratia, Goldstein, Gordon. “Poisson Approximation and the Chen-Stein Method”,

Remarks on the central limit theorem:

- Sourav Chatterjee, “A generalization of the Lindeberg principle”. Annals of Probability, 2006.
- Shows that the main idea of Lindeberg’s proof of the CLT is very general, and can be extended.

- S.D. Chatterji, “Lindeberg’s central limit theorem a la Hausdorff”. Expositiones Mathematicae, 2007.
- Fang and Koike, “High-dimensional central limit theorems by Stein’s method”. Annals of Applied Probability, 2021.

Remarks on Fourier analysis:

- Diaconis is a “user, consumer, and developer” of Fourier analysis on noncommutative groups — famously for his theorem that “seven shuffles suffice” to randomize a deck of cards.
- Diaconis,
*Use of Group Representations in Probability and Statistics*. - Feller, volume 2, Ch. 15 is one of the best treatments of characteristic functions.

Edgeworth corrections and small sample asymptotics

- Uses Fourier techniques to get better bounds on CLT results. “Hard, honest work.”
- See Bhattachaya and Rao,
*Normal Approximation and Asymptotic Expansions*or Field and Ronchetti,*Small sample asymptotics*.

Some scattered notes from office hours:

- How does one generate a random contingency table given certain marginals?
- See e.g. Diaconis and Gangolli, Rectangular Arrays with Fixed Margins.
- “Hit and run” algorithms

- People care about the Fisher-Yates distribution
- Diaconis and Efron, Generalized variance of the multinomial and Fisher-Yates distributions.
- Diaconis and Efron, Testing for Independence in a Two-Way Table: New Interpretations of the Chi-Square Statistic.

- Diaconis, Holmes, Shahshahani, Sampling From A Manifold.
- A perfectly legitimate question — how do you do it?

- D’Aristotile, Diaconis, and Freedman, On Merging of Probabilities.
- Diaconis,
*10 Great Ideas About Chance*. - Janos Galambos
- There is a kind of CLT for Brownian motion (Donsker’s theorem).

Miscellany

- Kecrish,
*Descriptive Set Theory*.- A beautiful book about a now-fading field of research.

- Jeffrey Lagarius, “Euler’s constant: Euler’s work and modern developments”. Bulletin of the AMS.
- A very nice article on Euler’s constant (which we used several times in class in approximations).

- Stephen Stigler’s biographical work on Laplace — recommended.