Preface 1.Introduction to Information Theory 1.1 Entropy and relative entropy 1.1.1 Discrete entropy 1.1.2 Differential entropy 1.1.3 Relative entropy 1.1.4 Other entropy-like quantities 1.1.5 Axiomatic definition of entropy 1.2 Link to thermodynamic entropy 1.2.1 Definition of thermodynamic entropy 1.2.2 Maximum entropy and the Second Law 1.3 Fisher information 1.3.1 Definition and properties 1.3.2 Behaviour on convolution 1.4 Previous information-theoretic proofs 1.4.1 R?nyi's method 1.4.2 Convergence of Fisher information 2.Convergence in Relative Entropy 2.1 Motivation 2.1.1 Sandwich inequality 2.1.2 Projections and adjoints 2.1.3 Normal case 2.1.4 Results of Brown and Barron 2.2 Generalised bounds on projection eigenvalues 2.2.1 Projection of functions in L 2.2.2 Restricted Poincar? constants 2.2.3 Convergence of restricted Poincar? constants 2.3 Rates of convergence 2.3.1 Proof of O(1/n) rate of convergence 2.3.2 Comparison with other forms of convergence 2.3.3 Extending the Cram?r-Rao lower bound 3.Non-Identical Variables and Random Vectors 3.1 Non-identical random variables 3.1.1 Previous results 3.1.2 Improved projection inequalities 3.2 Random vectors 3.2.1 Definitions 3.2.2 Behaviour on convolution 3.2.3 Projection inequalities 4.Dependent Random Variables 4.1 Introduction and notation 4.1.1 Mixing coefficients 4.1.2 Main results 4.2 Fisher information and convolution 4.3 Proof of subadditive relations 4.3.1 Notation and definitions 4.3.2 _ Bounds on densities 4.3.3 Bounds on tails 4.3.4 Control of the mixing coefficients 5.Convergence to Stable Laws
5.1 Introduction to stable laws 5.1.1 Definitions 5.1.2 Domains of attraction 5.1.3 Entropy of stable laws 5.2 Parameter estimation for stable distributions 5.2.1 Minimising relative entropy 5.2.2 Minimising Fisher information distance 5.2.3 Matching logarithm of density 5.3 Extending de Brujjn's identity 5.3.1 Partial differential equations 5.3.2 Derivatives of relative entropy 5.3.3 Integral form of the identities 5.4 Relationship between forms of convergence 5.5 Steps towards a Brown inequality 6.Convergence on Compact Groups 6.1 Probability on compact groups 6.1.1 Introduction to topological groups 6.1.2 Convergence of convolutions 6.1.3 Conditions for uniform convergence 6.2 Convergence in relative entropy 6.2.1 Introduction and results 6.2.2 Entropy on compact groups 6.3 Comparison of forms of convergence 6.4 Proof of convergence in relative entropy 6.4.1 Explicit rate of convergence 6.4.2 No explicit rate of convergence 7.Convergence to the Poisson Distribution 7.1 Entropy and the Poisson distribution 7.1.1 The law of small numbers 7.1.2 Simplest bounds on relative entropy 7.2 Fisher information 7.2.1 Standard Fisher information 7.2.2 Scaled Fisher information 7.2.3 Dependent variables 7.3 Strength of bounds 7.4 De Bruijn identity 7.5 L2 bounds on Poisson distance 7.5.1 L2 definitions 7.5.2 Sums of Bernoulli variables 7.5.3 Normal convergence 8.Free Random Variables 8.1 Introduction to free variables 8.1.1 Operators and algebras 8.1.2 Expectations and Cauchy transforms 8.2 Derivations and conjugate functions 8.2.1 Derivations 8.2.2 Fisher information and entropy 8.3 Projection inequalities Appendix A Calculating Entropies A.1 Gamma distribution
A.2 Stable distributions Appendix B Poincar? Inequalities B.1 Standard Poincar? inequalities B.2 Weighted Poincar? inequalities Appendix C de Bruijn Identity Appendix D Entropy Power Inequality Appendix E Relationships Between Different Forms of Convergence E.1 Convergence in relative entropy to the Gaussian E.2 Convergence to other variables E.3 Convergence in Fisher information Bibliography Index