幫助中心 | 我的帳號 | 關於我們

資訊理論和中心極限定理(英文版香農信息科學經典)

  • 作者:(英)奧利佛·約翰遜|責編:陳亮//劉葉青
  • 出版社:世圖出版公司
  • ISBN:9787519296872
  • 出版日期:2023/01/01
  • 裝幀:平裝
  • 頁數:209
人民幣:RMB 59 元      售價:
放入購物車
加入收藏夾

內容大鋼
    本書採用了不相關的、來自資訊理論的研究,角度新穎地提出了一種證明中心極限的新方法,並對此進行了全面描述:書中先是讀者呈現了嫡和費雪信息概念的基本導論,隨後以一系列與它們行為有關的標準測試作為驗證。在作者的獨特構思與實證下,資訊理論與中心極限定理兩個看似不相干的領域被巧妙地聯結起來,實現了跨學科的科研合作。此外,書里還彙編了一些已發表或尚未發表的論文研究成果,展現了技術如何可以給出一個極限定理的統一觀點。

作者介紹
(英)奧利佛·約翰遜|責編:陳亮//劉葉青

目錄
Preface
1.Introduction to Information Theory
  1.1  Entropy and relative entropy
    1.1.1  Discrete entropy
    1.1.2  Differential entropy
    1.1.3  Relative entropy
    1.1.4  Other entropy-like quantities
    1.1.5  Axiomatic definition of entropy
  1.2  Link to thermodynamic entropy
    1.2.1  Definition of thermodynamic entropy
    1.2.2  Maximum entropy and the Second Law
  1.3  Fisher information
    1.3.1  Definition and properties
    1.3.2  Behaviour on convolution
  1.4  Previous information-theoretic proofs
    1.4.1  R?nyi's method
    1.4.2  Convergence of Fisher information
2.Convergence in Relative Entropy
  2.1  Motivation
    2.1.1  Sandwich inequality
    2.1.2  Projections and adjoints
    2.1.3  Normal case
    2.1.4  Results of Brown and Barron
  2.2  Generalised bounds on projection eigenvalues
    2.2.1  Projection of functions in L
    2.2.2  Restricted Poincar? constants
    2.2.3  Convergence of restricted Poincar? constants
  2.3  Rates of convergence
    2.3.1  Proof of O(1/n) rate of convergence
    2.3.2  Comparison with other forms of convergence
    2.3.3  Extending the Cram?r-Rao lower bound
3.Non-Identical Variables and Random Vectors
  3.1  Non-identical random variables
    3.1.1  Previous results
    3.1.2  Improved projection inequalities
  3.2  Random vectors
    3.2.1  Definitions
    3.2.2  Behaviour on convolution
    3.2.3  Projection inequalities
4.Dependent Random Variables
  4.1  Introduction and notation
    4.1.1  Mixing coefficients
    4.1.2  Main results
  4.2  Fisher information and convolution
  4.3  Proof of subadditive relations
    4.3.1  Notation and definitions
    4.3.2  _ Bounds on densities
    4.3.3  Bounds on tails
    4.3.4  Control of the mixing coefficients
5.Convergence to Stable Laws

  5.1  Introduction to stable laws
    5.1.1  Definitions
    5.1.2  Domains of attraction
    5.1.3  Entropy of stable laws
  5.2  Parameter estimation for stable distributions
    5.2.1  Minimising relative entropy
    5.2.2  Minimising Fisher information distance
    5.2.3  Matching logarithm of density
  5.3  Extending de Brujjn's identity
    5.3.1  Partial differential equations
    5.3.2  Derivatives of relative entropy
    5.3.3  Integral form of the identities
  5.4  Relationship between forms of convergence
  5.5  Steps towards a Brown inequality
6.Convergence on Compact Groups
  6.1  Probability on compact groups
    6.1.1  Introduction to topological groups
    6.1.2  Convergence of convolutions
    6.1.3  Conditions for uniform convergence
  6.2  Convergence in relative entropy
    6.2.1  Introduction and results
    6.2.2  Entropy on compact groups
  6.3  Comparison of forms of convergence
  6.4  Proof of convergence in relative entropy
    6.4.1  Explicit rate of convergence
    6.4.2  No explicit rate of convergence
7.Convergence to the Poisson Distribution
  7.1  Entropy and the Poisson distribution
    7.1.1  The law of small numbers
    7.1.2  Simplest bounds on relative entropy
  7.2  Fisher information
    7.2.1  Standard Fisher information
    7.2.2  Scaled Fisher information
    7.2.3  Dependent variables
  7.3  Strength of bounds
  7.4  De Bruijn identity
  7.5  L2 bounds on Poisson distance
    7.5.1  L2 definitions
    7.5.2  Sums of Bernoulli variables
    7.5.3  Normal convergence
8.Free Random Variables
  8.1  Introduction to free variables
    8.1.1  Operators and algebras
    8.1.2  Expectations and Cauchy transforms
  8.2  Derivations and conjugate functions
    8.2.1  Derivations
    8.2.2  Fisher information and entropy
  8.3  Projection inequalities
Appendix A  Calculating Entropies
  A.1  Gamma distribution

  A.2  Stable distributions
Appendix B  Poincar? Inequalities
  B.1  Standard Poincar? inequalities
  B.2  Weighted Poincar? inequalities
Appendix C  de Bruijn Identity
Appendix D  Entropy Power Inequality
Appendix E  Relationships Between Different Forms of Convergence
  E.1  Convergence in relative entropy to the Gaussian
  E.2  Convergence to other variables
  E.3  Convergence in Fisher information
Bibliography
Index

  • 商品搜索:
  • | 高級搜索
首頁新手上路客服中心關於我們聯絡我們Top↑
Copyrightc 1999~2008 美商天龍國際圖書股份有限公司 臺灣分公司. All rights reserved.
營業地址:臺北市中正區重慶南路一段103號1F 105號1F-2F
讀者服務部電話:02-2381-2033 02-2381-1863 時間:週一-週五 10:00-17:00
 服務信箱:bookuu@69book.com 客戶、意見信箱:cs@69book.com
ICP證:浙B2-20060032