Information theory and the central limit theorem:
Gespeichert in:
Bibliographische Detailangaben
Beteilige Person: Johnson, Oliver, (Oliver Thomas) (VerfasserIn)
Format: Elektronisch E-Book
Sprache:Englisch
Veröffentlicht: London Imperial College Press ©2004
Schlagwörter:
Links:http://search.ebscohost.com/login.aspx?direct=true&scope=site&db=nlebk&db=nlabk&AN=130010
http://search.ebscohost.com/login.aspx?direct=true&scope=site&db=nlebk&db=nlabk&AN=130010
http://search.ebscohost.com/login.aspx?direct=true&scope=site&db=nlebk&db=nlabk&AN=130010
Beschreibung:Includes bibliographical references (pages 199-206) and index
Information Theory and The Central Limit Theorem; Preface; Contents; 1. Introduction to Information Theory; 2. Convergence in Relative Entropy; 3. Non-Identical Variables and Random Vectors; 4. Dependent Random Variables; 5. Convergence to Stable Laws; 6. Convergence on Compact Groups; 7. Convergence to the Poisson Distribution; 8. Free Random Variables; Appendix A Calculating Entropies; Appendix B Poincare Inequalities; Appendix C de Bruijn Identity; Appendix D Entropy Power Inequality; Appendix E Relationships Between Different Forms of Convergence; Bibliography; Index
This book provides a comprehensive description of a new method of proving the central limit theorem, through the use of apparently unrelated results from information theory. It gives a basic introduction to the concepts of entropy and Fisher information, and collects together standard results concerning their behaviour. It brings together results from a number of research papers as well as unpublished material, showing how the techniques can give a unified view of limit theorems
Umfang:1 Online-Ressource (xiv, 209 pages)
ISBN:1860944736
1860945376
9781860944734
9781860945373