幫助中心 | 我的帳號 | 關於我們

統計學習理論的本質(第2版英文版香農信息科學經典)

  • 作者:(美)弗拉基米爾·萬普尼克|責編:陳亮
  • 出版社:世圖出版公司
  • ISBN:9787519296858
  • 出版日期:2023/01/01
  • 裝幀:平裝
  • 頁數:314
人民幣:RMB 99 元      售價:
放入購物車
加入收藏夾

內容大鋼
    統計學習理論是針對小樣本情況研究統計學習規律的理論,是傳統統汁學的重要發展和補充,為研究有限樣本情況下機器學習的理論和方法提供了理論框架,其核心思想是通過控制學習機器的容量實現對推廣能力的控制。在這一理論中發展出的支持向量機方法是一種新的通用學習機器,較以往方法表現出很多理論和實踐上的優勢。本書是該領域的權威著作,由該領域的創立者來講述統計學習理論的本質,著重介紹了統計學習理論和支持向量機的關鍵思想、結論和方法,以及該領域的新進展。

作者介紹
(美)弗拉基米爾·萬普尼克|責編:陳亮

目錄
Preface to the Second Edition
Preface to the First Edition
Introduction: Four Periods in the Research of the Learning Problem
  Rosenblatt's Perceptron (The 1960s)
  Construction of the Fundamentals of Learning Theory(The 1960s–1970s)
  Neural Networks (The 1980s)
  Returning to the Origin (The 1990s)
Chapter 1  Setting of the Learning Problem
  1.1  Function Estimation Model
  1.2  The Problem of Risk Minimization
  1.3  Three Main Learning Problems
    1.3.1  Pattern Recognition
    1.3.2  Regression Estimation
    1.3.3  Density Estimation (Fisher–Wald Setting)
  1.4  The General Setting of the Learning Problem
  1.5  The Empirical Risk Minimization (ERM) Inductive Principle
  1.6  The Four Parts of Learning Theory
  1.7  The Classical Paradigm of Solving Learning Problems
    1.7.1  Density Estimation Problem (MaximumLikelihood Method)
    1.7.2  Pattern Recognition (Discriminant Analysis) Problem
    1.7.3  Regression Estimation Model
    1.7.4  Narrowness of the ML Method
  1.8  Nonparametric Methods of Density Estimation
    1.8.1  Parzen's Windows
    1.8.2  The Problem of Density Estimation Is Ill-Posed
  1.9  Main Principle for Solving Problems Using a Restricted Amount of Information
  1.10  Model Minimization of the Risk Based on Empirical Data
    1.10.1  Pattern Recognition
    1.10.2  Regression Estimation
    1.10.3  Density Estimation
  1.11  Stochastic Approximation Inference
Chapter 2  Consistency of Learning Processes
  2.1  The Classical Definition of Consistency and the Concept of Nontrivial Consistency
  2.2  The Key Theorem of Learning Theory
    2.2.1  Remark on the ML Method
  2.3  Necessary and Sufficient Conditions for Uniform Two-Sided Convergence
    2.3.1  Remark on Law of Large Numbers and Its Generalization
    2.3.2  Entropy of the Set of Indicator Functions
    2.3.3  Entropy of the Set of Real Functions
    2.3.4  Conditions for Uniform Two-Sided Convergence
  2.4  Necessary and Sufficient Conditions for Uniform One-Sided Convergence
  2.5  Theory of Nonfalsifiability
    2.5.1  Kant's Problem of Demarcation and Popper's Theory of Nonfalsifiability
  2.6  Theorems on Nonfalsifiability
    2.6.1  Case of Complete (Popper's) Nonfalsifiability
    2.6.2  Theorem on Partial Nonfalsifiability
    2.6.3  Theorem on Potential Nonfalsifiability
  2.7  Three Milestones in Learning Theory Informal Reasoning and Comments
  2.8  The Basic Problems of Probability Theory and Statistics
    2.8.1  Axioms of Probability Theory

  2.9  Two Modes of Estimating a Probability Measure
  ……
Chapter 3  Bounds on the Rate of Convergence ofLearning Processes
Chapter 4  Controlling the Generalization Ability of Learning Processes
Chapter 5  Methods of Pattern Recognition
Chapter 6  Methods of Function Estimation
Chapter 7  Direct Methods in Statistical Learning Theory
Chapter 8  The Vicinal Risk Minimization Principle and the SVMs
Chapter 9  Conclusion: What Is Important inLearning Theory?
References
Index

  • 商品搜索:
  • | 高級搜索
首頁新手上路客服中心關於我們聯絡我們Top↑
Copyrightc 1999~2008 美商天龍國際圖書股份有限公司 臺灣分公司. All rights reserved.
營業地址:臺北市中正區重慶南路一段103號1F 105號1F-2F
讀者服務部電話:02-2381-2033 02-2381-1863 時間:週一-週五 10:00-17:00
 服務信箱:bookuu@69book.com 客戶、意見信箱:cs@69book.com
ICP證:浙B2-20060032