幫助中心 | 我的帳號 | 關於我們

支持向量機(英文版香農信息科學經典)

  • 作者:(德)英戈·斯坦沃特//安德烈亞斯·克里斯特曼|責編:陳亮//劉葉青
  • 出版社:世圖出版公司
  • ISBN:9787519296926
  • 出版日期:2023/01/01
  • 裝幀:平裝
  • 頁數:601
人民幣:RMB 139 元      售價:
放入購物車
加入收藏夾

內容大鋼
    本書旨在解釋使支持向量機成為各種應用的成功建模和預測工具的原理。書中通過展示支持向量機的基本概念,以及最新發展和當前的研究問題來實現這一目標。本書分析了支持向量機成功的至少三個原因:它們在只有少量自由參數的情況下很好地學習的能力,它們對幾種類型的模型違反和異常值的魯棒性,最後是它們的計算效率與其他幾種方法進行的比較。

作者介紹
(德)英戈·斯坦沃特//安德烈亞斯·克里斯特曼|責編:陳亮//劉葉青

目錄
Preface
Reading Guide
1  Introduction
  1.1  Statistical Learning
  1.2  Support Vector Machines: An Overview
  1.3  History of SVMs and Geometrical Interpretation
  1.4  Alternatives to SVMs
2  Loss Functions and Their Risks
  2.1  Loss Functions: Definition and Examples
  2.2  Basic Properties of Loss Functions and Their Risks
  2.3  Margin-Based Losses for Classification Problems
  2.4  Distance-Based Losses for Regression Problems
  2.5  Further Reading and Advanced Topics
  2.6  Summary
  2.7  Exercises
3  Surrogate Loss Functions (*)
  3.1  Inner Risks and the Calibration Function
  3.2  Asymptotic Theory of Surrogate Losses
  3.3  Inequalities between Excess Risks
  3.4  Surrogates for Unweighted Binary Classification
  3.5  Surrogates for Weighted Binary Classification
  3.6  Template Loss Functions
  3.7  Surrogate Losses for Regression Problems
  3.8  Surrogate Losses for the Density Level Problem
  3.9  Self-Calibrated Loss Functions
  3.10  Further Reading and Advanced Topics
  3.11  Summary
  3.12  Exercises
4  Kernels and Reproducing Kernel Hilbert Spaces
  4.1  Basic Properties and Examples of Kernels
  4.2  The Reproducing Kernel Hilbert Space of a Kernel
  4.3  Properties of RKHSs
  4.4  Gaussian Kernels and Their RKHSs
  4.5  Mercer's Theorem (*)
  4.6  Large Reproducing Kernel Hilbert Spaces
  4.7  Further Reading and Advanced Topics
  4.8  Summary
  4.9  Exercises
5  Infinite-Sample Versions of Support Vector Machines
  5.1  Existence and Uniqueness of SVM Solutions
  5.2  A General Representer Theorem
  5.3  Stability of Infinite-Sample SVMs
  5.4  Behavior for Small Regularization Parameters
  5.5  Approximation Error of RKHSs
  5.6  Further Reading and Advanced Topics
  5.7  Summary
  5.8  Exercises
6  Basic Statistical Analysis of SVMs
  6.1  Notions of Statistical Learning
  6.2  Basic Concentration Inequalities

  6.3  Statistical Analysis of Empirical Risk Minimization
  6.4  Basic Oracle Inequalities for SVMs
  6.5  Data-Dependent Parameter Selection for SVMs
  6.6  Further Reading and Advanced Topics
  6.7  Summary
  6.8  Exercises
7  Advanced Statistical Analysis of SVMs (*)
  7.1  Why Do We Need a Refined Analysis?
  7.2  A Refined Oracle Inequality for ERM
  7.3  Some Advanced Machinery
  7.4  Refined Oracle Inequalities for SVMs
  7.5  Some Bounds on Average Entropy Numbers
  7.6  Further Reading and Advanced Topics
  7.7  Summary
  7.8  Exercises
8  Support Vector Machines for Classification
  8.1  Basic Oracle Inequalities for Classifying with SVMs
  8.2  Classifying with SVMs Using Gaussian Kernels
  8.3  Advanced Concentration Results for SVMs (*)
  8.4  Sparseness of SVMs Using the Hinge Loss
  8.5  Classifying with other Margin-Based Losses (*)
  8.6  Further Reading and Advanced Topics
  8.7  Summary
  8.8  Exercises
9  Support Vector Machines for Regression
  9.1  Introduction
  9.2  Consistency
  9.3  SVMs for Quantile Regression
  9.4  Numerical Results for Quantile Regression
  9.5  Median Regression with the eps-Insensitive Loss (*)
  9.6  Further Reading and Advanced Topics
  9.7  Summary
  9.8  Exercises
10  Robustness
  10.1  Motivation
  10.2  Approaches to Robust Statistics
  10.3  Robustness of SVMs for Classification
  10.4  Robustness of SVMs for Regression (*)
  10.5  Robust Learning from Bites (*)
  10.6  Further Reading and Advanced Topics
  10.7  Summary
  10.8  Exercises
11  Computational Aspects
  11.1  SVMs, Convex Programs, and Duality
  11.2  Implementation Techniques
  11.3  Determination of Hyperparameters
  11.4  Software Packages
  11.5  Further Reading and Advanced Topics
  11.6  Summary
  11.7  Exercises

12  Data Mining
  12.1  Introduction
  12.2  CRISP-DM Strategy
  12.3  Role of SVMs in Data Mining
  12.4  Software Tools for Data Mining
  12.5  Further Reading and Advanced Topics
  12.6  Summary
  12.7  Exercises
Appendix
  A.1  Basic Equations, Inequalities, and Functions
  A.2  Topology
  A.3  Measure and Integration Theory
    A.3.1  Some Basic Facts
    A.3.2  Measures on Topological Spaces
    A.3.3  Aumann's Measurable Selection Principle
  A.4  Probability Theory and Statistics
    A.4.1  Some Basic Facts
    A.4.2  Some Limit Theorems
    A.4.3  The Weak* Topology and Its Metrization
  A.5  Functional Analysis
    A.5.1  Essentials on Banach Spaces and Linear Operators
    A.5.2  Hilbert Spaces
    A.5.3  The Calculus in Normed Spaces
    A.5.4  Banach Space Valued Integration
    A.5.5  Some Important Banach Spaces
    A.5.6  Entropy Numbers
  A.6  Convex Analysis
    A.6.1  Basic Properties of Convex Functions
    A.6.2  Subdifferential Calculus for Convex Functions
    A.6.3  Some Further Notions of Convexity
    A.6.4  The Fenchel-Legendre Bi-conjugate
    A.6.5  Convex Programs and Lagrange Multipliers
  A.7  Complex Analysis
  A.8  Inequalities Involving Rademacher Sequences
  A.9  Talagrand's Inequality
References
Notation and Symbols
Abbreviations
Author Index
Subject Index

  • 商品搜索:
  • | 高級搜索
首頁新手上路客服中心關於我們聯絡我們Top↑
Copyrightc 1999~2008 美商天龍國際圖書股份有限公司 臺灣分公司. All rights reserved.
營業地址:臺北市中正區重慶南路一段103號1F 105號1F-2F
讀者服務部電話:02-2381-2033 02-2381-1863 時間:週一-週五 10:00-17:00
 服務信箱:bookuu@69book.com 客戶、意見信箱:cs@69book.com
ICP證:浙B2-20060032