幫助中心 | 我的帳號 | 關於我們

信息理論基礎(英文版工業和信息化部十二五規劃教材)

  • 作者:編者:陳傑//孫兵//于澤//周蔭清
  • 出版社:北京航空航天大學
  • ISBN:9787512419728
  • 出版日期:2016/01/01
  • 裝幀:平裝
  • 頁數:153
人民幣:RMB 29 元      售價:
放入購物車
加入收藏夾

內容大鋼
    陳傑、孫兵、于澤、周蔭清編著的《信息理論基礎(英文版工業和信息化部十二五規劃教材)》以通信系統的基本模型為主線,系統全面地闡述信息理論基礎課程應包含的知識點。本書包含資訊理論基本概念和資訊理論應用2部分,共11章,第1部分包括緒論、信息的統計度量、離散信源、無損編碼和數據壓縮、離散通道及其容量、通道編碼;第2部分包括率失真、連續信源、連續通道及其容量、最大熵和譜估計、電腦模擬實驗等。本書根據作者近十年來從事資訊理論中英文教學和科研實踐,總結歸納而成,可作為留學生本科生和研究生教學使用。

作者介紹
編者:陳傑//孫兵//于澤//周蔭清

目錄
Chapter 1  Introduction
  1.1  Concept of information
  1.2  History of information theory
  1.3  Information, messages and signals
  1.4  Communication system model
  1.5  Information theory applications
    1.5.1  Electrical engineering (communication theory)
    1.5.2  Computer science (algorithmic complexity)
  Exercises
Chapter 2  Statistical Measure of Information
  2.1  Information of random events
    2.1.1  Self-information
    2.1.2  Conditional self-information
    2.1.3  Mutual information of events
  2.2  Information of discrete random variables
    2.2.1  Entropy of discrete random variables
    2.2.2  Joint entropy
    2.2.3  Conditional entropy
    2.2.4  Mutual information of discrete random variables
  2.3  Relationship between entropy and mutual information
  2.4  Mutual information and entropy of continuous random variables
    2.4.1  Mutual information of continuous random variabies
    2.4.2  Entropy oI continuous random variables
  Exercises
Chapter 3  Discrete Source and Its Entropy Rate
  3.1  Mathematical model of source
    3.1.1  Discrete source and continuous source
    3.1.2  Simple discrete source and its extension
    3.1.3  Memoryless source and source with memory
  3.2  Discrete memoryles source
    3.2.1  Definition
    3.2.2  Extension of discrete source
  3.3  Discrete stationary source
    3.3.1  Definition
    3.3.2  Entropy rate of discrete stationary source
  3.4  Discrete Markov source
    3.4.1  Markov chain
    3.4.2  Transition probability
    3.4.3  Markov source and its entropy rate
  Exercises
Chapter 4  Lossless Source Coding and Data Compression
  4.1   Asymptotic equipartition property and typical sequences
  4.2  Lossless source coding
    4.2.1  Encoder
    4.2.2  Blockcode
    4.2.3  Fixed length code
    4.2.4  Variable length code
  4.3  Data compression
    4.3.1  Shannon coding
    4.3.2  Huffman coding

    4.3.3  Fano coding
  Exercises
Chapter 5  Discrete Channel and Its Capacity
  5.1  Mathematical model of channel
  5.2  Discrete memoryless channel
    5.2.1  Mathematical model o{ discrete memoryless channel
    5.2.2  Simple DMC
    5.2.3  Extension of discrete memoryless channel
  5.3  Channel combination
  5.4  Channel capacity
    5.4.1  Concept of channel capacity
    5.4.2  Channel capacity of several special discrete channels
    5.4.3  Channel capacity of symmetric channels
    5.4.4  Channel capacity of extended DMC
    5.4.5  Channel capacity of independent parallel DMC
    5.4.6  Channel capacity of the sum channel
    5.4.7  Channel capacity of general discrete channels
  Exercises
Chapter 6  Noisy-channel Coding
  6.1  Probability of error
  6.2  Decoding rules
  6.3  Channel coding
    6.3.1  Simple repetition code
    6.3.2  Linear code
  6.4  Noisy-channel coding theorem
  Exercises
Chapter 7  Rate Distortion
  7.1  Quantization
  7.2  Distortion definition
    7.2.1  Distortion function
    7.2.2  Mean distortion
  7.3  Rate distortion function
    7.3.1  Fidelity criterion for given channel
    7.3.2  Definition of rate distortion function
    7.3.3   Property of rate distortion function
  7.4  Rate distortion theorem and the converse
  7.5  The ea|culation of rate distortion function
  Exercises
Chapter 8  Continuous Source find Its Entropy Rate
  8.1  Continuous source
  8.2  Entropy of continuous source
  8.3  Maximum entropy of continuous source
  8.4  Joint entropy, conditional entropy and mutual information for continuous random variables
  8.5  Entropy rate of continuous source
  8.6  Rate distortion for continuous source
  Exercises
Chapter 9  Continuous Channel and Its Capacity
  9.1  Capacity of continuous channel
    9.1.1  Capacity of discrete-time channel
    9.1.2  Capacity of continuous time channel

  9.2  The Gaussian channel
  9.3  Band-limited channels
  9.4  Coding theorem for continuous channel
  Exercises
Chapter 10  Maximum Entropy and Spectrum Estimation
  10.1  Maximum entropy probability distribution
    10.1.1  Maximum entropy distribution
    10.1.2  Examples
  10.2  Maximum entropy spectrum estimation
    10.2.1  Burg's max entropy theorem
    10.2.2   Maximum entropy spectrum estimation
    Exercises
Chapter 11  Experiments of Information Theory
  11.1  Measure of information
    11.1.1  Information calculator
    11.1.2  Properties of entropy
  11.2  Simulation of Markov source
  11.3  Performance simulation for source coding
    11.3.1  Shannon coding
    11.3.2  Huffman coding
    11.3.3  Fano coding
  11.4  Simulation of BSC
  11.5  Simulation of the cascade channel
  11.6  Calculation of channel capacity
  11.7  Decoding rules
  11.8  Performance demonstration of channel coding
  References

  • 商品搜索:
  • | 高級搜索
首頁新手上路客服中心關於我們聯絡我們Top↑
Copyrightc 1999~2008 美商天龍國際圖書股份有限公司 臺灣分公司. All rights reserved.
營業地址:臺北市中正區重慶南路一段103號1F 105號1F-2F
讀者服務部電話:02-2381-2033 02-2381-1863 時間:週一-週五 10:00-17:00
 服務信箱:bookuu@69book.com 客戶、意見信箱:cs@69book.com
ICP證:浙B2-20060032