幫助中心 | 我的帳號 | 關於我們

資訊理論與可靠通信(英文版)

  • 作者:(美)羅伯特·加拉格爾|責編:陳亮//夏丹
  • 出版社:世界圖書出版公司
  • ISBN:9787519275945
  • 出版日期:2020/07/01
  • 裝幀:平裝
  • 頁數:588
人民幣:RMB 148 元      售價:
放入購物車
加入收藏夾

內容大鋼
    《資訊理論與可靠通信》是信息領域諾貝爾獎級別泰斗羅伯特·加拉格爾(Robert G. Gallager)所著的一本資訊理論聖經,一代一代的資訊理論學者都是讀著這本世界經典成長起來的。作者在美國麻省理工學院師從資訊理論創始人克勞德·香農(Claude E. Shannon)及另兩位最早期的香農獎得主羅伯特·法諾(Robert M. Fano)和彼得·埃里亞斯(Peter Elias),博士畢業后一直在麻省理工學院任教至今,被譽為香農以後最偉大的資訊理論學者。他1960年博士論文中提出的「低密度奇偶校驗碼」是目前所有5G設備都必用的通道編碼。《資訊理論與可靠通信》一書中有不少內容是作者當年首次提出的原創性成果,對資訊理論的發展有極大的推動作用。書中深入研究了通信系統中信源和通道的數學模型,並探索了構建真實世界中信源和通道詳細模型的框架。然後,作者通過將編碼器和解碼器分為兩個部分進一步闡述資訊理論原理,並研究構成有效通信系統的機制。本書適合作為電子工程、電腦科學以及數學相關專業的高年級本科生和研究生的資訊理論課程教材,也可供研究人員和專業人士參考。「香農信息科學經典」系列還出版了加拉格爾教授所著的另兩本名著《麻省理工加拉格爾數字通信原理》和《數據網路(第2版)》。

作者介紹
(美)羅伯特·加拉格爾|責編:陳亮//夏丹
    羅伯特·加拉格爾(Robert G.Gallager)教授是美國國家科學院與國家工程院的兩院院士。他曾擔任國際資訊理論學會的主席,他于1983年獲得資訊理論的最高獎——香農獎(相當於資訊理論領域的諾貝爾獎),1990年獲得國際電氣電子工程師學會最高榮譽獎章(相當於電子工程領域的諾貝爾獎),2003年獲得馬可尼獎(相當於通信領域的諾貝爾獎),2020年獲得日本國際獎(相當於整個應用科學領域的諾貝爾獎)。加拉格爾教授于1960年在美國麻省理工學院獲得博士學位后留校任教至今,他1960年博士論文中提出的「低密度奇偶校驗碼」(LDPC code)是目前所有5G設備都必用的通道編碼。他培養出的博士埃爾達爾·阿里坎(Erdal Arikan)提出了5G通信中的另一種重要通道編碼「極化碼」(Polar code)。

目錄
1  Communication Systems and Information Theory
  1.1  Introduction
  1.2  Source Models and Source Coding
  1.3  Channel Models and Channel Coding
  Historical Notes and References
2  AMeasure of Information
  2.1  Discrete Probability:Review and Notation
  2.2  Definition of Mutual Information
  2.3  Average Mutual Information and Entropy
  2.4  Probability and MutualInformation for Continuous Ensembles
  2.5  Mutual Information for Arbitrary Ensembles
  Summary and Conclusions
  Historical Notes and References
3  Coding for Discrete Sources
  3.1  Fixed-Length Codes
  3.2  Variable-Length Code Words
  3.3  A Source Coding Theorem
  3.4  An Optimum Variable-Length Encoding Procedure
  3.5  Discrete Stationary Sources
  3.6  Markov Sources
  Summary and Conclusions
  Historical Notes and References
4  Discrete Memoryless Channels and Capacity
  4.1  Classification of Channels
  4.2  Discrete Memoryless Channels
  4.3  The Converse to the Coding Theorem
  4.4  Convex Functions
  4.5  Finding Channel Capacity for a Discrete Memoryless Channel
  4.6  Discrete Channels with Memory
    Indecomposable Channels
  Summary and Conclusions
  Historical Notes and References
  Appendix 4A
5  The Noisy-Channel Couing Theorem
  5.1  Block Codes
  5.2  Decoding Block Codes
  5.3  Error Probability for Two Code Words
  5.4  The Generalized Chebyshev Inequality and the Chermor Bound
  5.5  Randomly Chosen Code Words
  5.6  Many Code Words-The Coding Theorem
    Properties of the Random Coding Exponent
  5.7  Eror Probability for an Expurgated Ensemble of Codes
  5.8  Lower Bounds to Error Probability
    Block Error Probability at Rates above Capacity
  5.9  The Coding Theorem for Finite-State Channels
    State Known at Receiver
  Summary and Conclusions
  Historical Notes and References
  Appendix 5A
  Appendix 5B

6  Techniques for Coding and Decoding
  6.1  Parity-Check Codes
    Generator Matrices
    Parity-Check Matrices for Systematic Parity-Check Codes
    Decoding Tables
    Hamming Codes
  6.2  The Coding Theorem for Parity-Check Codes
  6.3  Group Theory
    Subgroups
    Cyclic Subgroups
  6.4  Fields and Polynomials
    Polynomials
  6.5  Cyclic Codes
  6.6  Galois Fields
    Maximal Length Codes and Hamming Codes
    Existence of Galois Fields
  6.7  BCH Codes
    Iterative Algorithm for Finding o(D)
  6.8  Convolutional Codes and Threshold Decoding
  6.9  Sequential Decoding
    Computation for Sequential Decoding
    Error Probability for Sequential Decoding
  6.10  Coding for Burst Noise Channels
    Cyclic Codes
    Convolutional Codes
  Summary and Conclusions
  Historical Notes and References
  Appendix 6A
  Appendix 6B
7  Memoryless Channels with Discrete Time
  7.1  Introduction
  7.2  Unconstrained Inputs
  7.3  Constrained Inputs
  7.4  Additive Noise and Additive Gaussian Noise
    Additive Gaussian Noise with an Energy Constrained Input
  7.5  Parallel Additive Gaussian Noise Channels
  Summary and Conclusions
  Historical Notes and References
8  Waveform Channels
  8.1  Orthonormal Expansions of Signals and White Gaussian Noise
    Gaussian Random Processes
    Mutual Information for Continuous-Time Channels
  8.2  White Gaussian Noise and Orthogonal Signals
    Error Probability for Two Code Words
    Error Probability for Orthogonal Code Words
  8.3  Heuristic Treatment of Capacity for Channels with Additive
    Gaussian Noise and Bandwidth Constraints
  8.4  Representation of Linear Filters and Nonwhite Noise
    Filtered Noise and the Karhunen-Loeve Expansion
    Low-Pass Ideal Filters

  8.5  Additive Gaussian Noise Channels with an Input Constraine in Power and Frequency
  8.6  Fading Dispersive Channels
  Summary and Conclusions
  Historical Notes and References
9  Source Coding with a Fidelity Criterion
  9.1  Introduction
  9.2  Discrete Memoryless Sources and Single-Leer Distorton Measures
  3.3  The Coding Theorem for Sources with a Fidelity Criterior
  9.4  Calculation of R(d*)
  9.5  The Converse to the Noisy-Channel Coding Theorem Revisited
  9.6  Discrete-Time Sources with Continuous Amplitudes
  9.7  Gausian Sources with Square Difference Distortion
    Gaussian Random-Process Sources
  9.8  Discrete Ergodic Sources
  Summary and Conclusions
  Historical Notes and References
  Exercises and Problems
  References and Selected Reading
  Glossary of Symbols
Index

  • 商品搜索:
  • | 高級搜索
首頁新手上路客服中心關於我們聯絡我們Top↑
Copyrightc 1999~2008 美商天龍國際圖書股份有限公司 臺灣分公司. All rights reserved.
營業地址:臺北市中正區重慶南路一段103號1F 105號1F-2F
讀者服務部電話:02-2381-2033 02-2381-1863 時間:週一-週五 10:00-17:00
 服務信箱:bookuu@69book.com 客戶、意見信箱:cs@69book.com
ICP證:浙B2-20060032