书籍详情
信息论基础
作者:(美)科沃(Thomas M. Cover),(美)托马斯(Joy A. Thomas)著
出版社:清华大学出版社
出版时间:2003-11-01
ISBN:9787302072850
定价:¥65.00
购买这本书可以去
内容简介
《信息论基础》系统介绍了信息论基本原理及其在通信理论、统计学、计算机科学、概率论以及投资理论等领域的应用。作者以循序渐进的方式,介绍了信息量的基本定义、相对熵、互信息以及他们如何自然地用来解决数据压缩、信道容量、信息率失真、统计假设、网络信息流等问题。除此以外,《信息论基础》还探讨了很多教材中从未涉及的问题,如:热力学第二定律与马尔可夫链之间的联系Huffman编码的最优性数据压缩的对偶性Lempel ziv编码Kolmogorov复杂性PorfoII0理论信息论不等式及其数学结论《信息论基础》可作为通信、电子、计算机、自动控制、统计、经济等专业高年级本科生和研究生的教材或参考书,也可供相关领域的科研人员和专业技术人员参考。
作者简介
Thomas M. Cover斯坦福大学电气工程系、统计系教授。曾任IEEE信息论学会主席,现任数理统计研究所研究员、IEEE高级会员。1972年以论文“Broadcast Channels”荣获信息论优秀论文奖,1990年被选为“Shannon Lecturer”,这是信息论领域的最高荣誉。最近20年,他致力于信息论和统计学之间的关系。
目录
List of Figures
1 Introduction and Preview
1.1Preview of the book
2 Entropy,Relative Entropy and Mutual Information
2.1 Entropy
2.2 Joint entropy and conditional entropy
2.3 Relation entropy and mutual information
2.4 Relationship between entropy ad mutual information
2.5 Chain rules for entropy,relative entropy and mutual information
2.6 Jensen's inequality and its consequences
2.7 The log sum inequality and ist applications
2.8 Data processing inequality
2.9 The second law of thermodymamics
2.10 Sufficient statistics
2.11 Fano;s inequality
Summary of Chapter
Problems for Chatpter
Historical notes
3 The Asymptotic Equipartition Property
3.1 The AEP
3.2 Consequences of the AEP:data compression
3.3 Hign prbability sets and the typical set
Summary of Chapter 3
Problems for Chapter 3
Historical notes
4 Entropy Rates of a Stochastic Process
4.1 Markov chains
4.2 Entropy rate
4.3 Examply:Entropy rateof a random walk on a weighted graph
4.4 Hidden Markov models
Summary of Chapter 4
Problems for Chapter 4
Historical notes
5 Data Compression
5.1 Examples of codes
5.2 Kraft inequality
5.3 Optimal codes
5.4 Bounds on the optimal codelength
5.5 Kraft inequality for uniquely decodable codes
5.6 Huffman codes
5.7 Some comments on Huffman codes
5.8 Optimality of Huffman codes
5.9 Shannon-Fano-Elias coding
5.10 Arithmetic coding
5.11 Competitive optimality of the Shannon code
5.12 Generation of discrete distributions from far coins
Summary of Chapter 5
Problems for Chapter 5
Historical notes
6 Gambling and Data Compression
6.1 The horse race
6.2 Gambling and side information
6.3 Dependent horse races and entropy rate
6.4 The entropy of English
6.5 Data compression and gambling
6.6 Gambling estimate of the entropy of English
Summary of Chapter 6
Problems for Chapter 6
Historical notes
7 Komogorov Complexity
7.1 Models of cmputation
7.2 Kolmogorov complexity:definitions and examples
7.3 Kolmogorov complexity and entropy
7.4 Kolmogorov complexity of integers
7.5 Algorithmically random and incompressible sequences
7.6 Unversal probability
7.7 The halting progblem and the non-computability of Kolmogorov complexity
7.8 Ω/164
7.9 Universal gambling
7.10 Occam's razor
7.11 Kolmogorov complexity and universal probability
7.12 The Dolmogorov sufficeient statistic
Srmmary of Chapter
Problems of Chapter 7
Historical notes
8 Channel Capacity
8.1 Examples of channel capacity
8.2 Symetric channels
8.3 Properties of channel capacity
8.4 Preview of the channel coding theorem
8.5 Definitions
8.6 Jointly typical sequences
8.7 The channle coding theorem
8.8 Zero-error codes
8.9 Fano'inequality and the converse tothe coding theorem
8.10 Equality in thd converse to the channel coding theorem
8.11 Hamming codes
8.12 Feedback capacity
8.13 The joint source channel coding theorem
Summary of Chapter 8
Problems for Chapter 8
Historical notes
9 Differential Entropy
9.1 Definitions
9.2 The AEP for continuous random variables
9.3 Relation of differential entropy to discrete entropy
9.4 Joint and conditional differential entropy
9.5 Relative entroopy and mutual information
9.6 Propertise of differential entropy,relative entropy and mutual information
9.7 Differential entropy bound on discrete entropy
Summary of Chapter 9
Problems for Chapter 9
Historical notes
10 The Gaussian Channel
10.1 The Gaussian channel:definitions
10.2 Converse to the coding theorem for Gaussian channels
10.3 Band-limited channels
10.4 Parallel Gaussian channels
10.5 Channels with colored Gaussian noise
10.6 Gaussian channels with feedback
Summary of Chapter 10
Problems for Chapter 10
Historcal notes
11 Maximum Entropy and Spectral Estimation
11.1 Maximum entropy distributions
11.2 Examples
11.3 An anomalous maximum entropy problem
11.4 Spectrum estimation
11.5 Entropy rates of a Gaussian process
11.6 Burg's maximum entropy theorem
Summary of Chapter 11
Problem for Chapter 11
Historical notes
12 Information Theory and Statistics
12.1 The method of types
12.2 The law of large numbers
12.3 Unuversal source coding
12.4 Large deviation theory
12.5 Examples of Sanov's theorem
12.6 The conditional limit theorem
12.7 Hypothesis testing
12.8 Stein' lemma
12.9 Chernoff bound
12.10 Lempel-Ziv coding
12.11 Fisher information and the Cramer-Rao
inequality
Summary of Chapter 12
Problems for Chapter 12
Historical notes
13 Rate Distortion Theory
13.1 Quantization
13.2 Definitions
13.3 Calculation of the rate distortion function
13.4 Converse to the rate distortion theorem
13.5 Achievability of the rate distortion function
13.6 Strongly typical sequences and rate distortion
13.7 Characterization of the rate distortion function
13.8 Computatio of chammel capacity and the rate distortion function
Summary of Chapter 13
Probems for Chapter 13
Historical notes
14 Network Information Theory
14.1 Gaussian multiple user channels
14.2 Jointly typical sequences
14.3 The multiple access channel
14.4 Encoding of crrelated sources
14.5 Duality between Slepian-Wolf encoding and multiple access channels
14.6 The broadcast channel
14.7 The relay channel
14.8 Source coding with side information
14.9 Rate distortion with side information
14.10 General multiterminal networks
Summary of Chapter 14
Problems for Chapter 14
Historical notes
15 Information Theory and the Stock Market
15.1 The stock market:sone definitions
15.2 Kuhn-Tucker characterizxation of the log-optimal potrfolio
15.3 Asymptotic optimality of the log-optimal porfolio
15.4 Side information and the doubling rate
15.5 Investment in stationary markets
15.6 Competitive optimality of the log-optimal protfolio
15.7 The Shannon-McMillan-Breiman theorem
Summary of Charpter 15
Problems for Charpter 15
Historical notes
16 Inequalities in Theory
16.1 Basic inequalities of information theory
16.2 Differential etropy
16.3 Bounds on entropy and relative entropy
16.4 Inequalities for types
16.5 Entropy rates of subsets
16.6 Entropy and Fisher information
16.7 The entropy power inequality and the BrunnMinkowski inequality
16.8 Inequalites for determinants
16.9 Inequalites for ratios of determinants
Overall Summary
Problems for Chapter
Historical notes
Bibliography
List of Symbols
Index
1 Introduction and Preview
1.1Preview of the book
2 Entropy,Relative Entropy and Mutual Information
2.1 Entropy
2.2 Joint entropy and conditional entropy
2.3 Relation entropy and mutual information
2.4 Relationship between entropy ad mutual information
2.5 Chain rules for entropy,relative entropy and mutual information
2.6 Jensen's inequality and its consequences
2.7 The log sum inequality and ist applications
2.8 Data processing inequality
2.9 The second law of thermodymamics
2.10 Sufficient statistics
2.11 Fano;s inequality
Summary of Chapter
Problems for Chatpter
Historical notes
3 The Asymptotic Equipartition Property
3.1 The AEP
3.2 Consequences of the AEP:data compression
3.3 Hign prbability sets and the typical set
Summary of Chapter 3
Problems for Chapter 3
Historical notes
4 Entropy Rates of a Stochastic Process
4.1 Markov chains
4.2 Entropy rate
4.3 Examply:Entropy rateof a random walk on a weighted graph
4.4 Hidden Markov models
Summary of Chapter 4
Problems for Chapter 4
Historical notes
5 Data Compression
5.1 Examples of codes
5.2 Kraft inequality
5.3 Optimal codes
5.4 Bounds on the optimal codelength
5.5 Kraft inequality for uniquely decodable codes
5.6 Huffman codes
5.7 Some comments on Huffman codes
5.8 Optimality of Huffman codes
5.9 Shannon-Fano-Elias coding
5.10 Arithmetic coding
5.11 Competitive optimality of the Shannon code
5.12 Generation of discrete distributions from far coins
Summary of Chapter 5
Problems for Chapter 5
Historical notes
6 Gambling and Data Compression
6.1 The horse race
6.2 Gambling and side information
6.3 Dependent horse races and entropy rate
6.4 The entropy of English
6.5 Data compression and gambling
6.6 Gambling estimate of the entropy of English
Summary of Chapter 6
Problems for Chapter 6
Historical notes
7 Komogorov Complexity
7.1 Models of cmputation
7.2 Kolmogorov complexity:definitions and examples
7.3 Kolmogorov complexity and entropy
7.4 Kolmogorov complexity of integers
7.5 Algorithmically random and incompressible sequences
7.6 Unversal probability
7.7 The halting progblem and the non-computability of Kolmogorov complexity
7.8 Ω/164
7.9 Universal gambling
7.10 Occam's razor
7.11 Kolmogorov complexity and universal probability
7.12 The Dolmogorov sufficeient statistic
Srmmary of Chapter
Problems of Chapter 7
Historical notes
8 Channel Capacity
8.1 Examples of channel capacity
8.2 Symetric channels
8.3 Properties of channel capacity
8.4 Preview of the channel coding theorem
8.5 Definitions
8.6 Jointly typical sequences
8.7 The channle coding theorem
8.8 Zero-error codes
8.9 Fano'inequality and the converse tothe coding theorem
8.10 Equality in thd converse to the channel coding theorem
8.11 Hamming codes
8.12 Feedback capacity
8.13 The joint source channel coding theorem
Summary of Chapter 8
Problems for Chapter 8
Historical notes
9 Differential Entropy
9.1 Definitions
9.2 The AEP for continuous random variables
9.3 Relation of differential entropy to discrete entropy
9.4 Joint and conditional differential entropy
9.5 Relative entroopy and mutual information
9.6 Propertise of differential entropy,relative entropy and mutual information
9.7 Differential entropy bound on discrete entropy
Summary of Chapter 9
Problems for Chapter 9
Historical notes
10 The Gaussian Channel
10.1 The Gaussian channel:definitions
10.2 Converse to the coding theorem for Gaussian channels
10.3 Band-limited channels
10.4 Parallel Gaussian channels
10.5 Channels with colored Gaussian noise
10.6 Gaussian channels with feedback
Summary of Chapter 10
Problems for Chapter 10
Historcal notes
11 Maximum Entropy and Spectral Estimation
11.1 Maximum entropy distributions
11.2 Examples
11.3 An anomalous maximum entropy problem
11.4 Spectrum estimation
11.5 Entropy rates of a Gaussian process
11.6 Burg's maximum entropy theorem
Summary of Chapter 11
Problem for Chapter 11
Historical notes
12 Information Theory and Statistics
12.1 The method of types
12.2 The law of large numbers
12.3 Unuversal source coding
12.4 Large deviation theory
12.5 Examples of Sanov's theorem
12.6 The conditional limit theorem
12.7 Hypothesis testing
12.8 Stein' lemma
12.9 Chernoff bound
12.10 Lempel-Ziv coding
12.11 Fisher information and the Cramer-Rao
inequality
Summary of Chapter 12
Problems for Chapter 12
Historical notes
13 Rate Distortion Theory
13.1 Quantization
13.2 Definitions
13.3 Calculation of the rate distortion function
13.4 Converse to the rate distortion theorem
13.5 Achievability of the rate distortion function
13.6 Strongly typical sequences and rate distortion
13.7 Characterization of the rate distortion function
13.8 Computatio of chammel capacity and the rate distortion function
Summary of Chapter 13
Probems for Chapter 13
Historical notes
14 Network Information Theory
14.1 Gaussian multiple user channels
14.2 Jointly typical sequences
14.3 The multiple access channel
14.4 Encoding of crrelated sources
14.5 Duality between Slepian-Wolf encoding and multiple access channels
14.6 The broadcast channel
14.7 The relay channel
14.8 Source coding with side information
14.9 Rate distortion with side information
14.10 General multiterminal networks
Summary of Chapter 14
Problems for Chapter 14
Historical notes
15 Information Theory and the Stock Market
15.1 The stock market:sone definitions
15.2 Kuhn-Tucker characterizxation of the log-optimal potrfolio
15.3 Asymptotic optimality of the log-optimal porfolio
15.4 Side information and the doubling rate
15.5 Investment in stationary markets
15.6 Competitive optimality of the log-optimal protfolio
15.7 The Shannon-McMillan-Breiman theorem
Summary of Charpter 15
Problems for Charpter 15
Historical notes
16 Inequalities in Theory
16.1 Basic inequalities of information theory
16.2 Differential etropy
16.3 Bounds on entropy and relative entropy
16.4 Inequalities for types
16.5 Entropy rates of subsets
16.6 Entropy and Fisher information
16.7 The entropy power inequality and the BrunnMinkowski inequality
16.8 Inequalites for determinants
16.9 Inequalites for ratios of determinants
Overall Summary
Problems for Chapter
Historical notes
Bibliography
List of Symbols
Index
猜您喜欢