书籍详情
线性估计(影印版)
作者:(美)凯拉斯(Kailath,T.),(美)賽义德(Sayed,A.H.),(美)哈斯比(Hassibi,B.) 著
出版社:西安交通大学出版社
出版时间:2008-12-01
ISBN:9787560529493
定价:¥98.00
购买这本书可以去
内容简介
本书主要介绍状态空间模型的有限维线性系统的估计问题,涵盖了目前我们熟知的维纳滤波和卡尔曼滤波这一领域的许多方面。本书的三个独特之处是:’第一。将几何学的观点渗透于分析中;第二。侧重于将许多算法用平方根/阵列的形式给出;第三。强调了在解决自适应滤波、估计和控制这些相关问题时的等价性和对偶性概念。全书由17章正文和7章附录构成。按内容可分为以下几个专题:★概论和基础知识(1—5章)★平稳过程估计(6书章)★非平稳过程估计(9—10章)★快速阵列算法(11—1 3章)★连续时间估计(16章)★高级专题(14,15,17章)本书适合于控制、通信.数字信号处理、地球物理、计量经济学、统计学等领域的研究生和科研人员使用。
作者简介
Thomas Kailath博士,美国斯坦福大学教授,世界著名的控制与系统科学专家,美国科学院和工程院院士,第三世界科学院院士和印度工程院院士,IEEE会士(Fellow)。他的研究兴趣涉及信息理论、通信系统、计算、控制、线性系统、统计信号处理、大规模集成电路等,也是名著《线性系统理论》(LinearSystemTheory,Springer-Verla9,1991)的作者。Thomas Kailath教授在多个研究领域做出了深远的贡献,并在1991年获得了IEEE信号处理分会的最高分会奖,在2000年获得了IEEE信息理论分会的Shannon奖。同时,Thomas Kailath教授也是一名杰出的教育学者,他指导的博士生和博士后学者中许多人已在各自的研究领域做出了杰出的贡献。Ali H.Sayed博士,现为美国加州大学洛杉矶分校(UCLA)电气工程教授。IEEE会士。他的研究兴趣是自适应滤波、统计信号处理和估计算法等。Babak Hassibi博士,现为美国加州理工学院电气工程教授,1998-2000年曾在美国贝尔实验室工作。他的研究兴趣是通信、信号处理和控制等。
目录
Preface
Symbols
1 OVERVIEW
1.1 The Asymptotic Observer
1.2 The Optimum Transient Observer
1.2.1 The Mean-Square-Error Criterion
1.2.2 Minimization via Completion of Squares
1.2.3 The Optimum Transient Observer
1.2.4 The Kalman Filter
1.3 Coming Attractions
1.3.1 Smoothed Estimators
1.3.2 Extensions to Time-Variant Models
1.3.3 Fast Algorithms for Time-Invariant Systems
1.3.4 Numerical Issues
1.3.5 Array Algorithms
1.3.6 Other Topics
1.4 The Innovations Process
1.4.1 Whiteness of the Innovations Process
1.4.2 Innovations Representations
1.4.3 Canonical Covariance Factorization
1.4.4 Exploiting State-Space Structure for Matrix Problems
1.5 Steady-State Behavior
1.5.1 Appropriate Solutions of the DARE
1.5.2 Wiener Filters
1.5.3 Convergence Results
1.6 Several Related Problems
1.6.1 Adaptive RL$ Fdtering
1.6.2 Linear Quadratic Control
1.6.3 Hoo Estimation
1.6.4 Hoo Adaptive Fdtering
1.6.5 Hoo Control
1.6.6 Linear Algebra and Matrix Theory
1.7 Complements
Problems
2 DETERMINISTIC LEAST-SQUARES PROBLEMS
2.1 The Deterministic Least-Squares Criterion
2.2 The Classical Solutions
2.2.1 The Normal Equations
2.2.2 Weighted Least-Squares Problems
2.2.3 Statistical Assumptions on the Noise
2.3 A Geometric Formulation: The Orthogonality Condition
2.3.1 The Projection Theorem in Inner Product Spaces
2.3.2 Geometric Insights
2.3.3 Projection Matrices
2.3.4 An Application: Order-Reeursive Least-Squares
2.4 Regularized Least-Squares Problems
2.5 An Array Algorithm: The OR Method
2.6 Updating Least-Squares Solutions: RLS Algorithms
2.6.1 The RLS Algorithm
2.6.2 An Array Algorithm for RLS
2.7 Downdating Least-Squares Solutions
2.8 Some Variations of Least-Squares Problems
2.8.1 The Total Least-Squares Criterion
2.8.2 Criteria with Bounds on Data Uncertainties
2.9 Complements
Problems
2.A On Systems of Linear Equations
3 STOCHASTIC LEAST-SQUARES PROBLEMS
3.1 The Problem of Stochastic Estimation
3.2 Linear Least-Mean-Squares Estimators
3.2.1 The Fundamental Equations
3.2.2 Stochastic Interpretation of Triangular Factorization
3.2.3 Singular Data Covariance Matrices
3.2.4 Nonzero-Mean Values and Centering
3.2.5 Estimators for Complex-Valued Random Variables
3.3 A Geometric Formulation
3.3.1 The Orthogonality Condition
3.3.2 Examples
3.4 Linear Models
3.4.1 Information Forms When Rx > 0 and Rv > 0
3.4.2 The Gauss-Markov Theorem
3.4.3 Combining Estimators
3.5 Equivalence to Deterministic Least-Squares
3.6 Complements
Problems
3.7 Least-Mean-Squares Estimation
3.8 Gaussian Random Variables
3.9 Optimal Estimation for Gaussian Variables
4 THE INNOVATIONS PROCESS
4.1 Estimation of Stochastic Processes
4.1.1 The Fixed Interval Smoothing Problem
4.1.2 The Causal Fdtering Problem
4.1.3 The Wiener-HopfTechnique
4.1.4 A Note on Terminology—— Vectors and Gramians
4.2 The Innovations Process
4.2.1 A Geometric Approach
4.2.2 An Algebraic Approach
4.2.3 The Modified Gram-Schmidt Procedure
4.2.4 Estimation Given the Innovations Process
4.2.5 The Filtering Problem via the Innovations Approach
4.2.6 Computational Issues
4.3 Innovations Approach to Deterministic Least-Squares Problems
4.4 The Exponentially Correlated Process
4.4.1 Triangular Factorization of Ry
4.4.2 Finding L-1 and the Innovations
4.4.3 Innovations via the Gram-Schmidt Procedures
4.5 Complements
Problems
4.6 Linear Spaces, Modules, and Gramians
5 STATE-SPACE MODELS
5.1 The Exponentially Correlated Process
5.1.1 Finite Interval Problems; Initial Conditions for Stationarity
5.1.2 Innovations from the Process Model
5.2 Going Beyond the Stationary Case
5.2.1 Stationary Processes
5.2.2 Nonstationary Processes
5.3 Higher-Order Processes and State-Space Models
5.3.1 Autoregressive Processes
5.3.2 Handling Initial Conditions
5.3.3 State-SpaceDescriptions
5.3.4 The Standard State-Space Model
5.3.5 Examples of Other State-Space Models
5.4 Wide-Seuse Markov Processes
5.4.1 Forwards Markovian Models
5.4.2 Backwards Markovian Models
5.4.3 Backwards Models from Forwards Models
5.4.4 Markovian Representations and the Standard Model
5.5 Complements
Problems
5.6 Some Global Formulas
6 INNOVATIONS FOR STATIONARY PROCESSES
6.1 Innovations via Spectral Factorization
6.1.1 Stationary Processes
6.1.2 Generating Functions and z-Spectra
6.2 Signals and Systems
6.2.1 The z-Transform
6.2.2 Linear Time-Invariant Systems
6.2.3 Causal, Anticausal, and Minimum-Phase Systems
6.3 Stationary Random Processes
6.3.1 Properties of the z-Spectrum
6.3.2 Linear Operations on Stationary Stochastic Processes
6.4 Canonical Spectral Factorization
6.5 Scalar Rational z-Spectra
6.6 Vector-Valued Stationary Processes
6.7 Complements
Problems
6.8 Continuous-Time Systems and Processes
7 WIENER THEORY FOR SCALAR PROCESSES
7.1 Continuous-Time Wiener Smoothing
7.1.1 The GeometricFormulation
7.1.2 Solution via Fourier Transforms
7.1.3 The Minimum Mean-Square Error
7.1.4 Filtering Signals out of Noisy Measurements
7.1.5 Comparison with the Ideal Filter
7.2 The Continuous-Time Wiener-Hopf Equation
7.3 Discrete-Trine Problems
7.3.1 The Discrete-Trine Wiener Smoother
7.3.2 The Discrete-Trine Wiener-Hopf Equation
7.4 The Discrete-Trine Wiener-Hopf Technique
7.5 Causal Parts Via Partial Fractions
7.6 Important Special Cases and Examples
7.6.1 Pure Prediction
7.6.2 Additive White Noise
……
8 RECURSIVE WIENER FILTERING
9 THE KALMAN FILTER
10 SMOOTHED ESTIMATORS
11 FAST ALGORITHMS
12 ARRAY ALGORITHMS
13 FAST ARRAY ALGORITHMS
14 ASYMPTOTIC BEHAVIOR
15 DUALITY AND EQUIVALENCE IN ESTIMATION AND CONTROL
16 CONTINUOUS-TIME STATE-SPACE ESTIMATION
17 A SCATTERING THEORY APPROACH
A USEFUL MATRIX RESULTS
B UNITARY AND J-UNITARY TRANSFORMATIONS
C SOME SYSTEM THEORY CONCEPTS
D LYAPUNOV EQUATIONS
E ALGEBRAIC RICCATI EQUATIONS
F DISPLACEMENT STRUCTURE
Symbols
1 OVERVIEW
1.1 The Asymptotic Observer
1.2 The Optimum Transient Observer
1.2.1 The Mean-Square-Error Criterion
1.2.2 Minimization via Completion of Squares
1.2.3 The Optimum Transient Observer
1.2.4 The Kalman Filter
1.3 Coming Attractions
1.3.1 Smoothed Estimators
1.3.2 Extensions to Time-Variant Models
1.3.3 Fast Algorithms for Time-Invariant Systems
1.3.4 Numerical Issues
1.3.5 Array Algorithms
1.3.6 Other Topics
1.4 The Innovations Process
1.4.1 Whiteness of the Innovations Process
1.4.2 Innovations Representations
1.4.3 Canonical Covariance Factorization
1.4.4 Exploiting State-Space Structure for Matrix Problems
1.5 Steady-State Behavior
1.5.1 Appropriate Solutions of the DARE
1.5.2 Wiener Filters
1.5.3 Convergence Results
1.6 Several Related Problems
1.6.1 Adaptive RL$ Fdtering
1.6.2 Linear Quadratic Control
1.6.3 Hoo Estimation
1.6.4 Hoo Adaptive Fdtering
1.6.5 Hoo Control
1.6.6 Linear Algebra and Matrix Theory
1.7 Complements
Problems
2 DETERMINISTIC LEAST-SQUARES PROBLEMS
2.1 The Deterministic Least-Squares Criterion
2.2 The Classical Solutions
2.2.1 The Normal Equations
2.2.2 Weighted Least-Squares Problems
2.2.3 Statistical Assumptions on the Noise
2.3 A Geometric Formulation: The Orthogonality Condition
2.3.1 The Projection Theorem in Inner Product Spaces
2.3.2 Geometric Insights
2.3.3 Projection Matrices
2.3.4 An Application: Order-Reeursive Least-Squares
2.4 Regularized Least-Squares Problems
2.5 An Array Algorithm: The OR Method
2.6 Updating Least-Squares Solutions: RLS Algorithms
2.6.1 The RLS Algorithm
2.6.2 An Array Algorithm for RLS
2.7 Downdating Least-Squares Solutions
2.8 Some Variations of Least-Squares Problems
2.8.1 The Total Least-Squares Criterion
2.8.2 Criteria with Bounds on Data Uncertainties
2.9 Complements
Problems
2.A On Systems of Linear Equations
3 STOCHASTIC LEAST-SQUARES PROBLEMS
3.1 The Problem of Stochastic Estimation
3.2 Linear Least-Mean-Squares Estimators
3.2.1 The Fundamental Equations
3.2.2 Stochastic Interpretation of Triangular Factorization
3.2.3 Singular Data Covariance Matrices
3.2.4 Nonzero-Mean Values and Centering
3.2.5 Estimators for Complex-Valued Random Variables
3.3 A Geometric Formulation
3.3.1 The Orthogonality Condition
3.3.2 Examples
3.4 Linear Models
3.4.1 Information Forms When Rx > 0 and Rv > 0
3.4.2 The Gauss-Markov Theorem
3.4.3 Combining Estimators
3.5 Equivalence to Deterministic Least-Squares
3.6 Complements
Problems
3.7 Least-Mean-Squares Estimation
3.8 Gaussian Random Variables
3.9 Optimal Estimation for Gaussian Variables
4 THE INNOVATIONS PROCESS
4.1 Estimation of Stochastic Processes
4.1.1 The Fixed Interval Smoothing Problem
4.1.2 The Causal Fdtering Problem
4.1.3 The Wiener-HopfTechnique
4.1.4 A Note on Terminology—— Vectors and Gramians
4.2 The Innovations Process
4.2.1 A Geometric Approach
4.2.2 An Algebraic Approach
4.2.3 The Modified Gram-Schmidt Procedure
4.2.4 Estimation Given the Innovations Process
4.2.5 The Filtering Problem via the Innovations Approach
4.2.6 Computational Issues
4.3 Innovations Approach to Deterministic Least-Squares Problems
4.4 The Exponentially Correlated Process
4.4.1 Triangular Factorization of Ry
4.4.2 Finding L-1 and the Innovations
4.4.3 Innovations via the Gram-Schmidt Procedures
4.5 Complements
Problems
4.6 Linear Spaces, Modules, and Gramians
5 STATE-SPACE MODELS
5.1 The Exponentially Correlated Process
5.1.1 Finite Interval Problems; Initial Conditions for Stationarity
5.1.2 Innovations from the Process Model
5.2 Going Beyond the Stationary Case
5.2.1 Stationary Processes
5.2.2 Nonstationary Processes
5.3 Higher-Order Processes and State-Space Models
5.3.1 Autoregressive Processes
5.3.2 Handling Initial Conditions
5.3.3 State-SpaceDescriptions
5.3.4 The Standard State-Space Model
5.3.5 Examples of Other State-Space Models
5.4 Wide-Seuse Markov Processes
5.4.1 Forwards Markovian Models
5.4.2 Backwards Markovian Models
5.4.3 Backwards Models from Forwards Models
5.4.4 Markovian Representations and the Standard Model
5.5 Complements
Problems
5.6 Some Global Formulas
6 INNOVATIONS FOR STATIONARY PROCESSES
6.1 Innovations via Spectral Factorization
6.1.1 Stationary Processes
6.1.2 Generating Functions and z-Spectra
6.2 Signals and Systems
6.2.1 The z-Transform
6.2.2 Linear Time-Invariant Systems
6.2.3 Causal, Anticausal, and Minimum-Phase Systems
6.3 Stationary Random Processes
6.3.1 Properties of the z-Spectrum
6.3.2 Linear Operations on Stationary Stochastic Processes
6.4 Canonical Spectral Factorization
6.5 Scalar Rational z-Spectra
6.6 Vector-Valued Stationary Processes
6.7 Complements
Problems
6.8 Continuous-Time Systems and Processes
7 WIENER THEORY FOR SCALAR PROCESSES
7.1 Continuous-Time Wiener Smoothing
7.1.1 The GeometricFormulation
7.1.2 Solution via Fourier Transforms
7.1.3 The Minimum Mean-Square Error
7.1.4 Filtering Signals out of Noisy Measurements
7.1.5 Comparison with the Ideal Filter
7.2 The Continuous-Time Wiener-Hopf Equation
7.3 Discrete-Trine Problems
7.3.1 The Discrete-Trine Wiener Smoother
7.3.2 The Discrete-Trine Wiener-Hopf Equation
7.4 The Discrete-Trine Wiener-Hopf Technique
7.5 Causal Parts Via Partial Fractions
7.6 Important Special Cases and Examples
7.6.1 Pure Prediction
7.6.2 Additive White Noise
……
8 RECURSIVE WIENER FILTERING
9 THE KALMAN FILTER
10 SMOOTHED ESTIMATORS
11 FAST ALGORITHMS
12 ARRAY ALGORITHMS
13 FAST ARRAY ALGORITHMS
14 ASYMPTOTIC BEHAVIOR
15 DUALITY AND EQUIVALENCE IN ESTIMATION AND CONTROL
16 CONTINUOUS-TIME STATE-SPACE ESTIMATION
17 A SCATTERING THEORY APPROACH
A USEFUL MATRIX RESULTS
B UNITARY AND J-UNITARY TRANSFORMATIONS
C SOME SYSTEM THEORY CONCEPTS
D LYAPUNOV EQUATIONS
E ALGEBRAIC RICCATI EQUATIONS
F DISPLACEMENT STRUCTURE
猜您喜欢