书籍详情
Principal Component Analysis Networks and Algorithms
作者:孔祥玉,胡昌华,段战胜 著
出版社:科学出版社
出版时间:1900-01-01
ISBN:9787030602886
定价:¥150.00
购买这本书可以去
内容简介
本书主要研究了一类非线性系统的时域辨识、频域辨识、总体*小二乘辨识及应用、非线性系统建模、非线性系统故障诊断应用等内容。本书内容大体上可分为三部分,第一部分研究了一类非线性系统:Volterra级数模型基本理论,介绍了其时域分析和频域分析方法;第二部分研究了Volterra级数模型的辨识与建模方法,介绍了Volterra级数模型的时域辨识和频域辨识等多种迭代方法;第三部分研究了Volterra级数时域、频域方法及混沌方法等在电路等复杂系统参数估计、故障诊断中的应用。本书的很大一部分内容十分新颖,反映了国内外非线性系统建模与辨识领域方向上研究和应用的*新进展。
作者简介
暂缺《Principal Component Analysis Networks and Algorithms》作者简介
目录
Contents
Chapter 1 Introduction 1
1.1 Feature Extraction 1
1.1.1 PCA and Subspace Tracking 1
1.1.2 PCA Neural Networks 3
1.1.3 Extension or Generalization of PCA 4
1.2 Basis for Subspace Tracking 5
1.2.1 Concept of Subspace 5
1.2.2 Subspace Tracking Method 8
1.3 Main Features of This Book 10
1.4 Organization of This Book 11
References 13
Chapter 2 Matrix Analysis Basics 19
2.1 Introduction 19
2.2 Singular Value Decomposition 20
2.2.1 Theorem and Uniqueness of SVD 20
2.2.2 Properties of SVD 22
2.3 Eigenvalue Decomposition 24
2.3.1 Eigenvalue Problem and Eigen Equation 24
2.3.2 Eigenvalue and Eigenvector 25
2.3.3 Eigenvalue Decomposition of Hermitian Matrix 29
2.3.4 Generalized Eigenvalue Decomposition 31
2.4 Rayleigh Quotient and Its Characteristics 34
2.4.1 Rayleigh Quotient 35
2.4.2 Gradient and Conjugate Gradient Algorithm for RQ 35
2.4.3 Generalized Rayleigh Quotient 37
2.5 Matrix Analysis 38
2.5.1 Differential and Integral of Matrix with Respect to Scalar 38
2.5.2 Gradient of Real Function with Respect to Real Vector 39
2.5.3 Gradient Matrix of Real Function 40
2.5.4 Gradient Matrix of Trace Function 42
2.5.5 Gradient Matrix of Determinant 43
2.5.6 Hessian Matrix 44
2.6 Summary 45
References 45
Chapter 3 Neural Networks for Principal Component Analysis 47
3.1 Introduction 47
3.2 Review of Neural Based PCA algorithms 48
3.3 Neural based PCA Algorithms Foundation 48
3.3.1 Hebbian Learning Rule 48
3.3.2 Oja's Learning Rule 50
3.4 Hebbian/Anti-Hebbian Rule based Principal Component Analysis 51
3.4.1 Subspace Learning Algorithms 52
3.4.2 Generalized Hebbian Algorithm 53
3.4.3 Learning Machine for Adaptive Feature Extraction via PCA 54
3.4.4 The Dot-Product-Decorrelation Algorithm 54
3.4.5 Anti-Hebbian Rule based Principal Component Analysis 54
3.5 Least Mean Squared Error based Principal Component Analysis 57
3.5.1 Least Mean Square Error Reconstruction Algorithm 58
3.5.2 Projection Approximation Subspace Tracking Algorithm 58
3.5.3 Robust RLS Algorithm 59
3.6 Optimization based Principal Component Analysis 60
3.6.1 Novel Information Criterion Algorithm 60
3.6.2 Coupled Principal Component Analysis 61
3.7 Nonlinear Principal Component Analysis 63
3.7.1 Kernel Principal Component Analysis 63
3.7.2 Robust/Nonlinear Principal Component Analysis 65
3.7.3 Autoassociative Network based Nonlinear PCA 67
3.8 Other PCA or Extensions of PCA 68
3.9 Summary 70
References 70
Chapter 4 Neural Networks for Minor Component Analysis 75
4.1 Introduction 75
4.2 Review of Neural Network Based MCA Algorithms 76
4.2.1 Extracting the First Minor Component 77
4.2.2 Oja's Minor Subspace Analysis 79
4.2.3 Self-stabilizing MCA 79
4.2.4 Orthogonal Oja Algorithm 79
4.2.5 Other MCA Algorithm 80
4.3 MCA EXIN Linear Neuron 81
4.3.1 The Sudden Divergence 81
4.3.2 The Instability Divergence 83
4.3.3 The Numerical Divergence 83
4.4 A Novel Self-stabilizing MCA Linear Neurons 83
4.4.1 A Self-stabilizing Algorithm for Tracking one MC 84
4.4.2 MS Tracking Algorithm 90
4.4.3 Computer Simulations 92
4.5 Total Least Squares Problem Application 97
4.5.1 A Novel Neural Algorithm for Total Least Squares Filtering 97
4.5.2 Computer Simulations 104
4.6 Summary 105
References 106
Chapter 5 Dual Purpose for Principal and Minor Component Analysis 111
5.1 Introduction 111
5.2 Review of Neural Network Based Dual Purpose Methods 113
5.2.1 Chen's Unified Stabilization Approach 113
5.2.2 Hasan's Self-normalizing Dual Systems 114
5.2.3 Peng's Unified Learning Algorithm to Extract Principal and Minor Components 117
5.2.4 Manton's Dual Purpose Principal and Minor Component Flow 117
5.3 A Novel Dual Purpose Method for Principal and Minor Subspace Tracking 119
5.3.1 Preliminaries 119
5.3.2 A Novel Information Criterion and Its Landscape 121
5.3.3 Dual Purpose Subspace Gradient Flow 126
5.3.4 Global Convergence Analysis 130
5.3.5 Numerical Simulations 131
5.4 Another Novel Dual Purpose Algorithm for Principal and Minor Subspace Analysis 138
5.4.1 The Criterion for PSA and MSA and Its Landscape 138
5.4.2 Dual Purpose Algorithm for PSA and MSA 141
5.4.3 Experimental Results 141
5.5 Summary 145
References 146
Chapter 6 Deterministic Discrete Time System for the Analysis of Iterative Algorithms 149
6.1 Introduction 149
6.2 Review of Performance Analysis Methods for Neural Network Based PCA Algorithms 150
6.2.1 Deterministic Continuous-Time System Method 150
6.2.2 Stochastic Discrete-Time System Method 151
6.2.3 Lyapunov Function Approach 155
6.2.4 Deterministic Discrete-Time System Method 155
6.3 DDT System of a Novel MCA Algorithm 155
6.3.1 Sel
Chapter 1 Introduction 1
1.1 Feature Extraction 1
1.1.1 PCA and Subspace Tracking 1
1.1.2 PCA Neural Networks 3
1.1.3 Extension or Generalization of PCA 4
1.2 Basis for Subspace Tracking 5
1.2.1 Concept of Subspace 5
1.2.2 Subspace Tracking Method 8
1.3 Main Features of This Book 10
1.4 Organization of This Book 11
References 13
Chapter 2 Matrix Analysis Basics 19
2.1 Introduction 19
2.2 Singular Value Decomposition 20
2.2.1 Theorem and Uniqueness of SVD 20
2.2.2 Properties of SVD 22
2.3 Eigenvalue Decomposition 24
2.3.1 Eigenvalue Problem and Eigen Equation 24
2.3.2 Eigenvalue and Eigenvector 25
2.3.3 Eigenvalue Decomposition of Hermitian Matrix 29
2.3.4 Generalized Eigenvalue Decomposition 31
2.4 Rayleigh Quotient and Its Characteristics 34
2.4.1 Rayleigh Quotient 35
2.4.2 Gradient and Conjugate Gradient Algorithm for RQ 35
2.4.3 Generalized Rayleigh Quotient 37
2.5 Matrix Analysis 38
2.5.1 Differential and Integral of Matrix with Respect to Scalar 38
2.5.2 Gradient of Real Function with Respect to Real Vector 39
2.5.3 Gradient Matrix of Real Function 40
2.5.4 Gradient Matrix of Trace Function 42
2.5.5 Gradient Matrix of Determinant 43
2.5.6 Hessian Matrix 44
2.6 Summary 45
References 45
Chapter 3 Neural Networks for Principal Component Analysis 47
3.1 Introduction 47
3.2 Review of Neural Based PCA algorithms 48
3.3 Neural based PCA Algorithms Foundation 48
3.3.1 Hebbian Learning Rule 48
3.3.2 Oja's Learning Rule 50
3.4 Hebbian/Anti-Hebbian Rule based Principal Component Analysis 51
3.4.1 Subspace Learning Algorithms 52
3.4.2 Generalized Hebbian Algorithm 53
3.4.3 Learning Machine for Adaptive Feature Extraction via PCA 54
3.4.4 The Dot-Product-Decorrelation Algorithm 54
3.4.5 Anti-Hebbian Rule based Principal Component Analysis 54
3.5 Least Mean Squared Error based Principal Component Analysis 57
3.5.1 Least Mean Square Error Reconstruction Algorithm 58
3.5.2 Projection Approximation Subspace Tracking Algorithm 58
3.5.3 Robust RLS Algorithm 59
3.6 Optimization based Principal Component Analysis 60
3.6.1 Novel Information Criterion Algorithm 60
3.6.2 Coupled Principal Component Analysis 61
3.7 Nonlinear Principal Component Analysis 63
3.7.1 Kernel Principal Component Analysis 63
3.7.2 Robust/Nonlinear Principal Component Analysis 65
3.7.3 Autoassociative Network based Nonlinear PCA 67
3.8 Other PCA or Extensions of PCA 68
3.9 Summary 70
References 70
Chapter 4 Neural Networks for Minor Component Analysis 75
4.1 Introduction 75
4.2 Review of Neural Network Based MCA Algorithms 76
4.2.1 Extracting the First Minor Component 77
4.2.2 Oja's Minor Subspace Analysis 79
4.2.3 Self-stabilizing MCA 79
4.2.4 Orthogonal Oja Algorithm 79
4.2.5 Other MCA Algorithm 80
4.3 MCA EXIN Linear Neuron 81
4.3.1 The Sudden Divergence 81
4.3.2 The Instability Divergence 83
4.3.3 The Numerical Divergence 83
4.4 A Novel Self-stabilizing MCA Linear Neurons 83
4.4.1 A Self-stabilizing Algorithm for Tracking one MC 84
4.4.2 MS Tracking Algorithm 90
4.4.3 Computer Simulations 92
4.5 Total Least Squares Problem Application 97
4.5.1 A Novel Neural Algorithm for Total Least Squares Filtering 97
4.5.2 Computer Simulations 104
4.6 Summary 105
References 106
Chapter 5 Dual Purpose for Principal and Minor Component Analysis 111
5.1 Introduction 111
5.2 Review of Neural Network Based Dual Purpose Methods 113
5.2.1 Chen's Unified Stabilization Approach 113
5.2.2 Hasan's Self-normalizing Dual Systems 114
5.2.3 Peng's Unified Learning Algorithm to Extract Principal and Minor Components 117
5.2.4 Manton's Dual Purpose Principal and Minor Component Flow 117
5.3 A Novel Dual Purpose Method for Principal and Minor Subspace Tracking 119
5.3.1 Preliminaries 119
5.3.2 A Novel Information Criterion and Its Landscape 121
5.3.3 Dual Purpose Subspace Gradient Flow 126
5.3.4 Global Convergence Analysis 130
5.3.5 Numerical Simulations 131
5.4 Another Novel Dual Purpose Algorithm for Principal and Minor Subspace Analysis 138
5.4.1 The Criterion for PSA and MSA and Its Landscape 138
5.4.2 Dual Purpose Algorithm for PSA and MSA 141
5.4.3 Experimental Results 141
5.5 Summary 145
References 146
Chapter 6 Deterministic Discrete Time System for the Analysis of Iterative Algorithms 149
6.1 Introduction 149
6.2 Review of Performance Analysis Methods for Neural Network Based PCA Algorithms 150
6.2.1 Deterministic Continuous-Time System Method 150
6.2.2 Stochastic Discrete-Time System Method 151
6.2.3 Lyapunov Function Approach 155
6.2.4 Deterministic Discrete-Time System Method 155
6.3 DDT System of a Novel MCA Algorithm 155
6.3.1 Sel
猜您喜欢