0去购物车结算
购物车中还没有商品,赶紧选购吧!
当前位置: 图书分类 > 信息技术 > 计算机理论 > Principal Component Analysis Networks and Algorithms

相同作者的商品

相同语种的商品

浏览历史

Principal Component Analysis Networks and Algorithms


联系编辑
 
标题:
 
内容:
 
联系方式:
 
  
Principal Component Analysis Networks and Algorithms
  • 书号:9787030602886
    作者:孔祥玉,胡昌华,段战胜
  • 外文书名:
  • 装帧:平装
    开本:B5
  • 页数:311
    字数:
    语种:en
  • 出版社:科学出版社
    出版时间:1900-01-01
  • 所属分类:
  • 定价: ¥150.00元
    售价: ¥150.00元
  • 图书介质:
    按需印刷

  • 购买数量: 件  可供
  • 商品总价:

相同系列
全选

内容介绍

样章试读

用户评论

全部咨询

样章试读
  • 暂时还没有任何用户评论
总计 0 个记录,共 1 页。 第一页 上一页 下一页 最末页

全部咨询(共0条问答)

  • 暂时还没有任何用户咨询内容
总计 0 个记录,共 1 页。 第一页 上一页 下一页 最末页
用户名: 匿名用户
E-mail:
咨询内容:

目录

  • Contents
    Chapter 1 Introduction 1
    1.1 Feature Extraction 1
    1.1.1 PCA and Subspace Tracking 1
    1.1.2 PCA Neural Networks 3
    1.1.3 Extension or Generalization of PCA 4
    1.2 Basis for Subspace Tracking 5
    1.2.1 Concept of Subspace 5
    1.2.2 Subspace Tracking Method 8
    1.3 Main Features of This Book 10
    1.4 Organization of This Book 11
    References 13
    Chapter 2 Matrix Analysis Basics 19
    2.1 Introduction 19
    2.2 Singular Value Decomposition 20
    2.2.1 Theorem and Uniqueness of SVD 20
    2.2.2 Properties of SVD 22
    2.3 Eigenvalue Decomposition 24
    2.3.1 Eigenvalue Problem and Eigen Equation 24
    2.3.2 Eigenvalue and Eigenvector 25
    2.3.3 Eigenvalue Decomposition of Hermitian Matrix 29
    2.3.4 Generalized Eigenvalue Decomposition 31
    2.4 Rayleigh Quotient and Its Characteristics 34
    2.4.1 Rayleigh Quotient 35
    2.4.2 Gradient and Conjugate Gradient Algorithm for RQ 35
    2.4.3 Generalized Rayleigh Quotient 37
    2.5 Matrix Analysis 38
    2.5.1 Differential and Integral of Matrix with Respect to Scalar 38
    2.5.2 Gradient of Real Function with Respect to Real Vector 39
    2.5.3 Gradient Matrix of Real Function 40
    2.5.4 Gradient Matrix of Trace Function 42
    2.5.5 Gradient Matrix of Determinant 43
    2.5.6 Hessian Matrix 44
    2.6 Summary 45
    References 45
    Chapter 3 Neural Networks for Principal Component Analysis 47
    3.1 Introduction 47
    3.2 Review of Neural Based PCA algorithms 48
    3.3 Neural based PCA Algorithms Foundation 48
    3.3.1 Hebbian Learning Rule 48
    3.3.2 Oja's Learning Rule 50
    3.4 Hebbian/Anti-Hebbian Rule based Principal Component Analysis 51
    3.4.1 Subspace Learning Algorithms 52
    3.4.2 Generalized Hebbian Algorithm 53
    3.4.3 Learning Machine for Adaptive Feature Extraction via PCA 54
    3.4.4 The Dot-Product-Decorrelation Algorithm 54
    3.4.5 Anti-Hebbian Rule based Principal Component Analysis 54
    3.5 Least Mean Squared Error based Principal Component Analysis 57
    3.5.1 Least Mean Square Error Reconstruction Algorithm 58
    3.5.2 Projection Approximation Subspace Tracking Algorithm 58
    3.5.3 Robust RLS Algorithm 59
    3.6 Optimization based Principal Component Analysis 60
    3.6.1 Novel Information Criterion Algorithm 60
    3.6.2 Coupled Principal Component Analysis 61
    3.7 Nonlinear Principal Component Analysis 63
    3.7.1 Kernel Principal Component Analysis 63
    3.7.2 Robust/Nonlinear Principal Component Analysis 65
    3.7.3 Autoassociative Network based Nonlinear PCA 67
    3.8 Other PCA or Extensions of PCA 68
    3.9 Summary 70
    References 70
    Chapter 4 Neural Networks for Minor Component Analysis 75
    4.1 Introduction 75
    4.2 Review of Neural Network Based MCA Algorithms 76
    4.2.1 Extracting the First Minor Component 77
    4.2.2 Oja's Minor Subspace Analysis 79
    4.2.3 Self-stabilizing MCA 79
    4.2.4 Orthogonal Oja Algorithm 79
    4.2.5 Other MCA Algorithm 80
    4.3 MCA EXIN Linear Neuron 81
    4.3.1 The Sudden Divergence 81
    4.3.2 The Instability Divergence 83
    4.3.3 The Numerical Divergence 83
    4.4 A Novel Self-stabilizing MCA Linear Neurons 83
    4.4.1 A Self-stabilizing Algorithm for Tracking one MC 84
    4.4.2 MS Tracking Algorithm 90
    4.4.3 Computer Simulations 92
    4.5 Total Least Squares Problem Application 97
    4.5.1 A Novel Neural Algorithm for Total Least Squares Filtering 97
    4.5.2 Computer Simulations 104
    4.6 Summary 105
    References 106
    Chapter 5 Dual Purpose for Principal and Minor Component Analysis 111
    5.1 Introduction 111
    5.2 Review of Neural Network Based Dual Purpose Methods 113
    5.2.1 Chen's Unified Stabilization Approach 113
    5.2.2 Hasan's Self-normalizing Dual Systems 114
    5.2.3 Peng's Unified Learning Algorithm to Extract Principal and Minor Components 117
    5.2.4 Manton's Dual Purpose Principal and Minor Component Flow 117
    5.3 A Novel Dual Purpose Method for Principal and Minor Subspace Tracking 119
    5.3.1 Preliminaries 119
    5.3.2 A Novel Information Criterion and Its Landscape 121
    5.3.3 Dual Purpose Subspace Gradient Flow 126
    5.3.4 Global Convergence Analysis 130
    5.3.5 Numerical Simulations 131
    5.4 Another Novel Dual Purpose Algorithm for Principal and Minor Subspace Analysis 138
    5.4.1 The Criterion for PSA and MSA and Its Landscape 138
    5.4.2 Dual Purpose Algorithm for PSA and MSA 141
    5.4.3 Experimental Results 141
    5.5 Summary 145
    References 146
    Chapter 6 Deterministic Discrete Time System for the Analysis of Iterative Algorithms 149
    6.1 Introduction 149
    6.2 Review of Performance Analysis Methods for Neural Network Based PCA Algorithms 150
    6.2.1 Deterministic Continuous-Time System Method 150
    6.2.2 Stochastic Discrete-Time System Method 151
    6.2.3 Lyapunov Function Approach 155
    6.2.4 Deterministic Discrete-Time System Method 155
    6.3 DDT System of a Novel MCA Algorithm 155
    6.3.1 Self-stabilizing MCA Extraction Algorithms 155
    6.3.2 Convergence Analysis via DDT System 156
    6.3.3 Computer Simulations 165
    6.4 DDT System of a Unified PCA and MCA Algorithm 167
    6.4.1 Introduction 167
    6.4.2 A Unified Self-stabilizing Algorithm for PCA and MCA 168
    6.4.3 Convergence Analysis 168
    6.4.4 Computer Simulations 180
    6.5 Summary 182
    References 182
    Chapter 7 Generalized Principal Component Analysis 185
    7.1 Introduction 185
    7.2 Review of Generalized Feature Extraction Algorithm 188
    7.2.1 Mathew's Quasi-Newton Algorithm for Generalized Symmetric Eigenvalue Problem 188
    7.2.2 Self-organizing Algorithms for Generalized Eigen Decomposition 189
    7.2.3 Fast RLS-Like Algorithm for Generalized Eigen Decomposition 190
    7.2.4 Generalized Eigenvector Extraction Algorithm Based on RLS Method 191
    7.2.5 Fast Adaptive Algorithm for the Generalized Symmetric Eigenvalue Problem 194
    7.2.6 Fast Generalized Eigenvector Tracking Based on the Power Method 196
    7.2.7 Generalized Eigenvector Extraction Algorithm Based on Newton Method 198
    7.2.8 Online Algorithms for Extracting Minor Generalized Eigenvector 199
    7.3 A Novel Principal Generalized Eigenvector Extraction Algorithm 201
    7.3.1 Algorithm Description 201
    7.3.2 Convergence Analysis 202
    7.3.3 Computer Simulations 209
    7.4 Novel Multiple GMC Extraction Algorithm 213
    7.4.1 A Weighted Information Criterion 214
    7.4.2 Multiple GMCs Extraction Algorithm 220
    7.4.3 Simulations and Application Experiments 220
    7.5 Summary 223
    References 223
    Chapter 8 Coupled Principal Component Analysis 227
    8.1 Introduction 227
    8.2 Review of Coupled Principal Component Analysis 229
    8.2.1 Moller's Coupled PCA Algorithm 229
    8.2.2 Nguyen's Coupled Generalized Eigen-pairs Extraction Algorithm 230
    8.2.3 Coupled Singular Value Decomposition of a Cross-covariance Matrix 234
    8.3 Unified and Coupled Algorithm for Minor and Principal Eigen-pair Extraction 235
    8.3.1 Coupled Dynamical System 235
    8.3.2 The Unified and Coupled Learning Algorithms 237
    8.3.3 Analysis of Convergence and Self-stabilizing Property 241
    8.3.4 Simulation Experiments 242
    8.4 Adaptive Coupled Generalized Eigen-pairs Extraction Algorithms 248
    8.4.1 A Coupled Generalized System for GMCA and GPCA 248
    8.4.2 Adaptive Implementation of Coupled Generalized Systems 253
    8.4.3 Convergence Analysis 255
    8.4.4 Numerical Examples 261
    8.5 Summary 268
    References 268
    Chapter 9 Singular Feature Extraction and Its Neural Network 270
    9.1 Introduction 270
    9.2 Review of Cross-correlation Feature Method 272
    9.2.1 Cross-correlation Neural Networks Model and Deflation Method 272
    9.2.2 Parallel SVD Learning Algorithms on Double Stiefel Manifold 275
    9.2.3 Double Generalized Hebbian Algorithm (DGHA) for SVD 277
    9.2.4 Cross-associative Neural Network for SVD(CANN) 277
    9.2.5 Coupled SVD of a Cross-Covariance Matrix 279
    9.3 An Effective Neural Learning Algorithm for Extracting Cross- Correlation Feature 282
    9.3.1 Preliminaries 283
    9.3.2 Novel Information Criterion Formulation for PSS 284
    9.3.3 Adaptive Learning Algorithm and Performance Analysis 291
    9.3.4 Computer Simulations 294
    9.4 Coupled Cross-correlation Neural Network Algorithm for Principal Singular Triplet Extraction of a Cross-Covariance Matrix 299
    9.4.1 A Novel Information Criterion and A Coupled System 299
    9.4.2 Online Implementation and Stability Analysis 302
    9.4.3 Simulation Experiments 303
    9.5 Summary 307
    References 308
帮助中心
公司简介
联系我们
常见问题
新手上路
发票制度
积分说明
购物指南
配送方式
配送时间及费用
配送查询说明
配送范围
快递查询
售后服务
退换货说明
退换货流程
投诉或建议
版权声明
经营资质
营业执照
出版社经营许可证