Abstract
Principal component analysis (PCA) is a technique commonly used for fault detection and classification (FDC) in highly automated manufacturing. Because PCA model building and adaptation rely on eigenvalue decomposition of parameter covariance matrices, the computational effort scales cubically with the number of input variables. As PCA-based FDC applications monitor systems with more variables, or trace data with faster sampling rates, the size of the PCA problems can grow faster than the FDC system infrastructure will allow. This paper introduces an algorithm that greatly reduces the overall size of the PCA problem by breaking the analysis of a large number of variables into multiple analyses of smaller uncorrelated blocks of variables. Summary statistics from these subanalyses are then combined into results that are comparable to what is generated from the complete PCA of all variables together.
Original language | English |
---|---|
Article number | 5458324 |
Pages (from-to) | 201-209 |
Number of pages | 9 |
Journal | IEEE Transactions on Semiconductor Manufacturing |
Volume | 23 |
Issue number | 2 |
DOIs | |
State | Published - 1 May 2010 |
Externally published | Yes |
Keywords
- Combined index
- Computation time
- Fault detection
- Large scale systems
- Multivariate statistical process control (MSPC)
- Principal component analysis (PCA)
- Recursive PCA
ASJC Scopus subject areas
- Electronic, Optical and Magnetic Materials
- Condensed Matter Physics
- Industrial and Manufacturing Engineering
- Electrical and Electronic Engineering