Análise de componentes principais: diferenças entre revisões
Conteúdo apagado Conteúdo adicionado
m adicionou Categoria:Decomposição em Valores Singulares usando HotCat |
software + ver tambem |
||
Linha 423:
While PCA finds the mathematically optimal method (as in minimizing the squared error), it is sensitive to [[outlier]]s in the data that produce large errors PCA tries to avoid. It therefore is common practice to remove outliers before computing PCA. However, in some contexts, outliers can be difficult to identify. For example in [[data mining]] algorithms like [[correlation clustering]], the assignment of points to clusters and outliers is not known beforehand. A recently proposed generalization of PCA <ref>{{cite doi | 10.1007/978-3-540-69497-7_27 }}</ref> based on a '''Weighted PCA''' increases robustness by assigning different weights to data objects based on their estimated relevancy.
== Software/source code ==▼
{{externallinks|date=November 2011}}▼
*[http://code.google.com/p/cornell-spectrum-imager/wiki/Home Cornell Spectrum Imager] - An open-source toolset built on ImageJ. Enables quick easy PCA analysis for 3D datacubes.▼
* In the [[NAG Numerical Library|NAG Library]], principal components analysis is implemented via the <code>g03aa</code> routine (available in both the Fortran<ref>{{ cite web | last = The Numerical Algorithms Group | first = | title = NAG Library Routine Document: nagf_mv_prin_comp (g03aaf) | date = | work = NAG Library Manual, Mark 23 | url = http://www.nag.co.uk/numeric/fl/nagdoc_fl23/pdf/G03/g03aaf.pdf | accessdate = 2012-02-16 }}</ref> and the C<ref>{{ cite web | last = The Numerical Algorithms Group | first = | title = NAG Library Routine Document: nag_mv_prin_comp (g03aac) | date = | work = NAG Library Manual, Mark 9 | url = http://www.nag.co.uk/numeric/CL/nagdoc_cl09/pdf/G03/g03aac.pdf | accessdate = 2012-02-16 }}</ref> versions of the Library).▼
* in [[GNU Octave|Octave]], a free software computational environment mostly compatible with MATLAB, the function [http://octave.sourceforge.net/statistics/function/princomp.html <code>princomp</code>] gives the principal component.▼
* in the [[free software|free]] statistical package [[R (programming language)|R]], the functions [http://stat.ethz.ch/R-manual/R-patched/library/stats/html/princomp.html <code>princomp</code>] and [http://stat.ethz.ch/R-manual/R-patched/library/stats/html/prcomp.html <code>prcomp</code>] can be used for principal component analysis; <code>prcomp</code> uses [[singular value decomposition]] which generally gives better numerical accuracy. Recently there has been an explosion in implementations of principal component analysis in various R packages, generally in packages for specific purposes. For a more complete list, see here: [http://cran.r-project.org/web/views/Multivariate.html].▼
* [[Weka (machine learning)|Weka]] computes principal components ([http://weka.sourceforge.net/doc/weka/attributeSelection/PrincipalComponents.html javadoc]).▼
== See also ==▼
<div style="-moz-column-count:2; column-count:2;">▼
* [[Multilinear principal component analysis|Multilinear PCA]]▼
* [[Correspondence analysis]]▼
* [[Eigenface]]▼
* [[v:Exploratory factor analysis|Exploratory factor analysis]] (Wikiversity)▼
* [[Geometric data analysis]]▼
* [[Factorial code]]▼
* [[Independent component analysis]]▼
* [[Kernel PCA]]▼
* [[Matrix decomposition]]▼
* [[Nonlinear dimensionality reduction]]▼
* [[Oja's rule]]▼
* [[Principal component regression]]▼
* [[wikibooks:Statistics/Multivariate Data Analysis/Principal Component Analysis|Principal component analysis]] (Wikibooks)▼
* [[Sparse PCA]]▼
* [[Transform coding]]▼
* [[Weighted least squares]]▼
* [[Dynamic mode decomposition]]▼
* [[Low-rank approximation]]▼
</div>▼
Linha 479 ⟶ 432:
[[Category:Singular value decomposition]]
[[Category:Data analysis]]
[[Categoria:Decomposição em Valores Singulares]]▼
--!>
▲{{externallinks|date=November 2011}}
▲*
▲*
* [[OpenCV]]
▲*
▲*[http://code.google.com/p/cornell-spectrum-imager/wiki/Home Cornell Spectrum Imager] -
▲* [[Weka (machine learning)|Weka]]
▲<div style="-moz-column-count:2; column-count:2;">
▲* ''[[:en:Eigenface|Eigenface]]''
* ''[[:en:Point distribution model|Point distribution model]]'' (Mais aplicações de PCA à morfometria e visão computacional)
▲* ''[[:en:Correspondence analysis|Correspondence analysis]]''
▲* [[v:Exploratory factor analysis|Exploratory factor analysis]] (Wikiversity)
▲* ''[[:en:Geometric data analysis|Geometric data analysis]]''
▲* ''[[:en:Factorial code|Factorial code]]''
▲* ''[[:en:Independent component analysis|Independent component analysis]]''
▲* ''[[:en:Kernel PCA|Kernel PCA]]''
▲* ''[[:en:Matrix decomposition|Matrix decomposition]]''
▲* ''[[:en:Nonlinear dimensionality reduction|Nonlinear dimensionality reduction]]''
▲* ''[[:en:Oja's rule|Oja's rule]]''
▲* '[[:en:Principal component regression|Principal component regression]]'
▲* [[wikibooks:Statistics/Multivariate Data Analysis/Principal Component Analysis|Principal component analysis]] (Wikibooks)
* ''[[:en:Decomposição em Valores Singulares|Decomposição em Valores Singulares]]''
▲* ''[[:en:Sparse PCA|Sparse PCA]]''
▲* ''[[:en:Transform coding|Transform coding]]''
▲* ''[[:en:Weighted least squares|Weighted least squares]]''
▲* ''[[:en:Dynamic mode decomposition|Dynamic mode decomposition]]''
▲* ''[[:en:Low-rank approximation|Low-rank approximation]]''
▲</div>
▲[[Categoria:Decomposição em Valores Singulares]]
|