SlideShare une entreprise Scribd logo
1  sur  27
1
Principal Components Analysis ( PCA)
• An exploratory technique used to reduce the
dimensionality of the data set to 2D or 3D
• Can be used to:
– Reduce number of dimensions in data
– Find patterns in high-dimensional data
– Visualize data of high dimensionality
• Example applications:
– Face recognition
– Image compression
– Gene expression analysis
2
Principal Components Analysis Ideas (
PCA)
• Does the data set ‘span’ the whole of d
dimensional space?
• For a matrix of m samples x n genes, create a new
covariance matrix of size n x n.
• Transform some large number of variables into a
smaller number of uncorrelated variables called
principal components (PCs).
• developed to capture as much of the variation in
data as possible
3
X1
X2
Principal Component Analysis
See online tutorials such as
http://www.cs.otago.ac.nz/cosc453/student_
tutorials/principal_components.pdf
Note: Y1 is
the first
eigen vector,
Y2 is the
second. Y2
ignorable.
Y1
Y2
x
x
x x
x
x
x
x
x
x
x
x
x x
x
x
x
x
x x
x
x
x
x
x
Key observation:
variance = largest!
4
Principal Component Analysis: one
attribute first
• Question: how much
spread is in the data
along the axis?
(distance to the mean)
• Variance=Standard
deviation^2
Temperature
42
40
24
30
15
18
15
30
15
30
35
30
40
30
)
1
(
)
(
1
2
2





n
X
X
s
n
i
i
5
Now consider two dimensions
X=Temperature Y=Humidity
40 90
40 90
40 90
30 90
15 70
15 70
15 70
30 90
15 70
30 70
30 70
30 90
40 70
)
1
(
)
)(
(
)
,
cov( 1






n
Y
Y
X
X
Y
X
n
i
i
i
Covariance: measures the
correlation between X and Y
• cov(X,Y)=0: independent
•Cov(X,Y)>0: move same dir
•Cov(X,Y)<0: move oppo dir
6
More than two attributes: covariance
matrix
• Contains covariance values between all
possible dimensions (=attributes):
• Example for three attributes (x,y,z):
))
,
cov(
|
( j
i
ij
ij
nxn
Dim
Dim
c
c
C 












)
,
cov(
)
,
cov(
)
,
cov(
)
,
cov(
)
,
cov(
)
,
cov(
)
,
cov(
)
,
cov(
)
,
cov(
z
z
y
z
x
z
z
y
y
y
x
y
z
x
y
x
x
x
C
7
Eigenvalues & eigenvectors
• Vectors x having same direction as Ax are called
eigenvectors of A (A is an n by n matrix).
• In the equation Ax=x,  is called an eigenvalue of A.


































2
3
4
8
12
2
3
1
2
3
2
x
x
8
Eigenvalues & eigenvectors
• Ax=x  (A-I)x=0
• How to calculate x and :
– Calculate det(A-I), yields a polynomial
(degree n)
– Determine roots to det(A-I)=0, roots are
eigenvalues 
– Solve (A- I) x=0 for each  to obtain
eigenvectors x
9
Principal components
• 1. principal component (PC1)
– The eigenvalue with the largest absolute value will
indicate that the data have the largest variance
along its eigenvector, the direction along which
there is greatest variation
• 2. principal component (PC2)
– the direction with maximum variation left in data,
orthogonal to the 1. PC
• In general, only few directions manage to
capture most of the variability in the data.
10
Steps of PCA
• Let be the mean
vector (taking the mean
of all rows)
• Adjust the original data
by the mean
X’ = X –
• Compute the
covariance matrix C of
adjusted X
• Find the eigenvectors
and eigenvalues of C.
X
• For matrix C, vectors e
(=column vector) having
same direction as Ce :
– eigenvectors of C is e such
that Ce=e,
–  is called an eigenvalue of
C.
• Ce=e  (C-I)e=0
– Most data mining
packages do this for you.
X
11
Eigenvalues
• Calculate eigenvalues  and eigenvectors x for
covariance matrix:
– Eigenvalues j are used for calculation of [% of total
variance] (Vj) for each component j:

 




n
x
x
n
x
x
j
j n
V
1
1
100 


12
Principal components - Variance
0
5
10
15
20
25
PC1 PC2 PC3 PC4 PC5 PC6 PC7 PC8 PC9 PC10
Variance
(%)
13
Transformed Data
• Eigenvalues j corresponds to variance on each
component j
• Thus, sort by j
• Take the first p eigenvectors ei; where p is the number of
top eigenvalues
• These are the directions with the largest variances














































n
in
i
i
p
ip
i
i
x
x
x
x
x
x
e
e
e
y
y
y
...
...
...
2
2
1
1
2
1
2
1
14
An Example
X1 X2 X1' X2'
19 63 -5.1 9.25
39 74 14.9 20.25
30 87 5.9 33.25
30 23 5.9 -30.75
15 35 -9.1 -18.75
15 43 -9.1 -10.75
15 32 -9.1 -21.75
0
10
20
30
40
50
60
70
80
90
100
0 10 20 30 40 50
Series1
Mean1=24.1
Mean2=53.8
-40
-30
-20
-10
0
10
20
30
40
-15 -10 -5 0 5 10 15 20
Series1
15
Covariance Matrix
• C=
• Using MATLAB, we find out:
– Eigenvectors:
– e1=(-0.98,-0.21), 1=51.8
– e2=(0.21,-0.98), 2=560.2
– Thus the second eigenvector is more important!
75 106
106 482
16
If we only keep one dimension: e2
• We keep the dimension
of e2=(0.21,-0.98)
• We can obtain the final
data as
  2
1
2
1
*
98
.
0
*
21
.
0
98
.
0
21
.
0 i
i
i
i
i x
x
x
x
y 











-0.5
-0.4
-0.3
-0.2
-0.1
0
0.1
0.2
0.3
0.4
0.5
-40 -20 0 20 40
yi
-10.14
-16.72
-31.35
31.374
16.464
8.624
19.404
-17.63
17
18
19
20
PCA –> Original Data
• Retrieving old data (e.g. in data compression)
– RetrievedRowData=(RowFeatureVectorT x
FinalData)+OriginalMean
– Yields original data using the chosen components
21
Principal components
• General about principal components
– summary variables
– linear combinations of the original variables
– uncorrelated with each other
– capture as much of the original variance as
possible
22
Applications – Gene expression analysis
• Reference: Raychaudhuri et al. (2000)
• Purpose: Determine core set of conditions for useful
gene comparison
• Dimensions: conditions, observations: genes
• Yeast sporulation dataset (7 conditions, 6118 genes)
• Result: Two components capture most of variability
(90%)
• Issues: uneven data intervals, data dependencies
• PCA is common prior to clustering
• Crisp clustering questioned : genes may correlate with
multiple clusters
• Alternative: determination of gene’s closest neighbours
23
Two Way (Angle) Data Analysis
Genes 103–104
Samples
10
1
-10
2
Gene expression
matrix
Sample space analysis Gene space analysis
Conditions 101–102
Genes
10
3
-10
4
Gene expression
matrix
24
PCA - example
25
PCA on all Genes
Leukemia data, precursor B and T
Plot of 34 patients, dimension of 8973 genes
reduced to 2
26
PCA on 100 top significant genes
Leukemia data, precursor B and T
Plot of 34 patients, dimension of 100 genes
reduced to 2
27
PCA of genes (Leukemia data)
Plot of 8973 genes, dimension of 34 patients reduced to 2

Contenu connexe

Tendances

Tendances (20)

Feature selection
Feature selectionFeature selection
Feature selection
 
Neural Networks: Principal Component Analysis (PCA)
Neural Networks: Principal Component Analysis (PCA)Neural Networks: Principal Component Analysis (PCA)
Neural Networks: Principal Component Analysis (PCA)
 
Data Science - Part III - EDA & Model Selection
Data Science - Part III - EDA & Model SelectionData Science - Part III - EDA & Model Selection
Data Science - Part III - EDA & Model Selection
 
Machine Learning With Logistic Regression
Machine Learning  With Logistic RegressionMachine Learning  With Logistic Regression
Machine Learning With Logistic Regression
 
Fuzzy Clustering(C-means, K-means)
Fuzzy Clustering(C-means, K-means)Fuzzy Clustering(C-means, K-means)
Fuzzy Clustering(C-means, K-means)
 
Parametric & Non-Parametric Machine Learning (Supervised ML)
Parametric & Non-Parametric Machine Learning (Supervised ML)Parametric & Non-Parametric Machine Learning (Supervised ML)
Parametric & Non-Parametric Machine Learning (Supervised ML)
 
Lect5 principal component analysis
Lect5 principal component analysisLect5 principal component analysis
Lect5 principal component analysis
 
Implement principal component analysis (PCA) in python from scratch
Implement principal component analysis (PCA) in python from scratchImplement principal component analysis (PCA) in python from scratch
Implement principal component analysis (PCA) in python from scratch
 
Linear regression in machine learning
Linear regression in machine learningLinear regression in machine learning
Linear regression in machine learning
 
DataEngConf: Feature Extraction: Modern Questions and Challenges at Google
DataEngConf: Feature Extraction: Modern Questions and Challenges at GoogleDataEngConf: Feature Extraction: Modern Questions and Challenges at Google
DataEngConf: Feature Extraction: Modern Questions and Challenges at Google
 
Data Mining: Concepts and Techniques (3rd ed.) - Chapter 3 preprocessing
Data Mining:  Concepts and Techniques (3rd ed.)- Chapter 3 preprocessingData Mining:  Concepts and Techniques (3rd ed.)- Chapter 3 preprocessing
Data Mining: Concepts and Techniques (3rd ed.) - Chapter 3 preprocessing
 
Distance function
Distance functionDistance function
Distance function
 
Machine Learning with Decision trees
Machine Learning with Decision treesMachine Learning with Decision trees
Machine Learning with Decision trees
 
3.7 outlier analysis
3.7 outlier analysis3.7 outlier analysis
3.7 outlier analysis
 
Introduction to object detection
Introduction to object detectionIntroduction to object detection
Introduction to object detection
 
Introduction to Linear Discriminant Analysis
Introduction to Linear Discriminant AnalysisIntroduction to Linear Discriminant Analysis
Introduction to Linear Discriminant Analysis
 
Machine learning session4(linear regression)
Machine learning   session4(linear regression)Machine learning   session4(linear regression)
Machine learning session4(linear regression)
 
Support Vector Machines ( SVM )
Support Vector Machines ( SVM ) Support Vector Machines ( SVM )
Support Vector Machines ( SVM )
 
Machine learning clustering
Machine learning clusteringMachine learning clustering
Machine learning clustering
 
Support Vector Machines for Classification
Support Vector Machines for ClassificationSupport Vector Machines for Classification
Support Vector Machines for Classification
 

Similaire à pca.ppt

2012 mdsp pr09 pca lda
2012 mdsp pr09 pca lda2012 mdsp pr09 pca lda
2012 mdsp pr09 pca lda
nozomuhamada
 
Lecture32
Lecture32Lecture32
Lecture32
zukun
 
Dimension Reduction Introduction & PCA.pptx
Dimension Reduction Introduction & PCA.pptxDimension Reduction Introduction & PCA.pptx
Dimension Reduction Introduction & PCA.pptx
RohanBorgalli
 

Similaire à pca.ppt (20)

DimensionalityReduction.pptx
DimensionalityReduction.pptxDimensionalityReduction.pptx
DimensionalityReduction.pptx
 
5 DimensionalityReduction.pdf
5 DimensionalityReduction.pdf5 DimensionalityReduction.pdf
5 DimensionalityReduction.pdf
 
machine learning.pptx
machine learning.pptxmachine learning.pptx
machine learning.pptx
 
pcappt-140121072949-phpapp01.pptx
pcappt-140121072949-phpapp01.pptxpcappt-140121072949-phpapp01.pptx
pcappt-140121072949-phpapp01.pptx
 
Understandig PCA and LDA
Understandig PCA and LDAUnderstandig PCA and LDA
Understandig PCA and LDA
 
Principal Components Analysis, Calculation and Visualization
Principal Components Analysis, Calculation and VisualizationPrincipal Components Analysis, Calculation and Visualization
Principal Components Analysis, Calculation and Visualization
 
Lecture slides week14-15
Lecture slides week14-15Lecture slides week14-15
Lecture slides week14-15
 
Principal Component Analysis PCA
Principal Component Analysis PCAPrincipal Component Analysis PCA
Principal Component Analysis PCA
 
Daamen r 2010scwr-cpaper
Daamen r 2010scwr-cpaperDaamen r 2010scwr-cpaper
Daamen r 2010scwr-cpaper
 
2012 mdsp pr09 pca lda
2012 mdsp pr09 pca lda2012 mdsp pr09 pca lda
2012 mdsp pr09 pca lda
 
Support Vector Machines Simply
Support Vector Machines SimplySupport Vector Machines Simply
Support Vector Machines Simply
 
Class9_PCA_final.ppt
Class9_PCA_final.pptClass9_PCA_final.ppt
Class9_PCA_final.ppt
 
principle component analysis.pptx
principle component analysis.pptxprinciple component analysis.pptx
principle component analysis.pptx
 
module 1.pdf
module 1.pdfmodule 1.pdf
module 1.pdf
 
Principal component analysis and lda
Principal component analysis and ldaPrincipal component analysis and lda
Principal component analysis and lda
 
ML ALL in one (1).pdf
ML ALL in one (1).pdfML ALL in one (1).pdf
ML ALL in one (1).pdf
 
Sp18_P1.pptx
Sp18_P1.pptxSp18_P1.pptx
Sp18_P1.pptx
 
Lecture32
Lecture32Lecture32
Lecture32
 
Dimension Reduction Introduction & PCA.pptx
Dimension Reduction Introduction & PCA.pptxDimension Reduction Introduction & PCA.pptx
Dimension Reduction Introduction & PCA.pptx
 
Optimization tutorial
Optimization tutorialOptimization tutorial
Optimization tutorial
 

Plus de ShivareddyGangam

Student Voting Application for Election – Using SMS (1).pptx
Student Voting Application for Election – Using SMS (1).pptxStudent Voting Application for Election – Using SMS (1).pptx
Student Voting Application for Election – Using SMS (1).pptx
ShivareddyGangam
 
Unit24_TopologicalSort (2).ppt
Unit24_TopologicalSort (2).pptUnit24_TopologicalSort (2).ppt
Unit24_TopologicalSort (2).ppt
ShivareddyGangam
 
artificialintelligencea-200326090832.pdf
artificialintelligencea-200326090832.pdfartificialintelligencea-200326090832.pdf
artificialintelligencea-200326090832.pdf
ShivareddyGangam
 
Software Project Risks Management (1).pdf
Software Project Risks Management (1).pdfSoftware Project Risks Management (1).pdf
Software Project Risks Management (1).pdf
ShivareddyGangam
 
OS-Final-Transform-Manual-Testing-Processes-to-incorporate-Automatio....pptx
OS-Final-Transform-Manual-Testing-Processes-to-incorporate-Automatio....pptxOS-Final-Transform-Manual-Testing-Processes-to-incorporate-Automatio....pptx
OS-Final-Transform-Manual-Testing-Processes-to-incorporate-Automatio....pptx
ShivareddyGangam
 
machinelearningengineeringslideshare-160909192132 (1).pdf
machinelearningengineeringslideshare-160909192132 (1).pdfmachinelearningengineeringslideshare-160909192132 (1).pdf
machinelearningengineeringslideshare-160909192132 (1).pdf
ShivareddyGangam
 

Plus de ShivareddyGangam (20)

Student Voting Application for Election – Using SMS (1).pptx
Student Voting Application for Election – Using SMS (1).pptxStudent Voting Application for Election – Using SMS (1).pptx
Student Voting Application for Election – Using SMS (1).pptx
 
Project Management (2).pdf
Project Management (2).pdfProject Management (2).pdf
Project Management (2).pdf
 
The Product and Process(1).pdf
The Product and Process(1).pdfThe Product and Process(1).pdf
The Product and Process(1).pdf
 
Unit24_TopologicalSort (2).ppt
Unit24_TopologicalSort (2).pptUnit24_TopologicalSort (2).ppt
Unit24_TopologicalSort (2).ppt
 
Lecture1_jps (1).ppt
Lecture1_jps (1).pptLecture1_jps (1).ppt
Lecture1_jps (1).ppt
 
2 semai.pptx
2 semai.pptx2 semai.pptx
2 semai.pptx
 
artificialintelligencea-200326090832.pdf
artificialintelligencea-200326090832.pdfartificialintelligencea-200326090832.pdf
artificialintelligencea-200326090832.pdf
 
Software Project Risks Management (1).pdf
Software Project Risks Management (1).pdfSoftware Project Risks Management (1).pdf
Software Project Risks Management (1).pdf
 
Introduction (1).pdf
Introduction (1).pdfIntroduction (1).pdf
Introduction (1).pdf
 
Unit 3 (1) (1).pdf
Unit 3 (1) (1).pdfUnit 3 (1) (1).pdf
Unit 3 (1) (1).pdf
 
Unit 1 (1).pdf
Unit 1 (1).pdfUnit 1 (1).pdf
Unit 1 (1).pdf
 
Strassen.ppt
Strassen.pptStrassen.ppt
Strassen.ppt
 
Notion of Algorithms.pdf
Notion of Algorithms.pdfNotion of Algorithms.pdf
Notion of Algorithms.pdf
 
11_Automated_Testing.ppt
11_Automated_Testing.ppt11_Automated_Testing.ppt
11_Automated_Testing.ppt
 
OS-Final-Transform-Manual-Testing-Processes-to-incorporate-Automatio....pptx
OS-Final-Transform-Manual-Testing-Processes-to-incorporate-Automatio....pptxOS-Final-Transform-Manual-Testing-Processes-to-incorporate-Automatio....pptx
OS-Final-Transform-Manual-Testing-Processes-to-incorporate-Automatio....pptx
 
Project Management.pdf
Project Management.pdfProject Management.pdf
Project Management.pdf
 
PP
PPPP
PP
 
chapter2.ppt
chapter2.pptchapter2.ppt
chapter2.ppt
 
machinelearningengineeringslideshare-160909192132 (1).pdf
machinelearningengineeringslideshare-160909192132 (1).pdfmachinelearningengineeringslideshare-160909192132 (1).pdf
machinelearningengineeringslideshare-160909192132 (1).pdf
 
intelligent agent (1).pptx
intelligent agent (1).pptxintelligent agent (1).pptx
intelligent agent (1).pptx
 

Dernier

BASLIQ CURRENT LOOKBOOK LOOKBOOK(1) (1).pdf
BASLIQ CURRENT LOOKBOOK  LOOKBOOK(1) (1).pdfBASLIQ CURRENT LOOKBOOK  LOOKBOOK(1) (1).pdf
BASLIQ CURRENT LOOKBOOK LOOKBOOK(1) (1).pdf
SoniaTolstoy
 
Beyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global ImpactBeyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global Impact
PECB
 
1029 - Danh muc Sach Giao Khoa 10 . pdf
1029 -  Danh muc Sach Giao Khoa 10 . pdf1029 -  Danh muc Sach Giao Khoa 10 . pdf
1029 - Danh muc Sach Giao Khoa 10 . pdf
QucHHunhnh
 
The basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptxThe basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptx
heathfieldcps1
 

Dernier (20)

INDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptx
INDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptxINDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptx
INDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptx
 
BASLIQ CURRENT LOOKBOOK LOOKBOOK(1) (1).pdf
BASLIQ CURRENT LOOKBOOK  LOOKBOOK(1) (1).pdfBASLIQ CURRENT LOOKBOOK  LOOKBOOK(1) (1).pdf
BASLIQ CURRENT LOOKBOOK LOOKBOOK(1) (1).pdf
 
Web & Social Media Analytics Previous Year Question Paper.pdf
Web & Social Media Analytics Previous Year Question Paper.pdfWeb & Social Media Analytics Previous Year Question Paper.pdf
Web & Social Media Analytics Previous Year Question Paper.pdf
 
Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111
 
Beyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global ImpactBeyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global Impact
 
Software Engineering Methodologies (overview)
Software Engineering Methodologies (overview)Software Engineering Methodologies (overview)
Software Engineering Methodologies (overview)
 
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
 
Mattingly "AI & Prompt Design: Structured Data, Assistants, & RAG"
Mattingly "AI & Prompt Design: Structured Data, Assistants, & RAG"Mattingly "AI & Prompt Design: Structured Data, Assistants, & RAG"
Mattingly "AI & Prompt Design: Structured Data, Assistants, & RAG"
 
1029 - Danh muc Sach Giao Khoa 10 . pdf
1029 -  Danh muc Sach Giao Khoa 10 . pdf1029 -  Danh muc Sach Giao Khoa 10 . pdf
1029 - Danh muc Sach Giao Khoa 10 . pdf
 
Mattingly "AI & Prompt Design: The Basics of Prompt Design"
Mattingly "AI & Prompt Design: The Basics of Prompt Design"Mattingly "AI & Prompt Design: The Basics of Prompt Design"
Mattingly "AI & Prompt Design: The Basics of Prompt Design"
 
Sanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdfSanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdf
 
A Critique of the Proposed National Education Policy Reform
A Critique of the Proposed National Education Policy ReformA Critique of the Proposed National Education Policy Reform
A Critique of the Proposed National Education Policy Reform
 
APM Welcome, APM North West Network Conference, Synergies Across Sectors
APM Welcome, APM North West Network Conference, Synergies Across SectorsAPM Welcome, APM North West Network Conference, Synergies Across Sectors
APM Welcome, APM North West Network Conference, Synergies Across Sectors
 
General AI for Medical Educators April 2024
General AI for Medical Educators April 2024General AI for Medical Educators April 2024
General AI for Medical Educators April 2024
 
Advance Mobile Application Development class 07
Advance Mobile Application Development class 07Advance Mobile Application Development class 07
Advance Mobile Application Development class 07
 
The basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptxThe basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptx
 
Accessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impactAccessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impact
 
Measures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and ModeMeasures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and Mode
 
Key note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdfKey note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdf
 
Paris 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activityParis 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activity
 

pca.ppt

  • 1. 1 Principal Components Analysis ( PCA) • An exploratory technique used to reduce the dimensionality of the data set to 2D or 3D • Can be used to: – Reduce number of dimensions in data – Find patterns in high-dimensional data – Visualize data of high dimensionality • Example applications: – Face recognition – Image compression – Gene expression analysis
  • 2. 2 Principal Components Analysis Ideas ( PCA) • Does the data set ‘span’ the whole of d dimensional space? • For a matrix of m samples x n genes, create a new covariance matrix of size n x n. • Transform some large number of variables into a smaller number of uncorrelated variables called principal components (PCs). • developed to capture as much of the variation in data as possible
  • 3. 3 X1 X2 Principal Component Analysis See online tutorials such as http://www.cs.otago.ac.nz/cosc453/student_ tutorials/principal_components.pdf Note: Y1 is the first eigen vector, Y2 is the second. Y2 ignorable. Y1 Y2 x x x x x x x x x x x x x x x x x x x x x x x x x Key observation: variance = largest!
  • 4. 4 Principal Component Analysis: one attribute first • Question: how much spread is in the data along the axis? (distance to the mean) • Variance=Standard deviation^2 Temperature 42 40 24 30 15 18 15 30 15 30 35 30 40 30 ) 1 ( ) ( 1 2 2      n X X s n i i
  • 5. 5 Now consider two dimensions X=Temperature Y=Humidity 40 90 40 90 40 90 30 90 15 70 15 70 15 70 30 90 15 70 30 70 30 70 30 90 40 70 ) 1 ( ) )( ( ) , cov( 1       n Y Y X X Y X n i i i Covariance: measures the correlation between X and Y • cov(X,Y)=0: independent •Cov(X,Y)>0: move same dir •Cov(X,Y)<0: move oppo dir
  • 6. 6 More than two attributes: covariance matrix • Contains covariance values between all possible dimensions (=attributes): • Example for three attributes (x,y,z): )) , cov( | ( j i ij ij nxn Dim Dim c c C              ) , cov( ) , cov( ) , cov( ) , cov( ) , cov( ) , cov( ) , cov( ) , cov( ) , cov( z z y z x z z y y y x y z x y x x x C
  • 7. 7 Eigenvalues & eigenvectors • Vectors x having same direction as Ax are called eigenvectors of A (A is an n by n matrix). • In the equation Ax=x,  is called an eigenvalue of A.                                   2 3 4 8 12 2 3 1 2 3 2 x x
  • 8. 8 Eigenvalues & eigenvectors • Ax=x  (A-I)x=0 • How to calculate x and : – Calculate det(A-I), yields a polynomial (degree n) – Determine roots to det(A-I)=0, roots are eigenvalues  – Solve (A- I) x=0 for each  to obtain eigenvectors x
  • 9. 9 Principal components • 1. principal component (PC1) – The eigenvalue with the largest absolute value will indicate that the data have the largest variance along its eigenvector, the direction along which there is greatest variation • 2. principal component (PC2) – the direction with maximum variation left in data, orthogonal to the 1. PC • In general, only few directions manage to capture most of the variability in the data.
  • 10. 10 Steps of PCA • Let be the mean vector (taking the mean of all rows) • Adjust the original data by the mean X’ = X – • Compute the covariance matrix C of adjusted X • Find the eigenvectors and eigenvalues of C. X • For matrix C, vectors e (=column vector) having same direction as Ce : – eigenvectors of C is e such that Ce=e, –  is called an eigenvalue of C. • Ce=e  (C-I)e=0 – Most data mining packages do this for you. X
  • 11. 11 Eigenvalues • Calculate eigenvalues  and eigenvectors x for covariance matrix: – Eigenvalues j are used for calculation of [% of total variance] (Vj) for each component j:        n x x n x x j j n V 1 1 100   
  • 12. 12 Principal components - Variance 0 5 10 15 20 25 PC1 PC2 PC3 PC4 PC5 PC6 PC7 PC8 PC9 PC10 Variance (%)
  • 13. 13 Transformed Data • Eigenvalues j corresponds to variance on each component j • Thus, sort by j • Take the first p eigenvectors ei; where p is the number of top eigenvalues • These are the directions with the largest variances                                               n in i i p ip i i x x x x x x e e e y y y ... ... ... 2 2 1 1 2 1 2 1
  • 14. 14 An Example X1 X2 X1' X2' 19 63 -5.1 9.25 39 74 14.9 20.25 30 87 5.9 33.25 30 23 5.9 -30.75 15 35 -9.1 -18.75 15 43 -9.1 -10.75 15 32 -9.1 -21.75 0 10 20 30 40 50 60 70 80 90 100 0 10 20 30 40 50 Series1 Mean1=24.1 Mean2=53.8 -40 -30 -20 -10 0 10 20 30 40 -15 -10 -5 0 5 10 15 20 Series1
  • 15. 15 Covariance Matrix • C= • Using MATLAB, we find out: – Eigenvectors: – e1=(-0.98,-0.21), 1=51.8 – e2=(0.21,-0.98), 2=560.2 – Thus the second eigenvector is more important! 75 106 106 482
  • 16. 16 If we only keep one dimension: e2 • We keep the dimension of e2=(0.21,-0.98) • We can obtain the final data as   2 1 2 1 * 98 . 0 * 21 . 0 98 . 0 21 . 0 i i i i i x x x x y             -0.5 -0.4 -0.3 -0.2 -0.1 0 0.1 0.2 0.3 0.4 0.5 -40 -20 0 20 40 yi -10.14 -16.72 -31.35 31.374 16.464 8.624 19.404 -17.63
  • 17. 17
  • 18. 18
  • 19. 19
  • 20. 20 PCA –> Original Data • Retrieving old data (e.g. in data compression) – RetrievedRowData=(RowFeatureVectorT x FinalData)+OriginalMean – Yields original data using the chosen components
  • 21. 21 Principal components • General about principal components – summary variables – linear combinations of the original variables – uncorrelated with each other – capture as much of the original variance as possible
  • 22. 22 Applications – Gene expression analysis • Reference: Raychaudhuri et al. (2000) • Purpose: Determine core set of conditions for useful gene comparison • Dimensions: conditions, observations: genes • Yeast sporulation dataset (7 conditions, 6118 genes) • Result: Two components capture most of variability (90%) • Issues: uneven data intervals, data dependencies • PCA is common prior to clustering • Crisp clustering questioned : genes may correlate with multiple clusters • Alternative: determination of gene’s closest neighbours
  • 23. 23 Two Way (Angle) Data Analysis Genes 103–104 Samples 10 1 -10 2 Gene expression matrix Sample space analysis Gene space analysis Conditions 101–102 Genes 10 3 -10 4 Gene expression matrix
  • 25. 25 PCA on all Genes Leukemia data, precursor B and T Plot of 34 patients, dimension of 8973 genes reduced to 2
  • 26. 26 PCA on 100 top significant genes Leukemia data, precursor B and T Plot of 34 patients, dimension of 100 genes reduced to 2
  • 27. 27 PCA of genes (Leukemia data) Plot of 8973 genes, dimension of 34 patients reduced to 2