SlideShare une entreprise Scribd logo
1  sur  16
Télécharger pour lire hors ligne
Course Calendar
Class DATE Contents
1 Sep. 26 Course information & Course overview
2 Oct. 4 Bayes Estimation
3 〃 11 Classical Bayes Estimation - Kalman Filter -
4 〃 18 Simulation-based Bayesian Methods
5 〃 25 Modern Bayesian Estimation :Particle Filter
6 Nov. 1 HMM(Hidden Markov Model)
Nov. 8 No Class
7 〃 15 Bayesian Decision
8 〃 29 Non parametric Approaches
9 Dec. 6 PCA(Principal Component Analysis)
10 〃 13 ICA(Independent Component Analysis)
11 〃 20 Applications of PCA and ICA
12 〃 27 Clustering, k-means et al.
13 Jan. 17 Other Topics 1 Kernel machine.
14 〃 22(Tue) Other Topics 2
Lecture Plan
Independent Component Analysis
-1.Whitening by PCA
1. Introduction
Blind Source Separation(BSS)
2. Problem Formulation and independence
3. Whitening + ICA Approach
4. Non-Gaussianity Measure
References:
[1] A. Hyvarinen et al. “Independent Component Analysis”Wiley-InterScience, 2001
3
- 1. Whitening by PCA (Preparation for ICA approach)
Whitened := Uncorrelatedness + Unity variance*
PCA is a very useful tool to transform a random vector x to an
uncorrelated or whitened z.
Matrix V is not uniquely defined, so we have free parameter.
(In 2-d case: rotation parameter)
* Here we assume all random variables are zero mean
PCA gives one solution of whitening issue.
z Vx
n-vector
matrix (unity matrix)z
z Vx
n n

 C = I
n-vector
covariance matrix
x
n n
Fig. 1
4
PCA whitening method:
- Define the covariance matrix
- {Eigenvalue, Eigenvector} pairs of Cx
- Representation of Cx
(*)
- Whitening matrix transformation
* The matrix E is orthogonal matrix that satisfies EET=ETE=I
 T
x E xxC
1
2
1 2
T
T
T
x n
T
n
 
    
         
         
 
e
e
C e e e EΛE
e

 { , 1 }i i i n e
 
1 1
2 2
1 2
1 1
: ( , , )T
T
z
x diag
E
 
 
 
 
z Vx , V = EΛ E
C zz I

(1)
(2)
(3)
5
1. Introduction
Blind Source Separation (BSS)
Ex. Source signals Mixed signals
BSS Problem: Recover or separate source signals with no prior
information on mixing matrix [ aij ]. Typical BSS problem in real world
is known as “Cocktail party problem”
- Independent Component Analysis (ICA) utilizes the independence of
source signals to solve BSS issue.
   1 3s t s t    
3
1
( 1,2,3)i ij j
j
x t a s t i

 
Fig.2
mic1
mic2
Source 1
s1(t)
Source 2
s2(t) y2(t)
y1(t)
ICA Solution of BSS
?
Mixing
Process
Separation
Process
Independency degree
Fig.3
7
2. Problem Formulation and Independence
- Source signals (zero mean): sj (t), j=1~n
- Recorded signals : xi (t), i=1~n
- Linear Mixing Process (No delay model *)
(* In real environment, the arrival time differences between microphones should be involved in the mixing model)
Recover the sources sj (t), j=1~n from mixed signals xi (t) i=1~n
- aij are unknown
- We want to get both aij and si (t) (-aij , -si (t))
Under the following assumptions
   
   
1
1 1
( 1 )
Vector-Marix form:
=
where ( constant matrix) ,
, , , ,
n
i ij j
j
ij
T T
n n
x t a s t i n
a n n
x x s s

 
   
 

x As
A
x s
(4)
(5)
8
Assumption 1: Source waveforms are statistically independent
Assumption 2: The sources have non-Gaussian distribution
Assumption 3: Matrix A is square and invertible (for simplicity)
Estimated A is used to recover the original signals by inverse (de-
mixing) operation.
s=Bx where B=A-1
Possible Solutions by ICA -To ambiguities-
- The Variance(amplitude) of recovered signals
Because, if a pair of is the solution of the underlying
BSS problem, then is also the other solution.
Variance of source signals are assumed to be unity:
- The order of recovered signals cannot be determined (Permutation)
  ,ij ja s t
2
1jE s   
 
1
,ij jKa s t
K
 
 
 
(6)
Basics Independence and Uncorrelatedness
Statistical independence of two random variables x and y
Knowing the value of x does not provide information about the
distribution of y .
Example:
Uncorrelatedness of two random variables x and y
their covariance is zero, i.e.
9
         , ,y y x y x yp y x p y p x y p x p y  
 
1
0
- variables , , are uncorrelated then we have
is diagonal matrix
n
T
x
E xy
x x
E

   C xx
Fig.4
(7)
Statistical Independence
    0x yE x m y m  
Uncorrelatedness
     x,y x yp x,y p x p y
- Independence means uncorrelatedness
- Gaussian density case
Independence = Uncorrelatedness
11
Examples of Sources and Mixed Signals
[Uniform densities (Sub-Gaussian)]
- Two independent components s1, s2 that have the same
uniform density
- Mixing matrix
 
     1 2 1 2
1
3
,2 3
0
Variance of 1
,
i
i
i
s
p s
otherwise
s
p s s p s p s


 




1 2 1 1 2 2
5 10
where
10 2
5 10
10 2
x As
A
x s s s s

 
  
 
   
      
   
a a
2a
1a
Fig.5
Fig.6
(8)
12
[Super-Gaussian densities]
- Two independents s1, s2 have super-Gaussian like Fig. ??
-
- Mixing signals
2a
1a
Super Gaussian Source signals joint distribution
Mixed signals distribution
Fig.8
Fig.7
13
3. Whitening(PCA) + ICA approach
Observed signals
This means y is also whitened signal.
Conclusion: Whitening gives the independent components only
up to an orthogonal transformation.
whitening ICA
x z s
1
2 T

z ED E x
=Vx
New Mixing matrix is Orthogonal matrix **( )


z Vx =VAs
= As
A VA
Question*: Is this unique solution?
   
Ans* ( orthogonal matrix)
T T T T
y
U
E E  
y = Uz
C yy Uzz U UIU = I
   ** T T T T
E zz AE ss A AA I  
14
Why Gaussian variables are forbidden (Assumption 2)
Suppose two independent signals s1 and s2 are Gaussian
and
we obtain
Joint Gaussian distribution is invariant with respect to orthogonal
transformation. This means that we cannot find (identify)
orthogonal matrix A from the mixed data.
 
2 2
1 2
1 2
2 2
1 2
1 1
, exp exp
2 22 2
1 1 1
exp exp
2 2 2 2
T
s s
p s s
s s
 
 
   
     
   
   
      
  
s s
 
1
1
1 2
( =orthogonal matrix, )
1 1 1 1
, exp exp
2 2 2 2
T T
T T
p z z
 

   
      
   
z = As A A = A , s = A z
z AA z z z
The same density is
observed = no
information arises by
the orthonormal
transformation
(9)
15
We need to answer the following two questions.
1) How can the non-Gaussianity of y is measured?
2) How can we compute the values of B that maximize the measure?
Maximization of Non-Gaussianity
For a given density p(x), we define a measure of non-Gaussianity
NG(p(x)) (non-negative, =0 if p(x) is Gaussian )
NG
[Intuitive interpretation of ICA as non-Gaussianity minimization]
0
Gaussian p(x) Non-Gaussian
1
1
is non-Gaussian
: ( ) is unknown
( ( )) as a function of ( ) is maximuzed when s
i
T
T
n
T T T
i i
x As s
b
s A x Bx x
b
y b x b As q s b q Ab
p y b q y q


 
 
    
 
 
   
NG
(10)
(11)
16
NG(py)0
y=q1s1 +q2s2 y=q1s1
y=q2s2
Reduce NG by mixing
Maximization by alternating b: As y
→qisi NG tends to be maximized
4. Measure of non-Gaussianity
Kurtosis is a classical measure of non-Gaussianity
The absolute value of the kurtosis can be used as s measure of
non-Gaussian. Optimization problem
is solved as an ICA solution.
    
2
4 2
( ) : 3Kurt p y E y E y   
0
Gaussian super-Gaussian
 ( )Kurt p y
sub-Gaussian
     , : ( )
b
Max J b J b Kurt p y
(12)
(13)

Contenu connexe

Tendances

NUMERICAL METHODS -Iterative methods(indirect method)
NUMERICAL METHODS -Iterative methods(indirect method)NUMERICAL METHODS -Iterative methods(indirect method)
NUMERICAL METHODS -Iterative methods(indirect method)krishnapriya R
 
Principal Component Analysis for Tensor Analysis and EEG classification
Principal Component Analysis for Tensor Analysis and EEG classificationPrincipal Component Analysis for Tensor Analysis and EEG classification
Principal Component Analysis for Tensor Analysis and EEG classificationTatsuya Yokota
 
linear system of solutions
linear system of solutionslinear system of solutions
linear system of solutionsLama Rulz
 
system of algebraic equation by Iteration method
system of algebraic equation by Iteration methodsystem of algebraic equation by Iteration method
system of algebraic equation by Iteration methodAkhtar Kamal
 
Tensor Train decomposition in machine learning
Tensor Train decomposition in machine learningTensor Train decomposition in machine learning
Tensor Train decomposition in machine learningAlexander Novikov
 
PosterPresentations.com-3 6x48-Template-V5 - 副本
PosterPresentations.com-3 6x48-Template-V5 - 副本PosterPresentations.com-3 6x48-Template-V5 - 副本
PosterPresentations.com-3 6x48-Template-V5 - 副本Yijun Zhou
 
Independent Component Analysis
Independent Component AnalysisIndependent Component Analysis
Independent Component AnalysisTatsuya Yokota
 
Nonnegative Matrix Factorization
Nonnegative Matrix FactorizationNonnegative Matrix Factorization
Nonnegative Matrix FactorizationTatsuya Yokota
 
E-Cordial Labeling of Some Mirror Graphs
E-Cordial Labeling of Some Mirror GraphsE-Cordial Labeling of Some Mirror Graphs
E-Cordial Labeling of Some Mirror GraphsWaqas Tariq
 
Third-kind Chebyshev Polynomials Vr(x) in Collocation Methods of Solving Boun...
Third-kind Chebyshev Polynomials Vr(x) in Collocation Methods of Solving Boun...Third-kind Chebyshev Polynomials Vr(x) in Collocation Methods of Solving Boun...
Third-kind Chebyshev Polynomials Vr(x) in Collocation Methods of Solving Boun...IOSR Journals
 
Regularity and complexity in dynamical systems
Regularity and complexity in dynamical systemsRegularity and complexity in dynamical systems
Regularity and complexity in dynamical systemsSpringer
 
Point Collocation Method used in the solving of Differential Equations, parti...
Point Collocation Method used in the solving of Differential Equations, parti...Point Collocation Method used in the solving of Differential Equations, parti...
Point Collocation Method used in the solving of Differential Equations, parti...Suddhasheel GHOSH, PhD
 
Conformable Chebyshev differential equation of first kind
Conformable Chebyshev differential equation of first kindConformable Chebyshev differential equation of first kind
Conformable Chebyshev differential equation of first kindIJECEIAES
 
Numerical analysis m2 l4slides
Numerical analysis  m2 l4slidesNumerical analysis  m2 l4slides
Numerical analysis m2 l4slidesSHAMJITH KM
 
Nonparametric approach to multiple regression
Nonparametric approach to multiple regressionNonparametric approach to multiple regression
Nonparametric approach to multiple regressionAlexander Decker
 

Tendances (20)

Numerical Methods Solving Linear Equations
Numerical Methods Solving Linear EquationsNumerical Methods Solving Linear Equations
Numerical Methods Solving Linear Equations
 
NUMERICAL METHODS -Iterative methods(indirect method)
NUMERICAL METHODS -Iterative methods(indirect method)NUMERICAL METHODS -Iterative methods(indirect method)
NUMERICAL METHODS -Iterative methods(indirect method)
 
Section4 stochastic
Section4 stochasticSection4 stochastic
Section4 stochastic
 
Principal Component Analysis for Tensor Analysis and EEG classification
Principal Component Analysis for Tensor Analysis and EEG classificationPrincipal Component Analysis for Tensor Analysis and EEG classification
Principal Component Analysis for Tensor Analysis and EEG classification
 
linear system of solutions
linear system of solutionslinear system of solutions
linear system of solutions
 
system of algebraic equation by Iteration method
system of algebraic equation by Iteration methodsystem of algebraic equation by Iteration method
system of algebraic equation by Iteration method
 
Tensor Train decomposition in machine learning
Tensor Train decomposition in machine learningTensor Train decomposition in machine learning
Tensor Train decomposition in machine learning
 
PosterPresentations.com-3 6x48-Template-V5 - 副本
PosterPresentations.com-3 6x48-Template-V5 - 副本PosterPresentations.com-3 6x48-Template-V5 - 副本
PosterPresentations.com-3 6x48-Template-V5 - 副本
 
Independent Component Analysis
Independent Component AnalysisIndependent Component Analysis
Independent Component Analysis
 
Matlab Assignment Help
Matlab Assignment HelpMatlab Assignment Help
Matlab Assignment Help
 
Nsm
Nsm Nsm
Nsm
 
Nonnegative Matrix Factorization
Nonnegative Matrix FactorizationNonnegative Matrix Factorization
Nonnegative Matrix Factorization
 
Shape1 d
Shape1 dShape1 d
Shape1 d
 
E-Cordial Labeling of Some Mirror Graphs
E-Cordial Labeling of Some Mirror GraphsE-Cordial Labeling of Some Mirror Graphs
E-Cordial Labeling of Some Mirror Graphs
 
Third-kind Chebyshev Polynomials Vr(x) in Collocation Methods of Solving Boun...
Third-kind Chebyshev Polynomials Vr(x) in Collocation Methods of Solving Boun...Third-kind Chebyshev Polynomials Vr(x) in Collocation Methods of Solving Boun...
Third-kind Chebyshev Polynomials Vr(x) in Collocation Methods of Solving Boun...
 
Regularity and complexity in dynamical systems
Regularity and complexity in dynamical systemsRegularity and complexity in dynamical systems
Regularity and complexity in dynamical systems
 
Point Collocation Method used in the solving of Differential Equations, parti...
Point Collocation Method used in the solving of Differential Equations, parti...Point Collocation Method used in the solving of Differential Equations, parti...
Point Collocation Method used in the solving of Differential Equations, parti...
 
Conformable Chebyshev differential equation of first kind
Conformable Chebyshev differential equation of first kindConformable Chebyshev differential equation of first kind
Conformable Chebyshev differential equation of first kind
 
Numerical analysis m2 l4slides
Numerical analysis  m2 l4slidesNumerical analysis  m2 l4slides
Numerical analysis m2 l4slides
 
Nonparametric approach to multiple regression
Nonparametric approach to multiple regressionNonparametric approach to multiple regression
Nonparametric approach to multiple regression
 

Similaire à 2012 mdsp pr10 ica

Sequential Extraction of Local ICA Structures
Sequential Extraction of Local ICA StructuresSequential Extraction of Local ICA Structures
Sequential Extraction of Local ICA Structurestopujahin
 
Litv_Denmark_Weak_Supervised_Learning.pdf
Litv_Denmark_Weak_Supervised_Learning.pdfLitv_Denmark_Weak_Supervised_Learning.pdf
Litv_Denmark_Weak_Supervised_Learning.pdfAlexander Litvinenko
 
Independent Component Analysis
Independent Component Analysis Independent Component Analysis
Independent Component Analysis Ibrahim Amer
 
Principal Component Analysis
Principal Component AnalysisPrincipal Component Analysis
Principal Component AnalysisSumit Singh
 
Ch9-Gauss_Elimination4.pdf
Ch9-Gauss_Elimination4.pdfCh9-Gauss_Elimination4.pdf
Ch9-Gauss_Elimination4.pdfRahulUkhande
 
GATE Mathematics Paper-2000
GATE Mathematics Paper-2000GATE Mathematics Paper-2000
GATE Mathematics Paper-2000Dips Academy
 
Paper computer
Paper computerPaper computer
Paper computerbikram ...
 
Paper computer
Paper computerPaper computer
Paper computerbikram ...
 
Sisteme de ecuatii
Sisteme de ecuatiiSisteme de ecuatii
Sisteme de ecuatiiHerpy Derpy
 
ISI MSQE Entrance Question Paper (2008)
ISI MSQE Entrance Question Paper (2008)ISI MSQE Entrance Question Paper (2008)
ISI MSQE Entrance Question Paper (2008)CrackDSE
 
IIT Jam math 2016 solutions BY Trajectoryeducation
IIT Jam math 2016 solutions BY TrajectoryeducationIIT Jam math 2016 solutions BY Trajectoryeducation
IIT Jam math 2016 solutions BY TrajectoryeducationDev Singh
 
Constant strain triangular
Constant strain triangular Constant strain triangular
Constant strain triangular rahul183
 
Pydata Katya Vasilaky
Pydata Katya VasilakyPydata Katya Vasilaky
Pydata Katya Vasilakyknv4
 
MVPA with SpaceNet: sparse structured priors
MVPA with SpaceNet: sparse structured priorsMVPA with SpaceNet: sparse structured priors
MVPA with SpaceNet: sparse structured priorsElvis DOHMATOB
 
Non-Gaussian Methods for Learning Linear Structural Equation Models: Part I
Non-Gaussian Methods for Learning Linear Structural Equation Models: Part INon-Gaussian Methods for Learning Linear Structural Equation Models: Part I
Non-Gaussian Methods for Learning Linear Structural Equation Models: Part IShiga University, RIKEN
 

Similaire à 2012 mdsp pr10 ica (20)

Sequential Extraction of Local ICA Structures
Sequential Extraction of Local ICA StructuresSequential Extraction of Local ICA Structures
Sequential Extraction of Local ICA Structures
 
Litv_Denmark_Weak_Supervised_Learning.pdf
Litv_Denmark_Weak_Supervised_Learning.pdfLitv_Denmark_Weak_Supervised_Learning.pdf
Litv_Denmark_Weak_Supervised_Learning.pdf
 
Independent Component Analysis
Independent Component Analysis Independent Component Analysis
Independent Component Analysis
 
Principal Component Analysis
Principal Component AnalysisPrincipal Component Analysis
Principal Component Analysis
 
Presentation on matrix
Presentation on matrixPresentation on matrix
Presentation on matrix
 
Estimation rs
Estimation rsEstimation rs
Estimation rs
 
Multivariate Methods Assignment Help
Multivariate Methods Assignment HelpMultivariate Methods Assignment Help
Multivariate Methods Assignment Help
 
Ch9-Gauss_Elimination4.pdf
Ch9-Gauss_Elimination4.pdfCh9-Gauss_Elimination4.pdf
Ch9-Gauss_Elimination4.pdf
 
Lecture_note2.pdf
Lecture_note2.pdfLecture_note2.pdf
Lecture_note2.pdf
 
GATE Mathematics Paper-2000
GATE Mathematics Paper-2000GATE Mathematics Paper-2000
GATE Mathematics Paper-2000
 
Paper computer
Paper computerPaper computer
Paper computer
 
Paper computer
Paper computerPaper computer
Paper computer
 
Sisteme de ecuatii
Sisteme de ecuatiiSisteme de ecuatii
Sisteme de ecuatii
 
ISI MSQE Entrance Question Paper (2008)
ISI MSQE Entrance Question Paper (2008)ISI MSQE Entrance Question Paper (2008)
ISI MSQE Entrance Question Paper (2008)
 
IIT Jam math 2016 solutions BY Trajectoryeducation
IIT Jam math 2016 solutions BY TrajectoryeducationIIT Jam math 2016 solutions BY Trajectoryeducation
IIT Jam math 2016 solutions BY Trajectoryeducation
 
Constant strain triangular
Constant strain triangular Constant strain triangular
Constant strain triangular
 
Pydata Katya Vasilaky
Pydata Katya VasilakyPydata Katya Vasilaky
Pydata Katya Vasilaky
 
MVPA with SpaceNet: sparse structured priors
MVPA with SpaceNet: sparse structured priorsMVPA with SpaceNet: sparse structured priors
MVPA with SpaceNet: sparse structured priors
 
Non-Gaussian Methods for Learning Linear Structural Equation Models: Part I
Non-Gaussian Methods for Learning Linear Structural Equation Models: Part INon-Gaussian Methods for Learning Linear Structural Equation Models: Part I
Non-Gaussian Methods for Learning Linear Structural Equation Models: Part I
 
Ica group 3[1]
Ica group 3[1]Ica group 3[1]
Ica group 3[1]
 

Plus de nozomuhamada

2012 mdsp pr08 nonparametric approach
2012 mdsp pr08 nonparametric approach2012 mdsp pr08 nonparametric approach
2012 mdsp pr08 nonparametric approachnozomuhamada
 
2012 mdsp pr05 particle filter
2012 mdsp pr05 particle filter2012 mdsp pr05 particle filter
2012 mdsp pr05 particle filternozomuhamada
 
2012 mdsp pr04 monte carlo
2012 mdsp pr04 monte carlo2012 mdsp pr04 monte carlo
2012 mdsp pr04 monte carlonozomuhamada
 
2012 mdsp pr03 kalman filter
2012 mdsp pr03 kalman filter2012 mdsp pr03 kalman filter
2012 mdsp pr03 kalman filternozomuhamada
 
2012 mdsp pr02 1004
2012 mdsp pr02 10042012 mdsp pr02 1004
2012 mdsp pr02 1004nozomuhamada
 
2012 mdsp pr01 introduction 0921
2012 mdsp pr01 introduction 09212012 mdsp pr01 introduction 0921
2012 mdsp pr01 introduction 0921nozomuhamada
 
招待講演(鶴岡)
招待講演(鶴岡)招待講演(鶴岡)
招待講演(鶴岡)nozomuhamada
 

Plus de nozomuhamada (9)

2012 mdsp pr08 nonparametric approach
2012 mdsp pr08 nonparametric approach2012 mdsp pr08 nonparametric approach
2012 mdsp pr08 nonparametric approach
 
2012 mdsp pr05 particle filter
2012 mdsp pr05 particle filter2012 mdsp pr05 particle filter
2012 mdsp pr05 particle filter
 
2012 mdsp pr04 monte carlo
2012 mdsp pr04 monte carlo2012 mdsp pr04 monte carlo
2012 mdsp pr04 monte carlo
 
2012 mdsp pr03 kalman filter
2012 mdsp pr03 kalman filter2012 mdsp pr03 kalman filter
2012 mdsp pr03 kalman filter
 
2012 mdsp pr02 1004
2012 mdsp pr02 10042012 mdsp pr02 1004
2012 mdsp pr02 1004
 
2012 mdsp pr01 introduction 0921
2012 mdsp pr01 introduction 09212012 mdsp pr01 introduction 0921
2012 mdsp pr01 introduction 0921
 
Ieice中国地区
Ieice中国地区Ieice中国地区
Ieice中国地区
 
招待講演(鶴岡)
招待講演(鶴岡)招待講演(鶴岡)
招待講演(鶴岡)
 
最終講義
最終講義最終講義
最終講義
 

Dernier

Artificial Intelligence: Facts and Myths
Artificial Intelligence: Facts and MythsArtificial Intelligence: Facts and Myths
Artificial Intelligence: Facts and MythsJoaquim Jorge
 
Data Cloud, More than a CDP by Matt Robison
Data Cloud, More than a CDP by Matt RobisonData Cloud, More than a CDP by Matt Robison
Data Cloud, More than a CDP by Matt RobisonAnna Loughnan Colquhoun
 
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...Drew Madelung
 
[2024]Digital Global Overview Report 2024 Meltwater.pdf
[2024]Digital Global Overview Report 2024 Meltwater.pdf[2024]Digital Global Overview Report 2024 Meltwater.pdf
[2024]Digital Global Overview Report 2024 Meltwater.pdfhans926745
 
Strategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
Strategize a Smooth Tenant-to-tenant Migration and Copilot TakeoffStrategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
Strategize a Smooth Tenant-to-tenant Migration and Copilot Takeoffsammart93
 
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdf
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdfThe Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdf
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdfEnterprise Knowledge
 
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...Igalia
 
08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking Men08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking MenDelhi Call girls
 
Finology Group – Insurtech Innovation Award 2024
Finology Group – Insurtech Innovation Award 2024Finology Group – Insurtech Innovation Award 2024
Finology Group – Insurtech Innovation Award 2024The Digital Insurer
 
GenAI Risks & Security Meetup 01052024.pdf
GenAI Risks & Security Meetup 01052024.pdfGenAI Risks & Security Meetup 01052024.pdf
GenAI Risks & Security Meetup 01052024.pdflior mazor
 
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...apidays
 
Driving Behavioral Change for Information Management through Data-Driven Gree...
Driving Behavioral Change for Information Management through Data-Driven Gree...Driving Behavioral Change for Information Management through Data-Driven Gree...
Driving Behavioral Change for Information Management through Data-Driven Gree...Enterprise Knowledge
 
Powerful Google developer tools for immediate impact! (2023-24 C)
Powerful Google developer tools for immediate impact! (2023-24 C)Powerful Google developer tools for immediate impact! (2023-24 C)
Powerful Google developer tools for immediate impact! (2023-24 C)wesley chun
 
Strategies for Landing an Oracle DBA Job as a Fresher
Strategies for Landing an Oracle DBA Job as a FresherStrategies for Landing an Oracle DBA Job as a Fresher
Strategies for Landing an Oracle DBA Job as a FresherRemote DBA Services
 
IAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI SolutionsIAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI SolutionsEnterprise Knowledge
 
Partners Life - Insurer Innovation Award 2024
Partners Life - Insurer Innovation Award 2024Partners Life - Insurer Innovation Award 2024
Partners Life - Insurer Innovation Award 2024The Digital Insurer
 
How to convert PDF to text with Nanonets
How to convert PDF to text with NanonetsHow to convert PDF to text with Nanonets
How to convert PDF to text with Nanonetsnaman860154
 
08448380779 Call Girls In Greater Kailash - I Women Seeking Men
08448380779 Call Girls In Greater Kailash - I Women Seeking Men08448380779 Call Girls In Greater Kailash - I Women Seeking Men
08448380779 Call Girls In Greater Kailash - I Women Seeking MenDelhi Call girls
 
GenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day PresentationGenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day PresentationMichael W. Hawkins
 
Presentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreterPresentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreternaman860154
 

Dernier (20)

Artificial Intelligence: Facts and Myths
Artificial Intelligence: Facts and MythsArtificial Intelligence: Facts and Myths
Artificial Intelligence: Facts and Myths
 
Data Cloud, More than a CDP by Matt Robison
Data Cloud, More than a CDP by Matt RobisonData Cloud, More than a CDP by Matt Robison
Data Cloud, More than a CDP by Matt Robison
 
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
 
[2024]Digital Global Overview Report 2024 Meltwater.pdf
[2024]Digital Global Overview Report 2024 Meltwater.pdf[2024]Digital Global Overview Report 2024 Meltwater.pdf
[2024]Digital Global Overview Report 2024 Meltwater.pdf
 
Strategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
Strategize a Smooth Tenant-to-tenant Migration and Copilot TakeoffStrategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
Strategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
 
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdf
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdfThe Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdf
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdf
 
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
 
08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking Men08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking Men
 
Finology Group – Insurtech Innovation Award 2024
Finology Group – Insurtech Innovation Award 2024Finology Group – Insurtech Innovation Award 2024
Finology Group – Insurtech Innovation Award 2024
 
GenAI Risks & Security Meetup 01052024.pdf
GenAI Risks & Security Meetup 01052024.pdfGenAI Risks & Security Meetup 01052024.pdf
GenAI Risks & Security Meetup 01052024.pdf
 
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
 
Driving Behavioral Change for Information Management through Data-Driven Gree...
Driving Behavioral Change for Information Management through Data-Driven Gree...Driving Behavioral Change for Information Management through Data-Driven Gree...
Driving Behavioral Change for Information Management through Data-Driven Gree...
 
Powerful Google developer tools for immediate impact! (2023-24 C)
Powerful Google developer tools for immediate impact! (2023-24 C)Powerful Google developer tools for immediate impact! (2023-24 C)
Powerful Google developer tools for immediate impact! (2023-24 C)
 
Strategies for Landing an Oracle DBA Job as a Fresher
Strategies for Landing an Oracle DBA Job as a FresherStrategies for Landing an Oracle DBA Job as a Fresher
Strategies for Landing an Oracle DBA Job as a Fresher
 
IAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI SolutionsIAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI Solutions
 
Partners Life - Insurer Innovation Award 2024
Partners Life - Insurer Innovation Award 2024Partners Life - Insurer Innovation Award 2024
Partners Life - Insurer Innovation Award 2024
 
How to convert PDF to text with Nanonets
How to convert PDF to text with NanonetsHow to convert PDF to text with Nanonets
How to convert PDF to text with Nanonets
 
08448380779 Call Girls In Greater Kailash - I Women Seeking Men
08448380779 Call Girls In Greater Kailash - I Women Seeking Men08448380779 Call Girls In Greater Kailash - I Women Seeking Men
08448380779 Call Girls In Greater Kailash - I Women Seeking Men
 
GenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day PresentationGenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day Presentation
 
Presentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreterPresentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreter
 

2012 mdsp pr10 ica

  • 1. Course Calendar Class DATE Contents 1 Sep. 26 Course information & Course overview 2 Oct. 4 Bayes Estimation 3 〃 11 Classical Bayes Estimation - Kalman Filter - 4 〃 18 Simulation-based Bayesian Methods 5 〃 25 Modern Bayesian Estimation :Particle Filter 6 Nov. 1 HMM(Hidden Markov Model) Nov. 8 No Class 7 〃 15 Bayesian Decision 8 〃 29 Non parametric Approaches 9 Dec. 6 PCA(Principal Component Analysis) 10 〃 13 ICA(Independent Component Analysis) 11 〃 20 Applications of PCA and ICA 12 〃 27 Clustering, k-means et al. 13 Jan. 17 Other Topics 1 Kernel machine. 14 〃 22(Tue) Other Topics 2
  • 2. Lecture Plan Independent Component Analysis -1.Whitening by PCA 1. Introduction Blind Source Separation(BSS) 2. Problem Formulation and independence 3. Whitening + ICA Approach 4. Non-Gaussianity Measure References: [1] A. Hyvarinen et al. “Independent Component Analysis”Wiley-InterScience, 2001
  • 3. 3 - 1. Whitening by PCA (Preparation for ICA approach) Whitened := Uncorrelatedness + Unity variance* PCA is a very useful tool to transform a random vector x to an uncorrelated or whitened z. Matrix V is not uniquely defined, so we have free parameter. (In 2-d case: rotation parameter) * Here we assume all random variables are zero mean PCA gives one solution of whitening issue. z Vx n-vector matrix (unity matrix)z z Vx n n   C = I n-vector covariance matrix x n n Fig. 1
  • 4. 4 PCA whitening method: - Define the covariance matrix - {Eigenvalue, Eigenvector} pairs of Cx - Representation of Cx (*) - Whitening matrix transformation * The matrix E is orthogonal matrix that satisfies EET=ETE=I  T x E xxC 1 2 1 2 T T T x n T n                              e e C e e e EΛE e   { , 1 }i i i n e   1 1 2 2 1 2 1 1 : ( , , )T T z x diag E         z Vx , V = EΛ E C zz I  (1) (2) (3)
  • 5. 5 1. Introduction Blind Source Separation (BSS) Ex. Source signals Mixed signals BSS Problem: Recover or separate source signals with no prior information on mixing matrix [ aij ]. Typical BSS problem in real world is known as “Cocktail party problem” - Independent Component Analysis (ICA) utilizes the independence of source signals to solve BSS issue.    1 3s t s t     3 1 ( 1,2,3)i ij j j x t a s t i    Fig.2
  • 6. mic1 mic2 Source 1 s1(t) Source 2 s2(t) y2(t) y1(t) ICA Solution of BSS ? Mixing Process Separation Process Independency degree Fig.3
  • 7. 7 2. Problem Formulation and Independence - Source signals (zero mean): sj (t), j=1~n - Recorded signals : xi (t), i=1~n - Linear Mixing Process (No delay model *) (* In real environment, the arrival time differences between microphones should be involved in the mixing model) Recover the sources sj (t), j=1~n from mixed signals xi (t) i=1~n - aij are unknown - We want to get both aij and si (t) (-aij , -si (t)) Under the following assumptions         1 1 1 ( 1 ) Vector-Marix form: = where ( constant matrix) , , , , , n i ij j j ij T T n n x t a s t i n a n n x x s s           x As A x s (4) (5)
  • 8. 8 Assumption 1: Source waveforms are statistically independent Assumption 2: The sources have non-Gaussian distribution Assumption 3: Matrix A is square and invertible (for simplicity) Estimated A is used to recover the original signals by inverse (de- mixing) operation. s=Bx where B=A-1 Possible Solutions by ICA -To ambiguities- - The Variance(amplitude) of recovered signals Because, if a pair of is the solution of the underlying BSS problem, then is also the other solution. Variance of source signals are assumed to be unity: - The order of recovered signals cannot be determined (Permutation)   ,ij ja s t 2 1jE s      1 ,ij jKa s t K       (6)
  • 9. Basics Independence and Uncorrelatedness Statistical independence of two random variables x and y Knowing the value of x does not provide information about the distribution of y . Example: Uncorrelatedness of two random variables x and y their covariance is zero, i.e. 9          , ,y y x y x yp y x p y p x y p x p y     1 0 - variables , , are uncorrelated then we have is diagonal matrix n T x E xy x x E     C xx Fig.4 (7)
  • 10. Statistical Independence     0x yE x m y m   Uncorrelatedness      x,y x yp x,y p x p y - Independence means uncorrelatedness - Gaussian density case Independence = Uncorrelatedness
  • 11. 11 Examples of Sources and Mixed Signals [Uniform densities (Sub-Gaussian)] - Two independent components s1, s2 that have the same uniform density - Mixing matrix        1 2 1 2 1 3 ,2 3 0 Variance of 1 , i i i s p s otherwise s p s s p s p s         1 2 1 1 2 2 5 10 where 10 2 5 10 10 2 x As A x s s s s                        a a 2a 1a Fig.5 Fig.6 (8)
  • 12. 12 [Super-Gaussian densities] - Two independents s1, s2 have super-Gaussian like Fig. ?? - - Mixing signals 2a 1a Super Gaussian Source signals joint distribution Mixed signals distribution Fig.8 Fig.7
  • 13. 13 3. Whitening(PCA) + ICA approach Observed signals This means y is also whitened signal. Conclusion: Whitening gives the independent components only up to an orthogonal transformation. whitening ICA x z s 1 2 T  z ED E x =Vx New Mixing matrix is Orthogonal matrix **( )   z Vx =VAs = As A VA Question*: Is this unique solution?     Ans* ( orthogonal matrix) T T T T y U E E   y = Uz C yy Uzz U UIU = I    ** T T T T E zz AE ss A AA I  
  • 14. 14 Why Gaussian variables are forbidden (Assumption 2) Suppose two independent signals s1 and s2 are Gaussian and we obtain Joint Gaussian distribution is invariant with respect to orthogonal transformation. This means that we cannot find (identify) orthogonal matrix A from the mixed data.   2 2 1 2 1 2 2 2 1 2 1 1 , exp exp 2 22 2 1 1 1 exp exp 2 2 2 2 T s s p s s s s                                 s s   1 1 1 2 ( =orthogonal matrix, ) 1 1 1 1 , exp exp 2 2 2 2 T T T T p z z                   z = As A A = A , s = A z z AA z z z The same density is observed = no information arises by the orthonormal transformation (9)
  • 15. 15 We need to answer the following two questions. 1) How can the non-Gaussianity of y is measured? 2) How can we compute the values of B that maximize the measure? Maximization of Non-Gaussianity For a given density p(x), we define a measure of non-Gaussianity NG(p(x)) (non-negative, =0 if p(x) is Gaussian ) NG [Intuitive interpretation of ICA as non-Gaussianity minimization] 0 Gaussian p(x) Non-Gaussian 1 1 is non-Gaussian : ( ) is unknown ( ( )) as a function of ( ) is maximuzed when s i T T n T T T i i x As s b s A x Bx x b y b x b As q s b q Ab p y b q y q                    NG (10) (11)
  • 16. 16 NG(py)0 y=q1s1 +q2s2 y=q1s1 y=q2s2 Reduce NG by mixing Maximization by alternating b: As y →qisi NG tends to be maximized 4. Measure of non-Gaussianity Kurtosis is a classical measure of non-Gaussianity The absolute value of the kurtosis can be used as s measure of non-Gaussian. Optimization problem is solved as an ICA solution.      2 4 2 ( ) : 3Kurt p y E y E y    0 Gaussian super-Gaussian  ( )Kurt p y sub-Gaussian      , : ( ) b Max J b J b Kurt p y (12) (13)