Genetics and epigenetics of ADHD and comorbid conditions
Consensual gene co-expression network inference with multiple samples
1. Consensual gene co-expression network
inference with multiple samples
Nathalie Villa-Vialaneix(1,2)
http://www.nathalievilla.org
nathalie.villa@univ-paris1.fr
Joint work with Magali SanCristobal and Laurence Liaubet
Groupe de travail biostatistique - 19 mars 2013
(1) (2)
Consensus LASSO (INRA de Toulouse, MIAT) Nathalie Villa-Vialaneix Toulouse, 19 mars 2013 1 / 21
2. Overview on network inference
Outline
1 Overview on network inference
2 Graphical Gaussian Models
3 Inference with multiple samples
4 Illustration
Consensus LASSO (INRA de Toulouse, MIAT) Nathalie Villa-Vialaneix Toulouse, 19 mars 2013 2 / 21
3. Overview on network inference
Framework
Data: large scale gene expression data
individuals
n 30/50
X =
. . . . . .
. . X
j
i
. . .
. . . . . .
variables (genes expression), p 103/4
What we want to obtain: a graph/network with
• nodes: genes;
• edges: “significant” and direct co-expression between two genes
(track transcription regulations).
Consensus LASSO (INRA de Toulouse, MIAT) Nathalie Villa-Vialaneix Toulouse, 19 mars 2013 3 / 21
4. Overview on network inference
Modeling multiple interactions between genes with a
network
Co-expression networks
• nodes: genes
• edges: “direct” co-expression
between two genes
Consensus LASSO (INRA de Toulouse, MIAT) Nathalie Villa-Vialaneix Toulouse, 19 mars 2013 4 / 21
5. Overview on network inference
Modeling multiple interactions between genes with a
network
Co-expression networks
• nodes: genes
• edges: “direct” co-expression between two genes
Method:
“Correlations” Thresholding Graph
Consensus LASSO (INRA de Toulouse, MIAT) Nathalie Villa-Vialaneix Toulouse, 19 mars 2013 4 / 21
6. Overview on network inference
Correlations/Partial correlations
strong indirect correlation
y z
x
set.seed(2807); x <- runif(100)
y <- 2*x+1 + rnorm(100,0,0.1); cor(x,y); [1] 0.9870407
z <- -x+2 + rnorm(100,0,0.1); cor(x,z); [1] -0.9443082
cor(y,z) [1] -0.9336924
Consensus LASSO (INRA de Toulouse, MIAT) Nathalie Villa-Vialaneix Toulouse, 19 mars 2013 5 / 21
7. Overview on network inference
Correlations/Partial correlations
Partial correlation
Cor (z, y|x)
Correlation between residuals:
set.seed(2807); x <- runif(100)
y <- 2*x+1 + rnorm(100,0,0.1); cor(x,y); [1] 0.9870407
z <- -x+2 + rnorm(100,0,0.1); cor(x,z); [1] -0.9443082
cor(y,z) [1] -0.9336924
cor(lm(y x)$residuals,lm(z x)$residuals) [1] -0.03071178
Consensus LASSO (INRA de Toulouse, MIAT) Nathalie Villa-Vialaneix Toulouse, 19 mars 2013 5 / 21
8. Overview on network inference
Advantages of a network approach
1 over raw data and correlation network (relevance network,
[Butte and Kohane, 1999]): focuses on direct links;
Consensus LASSO (INRA de Toulouse, MIAT) Nathalie Villa-Vialaneix Toulouse, 19 mars 2013 6 / 21
9. Overview on network inference
Advantages of a network approach
1 over raw data and correlation network (relevance network,
[Butte and Kohane, 1999]): focuses on direct links;
2 over raw data (again): focuses on “significant” links (more robust)
Consensus LASSO (INRA de Toulouse, MIAT) Nathalie Villa-Vialaneix Toulouse, 19 mars 2013 6 / 21
10. Overview on network inference
Advantages of a network approach
1 over raw data and correlation network (relevance network,
[Butte and Kohane, 1999]): focuses on direct links;
2 over raw data (again): focuses on “significant” links (more robust)
3 over bibliographic network: can handle interactions with yet
unknown (not annotated) genes
Consensus LASSO (INRA de Toulouse, MIAT) Nathalie Villa-Vialaneix Toulouse, 19 mars 2013 6 / 21
11. Graphical Gaussian Models
Outline
1 Overview on network inference
2 Graphical Gaussian Models
3 Inference with multiple samples
4 Illustration
Consensus LASSO (INRA de Toulouse, MIAT) Nathalie Villa-Vialaneix Toulouse, 19 mars 2013 7 / 21
12. Graphical Gaussian Models
Theoretical framework
Gaussian Graphical Models (GGM) X ∼ N(0, Σ) gene expressions
Seminal work [Schäfer and Strimmer, 2005], R package GeneNet:
estimation of the partial correlations
πjj = Cor(Xj
, Xj
|Xk
, k j, j )
from the concentration matrix S = Σ−1
:
πjj = −
Sjj
SjjSj j
.
Consensus LASSO (INRA de Toulouse, MIAT) Nathalie Villa-Vialaneix Toulouse, 19 mars 2013 8 / 21
13. Graphical Gaussian Models
Theoretical framework
Gaussian Graphical Models (GGM) X ∼ N(0, Σ) gene expressions
Seminal work [Schäfer and Strimmer, 2005], R package GeneNet:
estimation of the partial correlations
πjj = Cor(Xj
, Xj
|Xk
, k j, j )
from the concentration matrix S = Σ−1
:
πjj = −
Sjj
SjjSj j
.
Main issue: p n ⇒ Σ badly conditioned ⇒ estimating S from Σ−1
is a
bad idea...
Consensus LASSO (INRA de Toulouse, MIAT) Nathalie Villa-Vialaneix Toulouse, 19 mars 2013 8 / 21
14. Graphical Gaussian Models
Theoretical framework
Gaussian Graphical Models (GGM) X ∼ N(0, Σ) gene expressions
Seminal work [Schäfer and Strimmer, 2005], R package GeneNet:
estimation of the partial correlations
πjj = Cor(Xj
, Xj
|Xk
, k j, j )
from the concentration matrix S = Σ−1
:
πjj = −
Sjj
SjjSj j
.
Main issue: p n ⇒ Σ badly conditioned ⇒ estimating S from Σ−1
is a
bad idea... Schafer & Strimmer’s proposal:
1 use Σ + λI rather than Σ to estimate S;
2 select only the most significant Sjj (Bayesian test):
S ∼ (1 − η0)fA + η0f0
with f0: distribution of the “null” edges and η0 proportion of null edges
among the partial correlations values (close to 1).
Consensus LASSO (INRA de Toulouse, MIAT) Nathalie Villa-Vialaneix Toulouse, 19 mars 2013 8 / 21
15. Graphical Gaussian Models
Sparse regression approach
[Meinshausen and Bühlmann, 2006, Friedman et al., 2008] Partial
correlations can also be estimated by using linear models: ∀ j
Xj
= βT
j X−j
+
In the Gaussian framework: βjj = −
Sjj
Sjj
.
Consensus LASSO (INRA de Toulouse, MIAT) Nathalie Villa-Vialaneix Toulouse, 19 mars 2013 9 / 21
16. Graphical Gaussian Models
Sparse regression approach
[Meinshausen and Bühlmann, 2006, Friedman et al., 2008] Partial
correlations can also be estimated by using linear models: ∀ j
Xj
= βT
j X−j
+
In the Gaussian framework: βjj = −
Sjj
Sjj
.
Independant regressions:
max
(βjj )j
log MLj − λ
j j
|βjj |
with log MLj ∼ − n
i X
j
i
− j j βjj X
j
i
2
.
Consequence: the sparse penalty yields to βjj = 0 for most coefficients
(“all-in-one” approach: no thresholding step needed).
Consensus LASSO (INRA de Toulouse, MIAT) Nathalie Villa-Vialaneix Toulouse, 19 mars 2013 9 / 21
17. Graphical Gaussian Models
Sparse regression approach
[Meinshausen and Bühlmann, 2006, Friedman et al., 2008] Partial
correlations can also be estimated by using linear models: ∀ j
Xj
= βT
j X−j
+
In the Gaussian framework: βjj = −
Sjj
Sjj
.
Global approach: Graphical Lasso (R package glasso)
max
(βjj )jj
j
log MLj + λ
j j
|βjj |
Consequence: the sparse penalty yields to βjj = 0 for most coefficients
(“all-in-one” approach: no thresholding step needed).
Consensus LASSO (INRA de Toulouse, MIAT) Nathalie Villa-Vialaneix Toulouse, 19 mars 2013 9 / 21
18. Graphical Gaussian Models
Other methods/packages to infer networks
• relevance (correlation) networks: R package WGCNA
• Bayesian networks: R package bnlearn
[Pearl, 1998, Pearl and Russel, 2002, Scutari, 2010]
• networks based on mutual information: R package minet
[Meyer et al., 2008]
• networks based on random forest [Huynh-Thu et al., 2010]
See also:
• http://cran.r-project.org/web/views/gR.html (CRAN task
view on graphical methods)
• https://www.coursera.org/course/pgm (Daphne’s Koller on-line
course on “Probabilistic Graphical Models”, starts on April, 8th)
• https://www.coursera.org/course/netsysbio (On-line course
on “Network Analysis in Systems Biology”)
Consensus LASSO (INRA de Toulouse, MIAT) Nathalie Villa-Vialaneix Toulouse, 19 mars 2013 10 / 21
19. Inference with multiple samples
Outline
1 Overview on network inference
2 Graphical Gaussian Models
3 Inference with multiple samples
4 Illustration
Consensus LASSO (INRA de Toulouse, MIAT) Nathalie Villa-Vialaneix Toulouse, 19 mars 2013 11 / 21
20. Inference with multiple samples
Multiple networks inference
Transcriptomic data coming from several different conditions.
Examples:
• genes expression from pig muscle in Landrace and Large white
breeds;
• genes expression from obese humans after and before a diet.
Consensus LASSO (INRA de Toulouse, MIAT) Nathalie Villa-Vialaneix Toulouse, 19 mars 2013 12 / 21
21. Inference with multiple samples
Multiple networks inference
Transcriptomic data coming from several different conditions.
Examples:
• genes expression from pig muscle in Landrace and Large white
breeds;
• genes expression from obese humans after and before a diet.
• Assumption: A
common functioning
exists regardless the
condition;
• Which genes are
correlated
independently
from/depending on the
condition?
Consensus LASSO (INRA de Toulouse, MIAT) Nathalie Villa-Vialaneix Toulouse, 19 mars 2013 12 / 21
22. Inference with multiple samples
Dataset description
“DeLiSus” dataset
• variables: expression of 81 genes (selected by Laurence)
• conditions: two breeds (33 “Landrace” and 51 “Large white”; 84 pigs)
Consensus LASSO (INRA de Toulouse, MIAT) Nathalie Villa-Vialaneix Toulouse, 19 mars 2013 13 / 21
24. Inference with multiple samples
Multiple networks
Independent estimations: if c = 1, . . . , C are different samples (or
“conditions”, e.g., breeds or before/after diet...)
max
(βc
jk
)k j,c=1,...,C c
log MLc
j − λ
k j
|βc
jk |
.
Consensus LASSO (INRA de Toulouse, MIAT) Nathalie Villa-Vialaneix Toulouse, 19 mars 2013 15 / 21
25. Inference with multiple samples
Multiple networks
Independent estimations: if c = 1, . . . , C are different samples (or
“conditions”, e.g., breeds or before/after diet...)
max
(βc
jk
)k j,c=1,...,C c
log MLc
j − λ
k j
|βc
jk |
.
Joint estimations:
Implemented in the R package simone, [Chiquet et al., 2011]
GroupLasso Consensual network between conditions (enforces identical
edges by a group LASSO penalty)
CoopLasso Sign-coherent network between conditions (prevents edges
that corresponds to partial correlations having different
signs; thus allows one to obtain a few differences between
the conditions)
Intertwined In GLasso replace Σc
by 1/2Σc
+ 1/2Σ where Σ = 1
C c Σc
Consensus LASSO (INRA de Toulouse, MIAT) Nathalie Villa-Vialaneix Toulouse, 19 mars 2013 15 / 21
26. Inference with multiple samples
Consensus LASSO
Proposal: Infer multiple networks by forcing them toward a consensual
network.
Consensus LASSO (INRA de Toulouse, MIAT) Nathalie Villa-Vialaneix Toulouse, 19 mars 2013 16 / 21
27. Inference with multiple samples
Consensus LASSO
Proposal: Infer multiple networks by forcing them toward a consensual
network.
Original optimization:
max
(βc
jk
)k j,c=1,...,C c
log MLc
j − λ
k j
|βc
jk |
.
Consensus LASSO (INRA de Toulouse, MIAT) Nathalie Villa-Vialaneix Toulouse, 19 mars 2013 16 / 21
28. Inference with multiple samples
Consensus LASSO
Proposal: Infer multiple networks by forcing them toward a consensual
network.
Add a constraint to force inference toward a consensus βcons:
max
(βc
jk
)k j,c=1,...,C c
log MLc
j − λ
k j
|βc
jk | − µ
c
wc βc
j − βcons
j
2
Consensus LASSO (INRA de Toulouse, MIAT) Nathalie Villa-Vialaneix Toulouse, 19 mars 2013 16 / 21
29. Inference with multiple samples
Consensus LASSO
Proposal: Infer multiple networks by forcing them toward a consensual
network.
Add a constraint to force inference toward a consensus βcons:
max
(βc
jk
)k j,c=1,...,C c
log MLc
j − λ
k j
|βc
jk | − µ
c
wc βc
j − βcons
j
2
Examples:
• βcons
j
= βc∗
j
with c∗ = arg min |βc
j
| (network intersection);
• βcons
j
= c
nc
n βc
j
(“average” network).
Consensus LASSO (INRA de Toulouse, MIAT) Nathalie Villa-Vialaneix Toulouse, 19 mars 2013 16 / 21
30. Inference with multiple samples
In practice...
βcons
j
= c
nc
n βc
j
is a good choice because:
•
∂βcons
j
∂βc
j
exists;
Consensus LASSO (INRA de Toulouse, MIAT) Nathalie Villa-Vialaneix Toulouse, 19 mars 2013 17 / 21
31. Inference with multiple samples
In practice...
βcons
j
= c
nc
n βc
j
is a good choice because:
•
∂βcons
j
∂βc
j
exists;
• thus, solving the optimization problem is equivalent to maximizing
1
2
βT
j Sj(µ)βj + βT
j Σjj + λ
c
1
nc
βc
j 1
with Σjj, the jth row of empirical covariance matrix deprived from its
jth column and Sj(µ) = Σjj + 2µAT
A where Σjj is the empirical
covariance matrix deprived from its jth row and column and A is a
matrix that does not depend on j.
This is a standard LASSO problem that can be solved using a
sub-gradient method (as described in [Chiquet et al., 2011] and already
implemented in the beta-R-package therese).
Consensus LASSO (INRA de Toulouse, MIAT) Nathalie Villa-Vialaneix Toulouse, 19 mars 2013 17 / 21
32. Illustration
Outline
1 Overview on network inference
2 Graphical Gaussian Models
3 Inference with multiple samples
4 Illustration
Consensus LASSO (INRA de Toulouse, MIAT) Nathalie Villa-Vialaneix Toulouse, 19 mars 2013 18 / 21
33. Illustration
Datasets description
“DeLiSus” dataset
• variables: expression of 26 genes (selected by Laurence)
• conditions: two breeds (33 “Landrace” and 51 “Large white”; 84 pigs)
Consensus LASSO (INRA de Toulouse, MIAT) Nathalie Villa-Vialaneix Toulouse, 19 mars 2013 19 / 21
34. Illustration
Datasets description
“DeLiSus” dataset
• variables: expression of 26 genes (selected by Laurence)
• conditions: two breeds (33 “Landrace” and 51 “Large white”; 84 pigs)
Methodology
• package GeneNet: networks are estimated independently by a GGM
approach (edges selected based on the p-value in a Bayesian test);
• consensus LASSO: µ fixed and λ varied on a regularization path.
Selection of an instance of the path based on the number of edges
(similar than with GeneNet).
Consensus LASSO (INRA de Toulouse, MIAT) Nathalie Villa-Vialaneix Toulouse, 19 mars 2013 19 / 21
38. Illustration
Conclusion
... much left to do:
• biological validation,
• selecting λ (AIC and BIC are way too restrictive...),
• tuning µ,
• other comparisons...
Consensus LASSO (INRA de Toulouse, MIAT) Nathalie Villa-Vialaneix Toulouse, 19 mars 2013 21 / 21
39. Illustration
References
Butte, A. and Kohane, I. (1999).
Unsupervised knowledge discovery in medical databases using relevance networks.
In Proceedings of the AMIA Symposium, pages 711–715.
Chiquet, J., Grandvalet, Y., and Ambroise, C. (2011).
Inferring multiple graphical structures.
Statistics and Computing, 21(4):537–553.
Friedman, J., Hastie, T., and Tibshirani, R. (2008).
Sparse inverse covariance estimation with the graphical lasso.
Biostatistics, 9(3):432–441.
Huynh-Thu, V., Irrthum, A., Wehenkel, L., and Geurts, P. (2010).
Inferring regulatory networks from expression data using tree-based methods.
PLoS ONE, 5(9):e12776.
Meinshausen, N. and Bühlmann, P. (2006).
High dimensional graphs and variable selection with the lasso.
Annals of Statistic, 34(3):1436–1462.
Meyer, P., Lafitte, F., and Bontempi, G. (2008).
minet: A R/Bioconductor package for inferring large transcriptional networks using mutual information.
BMC Bioinformatics, 9(461).
Pearl, J. (1998).
Probabilistic reasoning in intelligent systems: networks of plausible inference.
Morgan Kaufmann, San Francisco, California, USA.
Pearl, J. and Russel, S. (2002).
Bayesian Networks.
Bradford Books (MIT Press), Cambridge, Massachussets, USA.
Schäfer, J. and Strimmer, K. (2005).
Consensus LASSO (INRA de Toulouse, MIAT) Nathalie Villa-Vialaneix Toulouse, 19 mars 2013 21 / 21
40. Illustration
An empirical bayes approach to inferring large-scale gene association networks.
Bioinformatics, 21(6):754–764.
Scutari, M. (2010).
Learning Bayesian networks with the bnlearn R package.
Journal of Statistical Software, 35(3):1–22.
Consensus LASSO (INRA de Toulouse, MIAT) Nathalie Villa-Vialaneix Toulouse, 19 mars 2013 21 / 21