SlideShare une entreprise Scribd logo
1  sur  17
Télécharger pour lire hors ligne
Maximizing	
  the	
  spectral	
  gap	
  
of	
  networks	
  produced	
  by	
  
node	
  removal
Naoki	
  Masuda	
  (University	
  of	
  Tokyo,	
  Japan)
Refs:	
  
1.	
  Watanabe	
  &	
  Masuda,	
  Physical	
  Review	
  E,	
  82,	
  046102	
  (2010)
2.	
  Masuda,	
  Fujie	
  &	
  Murota,	
  In:	
  Complex	
  Networks	
  IV,	
  Studies	
  in	
  ComputaUonal	
  
Intelligence,	
  476,	
  155-­‐163	
  (2013)
Collaborators:
Takamitsu	
  Watanabe	
  (University	
  of	
  Tokyo,	
  Japan)
Tetsuya	
  Fujie	
  (University	
  of	
  Hyogo,	
  Japan)
Kazuo	
  Murota	
  (University	
  of	
  Tokyo,	
  Japan)
Laplacian	
  of	
  a	
  network
˙x(t) = Lx(t)
˙x1 = 2x1 + x2 + x4
=(x2 x1) + (x4 x1)
1 2
3 4
L =
0
B
B
@
2 1 0 1
1 2 0 1
0 0 1 1
1 1 1 3
1
C
C
A
1 = 0 < 2  3  · · ·  NEigenvalues:
Spectral	
  gap
• If	
  λ2	
  is	
  large,	
  diffusive	
  dynamical	
  processes	
  on	
  networks	
  
occur	
  faster.	
  Ex:	
  synchronizaUon,	
  collecUve	
  opinion	
  
formaUon,	
  random	
  walk.
• Note:	
  unnormalized	
  Laplacian	
  here
• Problem:	
  Maximize	
  λ2	
  by	
  removing	
  Ndel	
  out	
  of	
  N	
  nodes	
  by	
  
two	
  methods.
• SequenUal	
  node	
  removal	
  +	
  perturbaUve	
  method	
  
(Watanabe	
  &	
  Masuda,	
  2010)
• Semidefinite	
  programming	
  (Masuda,	
  Fujie	
  &	
  Murota,	
  
2013)
• Note:	
  Removal	
  of	
  links	
  always	
  decreases	
  λ2	
  (Milanese,	
  Sun	
  
&	
  Nishikawa	
  2010;	
  Nishikawa	
  &	
  Mober	
  2010).
PerturbaUve	
  method
• Extends	
  the	
  same	
  method	
  for	
  adjacency	
  matrices	
  
(Restrepo,	
  Ob	
  &	
  Hunt,	
  2008)
• Much	
  faster	
  than	
  the	
  brute	
  force	
  method.
Lu = 2u
(L + L)(u + u) =( 2 + 2)(u + u)
u = u ui ˆei
where ˆei ⌘ (0, . . . , 0, 1|{z}
i
, 0, . . . , 0)
=) 2 ⇡
P
j2Ni
uj(ui uj)
1 u2
i
Select	
  i	
  that	
  
maximizes	
  Δλ2	
  
Results:	
  model	
  networks
(N	
  =	
  250,	
  <k>	
  =	
  10)
Goh
WS
HKBA
ER
f
0 0.1 0.2 0.3 0.4 0.5
f
0 0.1 0.2 0.3 0.4 0.5
f
0 0.1 0.2 0.3 0.4 0.5
1
3
5
1
1.4
1.8
perturbative
betweenness-based
degree-based
optimal sequential
1
1.2
0.9
0.8
1.1
0.9
1
1.1
1.2
1
0.6
1.4
f
0 0.1 0.2 0.3 0.4 0.5
f
2normalized
0 0.1 0.2 0.3 0.4 0.5
Goh
Results:	
  real	
  networks
perturbative
betweenness-based
degree-based
optimal sequential
e-mail
C. elegans
2
3
4
5
0.5
0
1
1.5
2
22
E. coli
0
0.2
0.4
0.6
0.8
macaque
1
2
3
4
5
0 0.1 0.2 0.3 0.4 0.5 0 0.1 0.2 0.3 0.4 0.5
0
f f
0 0.1 0.2 0.3 0.4 0.5 0 0.1 0.2 0.3 0.4 0.5
f f
N	
  =	
  279
<k>	
  =	
  16.4
N	
  =	
  1133
<k>	
  =	
  9.62
N	
  =	
  71
<k>	
  =	
  12.3
N	
  =	
  2268
<k>	
  =	
  4.96
Conclusions
• Careful	
  node	
  removal	
  can	
  increase	
  the	
  
spectral	
  gap.
• For	
  a	
  variety	
  of	
  networks,	
  the	
  perturbaUve	
  
strategy	
  works	
  well	
  with	
  a	
  reduced	
  
computaUonal	
  cost.
• Ref:	
  Watanabe	
  &	
  Masuda,	
  Physical	
  Review	
  
E,	
  82,	
  046102	
  (2010)
However,
• SequenUal	
  opUmal	
  may	
  not	
  be	
  opUmal	
  for	
  
Ndel	
  ≥	
  2.
• An	
  obvious	
  combinatorial	
  problem	
  if	
  we	
  
pursue	
  the	
  opUmal	
  soluUon.
min	
  t	
  subject	
  to
tI F(x1, . . . , xn) ⌫ 0 (eigenvalues: t n  · · ·  t 1)
Semidefinite	
  programming
Eigenvalue	
  minimizaUon	
  using	
  SDP
nX
i=1
ciximin subject	
  to F0 +
nX
i=1
xiFi ⌫ 0
F0, . . . , Fn :	
  symmetric	
  matrices
F(x1, . . . , xn) = F0 +
nX
i=1
xiFi (eigenvalues: 1  · · ·  n)
F0, . . . , Fn :	
  symmetric	
  matrices
DifficulUes	
  in	
  our	
  case
• Discreteness:	
  xi	
  ∈	
  {0,	
  1}
• Ndel	
  (irrelevant)	
  0	
  eigenvalues	
  appear.
• Not	
  interested	
  in	
  the	
  zero	
  eigenvalue	
  	
  λ1=0.
• So,	
  let’s	
  start	
  with	
  the	
  following	
  problem:
max	
  t	
  subject	
  to
λ1=0	
  →	
  λ1’=α
New	
  zero	
  eigenvalue	
  →	
  β
But,	
  a	
  nonlinear	
  constraint	
  
tI +
X
i<j;(i,j)2E
xixj
˜Lij + ↵J +
NX
i=1
(1 xi)Ei ⌫ 0
NX
i=1
xi = N Ndel, xi 2 {0, 1}
where Ei = diag(0, . . . , 0, 1|{z}
i
, 0, . . . , 0)
L =
X
1i<jN;(i,j)2E
˜Lij
(Lovász,	
  1979;	
  Grötschel,	
  Lovasz	
  &	
  Schrijver,	
  1986;	
  Lovasz	
  &	
  Schrijver,	
  1991)
• Xij,	
  where	
  (i,j)	
  is	
  not	
  a	
  link,	
  is	
  a	
  “free”	
  variable.
• We	
  can	
  reduce	
  the	
  number	
  of	
  variables	
  using	
  Xii	
  =	
  xi.	
  But	
  sUll	
  O(N2)	
  terms	
  exist,	
  and	
  the	
  algorithm	
  runs	
  slowly.
• For	
  a	
  technical	
  reason,	
  we	
  set	
  α	
  =	
  β/N
• Challenges
• Discreteness	
  of	
  xi	
  →	
  	
  “relax”	
  the	
  problem
• Nonlinear	
  constraint	
  →	
  	
  introduce	
  new	
  vars
Xij ⌘ xixj
tI +
X
i<j;(i,j)2E
Xij
˜Lij + ↵J +
NX
i=1
(1 xi)Ei ⌫ 0
NX
i=1
xi = N Ndel
Y ⌘

1 x>
x X
⌫ 0
0  xi(= Xii)  1(1  i  N)
SDP1
←	
  actually	
  not	
  needed
tI +
X
i<j;(i,j)2E
xixj
˜Lij + ↵J +
NX
i=1
(1 xi)Ei ⌫ 0
max	
  t	
  subject	
  to
An	
  improved	
  method	
  SDP2:
	
  “local	
  relaxaUon”
tI +
X
i<j;(i,j)2E
Xij
˜Lij + ↵J +
NX
i=1
(1 xi)Ei ⌫ 0
x1x2 0
x1(1 x2) 0
(1 x1)x2 0
(1 x1)(1 x2) 0
X12 0
x1 X12 0
x2 X12 0
1 x1 x2 + X12 0
IntuiUve	
  comparison
• Consider	
  N=1	
  (unrealisUc	
  though).
• SDP1
• Note:	
  In	
  fact,	
  X11	
  =	
  x1.
• SDP2
• Linear!

1 x>
x X
=

1 x1
x1 X11
⌫ 0 () X11 x2
1
8
>>><
>>>:
Xij 0
xi Xij 0
xj Xij 0
1 xi xj + Xij 0
with i = j = 1 =)
8
><
>:
X11 0
X11  x1
X11 2x1 1
• Number	
  of	
  vars	
  reduced.
• Size	
  of	
  the	
  SDP	
  part	
  reduced.
• Constraint	
  0	
  ≤	
  xi	
  ≤	
  1	
  unnecessary.
SDP2 max	
  t	
  subject	
  to
tI +
X
i<j;(i,j)2E
Xij
˜Lij+↵J +
NX
i=1
(1 xi)Ei ⌫ 0,
NX
i=1
xi =N Ndel,
For links (i, j)
8
>>><
>>>:
Xij 0
xi Xij 0
xj Xij 0
1 xi xj + Xij 0
Small	
  networks
Karate	
  club
(N=34,	
  78	
  links,	
  β=2)
Data:	
  Zachary	
  (1977)	
  
Macaque	
  corUcal	
  net
(N=71,	
  438	
  links,	
  β=2)
Data:	
  Sporns	
  &	
  Zwi	
  (2004)	
  	
  
0
1
2
0 10 20
2
Ndel
(a)
sequential
SDP1
SDP2
0
1
2
3
0 10 20
2
Ndel
(b)
RelaUvely	
  large	
  networks
BA	
  model	
  (scale-­‐free	
  net)
(N=150,	
  297	
  links,	
  β=2)
C.	
  elegans	
  neural	
  net
(N=297,	
  2287	
  links,	
  β=2.5)
Data:	
  Chen	
  et	
  al.	
  (2006)
0.5
0.6
0.7
0 10 20
2
Ndel
(c)
1
1.5
2
2.5
3
0 10 20 30
2
Ndel
(d)
SDP2
sequenUal
ObservaUon:	
  SDP1/SDP2	
  may	
  work	
  beber	
  for	
  sparse	
  networks.
Possible	
  direcUons
• Go	
  violate	
  convexity
• (1-­‐xi)	
  →	
  (1-­‐xi)p,	
  and	
  increase	
  p	
  gradually	
  
from	
  p=1.	
  By	
  the	
  Newton	
  method
• Parameter	
  tuning?
tI +
X
i<j;(i,j)2E
Xij
˜Lij + ↵J +
NX
i=1
(1 xi)Ei ⌫ 0

Contenu connexe

Tendances

Robust Control of Uncertain Switched Linear Systems based on Stochastic Reach...
Robust Control of Uncertain Switched Linear Systems based on Stochastic Reach...Robust Control of Uncertain Switched Linear Systems based on Stochastic Reach...
Robust Control of Uncertain Switched Linear Systems based on Stochastic Reach...Leo Asselborn
 
Back Propagation in Deep Neural Network
Back Propagation in Deep Neural NetworkBack Propagation in Deep Neural Network
Back Propagation in Deep Neural NetworkVARUN KUMAR
 
Random Matrix Theory and Machine Learning - Part 2
Random Matrix Theory and Machine Learning - Part 2Random Matrix Theory and Machine Learning - Part 2
Random Matrix Theory and Machine Learning - Part 2Fabian Pedregosa
 
Random Matrix Theory and Machine Learning - Part 4
Random Matrix Theory and Machine Learning - Part 4Random Matrix Theory and Machine Learning - Part 4
Random Matrix Theory and Machine Learning - Part 4Fabian Pedregosa
 
Zero relations
Zero relationsZero relations
Zero relationsSchool
 
Random Matrix Theory and Machine Learning - Part 1
Random Matrix Theory and Machine Learning - Part 1Random Matrix Theory and Machine Learning - Part 1
Random Matrix Theory and Machine Learning - Part 1Fabian Pedregosa
 
Random Matrix Theory and Machine Learning - Part 3
Random Matrix Theory and Machine Learning - Part 3Random Matrix Theory and Machine Learning - Part 3
Random Matrix Theory and Machine Learning - Part 3Fabian Pedregosa
 
Convolution Neural Network
Convolution Neural NetworkConvolution Neural Network
Convolution Neural NetworkAmit Kushwaha
 
Kdd12 tutorial-inf-part-iii
Kdd12 tutorial-inf-part-iiiKdd12 tutorial-inf-part-iii
Kdd12 tutorial-inf-part-iiiLaks Lakshmanan
 
Generalization of Compositons of Cellular Automata on Groups
Generalization of Compositons of Cellular Automata on GroupsGeneralization of Compositons of Cellular Automata on Groups
Generalization of Compositons of Cellular Automata on GroupsYoshihiro Mizoguchi
 
類神經網路、語意相似度(一個不嫌少、兩個恰恰好)
類神經網路、語意相似度(一個不嫌少、兩個恰恰好)類神經網路、語意相似度(一個不嫌少、兩個恰恰好)
類神經網路、語意相似度(一個不嫌少、兩個恰恰好)Ming-Chi Liu
 
Table 1
Table 1Table 1
Table 1butest
 
Lesson 27: Integration by Substitution (Section 4 version)
Lesson 27: Integration by Substitution (Section 4 version)Lesson 27: Integration by Substitution (Section 4 version)
Lesson 27: Integration by Substitution (Section 4 version)Matthew Leingang
 

Tendances (20)

Eye deep
Eye deepEye deep
Eye deep
 
18 Sampling Mean Sd
18 Sampling Mean Sd18 Sampling Mean Sd
18 Sampling Mean Sd
 
Robust Control of Uncertain Switched Linear Systems based on Stochastic Reach...
Robust Control of Uncertain Switched Linear Systems based on Stochastic Reach...Robust Control of Uncertain Switched Linear Systems based on Stochastic Reach...
Robust Control of Uncertain Switched Linear Systems based on Stochastic Reach...
 
Back Propagation in Deep Neural Network
Back Propagation in Deep Neural NetworkBack Propagation in Deep Neural Network
Back Propagation in Deep Neural Network
 
Random Matrix Theory and Machine Learning - Part 2
Random Matrix Theory and Machine Learning - Part 2Random Matrix Theory and Machine Learning - Part 2
Random Matrix Theory and Machine Learning - Part 2
 
2018 MUMS Fall Course - Mathematical surrogate and reduced-order models - Ral...
2018 MUMS Fall Course - Mathematical surrogate and reduced-order models - Ral...2018 MUMS Fall Course - Mathematical surrogate and reduced-order models - Ral...
2018 MUMS Fall Course - Mathematical surrogate and reduced-order models - Ral...
 
Random Matrix Theory and Machine Learning - Part 4
Random Matrix Theory and Machine Learning - Part 4Random Matrix Theory and Machine Learning - Part 4
Random Matrix Theory and Machine Learning - Part 4
 
Zero relations
Zero relationsZero relations
Zero relations
 
DCT
DCTDCT
DCT
 
Random Matrix Theory and Machine Learning - Part 1
Random Matrix Theory and Machine Learning - Part 1Random Matrix Theory and Machine Learning - Part 1
Random Matrix Theory and Machine Learning - Part 1
 
Random Matrix Theory and Machine Learning - Part 3
Random Matrix Theory and Machine Learning - Part 3Random Matrix Theory and Machine Learning - Part 3
Random Matrix Theory and Machine Learning - Part 3
 
2018 MUMS Fall Course - Sampling-based techniques for uncertainty propagation...
2018 MUMS Fall Course - Sampling-based techniques for uncertainty propagation...2018 MUMS Fall Course - Sampling-based techniques for uncertainty propagation...
2018 MUMS Fall Course - Sampling-based techniques for uncertainty propagation...
 
Convolution Neural Network
Convolution Neural NetworkConvolution Neural Network
Convolution Neural Network
 
Kdd12 tutorial-inf-part-iii
Kdd12 tutorial-inf-part-iiiKdd12 tutorial-inf-part-iii
Kdd12 tutorial-inf-part-iii
 
Generalization of Compositons of Cellular Automata on Groups
Generalization of Compositons of Cellular Automata on GroupsGeneralization of Compositons of Cellular Automata on Groups
Generalization of Compositons of Cellular Automata on Groups
 
類神經網路、語意相似度(一個不嫌少、兩個恰恰好)
類神經網路、語意相似度(一個不嫌少、兩個恰恰好)類神經網路、語意相似度(一個不嫌少、兩個恰恰好)
類神經網路、語意相似度(一個不嫌少、兩個恰恰好)
 
09 Normal Trans
09 Normal Trans09 Normal Trans
09 Normal Trans
 
Table 1
Table 1Table 1
Table 1
 
Lesson 27: Integration by Substitution (Section 4 version)
Lesson 27: Integration by Substitution (Section 4 version)Lesson 27: Integration by Substitution (Section 4 version)
Lesson 27: Integration by Substitution (Section 4 version)
 
05 Random Variables
05 Random Variables05 Random Variables
05 Random Variables
 

En vedette

Return times of random walk on generalized random graphs
Return times of random walk on generalized random graphsReturn times of random walk on generalized random graphs
Return times of random walk on generalized random graphsNaoki Masuda
 
Magazine cover build up
Magazine cover build upMagazine cover build up
Magazine cover build upDanHall1996
 
Bài phải nộp
Bài phải nộpBài phải nộp
Bài phải nộpThanh Huệ
 
Participation costs dismiss the advantage of heterogeneous networks in evolut...
Participation costs dismiss the advantage of heterogeneous networks in evolut...Participation costs dismiss the advantage of heterogeneous networks in evolut...
Participation costs dismiss the advantage of heterogeneous networks in evolut...Naoki Masuda
 
Tag-based indirect reciprocity
Tag-based indirect reciprocityTag-based indirect reciprocity
Tag-based indirect reciprocityNaoki Masuda
 
Global network structure of dominance hierarchy of ant workersAntnet slides-s...
Global network structure of dominance hierarchy of ant workersAntnet slides-s...Global network structure of dominance hierarchy of ant workersAntnet slides-s...
Global network structure of dominance hierarchy of ant workersAntnet slides-s...Naoki Masuda
 
10 common misconceptions in feng shui
10 common misconceptions in feng shui10 common misconceptions in feng shui
10 common misconceptions in feng shuiStanley Tan
 
Preparing Students for Collaborative Leadership: Lowering the walls and cross...
Preparing Students for Collaborative Leadership: Lowering the walls and cross...Preparing Students for Collaborative Leadership: Lowering the walls and cross...
Preparing Students for Collaborative Leadership: Lowering the walls and cross...Lyle Birkey
 
Pengujian vigor benih
Pengujian vigor benihPengujian vigor benih
Pengujian vigor benihUnhy Doel
 
Vigor dan viabilitas benih
Vigor dan viabilitas benihVigor dan viabilitas benih
Vigor dan viabilitas benihUnhy Doel
 
Murtad Bukan Hak Asasi
Murtad Bukan Hak AsasiMurtad Bukan Hak Asasi
Murtad Bukan Hak AsasiYumie Mie
 
SalesClic for Salesforce Essentials
SalesClic for Salesforce EssentialsSalesClic for Salesforce Essentials
SalesClic for Salesforce EssentialsSalesClic
 
How to make a perfect cupcake
How to make a perfect cupcakeHow to make a perfect cupcake
How to make a perfect cupcakeCarrieRoseW
 
Metaalvak_2016_nr5
Metaalvak_2016_nr5Metaalvak_2016_nr5
Metaalvak_2016_nr5Tim Buyle
 

En vedette (19)

Return times of random walk on generalized random graphs
Return times of random walk on generalized random graphsReturn times of random walk on generalized random graphs
Return times of random walk on generalized random graphs
 
Magazine cover build up
Magazine cover build upMagazine cover build up
Magazine cover build up
 
Bài phải nộp
Bài phải nộpBài phải nộp
Bài phải nộp
 
Participation costs dismiss the advantage of heterogeneous networks in evolut...
Participation costs dismiss the advantage of heterogeneous networks in evolut...Participation costs dismiss the advantage of heterogeneous networks in evolut...
Participation costs dismiss the advantage of heterogeneous networks in evolut...
 
Tag-based indirect reciprocity
Tag-based indirect reciprocityTag-based indirect reciprocity
Tag-based indirect reciprocity
 
Global network structure of dominance hierarchy of ant workersAntnet slides-s...
Global network structure of dominance hierarchy of ant workersAntnet slides-s...Global network structure of dominance hierarchy of ant workersAntnet slides-s...
Global network structure of dominance hierarchy of ant workersAntnet slides-s...
 
10 common misconceptions in feng shui
10 common misconceptions in feng shui10 common misconceptions in feng shui
10 common misconceptions in feng shui
 
Preparing Students for Collaborative Leadership: Lowering the walls and cross...
Preparing Students for Collaborative Leadership: Lowering the walls and cross...Preparing Students for Collaborative Leadership: Lowering the walls and cross...
Preparing Students for Collaborative Leadership: Lowering the walls and cross...
 
Pengujian vigor benih
Pengujian vigor benihPengujian vigor benih
Pengujian vigor benih
 
Vigor dan viabilitas benih
Vigor dan viabilitas benihVigor dan viabilitas benih
Vigor dan viabilitas benih
 
Murtad Bukan Hak Asasi
Murtad Bukan Hak AsasiMurtad Bukan Hak Asasi
Murtad Bukan Hak Asasi
 
SalesClic for Salesforce Essentials
SalesClic for Salesforce EssentialsSalesClic for Salesforce Essentials
SalesClic for Salesforce Essentials
 
Face forward deborah alessi
Face forward deborah alessiFace forward deborah alessi
Face forward deborah alessi
 
PRONOUNCIATION
PRONOUNCIATION PRONOUNCIATION
PRONOUNCIATION
 
How to make a perfect cupcake
How to make a perfect cupcakeHow to make a perfect cupcake
How to make a perfect cupcake
 
PC Pro
PC ProPC Pro
PC Pro
 
Diploma_LTS-B
Diploma_LTS-BDiploma_LTS-B
Diploma_LTS-B
 
LOR_KimVith
LOR_KimVithLOR_KimVith
LOR_KimVith
 
Metaalvak_2016_nr5
Metaalvak_2016_nr5Metaalvak_2016_nr5
Metaalvak_2016_nr5
 

Similaire à Maximizing the spectral gap of networks produced by node removal

Neural Processes Family
Neural Processes FamilyNeural Processes Family
Neural Processes FamilyKota Matsui
 
Test Problems in Optimization
Test Problems in OptimizationTest Problems in Optimization
Test Problems in OptimizationXin-She Yang
 
Ch 04 Arithmetic Coding (Ppt)
Ch 04 Arithmetic Coding (Ppt)Ch 04 Arithmetic Coding (Ppt)
Ch 04 Arithmetic Coding (Ppt)anithabalaprabhu
 
Ch 04 Arithmetic Coding ( P P T)
Ch 04  Arithmetic  Coding ( P P T)Ch 04  Arithmetic  Coding ( P P T)
Ch 04 Arithmetic Coding ( P P T)anithabalaprabhu
 
On estimating the integrated co volatility using
On estimating the integrated co volatility usingOn estimating the integrated co volatility using
On estimating the integrated co volatility usingkkislas
 
Solutions_Chapter3.pdf
Solutions_Chapter3.pdfSolutions_Chapter3.pdf
Solutions_Chapter3.pdfTessydee
 
Circuit Network Analysis - [Chapter4] Laplace Transform
Circuit Network Analysis - [Chapter4] Laplace TransformCircuit Network Analysis - [Chapter4] Laplace Transform
Circuit Network Analysis - [Chapter4] Laplace TransformSimen Li
 
Newton's forward & backward interpolation
Newton's forward & backward interpolationNewton's forward & backward interpolation
Newton's forward & backward interpolationHarshad Koshti
 
Lect аі 2 n net p2
Lect аі 2 n net p2Lect аі 2 n net p2
Lect аі 2 n net p2Halyna Melnyk
 
The Analytical Nature of the Greens Function in the Vicinity of a Simple Pole
The Analytical Nature of the Greens Function in the Vicinity of a Simple PoleThe Analytical Nature of the Greens Function in the Vicinity of a Simple Pole
The Analytical Nature of the Greens Function in the Vicinity of a Simple Poleijtsrd
 
NIPS2010: optimization algorithms in machine learning
NIPS2010: optimization algorithms in machine learningNIPS2010: optimization algorithms in machine learning
NIPS2010: optimization algorithms in machine learningzukun
 
11.generalized and subset integrated autoregressive moving average bilinear t...
11.generalized and subset integrated autoregressive moving average bilinear t...11.generalized and subset integrated autoregressive moving average bilinear t...
11.generalized and subset integrated autoregressive moving average bilinear t...Alexander Decker
 
APPROXIMATE CONTROLLABILITY RESULTS FOR IMPULSIVE LINEAR FUZZY STOCHASTIC DIF...
APPROXIMATE CONTROLLABILITY RESULTS FOR IMPULSIVE LINEAR FUZZY STOCHASTIC DIF...APPROXIMATE CONTROLLABILITY RESULTS FOR IMPULSIVE LINEAR FUZZY STOCHASTIC DIF...
APPROXIMATE CONTROLLABILITY RESULTS FOR IMPULSIVE LINEAR FUZZY STOCHASTIC DIF...Wireilla
 
APPROXIMATE CONTROLLABILITY RESULTS FOR IMPULSIVE LINEAR FUZZY STOCHASTIC DIF...
APPROXIMATE CONTROLLABILITY RESULTS FOR IMPULSIVE LINEAR FUZZY STOCHASTIC DIF...APPROXIMATE CONTROLLABILITY RESULTS FOR IMPULSIVE LINEAR FUZZY STOCHASTIC DIF...
APPROXIMATE CONTROLLABILITY RESULTS FOR IMPULSIVE LINEAR FUZZY STOCHASTIC DIF...ijfls
 
Introduction to Artificial Neural Networks
Introduction to Artificial Neural NetworksIntroduction to Artificial Neural Networks
Introduction to Artificial Neural NetworksStratio
 
Digit recognizer by convolutional neural network
Digit recognizer by convolutional neural networkDigit recognizer by convolutional neural network
Digit recognizer by convolutional neural networkDing Li
 

Similaire à Maximizing the spectral gap of networks produced by node removal (20)

Neural Processes Family
Neural Processes FamilyNeural Processes Family
Neural Processes Family
 
Test Problems in Optimization
Test Problems in OptimizationTest Problems in Optimization
Test Problems in Optimization
 
Ch 04 Arithmetic Coding (Ppt)
Ch 04 Arithmetic Coding (Ppt)Ch 04 Arithmetic Coding (Ppt)
Ch 04 Arithmetic Coding (Ppt)
 
Ch 04 Arithmetic Coding ( P P T)
Ch 04  Arithmetic  Coding ( P P T)Ch 04  Arithmetic  Coding ( P P T)
Ch 04 Arithmetic Coding ( P P T)
 
Artificial neural networks
Artificial neural networks Artificial neural networks
Artificial neural networks
 
On estimating the integrated co volatility using
On estimating the integrated co volatility usingOn estimating the integrated co volatility using
On estimating the integrated co volatility using
 
Solutions_Chapter3.pdf
Solutions_Chapter3.pdfSolutions_Chapter3.pdf
Solutions_Chapter3.pdf
 
5.n nmodels i
5.n nmodels i5.n nmodels i
5.n nmodels i
 
Circuit Network Analysis - [Chapter4] Laplace Transform
Circuit Network Analysis - [Chapter4] Laplace TransformCircuit Network Analysis - [Chapter4] Laplace Transform
Circuit Network Analysis - [Chapter4] Laplace Transform
 
Newton's forward & backward interpolation
Newton's forward & backward interpolationNewton's forward & backward interpolation
Newton's forward & backward interpolation
 
Fuzzy and nn
Fuzzy and nnFuzzy and nn
Fuzzy and nn
 
Lect аі 2 n net p2
Lect аі 2 n net p2Lect аі 2 n net p2
Lect аі 2 n net p2
 
The Analytical Nature of the Greens Function in the Vicinity of a Simple Pole
The Analytical Nature of the Greens Function in the Vicinity of a Simple PoleThe Analytical Nature of the Greens Function in the Vicinity of a Simple Pole
The Analytical Nature of the Greens Function in the Vicinity of a Simple Pole
 
Randomized algorithms ver 1.0
Randomized algorithms ver 1.0Randomized algorithms ver 1.0
Randomized algorithms ver 1.0
 
NIPS2010: optimization algorithms in machine learning
NIPS2010: optimization algorithms in machine learningNIPS2010: optimization algorithms in machine learning
NIPS2010: optimization algorithms in machine learning
 
11.generalized and subset integrated autoregressive moving average bilinear t...
11.generalized and subset integrated autoregressive moving average bilinear t...11.generalized and subset integrated autoregressive moving average bilinear t...
11.generalized and subset integrated autoregressive moving average bilinear t...
 
APPROXIMATE CONTROLLABILITY RESULTS FOR IMPULSIVE LINEAR FUZZY STOCHASTIC DIF...
APPROXIMATE CONTROLLABILITY RESULTS FOR IMPULSIVE LINEAR FUZZY STOCHASTIC DIF...APPROXIMATE CONTROLLABILITY RESULTS FOR IMPULSIVE LINEAR FUZZY STOCHASTIC DIF...
APPROXIMATE CONTROLLABILITY RESULTS FOR IMPULSIVE LINEAR FUZZY STOCHASTIC DIF...
 
APPROXIMATE CONTROLLABILITY RESULTS FOR IMPULSIVE LINEAR FUZZY STOCHASTIC DIF...
APPROXIMATE CONTROLLABILITY RESULTS FOR IMPULSIVE LINEAR FUZZY STOCHASTIC DIF...APPROXIMATE CONTROLLABILITY RESULTS FOR IMPULSIVE LINEAR FUZZY STOCHASTIC DIF...
APPROXIMATE CONTROLLABILITY RESULTS FOR IMPULSIVE LINEAR FUZZY STOCHASTIC DIF...
 
Introduction to Artificial Neural Networks
Introduction to Artificial Neural NetworksIntroduction to Artificial Neural Networks
Introduction to Artificial Neural Networks
 
Digit recognizer by convolutional neural network
Digit recognizer by convolutional neural networkDigit recognizer by convolutional neural network
Digit recognizer by convolutional neural network
 

Dernier

Architecting Cloud Native Applications
Architecting Cloud Native ApplicationsArchitecting Cloud Native Applications
Architecting Cloud Native ApplicationsWSO2
 
Boost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdfBoost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdfsudhanshuwaghmare1
 
ProductAnonymous-April2024-WinProductDiscovery-MelissaKlemke
ProductAnonymous-April2024-WinProductDiscovery-MelissaKlemkeProductAnonymous-April2024-WinProductDiscovery-MelissaKlemke
ProductAnonymous-April2024-WinProductDiscovery-MelissaKlemkeProduct Anonymous
 
Introduction to Multilingual Retrieval Augmented Generation (RAG)
Introduction to Multilingual Retrieval Augmented Generation (RAG)Introduction to Multilingual Retrieval Augmented Generation (RAG)
Introduction to Multilingual Retrieval Augmented Generation (RAG)Zilliz
 
DEV meet-up UiPath Document Understanding May 7 2024 Amsterdam
DEV meet-up UiPath Document Understanding May 7 2024 AmsterdamDEV meet-up UiPath Document Understanding May 7 2024 Amsterdam
DEV meet-up UiPath Document Understanding May 7 2024 AmsterdamUiPathCommunity
 
Exploring Multimodal Embeddings with Milvus
Exploring Multimodal Embeddings with MilvusExploring Multimodal Embeddings with Milvus
Exploring Multimodal Embeddings with MilvusZilliz
 
Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...
Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...
Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...apidays
 
DBX First Quarter 2024 Investor Presentation
DBX First Quarter 2024 Investor PresentationDBX First Quarter 2024 Investor Presentation
DBX First Quarter 2024 Investor PresentationDropbox
 
AWS Community Day CPH - Three problems of Terraform
AWS Community Day CPH - Three problems of TerraformAWS Community Day CPH - Three problems of Terraform
AWS Community Day CPH - Three problems of TerraformAndrey Devyatkin
 
MINDCTI Revenue Release Quarter One 2024
MINDCTI Revenue Release Quarter One 2024MINDCTI Revenue Release Quarter One 2024
MINDCTI Revenue Release Quarter One 2024MIND CTI
 
Polkadot JAM Slides - Token2049 - By Dr. Gavin Wood
Polkadot JAM Slides - Token2049 - By Dr. Gavin WoodPolkadot JAM Slides - Token2049 - By Dr. Gavin Wood
Polkadot JAM Slides - Token2049 - By Dr. Gavin WoodJuan lago vázquez
 
How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerThousandEyes
 
EMPOWERMENT TECHNOLOGY GRADE 11 QUARTER 2 REVIEWER
EMPOWERMENT TECHNOLOGY GRADE 11 QUARTER 2 REVIEWEREMPOWERMENT TECHNOLOGY GRADE 11 QUARTER 2 REVIEWER
EMPOWERMENT TECHNOLOGY GRADE 11 QUARTER 2 REVIEWERMadyBayot
 
Apidays New York 2024 - Accelerating FinTech Innovation by Vasa Krishnan, Fin...
Apidays New York 2024 - Accelerating FinTech Innovation by Vasa Krishnan, Fin...Apidays New York 2024 - Accelerating FinTech Innovation by Vasa Krishnan, Fin...
Apidays New York 2024 - Accelerating FinTech Innovation by Vasa Krishnan, Fin...apidays
 
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...DianaGray10
 
"I see eyes in my soup": How Delivery Hero implemented the safety system for ...
"I see eyes in my soup": How Delivery Hero implemented the safety system for ..."I see eyes in my soup": How Delivery Hero implemented the safety system for ...
"I see eyes in my soup": How Delivery Hero implemented the safety system for ...Zilliz
 
MS Copilot expands with MS Graph connectors
MS Copilot expands with MS Graph connectorsMS Copilot expands with MS Graph connectors
MS Copilot expands with MS Graph connectorsNanddeep Nachan
 
Biography Of Angeliki Cooney | Senior Vice President Life Sciences | Albany, ...
Biography Of Angeliki Cooney | Senior Vice President Life Sciences | Albany, ...Biography Of Angeliki Cooney | Senior Vice President Life Sciences | Albany, ...
Biography Of Angeliki Cooney | Senior Vice President Life Sciences | Albany, ...Angeliki Cooney
 
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, AdobeApidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobeapidays
 
[BuildWithAI] Introduction to Gemini.pdf
[BuildWithAI] Introduction to Gemini.pdf[BuildWithAI] Introduction to Gemini.pdf
[BuildWithAI] Introduction to Gemini.pdfSandro Moreira
 

Dernier (20)

Architecting Cloud Native Applications
Architecting Cloud Native ApplicationsArchitecting Cloud Native Applications
Architecting Cloud Native Applications
 
Boost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdfBoost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdf
 
ProductAnonymous-April2024-WinProductDiscovery-MelissaKlemke
ProductAnonymous-April2024-WinProductDiscovery-MelissaKlemkeProductAnonymous-April2024-WinProductDiscovery-MelissaKlemke
ProductAnonymous-April2024-WinProductDiscovery-MelissaKlemke
 
Introduction to Multilingual Retrieval Augmented Generation (RAG)
Introduction to Multilingual Retrieval Augmented Generation (RAG)Introduction to Multilingual Retrieval Augmented Generation (RAG)
Introduction to Multilingual Retrieval Augmented Generation (RAG)
 
DEV meet-up UiPath Document Understanding May 7 2024 Amsterdam
DEV meet-up UiPath Document Understanding May 7 2024 AmsterdamDEV meet-up UiPath Document Understanding May 7 2024 Amsterdam
DEV meet-up UiPath Document Understanding May 7 2024 Amsterdam
 
Exploring Multimodal Embeddings with Milvus
Exploring Multimodal Embeddings with MilvusExploring Multimodal Embeddings with Milvus
Exploring Multimodal Embeddings with Milvus
 
Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...
Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...
Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...
 
DBX First Quarter 2024 Investor Presentation
DBX First Quarter 2024 Investor PresentationDBX First Quarter 2024 Investor Presentation
DBX First Quarter 2024 Investor Presentation
 
AWS Community Day CPH - Three problems of Terraform
AWS Community Day CPH - Three problems of TerraformAWS Community Day CPH - Three problems of Terraform
AWS Community Day CPH - Three problems of Terraform
 
MINDCTI Revenue Release Quarter One 2024
MINDCTI Revenue Release Quarter One 2024MINDCTI Revenue Release Quarter One 2024
MINDCTI Revenue Release Quarter One 2024
 
Polkadot JAM Slides - Token2049 - By Dr. Gavin Wood
Polkadot JAM Slides - Token2049 - By Dr. Gavin WoodPolkadot JAM Slides - Token2049 - By Dr. Gavin Wood
Polkadot JAM Slides - Token2049 - By Dr. Gavin Wood
 
How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected Worker
 
EMPOWERMENT TECHNOLOGY GRADE 11 QUARTER 2 REVIEWER
EMPOWERMENT TECHNOLOGY GRADE 11 QUARTER 2 REVIEWEREMPOWERMENT TECHNOLOGY GRADE 11 QUARTER 2 REVIEWER
EMPOWERMENT TECHNOLOGY GRADE 11 QUARTER 2 REVIEWER
 
Apidays New York 2024 - Accelerating FinTech Innovation by Vasa Krishnan, Fin...
Apidays New York 2024 - Accelerating FinTech Innovation by Vasa Krishnan, Fin...Apidays New York 2024 - Accelerating FinTech Innovation by Vasa Krishnan, Fin...
Apidays New York 2024 - Accelerating FinTech Innovation by Vasa Krishnan, Fin...
 
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
 
"I see eyes in my soup": How Delivery Hero implemented the safety system for ...
"I see eyes in my soup": How Delivery Hero implemented the safety system for ..."I see eyes in my soup": How Delivery Hero implemented the safety system for ...
"I see eyes in my soup": How Delivery Hero implemented the safety system for ...
 
MS Copilot expands with MS Graph connectors
MS Copilot expands with MS Graph connectorsMS Copilot expands with MS Graph connectors
MS Copilot expands with MS Graph connectors
 
Biography Of Angeliki Cooney | Senior Vice President Life Sciences | Albany, ...
Biography Of Angeliki Cooney | Senior Vice President Life Sciences | Albany, ...Biography Of Angeliki Cooney | Senior Vice President Life Sciences | Albany, ...
Biography Of Angeliki Cooney | Senior Vice President Life Sciences | Albany, ...
 
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, AdobeApidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
 
[BuildWithAI] Introduction to Gemini.pdf
[BuildWithAI] Introduction to Gemini.pdf[BuildWithAI] Introduction to Gemini.pdf
[BuildWithAI] Introduction to Gemini.pdf
 

Maximizing the spectral gap of networks produced by node removal

  • 1. Maximizing  the  spectral  gap   of  networks  produced  by   node  removal Naoki  Masuda  (University  of  Tokyo,  Japan) Refs:   1.  Watanabe  &  Masuda,  Physical  Review  E,  82,  046102  (2010) 2.  Masuda,  Fujie  &  Murota,  In:  Complex  Networks  IV,  Studies  in  ComputaUonal   Intelligence,  476,  155-­‐163  (2013) Collaborators: Takamitsu  Watanabe  (University  of  Tokyo,  Japan) Tetsuya  Fujie  (University  of  Hyogo,  Japan) Kazuo  Murota  (University  of  Tokyo,  Japan)
  • 2. Laplacian  of  a  network ˙x(t) = Lx(t) ˙x1 = 2x1 + x2 + x4 =(x2 x1) + (x4 x1) 1 2 3 4 L = 0 B B @ 2 1 0 1 1 2 0 1 0 0 1 1 1 1 1 3 1 C C A 1 = 0 < 2  3  · · ·  NEigenvalues:
  • 3. Spectral  gap • If  λ2  is  large,  diffusive  dynamical  processes  on  networks   occur  faster.  Ex:  synchronizaUon,  collecUve  opinion   formaUon,  random  walk. • Note:  unnormalized  Laplacian  here • Problem:  Maximize  λ2  by  removing  Ndel  out  of  N  nodes  by   two  methods. • SequenUal  node  removal  +  perturbaUve  method   (Watanabe  &  Masuda,  2010) • Semidefinite  programming  (Masuda,  Fujie  &  Murota,   2013) • Note:  Removal  of  links  always  decreases  λ2  (Milanese,  Sun   &  Nishikawa  2010;  Nishikawa  &  Mober  2010).
  • 4. PerturbaUve  method • Extends  the  same  method  for  adjacency  matrices   (Restrepo,  Ob  &  Hunt,  2008) • Much  faster  than  the  brute  force  method. Lu = 2u (L + L)(u + u) =( 2 + 2)(u + u) u = u ui ˆei where ˆei ⌘ (0, . . . , 0, 1|{z} i , 0, . . . , 0) =) 2 ⇡ P j2Ni uj(ui uj) 1 u2 i Select  i  that   maximizes  Δλ2  
  • 5. Results:  model  networks (N  =  250,  <k>  =  10) Goh WS HKBA ER f 0 0.1 0.2 0.3 0.4 0.5 f 0 0.1 0.2 0.3 0.4 0.5 f 0 0.1 0.2 0.3 0.4 0.5 1 3 5 1 1.4 1.8 perturbative betweenness-based degree-based optimal sequential 1 1.2 0.9 0.8 1.1 0.9 1 1.1 1.2 1 0.6 1.4 f 0 0.1 0.2 0.3 0.4 0.5 f 2normalized 0 0.1 0.2 0.3 0.4 0.5 Goh
  • 6. Results:  real  networks perturbative betweenness-based degree-based optimal sequential e-mail C. elegans 2 3 4 5 0.5 0 1 1.5 2 22 E. coli 0 0.2 0.4 0.6 0.8 macaque 1 2 3 4 5 0 0.1 0.2 0.3 0.4 0.5 0 0.1 0.2 0.3 0.4 0.5 0 f f 0 0.1 0.2 0.3 0.4 0.5 0 0.1 0.2 0.3 0.4 0.5 f f N  =  279 <k>  =  16.4 N  =  1133 <k>  =  9.62 N  =  71 <k>  =  12.3 N  =  2268 <k>  =  4.96
  • 7. Conclusions • Careful  node  removal  can  increase  the   spectral  gap. • For  a  variety  of  networks,  the  perturbaUve   strategy  works  well  with  a  reduced   computaUonal  cost. • Ref:  Watanabe  &  Masuda,  Physical  Review   E,  82,  046102  (2010)
  • 8. However, • SequenUal  opUmal  may  not  be  opUmal  for   Ndel  ≥  2. • An  obvious  combinatorial  problem  if  we   pursue  the  opUmal  soluUon.
  • 9. min  t  subject  to tI F(x1, . . . , xn) ⌫ 0 (eigenvalues: t n  · · ·  t 1) Semidefinite  programming Eigenvalue  minimizaUon  using  SDP nX i=1 ciximin subject  to F0 + nX i=1 xiFi ⌫ 0 F0, . . . , Fn :  symmetric  matrices F(x1, . . . , xn) = F0 + nX i=1 xiFi (eigenvalues: 1  · · ·  n) F0, . . . , Fn :  symmetric  matrices
  • 10. DifficulUes  in  our  case • Discreteness:  xi  ∈  {0,  1} • Ndel  (irrelevant)  0  eigenvalues  appear. • Not  interested  in  the  zero  eigenvalue    λ1=0. • So,  let’s  start  with  the  following  problem: max  t  subject  to λ1=0  →  λ1’=α New  zero  eigenvalue  →  β But,  a  nonlinear  constraint   tI + X i<j;(i,j)2E xixj ˜Lij + ↵J + NX i=1 (1 xi)Ei ⌫ 0 NX i=1 xi = N Ndel, xi 2 {0, 1} where Ei = diag(0, . . . , 0, 1|{z} i , 0, . . . , 0) L = X 1i<jN;(i,j)2E ˜Lij
  • 11. (Lovász,  1979;  Grötschel,  Lovasz  &  Schrijver,  1986;  Lovasz  &  Schrijver,  1991) • Xij,  where  (i,j)  is  not  a  link,  is  a  “free”  variable. • We  can  reduce  the  number  of  variables  using  Xii  =  xi.  But  sUll  O(N2)  terms  exist,  and  the  algorithm  runs  slowly. • For  a  technical  reason,  we  set  α  =  β/N • Challenges • Discreteness  of  xi  →    “relax”  the  problem • Nonlinear  constraint  →    introduce  new  vars Xij ⌘ xixj tI + X i<j;(i,j)2E Xij ˜Lij + ↵J + NX i=1 (1 xi)Ei ⌫ 0 NX i=1 xi = N Ndel Y ⌘  1 x> x X ⌫ 0 0  xi(= Xii)  1(1  i  N) SDP1 ←  actually  not  needed tI + X i<j;(i,j)2E xixj ˜Lij + ↵J + NX i=1 (1 xi)Ei ⌫ 0 max  t  subject  to
  • 12. An  improved  method  SDP2:  “local  relaxaUon” tI + X i<j;(i,j)2E Xij ˜Lij + ↵J + NX i=1 (1 xi)Ei ⌫ 0 x1x2 0 x1(1 x2) 0 (1 x1)x2 0 (1 x1)(1 x2) 0 X12 0 x1 X12 0 x2 X12 0 1 x1 x2 + X12 0
  • 13. IntuiUve  comparison • Consider  N=1  (unrealisUc  though). • SDP1 • Note:  In  fact,  X11  =  x1. • SDP2 • Linear!  1 x> x X =  1 x1 x1 X11 ⌫ 0 () X11 x2 1 8 >>>< >>>: Xij 0 xi Xij 0 xj Xij 0 1 xi xj + Xij 0 with i = j = 1 =) 8 >< >: X11 0 X11  x1 X11 2x1 1
  • 14. • Number  of  vars  reduced. • Size  of  the  SDP  part  reduced. • Constraint  0  ≤  xi  ≤  1  unnecessary. SDP2 max  t  subject  to tI + X i<j;(i,j)2E Xij ˜Lij+↵J + NX i=1 (1 xi)Ei ⌫ 0, NX i=1 xi =N Ndel, For links (i, j) 8 >>>< >>>: Xij 0 xi Xij 0 xj Xij 0 1 xi xj + Xij 0
  • 15. Small  networks Karate  club (N=34,  78  links,  β=2) Data:  Zachary  (1977)   Macaque  corUcal  net (N=71,  438  links,  β=2) Data:  Sporns  &  Zwi  (2004)     0 1 2 0 10 20 2 Ndel (a) sequential SDP1 SDP2 0 1 2 3 0 10 20 2 Ndel (b)
  • 16. RelaUvely  large  networks BA  model  (scale-­‐free  net) (N=150,  297  links,  β=2) C.  elegans  neural  net (N=297,  2287  links,  β=2.5) Data:  Chen  et  al.  (2006) 0.5 0.6 0.7 0 10 20 2 Ndel (c) 1 1.5 2 2.5 3 0 10 20 30 2 Ndel (d) SDP2 sequenUal ObservaUon:  SDP1/SDP2  may  work  beber  for  sparse  networks.
  • 17. Possible  direcUons • Go  violate  convexity • (1-­‐xi)  →  (1-­‐xi)p,  and  increase  p  gradually   from  p=1.  By  the  Newton  method • Parameter  tuning? tI + X i<j;(i,j)2E Xij ˜Lij + ↵J + NX i=1 (1 xi)Ei ⌫ 0