SlideShare une entreprise Scribd logo
1  sur  66
Télécharger pour lire hors ligne
1/50
On Convolution of Graph Signals
And Deep Learning on Graph Domains
December 13th 2018
Candidate: Jean-Charles Vialatte
Advisors: Vincent Gripon, Mathias Herberts
Supervisor: Gilles Coppin
Jury: Pierre Borgnat∗, Matthias L¨owe∗,
Paulo Goncalves, Juliette Mattioli
∗: examiners
December 13th 2018 1 / 50
2/50
Outline
1 Motivation and problem statement
2 Literature overview
3 Convolution of graph signals
4 Deep learning on graph domains
5 Experiments
6 Conclusion
December 13th 2018 2 / 50
2/50
Outline
1 Motivation and problem statement
2 Literature overview
3 Convolution of graph signals
4 Deep learning on graph domains
5 Experiments
6 Conclusion
December 13th 2018 2 / 50
3/50
Definitions: Signals and Graphs
Signal:
December 13th 2018 3 / 50
3/50
Definitions: Signals and Graphs
Signal:
Graph:
1
2
3
4
5
a
b
c
d
e






0 a 0 d e
a 0 0 b 0
0 0 0 c 0
d b c 0 0
e 0 0 0 0






A: adjacency matrix
December 13th 2018 3 / 50
3/50
Definitions: Signals and Graphs
Signal:
Graph:
1
2
3
4
5
a
b
c
d
e






0 a 0 d e
a 0 0 b 0
0 0 0 c 0
d b c 0 0
e 0 0 0 0






A: adjacency matrix
December 13th 2018 3 / 50
4/50
Examples of signals
Euclidean
domains
Non-Euclidean
domains
December 13th 2018 4 / 50
4/50
Examples of signals
Most domains can be represented with a graph:
Euclidean
domains
Non-Euclidean
domains
December 13th 2018 4 / 50
5/50
Deep learning performances
On Euclidean domains:
digits tiny images
1
22
0.5
3
Classification error (in %)
without convolutions with convolutions
December 13th 2018 5 / 50
6/50
A key factor: convolution
Defined on Euclidean domains:
December 13th 2018 6 / 50
7/50
Connectivity pattern: MLP vs CNN
Dense layer
fully connected
no tied weights
ex: 81 different weights here
December 13th 2018 7 / 50
7/50
Connectivity pattern: MLP vs CNN
Dense layer
fully connected
no tied weights
ex: 81 different weights here
Convolutional layer
locally connected
with weight sharing
ex: 3 different weights here
December 13th 2018 7 / 50
8/50
Problem: How to extend convolutions?
Euclidean structure Non-Euclidean structure
convolution
8/50
Problem: How to extend convolutions?
Euclidean structure Non-Euclidean structure
convolution
?
?
December 13th 2018 8 / 50
9/50
Supervised vs semi-supervised application
Let X be a dataset. We classify its rows. Two different kind of graph structures:
Supervised
classification
of graph-
structured
data
X
(b×n)
n
Semi-
supervised
classification
of nodes
X
(n×p)
n
December 13th 2018 9 / 50
9/50
Outline
1 Motivation and problem statement
2 Literature overview
3 Convolution of graph signals
4 Deep learning on graph domains
5 Experiments
6 Conclusion
December 13th 2018 9 / 50
10/50
Spectral approaches
Convolutions are defined in the graph spectral domain.
L = D − A = UΛUT
gft(X) = UX
Figure: Example of signals of the Laplacian eigenbasis
Using the GFT, convolution amount to a pointwise multiplication in the spectral domain.
X ⊗ Θ = UT
(UX.UΘ)
December 13th 2018 10 / 50
11/50
Spectral approaches
X ⊗ Θ = UT
(UX.UΘ)
Pros
Elegant and fast under some approximations
Can be used off the shelf: no need to specify any weight sharing
Cons
Introduce isotropic symmetries
Do not match Euclidean convolutions on grid graphs
Figure: Example of translation defined as convolution with a dirac (courtesy of Pasdeloup, B.)
December 13th 2018 11 / 50
12/50
Vertex-domain approaches
Convolutions are defined as a sum over a neighborhood, usually a sum of dot products
(cf references in thesis manuscript).
(X ⊗ Θ)(vi) =
j∈Nvi
θijX(vj)
December 13th 2018 12 / 50
13/50
Vertex-domain approaches
(X ⊗ Θ)(vi) =
j∈Nvi
θijX(vj)
Pros
Match Euclidean convolutions on grid graphs
Locally connected
Cons
Weight sharing is not always explicit
December 13th 2018 13 / 50
14/50
A few important references
Bruna et al., 2013: spectral filters with O(1) weights, K smoother matrix used to
interpolate more weights.
gθ(X) = UT
(UX.Kθ)
Defferard et al., 2016: filters based on Chebychev polynomials (Ti)i.
gθ(L) =
k
i=0
θi Ti(L)
Kipf et al., 2016: application to semi-supervised settings.
Y = AXΘ
Velickovic et al., 2017: introduction of attention coefficients (Ak)k.
Y =
K
k=1
AkXΘk
Du et al., 2017: convolution from the GSP field (Sandryhaila et al., 2013).
Y =
K
k=1
Ak
XΘk
December 13th 2018 14 / 50
14/50
Outline
1 Motivation and problem statement
2 Literature overview
3 Convolution of graph signals
4 Deep learning on graph domains
5 Experiments
6 Conclusion
December 13th 2018 14 / 50
15/50
Recall: Euclidean convolution
Definition
Convolution on S(Z2
)
The (discrete) convolution s1 ∗ s2 is a binary operation in S(Z2
) defined as:
∀(a, b) ∈ Z2
, (s1 ∗ s2)[a, b] =
i j
s1[i, j] s2[a − i, b − j]
A convolution operator f is a function parameterized by a signal w ∈ S(Z2
) s.t. :
f = . ∗ w (right operator)
f = w ∗ . (left operator)
Some notable properties:
Linearity
Locality and weight sharing
Commutativity (optional)
Equivariance to translations (i.e. commutes with them)
December 13th 2018 15 / 50
16/50
Characterization by translational equivariance
Theorem
Characterization of convolution operators on S(Z2
)
A linear transformation f is equivariant to translations ⇔ it is a convolution operator.
translation
f f
translation
December 13th 2018 16 / 50
17/50
Convolution of signals with graph domains
Can use the Euclidean convolution
How to extend the convolution here ?
December 13th 2018 17 / 50
18/50
A few notions of representation theory
A group is a set (defined by some properties) which can act on other sets.
Let Γ be a group, g ∈ Γ, and V be a set. Example of group actions:
Lg : Γ → Γ (auto-action)
g(.) : V → V (action on V )
g
h → gh
v → g(v)
Lg
g(.)
December 13th 2018 18 / 50
19/50
Group convolution
Definition
Group convolution
Let a group Γ, the group convolution between two signals s1 and s2 ∈ S(Γ) is defined as:
∀h ∈ Γ, (s1 ∗i s2)[h] =
g∈Γ
s1[g] s2[g−1
h]
provided at least one of the signals has finite support if Γ is not finite.
Theorem
Characterization of group convolution operators
Let a group Γ, let f ∈ L(S(Γ)),
1 f is a group convolution right operator ⇔ f is equivariant to left multiplications,
2 f is a group convolution left operator ⇔ f is equivariant to right multiplications,
3 f is a group convolution commutative operator ⇔ f is equivariant to multiplications.
December 13th 2018 19 / 50
20/50
Goal: convolution on the vertex set
Let a graph G = V, E s.t. E ⊂ V 2
.
Starting point: bijective map ϕ between a group Γ and G (or a subgraph).
Γ V
S(Γ) S(V )
ϕ
lin. ext. lin. ext.
ϕ
December 13th 2018 20 / 50
21/50
Convolution on the vertex set from group convolutions
(Goal: convolution on the vertex set)
ϕ: bijective map
S(Γ) S(V )
S(Γ) S(V )
f f=ϕ◦f◦ϕ−1
ϕ−1
ϕ
Equivariance theorem to operators of the form ϕ ◦ Lg ◦ ϕ−1
holds.
But not necessarily to actions of Γ on V (i.e. of the form g(.)).
December 13th 2018 21 / 50
22/50
Needed condition: equivariant map
(Goal: equivariance theorem holds)
Condition: ϕ is a bijective equivariant map
Denote gv = ϕ−1
(v).
gu gvgu
u ϕ(gvgu)
Lgv
ϕ ϕ
gv(.)
We need gv(.) = ϕ ◦ Lgv ◦ ϕ−1
i.e. ∀u ∈ V, gv(u) = ϕ(gvgu).
December 13th 2018 22 / 50
23/50
ϕ-convolution
ϕ: bijective equivariant map i.e. gv(u) = ϕ(gvgu).
Definition
ϕ-convolution
∀s1, s2 ∈ S(V ):
s1 ∗ϕ s2 =
v∈V
s1[v] gv(s2) (1)
=
g∈Γ
s1[ϕ(g)] g(s2) (2)
Theorem
Characterization of ϕ-convolution right operators
f is a ϕ-convolution right operator ⇔ f is equivariant to Γ
December 13th 2018 23 / 50
24/50
Mixed domain formulation
Let Γ be an abelian group. No need to exhibit ϕ in this case:
Definition
Mixed domain convolution
∀r ∈ S(Γ) and ∀s ∈ S(V ):
r ∗m s =
g∈Γ
r[g] g(s) ∈ S(V )
Equivariance theorem holds:
Corollary
f is a m-convolution left operator ⇔ f is equivariant to Γ
(Converse sense still requires bijectivity between Γ and V ).
December 13th 2018 24 / 50
25/50
Inclusion and role of the edge set
Edge constrained (EC)
d
a
b
c
e
g(a) ∈ {c, d}
g(a) /∈ {b, e}
Locality Preserving (LP)
a b
c
a’ b’
c’
g(.)
December 13th 2018 25 / 50
26/50
Cayley graphs
(Goal: description of EC and LP convolutions)
Definition
Cayley graph and subgraph
Let a group Γ and one of its generating set U. The Cayley graph generated by U, is the
digraph G = V, E such that V = Γ and E is such that, either:
∀a, b ∈ Γ, a → b ⇔ ∃g ∈ U, ga = b (left Cayley graph)
∀a, b ∈ Γ, a → b ⇔ ∃g ∈ U, ag = b (right Cayley graph)
both points above (abelian Cayley graph)
A Cayley subgraph is a subgraph that is isomorph to a Cayley graph.
a b
g
Figure: An edge of a Cayley graph
December 13th 2018 26 / 50
27/50
Characterization of EC and LP convolutions
Theorem
Characterization by Cayley subgraphs
Let a graph G = V, E , then:
1 its left Cayley subgraphs characterize its EC ϕ-convolutions,
2 its right Cayley subgraphs characterize its LP ϕ-convolutions,
3 its abelian Cayley subgraphs characterize its EC and LP m-convolutions.
Corollary
Properties of convolutions that are both EC and LP
1 If a ϕ-convolution of group Γ is EC and LP then Γ is abelian;
2 an m-convolution is EC if, and only if, it is also LP.
December 13th 2018 27 / 50
28/50
Other results in the manuscript
Description with smaller kernels
The weight sharing is preserved
More detailed results depending on laterality of operator and equivariance
Analysis of limitations due to algebraic structure of the Cayley subgraphs
Above theorems hold for groupoids of partial transformation under mild conditions
They also hold for groupoids based on paths under restrictive conditions
December 13th 2018 28 / 50
28/50
Outline
1 Motivation and problem statement
2 Literature overview
3 Convolution of graph signals
4 Deep learning on graph domains
5 Experiments
6 Conclusion
December 13th 2018 28 / 50
29/50
Propagational representation of a layer
Ld
Ld+1
Definition
Edge-constrained layer
Connections (dotted lines) are constrained by edges (red lines) in a local receptive field.
December 13th 2018 29 / 50
29/50
Propagational representation of a layer
Ld
Ld+1
Definition
Edge-constrained layer
Connections (dotted lines) are constrained by edges (red lines) in a local receptive field.
Theorem
Characterization by local receptive fields (LRF)
There is a graph for which a layer is EC ⇔ its LRF are intertwined.
December 13th 2018 29 / 50
30/50
Connectivity matrix W
(Goal: generalized layer representation)
y = h(W · x + b)
December 13th 2018 30 / 50
31/50
Scheme tensor S
(Goal: generalized layer representation)
W = Θ · S
y = h(Θ · S · x + b)
1 2 3 4 5
5
4
3
2
1
5
4
3
2
1






s11 s12 0 0 0
s21 s22 s23 0 0
0 s32 s33 s34 0
0 0 s43 s44 s45
0 0 0 s54 s55






W45 =
ω
k=1
ΘkS45k
December 13th 2018 31 / 50
32/50
Neural contraction
˘ΘSX[j, q, b] =
ω
k=1
P
p=1
n
i=1
Θ[k, p, q] S[k, i, j] X[i, p, b]
g(X) = ˘ΘSX where
Wpq
ij
= Θpq
k
Sk
ij
g(X)jq
b
= Wjq
ip
Xip
b
index size description
i n input neuron
j m output neuron
p N input channel
q M feature map
k ω kernel weight
b B batch instance
Table: indices
tensor shape
Θ ω × N × M
S ω × n × m
X n × N × B
ΘS n × m × N × M
SX ω × m × N × B
ΘX ω × n × M × B
˘ΘSX m × M × B
Table: shapes
December 13th 2018 32 / 50
33/50
Properties
This formulation is:
Linear
Associative
Commutative
Generic (next slide)
It is explainable as a convolution of graph signals when:
in supervised application
Either 0 or 1 weight per connection (sij are one-hot vectors)
In other cases it can be seen as a linear combination of convolutions.
December 13th 2018 33 / 50
34/50
Genericity of ternary representation
Given adequate specification of the weight sharing scheme S, we can obtain, e.g. :
a dense layer
a partially connected layer
a convolutional layer
a graph convolutional layer (GCN, Kipf et al.)
a graph attention layer (GAT, Velickovic et al.)
a topology-adaptive graph convolution layer (TAGCN, Du et al.)
a mixture model convolutional layer (MOnet, Monti et al.)
a generalized convolution under sparse priors
any partial connectivity pattern, sparse or not
December 13th 2018 34 / 50
35/50
Discussion
X Y
Y = h(˙ΘSX)
The propagation logic is in the scheme S. For example, it can be either:
given
randomized
learned
inferred
December 13th 2018 35 / 50
35/50
Outline
1 Motivation and problem statement
2 Literature overview
3 Convolution of graph signals
4 Deep learning on graph domains
5 Experiments
6 Conclusion
December 13th 2018 35 / 50
36/50
Outline
Experiments:
Study of influence of symmetries using S
Learning S when masked by an adjacency matrix and its powers
Monte Carlo simulations with random realizations of S
Learning S in semi-supervised applications
Inferring S from translations
December 13th 2018 36 / 50
37/50
Datasets
December 13th 2018 37 / 50
38/50
Influence of symmetries
Y = h(˙ΘSX)
µ
S = E
(2p + 1)µ
µ
S = E
December 13th 2018 38 / 50
39/50
Influence of symmetries: results on MNIST
Y = h(˙ΘSX)
0 2 4 6
1
1.5
2
+
σ (in unit of initial pixel separation)
Error(in%)
MLP
CNN
10−1
100
101
102
December 13th 2018 39 / 50
40/50
Learning both S and Θ on MNIST and scrambled MNIST
Y = h(˙ΘSX)
Ordering Conv5x5 A1
A2
A3
A4
A5
A6
no prior / 1.24% 1.02% 0.93% 0.90% 0.93% 1.00%
prior 0.87% 1.21% 0.91% 0.91% 0.87% 0.80% 0.74%
Table: Grid graphs on MNIST
December 13th 2018 40 / 50
40/50
Learning both S and Θ on MNIST and scrambled MNIST
Y = h(˙ΘSX)
Ordering Conv5x5 A1
A2
A3
A4
A5
A6
no prior / 1.24% 1.02% 0.93% 0.90% 0.93% 1.00%
prior 0.87% 1.21% 0.91% 0.91% 0.87% 0.80% 0.74%
Table: Grid graphs on MNIST
MLP Conv5x5 Thresholded (p = 3%) k-NN (k = 25)
1.44% 1.39% 1.06% 0.96%
Table: Covariance graphs on Scrambled MNIST
December 13th 2018 40 / 50
41/50
Experiments on text categorization
Y = h(˙ΘSX)
input
MC64
MC64
TrivialConv1
+
FC500
FC20
output
Figure: Diagram of the MCNet architecture used
MNB FC2500 FC2500-500 ChebNet32 FC500 MCNet
68.51 64.64 65.76 68.26 71.46 (72.25) 70.74 (72.62)
Table: Accuracies (in %) on 20NEWS, given as mean (max)
December 13th 2018 41 / 50
42/50
Benchmarks on citation networks
Y = h(˙ΘSX)
Comparison of
Graph Convolution Network (GCN),
Graph Attention Network (GAT),
Topology Adaptive GCN (TAGCN).
With our models:
Addition of graph dropout to GCN (GCN*),
Graph Contraction Network (GCT).
Dataset MLP GCN GAT TAGCN GCN* GCT
Cora 58.8 ± 0.9 81.8 ± 0.9 83.3 ± 0.6 82.9 ± 0.7 83.4 ± 0.7 83.3 ± 0.7
Citeseer 56.7 ± 1.1 72.2 ± 0.6 72.1 ± 0.6 71.7 ± 0.7 72.5 ± 0.8 72.7 ± 0.5
Pubmed 72.6 ± 0.9 79.0 ± 0.5 78.3 ± 0.7 78.9 ± 0.5 78.2 ± 0.7 79.2 ± 0.4
Table: Mean accuracy (in %) and standard deviation after 100 run
December 13th 2018 42 / 50
43/50
Another approach: finding translations in graphs to construct S
Step 0 : infer a graph
x0
x1
...
xm
...
⇒
1 2
43
December 13th 2018 43 / 50
43/50
Another approach: finding translations in graphs to construct S
Step 0 : infer a graph
x0
x1
...
xm
...
⇒
1 2
43
Step 1: infer translations
1 2
43
⇒
December 13th 2018 43 / 50
44/50
Another approach: finding translations in graphs to construct S
Step 2: design convolution weight-sharing
w1×
+ w2×
+ w3×
+ w4×
+ w0×
December 13th 2018 44 / 50
44/50
Another approach: finding translations in graphs to construct S
Step 2: design convolution weight-sharing
w1×
+ w2×
+ w3×
+ w4×
+ w0×
Step 3: design data-augmentation
x0 ⇒
December 13th 2018 44 / 50
45/50
Another approach: finding translations in graphs to construct S
Step 4: design graph subsampling and convolution weight-sharing
v0
1 2
43
⇒
1 2
43
⇒
w0× + w1×
+ w2×
December 13th 2018 45 / 50
46/50
Architecture
We used a variant of deep residual networks (ResNet).
We swap operations (data augmentation, convolutions, subsampling) with their
counterparts.
December 13th 2018 46 / 50
47/50
Results on CIFAR-10, scrambled CIFAR-10 and PINES fMRI
Y = h(˙ΘSX)
Support MLP CNN
Grid Graph Covariance Graph
ChebNetc
Proposed Proposed
Full Data Augmentation 78.62%a,b
93.80% 85.13% 93.94% 92.57%
Data Augmentation w/o Flips —— 92.73% 84.41% 92.94% 91.29%
Graph Data Augmentation —— 92.10%d
—— 92.81% 91.07%a
None 69.62% 87.78% —— 88.83% 85.88%a
a
No priors about the structure
b
Lin et al., 2015
c
Defferard et al., 2016
d
Data augmentation done with covariance graph
Table: CIFAR-10 and scrambled CIFAR-10
December 13th 2018 47 / 50
47/50
Results on CIFAR-10, scrambled CIFAR-10 and PINES fMRI
Y = h(˙ΘSX)
Support MLP CNN
Grid Graph Covariance Graph
ChebNetc
Proposed Proposed
Full Data Augmentation 78.62%a,b
93.80% 85.13% 93.94% 92.57%
Data Augmentation w/o Flips —— 92.73% 84.41% 92.94% 91.29%
Graph Data Augmentation —— 92.10%d
—— 92.81% 91.07%a
None 69.62% 87.78% —— 88.83% 85.88%a
a
No priors about the structure
b
Lin et al., 2015
c
Defferard et al., 2016
d
Data augmentation done with covariance graph
Table: CIFAR-10 and scrambled CIFAR-10
Support None Neighborhood Graph
Method MLP CNN (1x1 kernels) ChebNetc
Proposed
Accuracy 82.62% 84.30% 82.80% 85.08%
Table: PINES fMRI
December 13th 2018 47 / 50
47/50
Outline
1 Motivation and problem statement
2 Literature overview
3 Convolution of graph signals
4 Deep learning on graph domains
5 Experiments
6 Conclusion
December 13th 2018 47 / 50
48/50
Summary
We studied convolutions of graph signals and used them to build and understand
extensions of CNN on graph domains.
Convolution of graph signals:
Algebraic description of convolution of graph signals ϕ- and m-convolutions,
Constructed as the class of linear operator that are equivariant to actions of a group.
Strong characterization results for graphs with Cayley subgraphs.
Extension with groupoids.
Deep learning on graphs:
Novel representation based on weight sharing: the neural contraction
Monte-Carlo Neural Networks (MCNN)
Graph Contraction Networks (GCT)
Graph dropout (GCN*)
Translation-Convolutional Neural Network (TCNN)
December 13th 2018 48 / 50
49/50
Final words
Perspectives:
In the literature of this domain: semi-supervised >> supervised.
Both tasks can be abstracted to a more general case.
Y = h(˘ΘSX)
There can be more than one tensor rank which relations can be represented by a
graph.
Y = h(g(X, A1, A2, ..., Ar))
Extended range of applications for deep learning architecture.
Thinking in AI might be about creating connections (captured by S) and not about
updating weights.
Thank you for your attention !
December 13th 2018 49 / 50
50/50
Contributions
Generalizing the convolution operator to extend CNNs to irregular domains, Jean-Charles
Vialatte, Vincent Grippon, Gr´egoire Mercier, arXiv preprint 2016.
Neighborhood-preserving translations on graphs, Nicolas Grelier, Bastien Pasdeloup,
Jean-Charles Vialatte, Vincent Gripon, IEEE GlobalSIP 2016.
Learning local receptive fields and their weight sharing scheme on graphs, Jean-Charles
Vialatte, Vincent Gripon, Gilles Coppin, IEE GlobalSIP 2017.
A study of deep learning robustness against computation failures, Jean-Charles Vialatte,
Fran¸cois Leduc-Primeau, ACTA 2017.
Convolutional neural networks on irregular domains through approximate translations on
inferred graphs, Bastien Pasdeloup, Vincent Gripon, Jean-Charles Vialatte, Dominique
Pastor, arXiv preprint 2017.
Translations on graphs with neighborhood preservation, Bastien Pasdeloup, Vincent
Gripon, Nicolas Grelier, Jean-Charles Vialatte, Dominique Pastor, arXiv preprint 2017.
Matching CNNs without Priors about data, Carlos-Eduardo Rosar Kos Lassance,
Jean-Charles Vialatte, Vincent Gripon, IEEE DSW 2018.
On convolution of graph signals and deep learning on graph domains, Jean-Charles
Vialatte, thesis, unpublished.
Convolution of graph signals, Jean-Charles Vialatte, Vincent Gripon, Gilles Coppin,
unpublished.
Graph contraction networks, Graph dropout, Monte-Carlo Networks, Jean-Charles
Vialatte, Vincent Gripon, Gilles Coppin, unpublished.
December 13th 2018 50 / 50

Contenu connexe

Tendances

Solving connectivity problems via basic Linear Algebra
Solving connectivity problems via basic Linear AlgebraSolving connectivity problems via basic Linear Algebra
Solving connectivity problems via basic Linear Algebracseiitgn
 
Clustering in Hilbert geometry for machine learning
Clustering in Hilbert geometry for machine learningClustering in Hilbert geometry for machine learning
Clustering in Hilbert geometry for machine learningFrank Nielsen
 
Treewidth and Applications
Treewidth and ApplicationsTreewidth and Applications
Treewidth and ApplicationsASPAK2014
 
Tailored Bregman Ball Trees for Effective Nearest Neighbors
Tailored Bregman Ball Trees for Effective Nearest NeighborsTailored Bregman Ball Trees for Effective Nearest Neighbors
Tailored Bregman Ball Trees for Effective Nearest NeighborsFrank Nielsen
 
A lattice-based consensus clustering
A lattice-based consensus clusteringA lattice-based consensus clustering
A lattice-based consensus clusteringDmitrii Ignatov
 
Introduction to harmonic analysis on groups, links with spatial correlation.
Introduction to harmonic analysis on groups, links with spatial correlation.Introduction to harmonic analysis on groups, links with spatial correlation.
Introduction to harmonic analysis on groups, links with spatial correlation.Valentin De Bortoli
 
Pattern-based classification of demographic sequences
Pattern-based classification of demographic sequencesPattern-based classification of demographic sequences
Pattern-based classification of demographic sequencesDmitrii Ignatov
 
Polylogarithmic approximation algorithm for weighted F-deletion problems
Polylogarithmic approximation algorithm for weighted F-deletion problemsPolylogarithmic approximation algorithm for weighted F-deletion problems
Polylogarithmic approximation algorithm for weighted F-deletion problemsAkankshaAgrawal55
 
On the Jensen-Shannon symmetrization of distances relying on abstract means
On the Jensen-Shannon symmetrization of distances relying on abstract meansOn the Jensen-Shannon symmetrization of distances relying on abstract means
On the Jensen-Shannon symmetrization of distances relying on abstract meansFrank Nielsen
 
Estimates for a class of non-standard bilinear multipliers
Estimates for a class of non-standard bilinear multipliersEstimates for a class of non-standard bilinear multipliers
Estimates for a class of non-standard bilinear multipliersVjekoslavKovac1
 
SUPER MAGIC CORONATIONS OF GRAPHS
SUPER MAGIC CORONATIONS OF GRAPHS SUPER MAGIC CORONATIONS OF GRAPHS
SUPER MAGIC CORONATIONS OF GRAPHS IAEME Publication
 
Econophysics III: Financial Correlations and Portfolio Optimization - Thomas ...
Econophysics III: Financial Correlations and Portfolio Optimization - Thomas ...Econophysics III: Financial Correlations and Portfolio Optimization - Thomas ...
Econophysics III: Financial Correlations and Portfolio Optimization - Thomas ...Lake Como School of Advanced Studies
 
Complexity Classes and the Graph Isomorphism Problem
Complexity Classes and the Graph Isomorphism ProblemComplexity Classes and the Graph Isomorphism Problem
Complexity Classes and the Graph Isomorphism Problemcseiitgn
 
Parabolic Restricted Three Body Problem
Parabolic Restricted Three Body ProblemParabolic Restricted Three Body Problem
Parabolic Restricted Three Body ProblemEsther Barrabés Vera
 
Graph Edit Distance: Basics & Trends
Graph Edit Distance: Basics & TrendsGraph Edit Distance: Basics & Trends
Graph Edit Distance: Basics & TrendsLuc Brun
 
BGW2012 - Linear balanceable and subcubic balanceable graphs
BGW2012 - Linear balanceable and subcubic balanceable graphsBGW2012 - Linear balanceable and subcubic balanceable graphs
BGW2012 - Linear balanceable and subcubic balanceable graphsttrunck
 
A T(1)-type theorem for entangled multilinear Calderon-Zygmund operators
A T(1)-type theorem for entangled multilinear Calderon-Zygmund operatorsA T(1)-type theorem for entangled multilinear Calderon-Zygmund operators
A T(1)-type theorem for entangled multilinear Calderon-Zygmund operatorsVjekoslavKovac1
 

Tendances (20)

Solving connectivity problems via basic Linear Algebra
Solving connectivity problems via basic Linear AlgebraSolving connectivity problems via basic Linear Algebra
Solving connectivity problems via basic Linear Algebra
 
Clustering in Hilbert geometry for machine learning
Clustering in Hilbert geometry for machine learningClustering in Hilbert geometry for machine learning
Clustering in Hilbert geometry for machine learning
 
Treewidth and Applications
Treewidth and ApplicationsTreewidth and Applications
Treewidth and Applications
 
Tailored Bregman Ball Trees for Effective Nearest Neighbors
Tailored Bregman Ball Trees for Effective Nearest NeighborsTailored Bregman Ball Trees for Effective Nearest Neighbors
Tailored Bregman Ball Trees for Effective Nearest Neighbors
 
A lattice-based consensus clustering
A lattice-based consensus clusteringA lattice-based consensus clustering
A lattice-based consensus clustering
 
Introduction to harmonic analysis on groups, links with spatial correlation.
Introduction to harmonic analysis on groups, links with spatial correlation.Introduction to harmonic analysis on groups, links with spatial correlation.
Introduction to harmonic analysis on groups, links with spatial correlation.
 
Pattern-based classification of demographic sequences
Pattern-based classification of demographic sequencesPattern-based classification of demographic sequences
Pattern-based classification of demographic sequences
 
Polylogarithmic approximation algorithm for weighted F-deletion problems
Polylogarithmic approximation algorithm for weighted F-deletion problemsPolylogarithmic approximation algorithm for weighted F-deletion problems
Polylogarithmic approximation algorithm for weighted F-deletion problems
 
On the Jensen-Shannon symmetrization of distances relying on abstract means
On the Jensen-Shannon symmetrization of distances relying on abstract meansOn the Jensen-Shannon symmetrization of distances relying on abstract means
On the Jensen-Shannon symmetrization of distances relying on abstract means
 
Estimates for a class of non-standard bilinear multipliers
Estimates for a class of non-standard bilinear multipliersEstimates for a class of non-standard bilinear multipliers
Estimates for a class of non-standard bilinear multipliers
 
Crystallographic groups
Crystallographic groupsCrystallographic groups
Crystallographic groups
 
L(2,1)-labeling
L(2,1)-labelingL(2,1)-labeling
L(2,1)-labeling
 
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
 
SUPER MAGIC CORONATIONS OF GRAPHS
SUPER MAGIC CORONATIONS OF GRAPHS SUPER MAGIC CORONATIONS OF GRAPHS
SUPER MAGIC CORONATIONS OF GRAPHS
 
Econophysics III: Financial Correlations and Portfolio Optimization - Thomas ...
Econophysics III: Financial Correlations and Portfolio Optimization - Thomas ...Econophysics III: Financial Correlations and Portfolio Optimization - Thomas ...
Econophysics III: Financial Correlations and Portfolio Optimization - Thomas ...
 
Complexity Classes and the Graph Isomorphism Problem
Complexity Classes and the Graph Isomorphism ProblemComplexity Classes and the Graph Isomorphism Problem
Complexity Classes and the Graph Isomorphism Problem
 
Parabolic Restricted Three Body Problem
Parabolic Restricted Three Body ProblemParabolic Restricted Three Body Problem
Parabolic Restricted Three Body Problem
 
Graph Edit Distance: Basics & Trends
Graph Edit Distance: Basics & TrendsGraph Edit Distance: Basics & Trends
Graph Edit Distance: Basics & Trends
 
BGW2012 - Linear balanceable and subcubic balanceable graphs
BGW2012 - Linear balanceable and subcubic balanceable graphsBGW2012 - Linear balanceable and subcubic balanceable graphs
BGW2012 - Linear balanceable and subcubic balanceable graphs
 
A T(1)-type theorem for entangled multilinear Calderon-Zygmund operators
A T(1)-type theorem for entangled multilinear Calderon-Zygmund operatorsA T(1)-type theorem for entangled multilinear Calderon-Zygmund operators
A T(1)-type theorem for entangled multilinear Calderon-Zygmund operators
 

Similaire à On Convolution of Graph Signals and Deep Learning on Graph Domains

Testing Forest-Isomorphism in the Adjacency List Model
Testing Forest-Isomorphismin the Adjacency List ModelTesting Forest-Isomorphismin the Adjacency List Model
Testing Forest-Isomorphism in the Adjacency List Modelirrrrr
 
Berezin-Toeplitz Quantization On Coadjoint orbits
Berezin-Toeplitz Quantization On Coadjoint orbitsBerezin-Toeplitz Quantization On Coadjoint orbits
Berezin-Toeplitz Quantization On Coadjoint orbitsHassan Jolany
 
ON ALGORITHMIC PROBLEMS CONCERNING GRAPHS OF HIGHER DEGREE OF SYMMETRY
ON ALGORITHMIC PROBLEMS CONCERNING GRAPHS OF HIGHER DEGREE OF SYMMETRYON ALGORITHMIC PROBLEMS CONCERNING GRAPHS OF HIGHER DEGREE OF SYMMETRY
ON ALGORITHMIC PROBLEMS CONCERNING GRAPHS OF HIGHER DEGREE OF SYMMETRYFransiskeran
 
VJAI Paper Reading#3-KDD2019-ClusterGCN
VJAI Paper Reading#3-KDD2019-ClusterGCNVJAI Paper Reading#3-KDD2019-ClusterGCN
VJAI Paper Reading#3-KDD2019-ClusterGCNDat Nguyen
 
A new generalized lindley distribution
A new generalized lindley distributionA new generalized lindley distribution
A new generalized lindley distributionAlexander Decker
 
On algorithmic problems concerning graphs of higher degree of symmetry
On algorithmic problems concerning graphs of higher degree of symmetryOn algorithmic problems concerning graphs of higher degree of symmetry
On algorithmic problems concerning graphs of higher degree of symmetrygraphhoc
 
1. Graph and Graph Terminologiesimp.pptx
1. Graph and Graph Terminologiesimp.pptx1. Graph and Graph Terminologiesimp.pptx
1. Graph and Graph Terminologiesimp.pptxswapnilbs2728
 
On the Equality of the Grundy Numbers of a Graph
On the Equality of the Grundy Numbers of a GraphOn the Equality of the Grundy Numbers of a Graph
On the Equality of the Grundy Numbers of a Graphjosephjonse
 
THE RESULT FOR THE GRUNDY NUMBER ON P4- CLASSES
THE RESULT FOR THE GRUNDY NUMBER ON P4- CLASSESTHE RESULT FOR THE GRUNDY NUMBER ON P4- CLASSES
THE RESULT FOR THE GRUNDY NUMBER ON P4- CLASSESgraphhoc
 
Divergence clustering
Divergence clusteringDivergence clustering
Divergence clusteringFrank Nielsen
 
On the equality of the grundy numbers of a graph
On the equality of the grundy numbers of a graphOn the equality of the grundy numbers of a graph
On the equality of the grundy numbers of a graphijngnjournal
 
Divergence center-based clustering and their applications
Divergence center-based clustering and their applicationsDivergence center-based clustering and their applications
Divergence center-based clustering and their applicationsFrank Nielsen
 
New Classes of Odd Graceful Graphs
New Classes of Odd Graceful GraphsNew Classes of Odd Graceful Graphs
New Classes of Odd Graceful Graphsgraphhoc
 
Strong (Weak) Triple Connected Domination Number of a Fuzzy Graph
Strong (Weak) Triple Connected Domination Number of a Fuzzy GraphStrong (Weak) Triple Connected Domination Number of a Fuzzy Graph
Strong (Weak) Triple Connected Domination Number of a Fuzzy Graphijceronline
 
Notes on Spectral Clustering
Notes on Spectral ClusteringNotes on Spectral Clustering
Notes on Spectral ClusteringDavide Eynard
 
Nodal Domain Theorem for the p-Laplacian on Graphs and the Related Multiway C...
Nodal Domain Theorem for the p-Laplacian on Graphs and the Related Multiway C...Nodal Domain Theorem for the p-Laplacian on Graphs and the Related Multiway C...
Nodal Domain Theorem for the p-Laplacian on Graphs and the Related Multiway C...Francesco Tudisco
 
Goodfellow, Bengio, Couville (2016) "Deep Learning", Chap. 7
Goodfellow, Bengio, Couville (2016) "Deep Learning", Chap. 7Goodfellow, Bengio, Couville (2016) "Deep Learning", Chap. 7
Goodfellow, Bengio, Couville (2016) "Deep Learning", Chap. 7Ono Shigeru
 
Lego like spheres and tori, enumeration and drawings
Lego like spheres and tori, enumeration and drawingsLego like spheres and tori, enumeration and drawings
Lego like spheres and tori, enumeration and drawingsMathieu Dutour Sikiric
 

Similaire à On Convolution of Graph Signals and Deep Learning on Graph Domains (20)

Testing Forest-Isomorphism in the Adjacency List Model
Testing Forest-Isomorphismin the Adjacency List ModelTesting Forest-Isomorphismin the Adjacency List Model
Testing Forest-Isomorphism in the Adjacency List Model
 
Berezin-Toeplitz Quantization On Coadjoint orbits
Berezin-Toeplitz Quantization On Coadjoint orbitsBerezin-Toeplitz Quantization On Coadjoint orbits
Berezin-Toeplitz Quantization On Coadjoint orbits
 
10.1.1.226.4381
10.1.1.226.438110.1.1.226.4381
10.1.1.226.4381
 
ON ALGORITHMIC PROBLEMS CONCERNING GRAPHS OF HIGHER DEGREE OF SYMMETRY
ON ALGORITHMIC PROBLEMS CONCERNING GRAPHS OF HIGHER DEGREE OF SYMMETRYON ALGORITHMIC PROBLEMS CONCERNING GRAPHS OF HIGHER DEGREE OF SYMMETRY
ON ALGORITHMIC PROBLEMS CONCERNING GRAPHS OF HIGHER DEGREE OF SYMMETRY
 
VJAI Paper Reading#3-KDD2019-ClusterGCN
VJAI Paper Reading#3-KDD2019-ClusterGCNVJAI Paper Reading#3-KDD2019-ClusterGCN
VJAI Paper Reading#3-KDD2019-ClusterGCN
 
A new generalized lindley distribution
A new generalized lindley distributionA new generalized lindley distribution
A new generalized lindley distribution
 
On algorithmic problems concerning graphs of higher degree of symmetry
On algorithmic problems concerning graphs of higher degree of symmetryOn algorithmic problems concerning graphs of higher degree of symmetry
On algorithmic problems concerning graphs of higher degree of symmetry
 
1. Graph and Graph Terminologiesimp.pptx
1. Graph and Graph Terminologiesimp.pptx1. Graph and Graph Terminologiesimp.pptx
1. Graph and Graph Terminologiesimp.pptx
 
On the Equality of the Grundy Numbers of a Graph
On the Equality of the Grundy Numbers of a GraphOn the Equality of the Grundy Numbers of a Graph
On the Equality of the Grundy Numbers of a Graph
 
THE RESULT FOR THE GRUNDY NUMBER ON P4- CLASSES
THE RESULT FOR THE GRUNDY NUMBER ON P4- CLASSESTHE RESULT FOR THE GRUNDY NUMBER ON P4- CLASSES
THE RESULT FOR THE GRUNDY NUMBER ON P4- CLASSES
 
Divergence clustering
Divergence clusteringDivergence clustering
Divergence clustering
 
On the equality of the grundy numbers of a graph
On the equality of the grundy numbers of a graphOn the equality of the grundy numbers of a graph
On the equality of the grundy numbers of a graph
 
Divergence center-based clustering and their applications
Divergence center-based clustering and their applicationsDivergence center-based clustering and their applications
Divergence center-based clustering and their applications
 
graphs
graphsgraphs
graphs
 
New Classes of Odd Graceful Graphs
New Classes of Odd Graceful GraphsNew Classes of Odd Graceful Graphs
New Classes of Odd Graceful Graphs
 
Strong (Weak) Triple Connected Domination Number of a Fuzzy Graph
Strong (Weak) Triple Connected Domination Number of a Fuzzy GraphStrong (Weak) Triple Connected Domination Number of a Fuzzy Graph
Strong (Weak) Triple Connected Domination Number of a Fuzzy Graph
 
Notes on Spectral Clustering
Notes on Spectral ClusteringNotes on Spectral Clustering
Notes on Spectral Clustering
 
Nodal Domain Theorem for the p-Laplacian on Graphs and the Related Multiway C...
Nodal Domain Theorem for the p-Laplacian on Graphs and the Related Multiway C...Nodal Domain Theorem for the p-Laplacian on Graphs and the Related Multiway C...
Nodal Domain Theorem for the p-Laplacian on Graphs and the Related Multiway C...
 
Goodfellow, Bengio, Couville (2016) "Deep Learning", Chap. 7
Goodfellow, Bengio, Couville (2016) "Deep Learning", Chap. 7Goodfellow, Bengio, Couville (2016) "Deep Learning", Chap. 7
Goodfellow, Bengio, Couville (2016) "Deep Learning", Chap. 7
 
Lego like spheres and tori, enumeration and drawings
Lego like spheres and tori, enumeration and drawingsLego like spheres and tori, enumeration and drawings
Lego like spheres and tori, enumeration and drawings
 

Dernier

POGONATUM : morphology, anatomy, reproduction etc.
POGONATUM : morphology, anatomy, reproduction etc.POGONATUM : morphology, anatomy, reproduction etc.
POGONATUM : morphology, anatomy, reproduction etc.Silpa
 
Biogenic Sulfur Gases as Biosignatures on Temperate Sub-Neptune Waterworlds
Biogenic Sulfur Gases as Biosignatures on Temperate Sub-Neptune WaterworldsBiogenic Sulfur Gases as Biosignatures on Temperate Sub-Neptune Waterworlds
Biogenic Sulfur Gases as Biosignatures on Temperate Sub-Neptune WaterworldsSérgio Sacani
 
PSYCHOSOCIAL NEEDS. in nursing II sem pptx
PSYCHOSOCIAL NEEDS. in nursing II sem pptxPSYCHOSOCIAL NEEDS. in nursing II sem pptx
PSYCHOSOCIAL NEEDS. in nursing II sem pptxSuji236384
 
Phenolics: types, biosynthesis and functions.
Phenolics: types, biosynthesis and functions.Phenolics: types, biosynthesis and functions.
Phenolics: types, biosynthesis and functions.Silpa
 
Digital Dentistry.Digital Dentistryvv.pptx
Digital Dentistry.Digital Dentistryvv.pptxDigital Dentistry.Digital Dentistryvv.pptx
Digital Dentistry.Digital Dentistryvv.pptxMohamedFarag457087
 
Proteomics: types, protein profiling steps etc.
Proteomics: types, protein profiling steps etc.Proteomics: types, protein profiling steps etc.
Proteomics: types, protein profiling steps etc.Silpa
 
Genome sequencing,shotgun sequencing.pptx
Genome sequencing,shotgun sequencing.pptxGenome sequencing,shotgun sequencing.pptx
Genome sequencing,shotgun sequencing.pptxSilpa
 
FAIRSpectra - Enabling the FAIRification of Spectroscopy and Spectrometry
FAIRSpectra - Enabling the FAIRification of Spectroscopy and SpectrometryFAIRSpectra - Enabling the FAIRification of Spectroscopy and Spectrometry
FAIRSpectra - Enabling the FAIRification of Spectroscopy and SpectrometryAlex Henderson
 
LUNULARIA -features, morphology, anatomy ,reproduction etc.
LUNULARIA -features, morphology, anatomy ,reproduction etc.LUNULARIA -features, morphology, anatomy ,reproduction etc.
LUNULARIA -features, morphology, anatomy ,reproduction etc.Silpa
 
Genetics and epigenetics of ADHD and comorbid conditions
Genetics and epigenetics of ADHD and comorbid conditionsGenetics and epigenetics of ADHD and comorbid conditions
Genetics and epigenetics of ADHD and comorbid conditionsbassianu17
 
Bhiwandi Bhiwandi ❤CALL GIRL 7870993772 ❤CALL GIRLS ESCORT SERVICE In Bhiwan...
Bhiwandi Bhiwandi ❤CALL GIRL 7870993772 ❤CALL GIRLS  ESCORT SERVICE In Bhiwan...Bhiwandi Bhiwandi ❤CALL GIRL 7870993772 ❤CALL GIRLS  ESCORT SERVICE In Bhiwan...
Bhiwandi Bhiwandi ❤CALL GIRL 7870993772 ❤CALL GIRLS ESCORT SERVICE In Bhiwan...Monika Rani
 
GBSN - Microbiology (Unit 3)Defense Mechanism of the body
GBSN - Microbiology (Unit 3)Defense Mechanism of the body GBSN - Microbiology (Unit 3)Defense Mechanism of the body
GBSN - Microbiology (Unit 3)Defense Mechanism of the body Areesha Ahmad
 
biology HL practice questions IB BIOLOGY
biology HL practice questions IB BIOLOGYbiology HL practice questions IB BIOLOGY
biology HL practice questions IB BIOLOGY1301aanya
 
Grade 7 - Lesson 1 - Microscope and Its Functions
Grade 7 - Lesson 1 - Microscope and Its FunctionsGrade 7 - Lesson 1 - Microscope and Its Functions
Grade 7 - Lesson 1 - Microscope and Its FunctionsOrtegaSyrineMay
 
Role of AI in seed science Predictive modelling and Beyond.pptx
Role of AI in seed science  Predictive modelling and  Beyond.pptxRole of AI in seed science  Predictive modelling and  Beyond.pptx
Role of AI in seed science Predictive modelling and Beyond.pptxArvind Kumar
 
THE ROLE OF BIOTECHNOLOGY IN THE ECONOMIC UPLIFT.pptx
THE ROLE OF BIOTECHNOLOGY IN THE ECONOMIC UPLIFT.pptxTHE ROLE OF BIOTECHNOLOGY IN THE ECONOMIC UPLIFT.pptx
THE ROLE OF BIOTECHNOLOGY IN THE ECONOMIC UPLIFT.pptxANSARKHAN96
 
Cyathodium bryophyte: morphology, anatomy, reproduction etc.
Cyathodium bryophyte: morphology, anatomy, reproduction etc.Cyathodium bryophyte: morphology, anatomy, reproduction etc.
Cyathodium bryophyte: morphology, anatomy, reproduction etc.Silpa
 
TransientOffsetin14CAftertheCarringtonEventRecordedbyPolarTreeRings
TransientOffsetin14CAftertheCarringtonEventRecordedbyPolarTreeRingsTransientOffsetin14CAftertheCarringtonEventRecordedbyPolarTreeRings
TransientOffsetin14CAftertheCarringtonEventRecordedbyPolarTreeRingsSérgio Sacani
 
FAIRSpectra - Enabling the FAIRification of Analytical Science
FAIRSpectra - Enabling the FAIRification of Analytical ScienceFAIRSpectra - Enabling the FAIRification of Analytical Science
FAIRSpectra - Enabling the FAIRification of Analytical ScienceAlex Henderson
 

Dernier (20)

POGONATUM : morphology, anatomy, reproduction etc.
POGONATUM : morphology, anatomy, reproduction etc.POGONATUM : morphology, anatomy, reproduction etc.
POGONATUM : morphology, anatomy, reproduction etc.
 
Biogenic Sulfur Gases as Biosignatures on Temperate Sub-Neptune Waterworlds
Biogenic Sulfur Gases as Biosignatures on Temperate Sub-Neptune WaterworldsBiogenic Sulfur Gases as Biosignatures on Temperate Sub-Neptune Waterworlds
Biogenic Sulfur Gases as Biosignatures on Temperate Sub-Neptune Waterworlds
 
PSYCHOSOCIAL NEEDS. in nursing II sem pptx
PSYCHOSOCIAL NEEDS. in nursing II sem pptxPSYCHOSOCIAL NEEDS. in nursing II sem pptx
PSYCHOSOCIAL NEEDS. in nursing II sem pptx
 
Phenolics: types, biosynthesis and functions.
Phenolics: types, biosynthesis and functions.Phenolics: types, biosynthesis and functions.
Phenolics: types, biosynthesis and functions.
 
Digital Dentistry.Digital Dentistryvv.pptx
Digital Dentistry.Digital Dentistryvv.pptxDigital Dentistry.Digital Dentistryvv.pptx
Digital Dentistry.Digital Dentistryvv.pptx
 
Proteomics: types, protein profiling steps etc.
Proteomics: types, protein profiling steps etc.Proteomics: types, protein profiling steps etc.
Proteomics: types, protein profiling steps etc.
 
Genome sequencing,shotgun sequencing.pptx
Genome sequencing,shotgun sequencing.pptxGenome sequencing,shotgun sequencing.pptx
Genome sequencing,shotgun sequencing.pptx
 
FAIRSpectra - Enabling the FAIRification of Spectroscopy and Spectrometry
FAIRSpectra - Enabling the FAIRification of Spectroscopy and SpectrometryFAIRSpectra - Enabling the FAIRification of Spectroscopy and Spectrometry
FAIRSpectra - Enabling the FAIRification of Spectroscopy and Spectrometry
 
LUNULARIA -features, morphology, anatomy ,reproduction etc.
LUNULARIA -features, morphology, anatomy ,reproduction etc.LUNULARIA -features, morphology, anatomy ,reproduction etc.
LUNULARIA -features, morphology, anatomy ,reproduction etc.
 
Genetics and epigenetics of ADHD and comorbid conditions
Genetics and epigenetics of ADHD and comorbid conditionsGenetics and epigenetics of ADHD and comorbid conditions
Genetics and epigenetics of ADHD and comorbid conditions
 
Clean In Place(CIP).pptx .
Clean In Place(CIP).pptx                 .Clean In Place(CIP).pptx                 .
Clean In Place(CIP).pptx .
 
Bhiwandi Bhiwandi ❤CALL GIRL 7870993772 ❤CALL GIRLS ESCORT SERVICE In Bhiwan...
Bhiwandi Bhiwandi ❤CALL GIRL 7870993772 ❤CALL GIRLS  ESCORT SERVICE In Bhiwan...Bhiwandi Bhiwandi ❤CALL GIRL 7870993772 ❤CALL GIRLS  ESCORT SERVICE In Bhiwan...
Bhiwandi Bhiwandi ❤CALL GIRL 7870993772 ❤CALL GIRLS ESCORT SERVICE In Bhiwan...
 
GBSN - Microbiology (Unit 3)Defense Mechanism of the body
GBSN - Microbiology (Unit 3)Defense Mechanism of the body GBSN - Microbiology (Unit 3)Defense Mechanism of the body
GBSN - Microbiology (Unit 3)Defense Mechanism of the body
 
biology HL practice questions IB BIOLOGY
biology HL practice questions IB BIOLOGYbiology HL practice questions IB BIOLOGY
biology HL practice questions IB BIOLOGY
 
Grade 7 - Lesson 1 - Microscope and Its Functions
Grade 7 - Lesson 1 - Microscope and Its FunctionsGrade 7 - Lesson 1 - Microscope and Its Functions
Grade 7 - Lesson 1 - Microscope and Its Functions
 
Role of AI in seed science Predictive modelling and Beyond.pptx
Role of AI in seed science  Predictive modelling and  Beyond.pptxRole of AI in seed science  Predictive modelling and  Beyond.pptx
Role of AI in seed science Predictive modelling and Beyond.pptx
 
THE ROLE OF BIOTECHNOLOGY IN THE ECONOMIC UPLIFT.pptx
THE ROLE OF BIOTECHNOLOGY IN THE ECONOMIC UPLIFT.pptxTHE ROLE OF BIOTECHNOLOGY IN THE ECONOMIC UPLIFT.pptx
THE ROLE OF BIOTECHNOLOGY IN THE ECONOMIC UPLIFT.pptx
 
Cyathodium bryophyte: morphology, anatomy, reproduction etc.
Cyathodium bryophyte: morphology, anatomy, reproduction etc.Cyathodium bryophyte: morphology, anatomy, reproduction etc.
Cyathodium bryophyte: morphology, anatomy, reproduction etc.
 
TransientOffsetin14CAftertheCarringtonEventRecordedbyPolarTreeRings
TransientOffsetin14CAftertheCarringtonEventRecordedbyPolarTreeRingsTransientOffsetin14CAftertheCarringtonEventRecordedbyPolarTreeRings
TransientOffsetin14CAftertheCarringtonEventRecordedbyPolarTreeRings
 
FAIRSpectra - Enabling the FAIRification of Analytical Science
FAIRSpectra - Enabling the FAIRification of Analytical ScienceFAIRSpectra - Enabling the FAIRification of Analytical Science
FAIRSpectra - Enabling the FAIRification of Analytical Science
 

On Convolution of Graph Signals and Deep Learning on Graph Domains

  • 1. 1/50 On Convolution of Graph Signals And Deep Learning on Graph Domains December 13th 2018 Candidate: Jean-Charles Vialatte Advisors: Vincent Gripon, Mathias Herberts Supervisor: Gilles Coppin Jury: Pierre Borgnat∗, Matthias L¨owe∗, Paulo Goncalves, Juliette Mattioli ∗: examiners December 13th 2018 1 / 50
  • 2. 2/50 Outline 1 Motivation and problem statement 2 Literature overview 3 Convolution of graph signals 4 Deep learning on graph domains 5 Experiments 6 Conclusion December 13th 2018 2 / 50
  • 3. 2/50 Outline 1 Motivation and problem statement 2 Literature overview 3 Convolution of graph signals 4 Deep learning on graph domains 5 Experiments 6 Conclusion December 13th 2018 2 / 50
  • 4. 3/50 Definitions: Signals and Graphs Signal: December 13th 2018 3 / 50
  • 5. 3/50 Definitions: Signals and Graphs Signal: Graph: 1 2 3 4 5 a b c d e       0 a 0 d e a 0 0 b 0 0 0 0 c 0 d b c 0 0 e 0 0 0 0       A: adjacency matrix December 13th 2018 3 / 50
  • 6. 3/50 Definitions: Signals and Graphs Signal: Graph: 1 2 3 4 5 a b c d e       0 a 0 d e a 0 0 b 0 0 0 0 c 0 d b c 0 0 e 0 0 0 0       A: adjacency matrix December 13th 2018 3 / 50
  • 8. 4/50 Examples of signals Most domains can be represented with a graph: Euclidean domains Non-Euclidean domains December 13th 2018 4 / 50
  • 9. 5/50 Deep learning performances On Euclidean domains: digits tiny images 1 22 0.5 3 Classification error (in %) without convolutions with convolutions December 13th 2018 5 / 50
  • 10. 6/50 A key factor: convolution Defined on Euclidean domains: December 13th 2018 6 / 50
  • 11. 7/50 Connectivity pattern: MLP vs CNN Dense layer fully connected no tied weights ex: 81 different weights here December 13th 2018 7 / 50
  • 12. 7/50 Connectivity pattern: MLP vs CNN Dense layer fully connected no tied weights ex: 81 different weights here Convolutional layer locally connected with weight sharing ex: 3 different weights here December 13th 2018 7 / 50
  • 13. 8/50 Problem: How to extend convolutions? Euclidean structure Non-Euclidean structure convolution
  • 14. 8/50 Problem: How to extend convolutions? Euclidean structure Non-Euclidean structure convolution ? ? December 13th 2018 8 / 50
  • 15. 9/50 Supervised vs semi-supervised application Let X be a dataset. We classify its rows. Two different kind of graph structures: Supervised classification of graph- structured data X (b×n) n Semi- supervised classification of nodes X (n×p) n December 13th 2018 9 / 50
  • 16. 9/50 Outline 1 Motivation and problem statement 2 Literature overview 3 Convolution of graph signals 4 Deep learning on graph domains 5 Experiments 6 Conclusion December 13th 2018 9 / 50
  • 17. 10/50 Spectral approaches Convolutions are defined in the graph spectral domain. L = D − A = UΛUT gft(X) = UX Figure: Example of signals of the Laplacian eigenbasis Using the GFT, convolution amount to a pointwise multiplication in the spectral domain. X ⊗ Θ = UT (UX.UΘ) December 13th 2018 10 / 50
  • 18. 11/50 Spectral approaches X ⊗ Θ = UT (UX.UΘ) Pros Elegant and fast under some approximations Can be used off the shelf: no need to specify any weight sharing Cons Introduce isotropic symmetries Do not match Euclidean convolutions on grid graphs Figure: Example of translation defined as convolution with a dirac (courtesy of Pasdeloup, B.) December 13th 2018 11 / 50
  • 19. 12/50 Vertex-domain approaches Convolutions are defined as a sum over a neighborhood, usually a sum of dot products (cf references in thesis manuscript). (X ⊗ Θ)(vi) = j∈Nvi θijX(vj) December 13th 2018 12 / 50
  • 20. 13/50 Vertex-domain approaches (X ⊗ Θ)(vi) = j∈Nvi θijX(vj) Pros Match Euclidean convolutions on grid graphs Locally connected Cons Weight sharing is not always explicit December 13th 2018 13 / 50
  • 21. 14/50 A few important references Bruna et al., 2013: spectral filters with O(1) weights, K smoother matrix used to interpolate more weights. gθ(X) = UT (UX.Kθ) Defferard et al., 2016: filters based on Chebychev polynomials (Ti)i. gθ(L) = k i=0 θi Ti(L) Kipf et al., 2016: application to semi-supervised settings. Y = AXΘ Velickovic et al., 2017: introduction of attention coefficients (Ak)k. Y = K k=1 AkXΘk Du et al., 2017: convolution from the GSP field (Sandryhaila et al., 2013). Y = K k=1 Ak XΘk December 13th 2018 14 / 50
  • 22. 14/50 Outline 1 Motivation and problem statement 2 Literature overview 3 Convolution of graph signals 4 Deep learning on graph domains 5 Experiments 6 Conclusion December 13th 2018 14 / 50
  • 23. 15/50 Recall: Euclidean convolution Definition Convolution on S(Z2 ) The (discrete) convolution s1 ∗ s2 is a binary operation in S(Z2 ) defined as: ∀(a, b) ∈ Z2 , (s1 ∗ s2)[a, b] = i j s1[i, j] s2[a − i, b − j] A convolution operator f is a function parameterized by a signal w ∈ S(Z2 ) s.t. : f = . ∗ w (right operator) f = w ∗ . (left operator) Some notable properties: Linearity Locality and weight sharing Commutativity (optional) Equivariance to translations (i.e. commutes with them) December 13th 2018 15 / 50
  • 24. 16/50 Characterization by translational equivariance Theorem Characterization of convolution operators on S(Z2 ) A linear transformation f is equivariant to translations ⇔ it is a convolution operator. translation f f translation December 13th 2018 16 / 50
  • 25. 17/50 Convolution of signals with graph domains Can use the Euclidean convolution How to extend the convolution here ? December 13th 2018 17 / 50
  • 26. 18/50 A few notions of representation theory A group is a set (defined by some properties) which can act on other sets. Let Γ be a group, g ∈ Γ, and V be a set. Example of group actions: Lg : Γ → Γ (auto-action) g(.) : V → V (action on V ) g h → gh v → g(v) Lg g(.) December 13th 2018 18 / 50
  • 27. 19/50 Group convolution Definition Group convolution Let a group Γ, the group convolution between two signals s1 and s2 ∈ S(Γ) is defined as: ∀h ∈ Γ, (s1 ∗i s2)[h] = g∈Γ s1[g] s2[g−1 h] provided at least one of the signals has finite support if Γ is not finite. Theorem Characterization of group convolution operators Let a group Γ, let f ∈ L(S(Γ)), 1 f is a group convolution right operator ⇔ f is equivariant to left multiplications, 2 f is a group convolution left operator ⇔ f is equivariant to right multiplications, 3 f is a group convolution commutative operator ⇔ f is equivariant to multiplications. December 13th 2018 19 / 50
  • 28. 20/50 Goal: convolution on the vertex set Let a graph G = V, E s.t. E ⊂ V 2 . Starting point: bijective map ϕ between a group Γ and G (or a subgraph). Γ V S(Γ) S(V ) ϕ lin. ext. lin. ext. ϕ December 13th 2018 20 / 50
  • 29. 21/50 Convolution on the vertex set from group convolutions (Goal: convolution on the vertex set) ϕ: bijective map S(Γ) S(V ) S(Γ) S(V ) f f=ϕ◦f◦ϕ−1 ϕ−1 ϕ Equivariance theorem to operators of the form ϕ ◦ Lg ◦ ϕ−1 holds. But not necessarily to actions of Γ on V (i.e. of the form g(.)). December 13th 2018 21 / 50
  • 30. 22/50 Needed condition: equivariant map (Goal: equivariance theorem holds) Condition: ϕ is a bijective equivariant map Denote gv = ϕ−1 (v). gu gvgu u ϕ(gvgu) Lgv ϕ ϕ gv(.) We need gv(.) = ϕ ◦ Lgv ◦ ϕ−1 i.e. ∀u ∈ V, gv(u) = ϕ(gvgu). December 13th 2018 22 / 50
  • 31. 23/50 ϕ-convolution ϕ: bijective equivariant map i.e. gv(u) = ϕ(gvgu). Definition ϕ-convolution ∀s1, s2 ∈ S(V ): s1 ∗ϕ s2 = v∈V s1[v] gv(s2) (1) = g∈Γ s1[ϕ(g)] g(s2) (2) Theorem Characterization of ϕ-convolution right operators f is a ϕ-convolution right operator ⇔ f is equivariant to Γ December 13th 2018 23 / 50
  • 32. 24/50 Mixed domain formulation Let Γ be an abelian group. No need to exhibit ϕ in this case: Definition Mixed domain convolution ∀r ∈ S(Γ) and ∀s ∈ S(V ): r ∗m s = g∈Γ r[g] g(s) ∈ S(V ) Equivariance theorem holds: Corollary f is a m-convolution left operator ⇔ f is equivariant to Γ (Converse sense still requires bijectivity between Γ and V ). December 13th 2018 24 / 50
  • 33. 25/50 Inclusion and role of the edge set Edge constrained (EC) d a b c e g(a) ∈ {c, d} g(a) /∈ {b, e} Locality Preserving (LP) a b c a’ b’ c’ g(.) December 13th 2018 25 / 50
  • 34. 26/50 Cayley graphs (Goal: description of EC and LP convolutions) Definition Cayley graph and subgraph Let a group Γ and one of its generating set U. The Cayley graph generated by U, is the digraph G = V, E such that V = Γ and E is such that, either: ∀a, b ∈ Γ, a → b ⇔ ∃g ∈ U, ga = b (left Cayley graph) ∀a, b ∈ Γ, a → b ⇔ ∃g ∈ U, ag = b (right Cayley graph) both points above (abelian Cayley graph) A Cayley subgraph is a subgraph that is isomorph to a Cayley graph. a b g Figure: An edge of a Cayley graph December 13th 2018 26 / 50
  • 35. 27/50 Characterization of EC and LP convolutions Theorem Characterization by Cayley subgraphs Let a graph G = V, E , then: 1 its left Cayley subgraphs characterize its EC ϕ-convolutions, 2 its right Cayley subgraphs characterize its LP ϕ-convolutions, 3 its abelian Cayley subgraphs characterize its EC and LP m-convolutions. Corollary Properties of convolutions that are both EC and LP 1 If a ϕ-convolution of group Γ is EC and LP then Γ is abelian; 2 an m-convolution is EC if, and only if, it is also LP. December 13th 2018 27 / 50
  • 36. 28/50 Other results in the manuscript Description with smaller kernels The weight sharing is preserved More detailed results depending on laterality of operator and equivariance Analysis of limitations due to algebraic structure of the Cayley subgraphs Above theorems hold for groupoids of partial transformation under mild conditions They also hold for groupoids based on paths under restrictive conditions December 13th 2018 28 / 50
  • 37. 28/50 Outline 1 Motivation and problem statement 2 Literature overview 3 Convolution of graph signals 4 Deep learning on graph domains 5 Experiments 6 Conclusion December 13th 2018 28 / 50
  • 38. 29/50 Propagational representation of a layer Ld Ld+1 Definition Edge-constrained layer Connections (dotted lines) are constrained by edges (red lines) in a local receptive field. December 13th 2018 29 / 50
  • 39. 29/50 Propagational representation of a layer Ld Ld+1 Definition Edge-constrained layer Connections (dotted lines) are constrained by edges (red lines) in a local receptive field. Theorem Characterization by local receptive fields (LRF) There is a graph for which a layer is EC ⇔ its LRF are intertwined. December 13th 2018 29 / 50
  • 40. 30/50 Connectivity matrix W (Goal: generalized layer representation) y = h(W · x + b) December 13th 2018 30 / 50
  • 41. 31/50 Scheme tensor S (Goal: generalized layer representation) W = Θ · S y = h(Θ · S · x + b) 1 2 3 4 5 5 4 3 2 1 5 4 3 2 1       s11 s12 0 0 0 s21 s22 s23 0 0 0 s32 s33 s34 0 0 0 s43 s44 s45 0 0 0 s54 s55       W45 = ω k=1 ΘkS45k December 13th 2018 31 / 50
  • 42. 32/50 Neural contraction ˘ΘSX[j, q, b] = ω k=1 P p=1 n i=1 Θ[k, p, q] S[k, i, j] X[i, p, b] g(X) = ˘ΘSX where Wpq ij = Θpq k Sk ij g(X)jq b = Wjq ip Xip b index size description i n input neuron j m output neuron p N input channel q M feature map k ω kernel weight b B batch instance Table: indices tensor shape Θ ω × N × M S ω × n × m X n × N × B ΘS n × m × N × M SX ω × m × N × B ΘX ω × n × M × B ˘ΘSX m × M × B Table: shapes December 13th 2018 32 / 50
  • 43. 33/50 Properties This formulation is: Linear Associative Commutative Generic (next slide) It is explainable as a convolution of graph signals when: in supervised application Either 0 or 1 weight per connection (sij are one-hot vectors) In other cases it can be seen as a linear combination of convolutions. December 13th 2018 33 / 50
  • 44. 34/50 Genericity of ternary representation Given adequate specification of the weight sharing scheme S, we can obtain, e.g. : a dense layer a partially connected layer a convolutional layer a graph convolutional layer (GCN, Kipf et al.) a graph attention layer (GAT, Velickovic et al.) a topology-adaptive graph convolution layer (TAGCN, Du et al.) a mixture model convolutional layer (MOnet, Monti et al.) a generalized convolution under sparse priors any partial connectivity pattern, sparse or not December 13th 2018 34 / 50
  • 45. 35/50 Discussion X Y Y = h(˙ΘSX) The propagation logic is in the scheme S. For example, it can be either: given randomized learned inferred December 13th 2018 35 / 50
  • 46. 35/50 Outline 1 Motivation and problem statement 2 Literature overview 3 Convolution of graph signals 4 Deep learning on graph domains 5 Experiments 6 Conclusion December 13th 2018 35 / 50
  • 47. 36/50 Outline Experiments: Study of influence of symmetries using S Learning S when masked by an adjacency matrix and its powers Monte Carlo simulations with random realizations of S Learning S in semi-supervised applications Inferring S from translations December 13th 2018 36 / 50
  • 49. 38/50 Influence of symmetries Y = h(˙ΘSX) µ S = E (2p + 1)µ µ S = E December 13th 2018 38 / 50
  • 50. 39/50 Influence of symmetries: results on MNIST Y = h(˙ΘSX) 0 2 4 6 1 1.5 2 + σ (in unit of initial pixel separation) Error(in%) MLP CNN 10−1 100 101 102 December 13th 2018 39 / 50
  • 51. 40/50 Learning both S and Θ on MNIST and scrambled MNIST Y = h(˙ΘSX) Ordering Conv5x5 A1 A2 A3 A4 A5 A6 no prior / 1.24% 1.02% 0.93% 0.90% 0.93% 1.00% prior 0.87% 1.21% 0.91% 0.91% 0.87% 0.80% 0.74% Table: Grid graphs on MNIST December 13th 2018 40 / 50
  • 52. 40/50 Learning both S and Θ on MNIST and scrambled MNIST Y = h(˙ΘSX) Ordering Conv5x5 A1 A2 A3 A4 A5 A6 no prior / 1.24% 1.02% 0.93% 0.90% 0.93% 1.00% prior 0.87% 1.21% 0.91% 0.91% 0.87% 0.80% 0.74% Table: Grid graphs on MNIST MLP Conv5x5 Thresholded (p = 3%) k-NN (k = 25) 1.44% 1.39% 1.06% 0.96% Table: Covariance graphs on Scrambled MNIST December 13th 2018 40 / 50
  • 53. 41/50 Experiments on text categorization Y = h(˙ΘSX) input MC64 MC64 TrivialConv1 + FC500 FC20 output Figure: Diagram of the MCNet architecture used MNB FC2500 FC2500-500 ChebNet32 FC500 MCNet 68.51 64.64 65.76 68.26 71.46 (72.25) 70.74 (72.62) Table: Accuracies (in %) on 20NEWS, given as mean (max) December 13th 2018 41 / 50
  • 54. 42/50 Benchmarks on citation networks Y = h(˙ΘSX) Comparison of Graph Convolution Network (GCN), Graph Attention Network (GAT), Topology Adaptive GCN (TAGCN). With our models: Addition of graph dropout to GCN (GCN*), Graph Contraction Network (GCT). Dataset MLP GCN GAT TAGCN GCN* GCT Cora 58.8 ± 0.9 81.8 ± 0.9 83.3 ± 0.6 82.9 ± 0.7 83.4 ± 0.7 83.3 ± 0.7 Citeseer 56.7 ± 1.1 72.2 ± 0.6 72.1 ± 0.6 71.7 ± 0.7 72.5 ± 0.8 72.7 ± 0.5 Pubmed 72.6 ± 0.9 79.0 ± 0.5 78.3 ± 0.7 78.9 ± 0.5 78.2 ± 0.7 79.2 ± 0.4 Table: Mean accuracy (in %) and standard deviation after 100 run December 13th 2018 42 / 50
  • 55. 43/50 Another approach: finding translations in graphs to construct S Step 0 : infer a graph x0 x1 ... xm ... ⇒ 1 2 43 December 13th 2018 43 / 50
  • 56. 43/50 Another approach: finding translations in graphs to construct S Step 0 : infer a graph x0 x1 ... xm ... ⇒ 1 2 43 Step 1: infer translations 1 2 43 ⇒ December 13th 2018 43 / 50
  • 57. 44/50 Another approach: finding translations in graphs to construct S Step 2: design convolution weight-sharing w1× + w2× + w3× + w4× + w0× December 13th 2018 44 / 50
  • 58. 44/50 Another approach: finding translations in graphs to construct S Step 2: design convolution weight-sharing w1× + w2× + w3× + w4× + w0× Step 3: design data-augmentation x0 ⇒ December 13th 2018 44 / 50
  • 59. 45/50 Another approach: finding translations in graphs to construct S Step 4: design graph subsampling and convolution weight-sharing v0 1 2 43 ⇒ 1 2 43 ⇒ w0× + w1× + w2× December 13th 2018 45 / 50
  • 60. 46/50 Architecture We used a variant of deep residual networks (ResNet). We swap operations (data augmentation, convolutions, subsampling) with their counterparts. December 13th 2018 46 / 50
  • 61. 47/50 Results on CIFAR-10, scrambled CIFAR-10 and PINES fMRI Y = h(˙ΘSX) Support MLP CNN Grid Graph Covariance Graph ChebNetc Proposed Proposed Full Data Augmentation 78.62%a,b 93.80% 85.13% 93.94% 92.57% Data Augmentation w/o Flips —— 92.73% 84.41% 92.94% 91.29% Graph Data Augmentation —— 92.10%d —— 92.81% 91.07%a None 69.62% 87.78% —— 88.83% 85.88%a a No priors about the structure b Lin et al., 2015 c Defferard et al., 2016 d Data augmentation done with covariance graph Table: CIFAR-10 and scrambled CIFAR-10 December 13th 2018 47 / 50
  • 62. 47/50 Results on CIFAR-10, scrambled CIFAR-10 and PINES fMRI Y = h(˙ΘSX) Support MLP CNN Grid Graph Covariance Graph ChebNetc Proposed Proposed Full Data Augmentation 78.62%a,b 93.80% 85.13% 93.94% 92.57% Data Augmentation w/o Flips —— 92.73% 84.41% 92.94% 91.29% Graph Data Augmentation —— 92.10%d —— 92.81% 91.07%a None 69.62% 87.78% —— 88.83% 85.88%a a No priors about the structure b Lin et al., 2015 c Defferard et al., 2016 d Data augmentation done with covariance graph Table: CIFAR-10 and scrambled CIFAR-10 Support None Neighborhood Graph Method MLP CNN (1x1 kernels) ChebNetc Proposed Accuracy 82.62% 84.30% 82.80% 85.08% Table: PINES fMRI December 13th 2018 47 / 50
  • 63. 47/50 Outline 1 Motivation and problem statement 2 Literature overview 3 Convolution of graph signals 4 Deep learning on graph domains 5 Experiments 6 Conclusion December 13th 2018 47 / 50
  • 64. 48/50 Summary We studied convolutions of graph signals and used them to build and understand extensions of CNN on graph domains. Convolution of graph signals: Algebraic description of convolution of graph signals ϕ- and m-convolutions, Constructed as the class of linear operator that are equivariant to actions of a group. Strong characterization results for graphs with Cayley subgraphs. Extension with groupoids. Deep learning on graphs: Novel representation based on weight sharing: the neural contraction Monte-Carlo Neural Networks (MCNN) Graph Contraction Networks (GCT) Graph dropout (GCN*) Translation-Convolutional Neural Network (TCNN) December 13th 2018 48 / 50
  • 65. 49/50 Final words Perspectives: In the literature of this domain: semi-supervised >> supervised. Both tasks can be abstracted to a more general case. Y = h(˘ΘSX) There can be more than one tensor rank which relations can be represented by a graph. Y = h(g(X, A1, A2, ..., Ar)) Extended range of applications for deep learning architecture. Thinking in AI might be about creating connections (captured by S) and not about updating weights. Thank you for your attention ! December 13th 2018 49 / 50
  • 66. 50/50 Contributions Generalizing the convolution operator to extend CNNs to irregular domains, Jean-Charles Vialatte, Vincent Grippon, Gr´egoire Mercier, arXiv preprint 2016. Neighborhood-preserving translations on graphs, Nicolas Grelier, Bastien Pasdeloup, Jean-Charles Vialatte, Vincent Gripon, IEEE GlobalSIP 2016. Learning local receptive fields and their weight sharing scheme on graphs, Jean-Charles Vialatte, Vincent Gripon, Gilles Coppin, IEE GlobalSIP 2017. A study of deep learning robustness against computation failures, Jean-Charles Vialatte, Fran¸cois Leduc-Primeau, ACTA 2017. Convolutional neural networks on irregular domains through approximate translations on inferred graphs, Bastien Pasdeloup, Vincent Gripon, Jean-Charles Vialatte, Dominique Pastor, arXiv preprint 2017. Translations on graphs with neighborhood preservation, Bastien Pasdeloup, Vincent Gripon, Nicolas Grelier, Jean-Charles Vialatte, Dominique Pastor, arXiv preprint 2017. Matching CNNs without Priors about data, Carlos-Eduardo Rosar Kos Lassance, Jean-Charles Vialatte, Vincent Gripon, IEEE DSW 2018. On convolution of graph signals and deep learning on graph domains, Jean-Charles Vialatte, thesis, unpublished. Convolution of graph signals, Jean-Charles Vialatte, Vincent Gripon, Gilles Coppin, unpublished. Graph contraction networks, Graph dropout, Monte-Carlo Networks, Jean-Charles Vialatte, Vincent Gripon, Gilles Coppin, unpublished. December 13th 2018 50 / 50