SlideShare une entreprise Scribd logo
1  sur  56
Télécharger pour lire hors ligne
Information	in	the	Weights	
Mark	Chang
2020/06/10
Outline
• Traditional	Machine	Learning	v.s. Deep	Learning
• Basic	Concepts	in	Information	Theory
• Information	in	the	Weights
Traditional	Machine	Learning	v.s.	Deep	
Learning
• VC	Bound
• Generalization	in	Deep	Learning
• PAC-Bayesian	Bound	for	Deep	Learning
VC	Bound
• What	cause	over-fitting?
• Too	many	parameters	->	over	fitting	?	
• Too	many	parameters	->	high	VC	Dimension	->	over	fitting	?
• …	?	
h
h h
under	fitting over	fittingappropriate	fitting
too	few	
parameters
adequate
parameters
too	many
parameters
training	data
testing	data
VC	Bound
• Over-fitting is caused by high VC Dimension
• For	a	given	dataset	(n	is	constant),	search	for	the	best	VC	Dimension
d=n (shatter)
d	(VC	Dimension)
over-fittingerror
best	VC	Dimension
numbef of
training instances
VC	Dimension
(model complexity)✏(h)  ˆ✏(h) +
r
8
n
log(
4(2n)d
)
training	error
testing	error
VC	Dimension
• VC	Dimension	of	linear	model:	
• O(W)
• W	=	number	of	parameters	
• VC	Dimension	of	fully-connected	
neural	networks:	
• O(LW	log	W)
• L	=	number	of	layers
• W	=	number	of	parameters
• VC Dimension is	independent from data distribution,	and	only	
depends	on	model
Generalization	in	Deep	Learning
• Considering	a	toy	example:
neural	networks
input:	
780
hidden:	
600
d	≈	26M
dataset
n	=	50,000
d	>>	n,	but	testing	error	<	0.1
Generalization	in	Deep	Learning
• However,	when	you	are	solving	your	problem	…
neural	networks
input:	
780
hidden:	
600
testing	error	=	0.6
over	fitting	!!	
your	
dataset
n	=	50,000
10	classes
testing	error	=	0.6
over	fitting	!!	
reduce	
VC	Dimension
neural	networks
input:	
780
hidden:	
200
…
reduce	
VC	Dimension
Generalization	in	Deep	Learning
d	(VC	Dimension)
error
✏(h)  ˆ✏(h) +
r
8
n
log(
4(2n)d
)
d=n
over-fitting
over-parameterization
model	with	
extremely	high	VC-
Dimension
Generalization	in	Deep	Learning
ICLR2017
Generalization	in	Deep	Learning
1				 0 1 0 2
random	noise	features
shatter	!
deep
neural
networks
(Inception)
feature
:label: 1						0 1 0 2
original	dataset	(CIFAR)
0 1 1 2 0
random	label
ˆ✏(h) ⇡ 0 ˆ✏(h) ⇡ 0 ˆ✏(h) ⇡ 0
Generalization	in	Deep	Learning
1				 0 1 0 2
random	noise	features
deep
neural
networks
(Inception)
feature
:label: 1						0 1 0 2
original	dataset	(CIFAR)
0 1 1 2 0
random	label
ˆ✏(h) ⇡ 0 ˆ✏(h) ⇡ 0 ˆ✏(h) ⇡ 0
✏(h) ⇡ 0.14 ✏(h) ⇡ 0.9 ✏(h) ⇡ 0.9
Generalization	in	Deep	Learning
• Testing error depends on data distribution
• However,	VC-Bound does not depend on data distribution
1					0 1 0 2
random	noise	
features
original	dataset
feature
:label: 1						0 1 0 2
random	label
0 1 1 2 0
✏(h) ⇡ 0.14 ✏(h) ⇡ 0.9
Generalization	in	Deep	Learning
Generalization	in	Deep	Learning
• high	sharpness	->	high	testing	error
PAC-Bayesian	Bound	for	Deep	Learning
UAI	2017
PAC-Bayesian	Bound	for	Deep	Learning
• Deterministic Model • Stochastic Model	(Gibbs	Classifier)
data ✏(h)
hypothesis
h
error
data
✏(Q)
= Eh⇠Q(✏(h))
distribution	of	
hypothesis
hypothesis
h
Gibbs
error
sampling
PAC-Bayesian	Bound	for	Deep	Learning
• Considering	the	sharpness	of	local	minimums	
single	
hypothesis
h
distribution	of
hypothesis	
Q
flat	minimum:
• low	
• low		
ˆ✏(h)
ˆ✏(Q)
sharp	minimum:
• low
• high
ˆ✏(h)
ˆ✏(Q)
KL(ˆ✏(Q)k✏(Q)) 
KL(QkP) + log(n
)
n 1
PAC-Bayesian	Bound	for	Deep	Learning
• With 1-ẟ probability, the following inequality (PAC-Bayesian	Bound)	is
satisfied
number of
training instances
KL	divergence
between	P	and	Q
Distribution	of	model	
before	training	(prior)
Distribution	of	models	
after	training	(posterior)
KL	divergence
between	testing	error	
&	training	error
PAC-Bayesian	Bound	for	Deep	Learning
✏(Q)  ˆ✏(Q) +
s
KL(Q||P) + log(n
) + 2
2n 1
high ✏(Q)  ˆ✏(Q) +
s
KL(Q||P) + log(n
) + 2
2n 1
,		high
under	fitting over	fittingappropriate	fitting
✏(Q)  ˆ✏(Q) +
s
KL(Q||P) + log(n
) + 2
2n 1
moderate ✏(Q)  ˆ✏(Q) +
s
KL(Q||P) + log(n
) +
2n 1
,		moderate ✏(Q)  ˆ✏(Q) +
s
KL(Q||P) + log(n
2n 1
low ✏(Q)  ˆ✏(Q) +
s
,		high
low KL(Q||P) moderate KL(Q||P) high KL(Q||P)
training	data
testing	data
P
Q
P
Q
P
Q
KL(ˆ✏(Q)k✏(Q)) 
KL(QkP) + log(n
)
n 1
lowKL(ˆ✏(Q)k✏(Q)) moderae KL(ˆ✏(Q)k✏(Q)) high KL(ˆ✏(Q)k✏(Q))
PAC-Bayesian	Bound	for	Deep	Learning
• PAC-Bayesian	Bound	is	data-dependent
KL(ˆ✏(Q)k✏(Q)) 
KL(QkP) + log(n
)
n 1
• High	VC	Dimension,	but	clean	data
->	low	KL	(Q||P)
• High	VC	Dimension,	and	noisy	data
->	high	KL(Q||P)
P
Q Q
P
PAC-Bayesian	Bound	for	Deep	Learning
• Data	:	M-NIST	(binary	classification,	class0	:	0~4,	class1	:	5~9)
• Model	:	2	layer	or	3	layer	NN
Training
Testing
VC	Bound
Pac-Bayesian
Bound
0.028
0.034
26m
0.161
0.027
0.035
56m
0.179
Varying	the	width	of	
hidden	layer	(2	layer	NN)
600 1200
0.027
0.032
121m
0.201
0.028
0.034
26m
0.161
0.028
0.033
66m
0.186
Varying	the	number	of	
hidden	layers
2 3 4
0.028
0.034
26m
0.161
0.112
0.503
26m
1.352
original random
original	M-NIST	
v.s.	random	label
PAC-Bayesian	Bound	for	Deep	Learning
• PAC-Bayesian	Bound	is	Data	Dependent
clean	data:
->small	KL(Q||P)
->small	ε(Q)
noisy	data:
->large	KL(Q||P)
->large	ε(Q)
feature:
label:
original	M-NIST
0 0 0 1 1
random	label
1						0 1						0						1		
KL(ˆ✏(Q)k✏(Q)) 
KL(QkP) + log(n
)
n 1
Basic	Concepts	in	Information	Theory
• Entropy
• Joint	Entropy	
• Conditional	Entropy
• Mutual	Information
• Cross	Entropy
Entropy
• The	uncertainty	of	a	random	variable	X	
H(X) =
X
x2X
p(x) log p(x)
x 1 2
P(X=x) 0.5 0.5
x 1 2
P(X=x) 0.9 0.1
x 1 2 3 4
P(X=x) 0.25 0.25 0.25 0.25
H(X) = 2 ⇥ 0.5 log(0.5) = 1
H(X) = 0.9 log(0.9) 0.1 log(0.1) = 0.469
H(X) = 4 ⇥ 0.25 log(0.25) = 2
Joint	Entropy	
• The	uncertainty	of	a	joint	distribution	involving	two	random	variables	
X,	Y	
H(X, Y ) =
X
x2X,y2Y
p(x, y) log p(x, y)
P(X,Y) Y=1 Y=2
X=1 0.25 0.25
X=2 0.25 0.25
H(X, Y ) = 4 ⇥ 0.25 log(0.25) = 2
Conditional	Entropy
• The	uncertainty	of	a	random	variable	Y	given	the	value	of		another	
random	variable	X
H(Y |X) =
X
x2X
p(x)
X
y2Y
p(y|x) log p(y|x)
P(X,Y) Y=1 Y=2
X=1 0.4 0.4
X=2 0.1 0.1
P(X,Y) Y=1 Y=2
X=1 0.4 0.1
X=2 0.1 0.4
X	&	Y	are	independent Y	is	a	stochasit function	of	X
H(X, Y ) = 1.722
H(Y |X) = 0.722
H(X, Y ) = 1.722
H(Y |X) = 1
P(X,Y) Y=1 Y=2
X=1 0.5 0
X=2 0 0.5
Y	is	a	deterministic	function	of	X
H(X, Y ) = 1
H(Y |X) = 0
Conditional	Entropy
Information	Diagram
H(X)H(Y )
H(X, Y )
H(Y |X)
H(Y |X) =
X
x2X
p(x)
X
y2Y
p(y|x) log p(y|x)
=
X
x2X
p(x)
X
y2Y
p(y|x)(log p(x, y) log p(x))
=
X
x2X,y2Y
p(x, y) log p(x, y) +
X
x2X
p(x) log p(x)
= H(X, Y ) H(X)
Conditional	Entropy
P(X,Y) Y=1 Y=2
X=1 0.4 0.4
X=2 0.1 0.1
P(X,Y) Y=1 Y=2
X=1 0.4 0.1
X=2 0.1 0.4
X	&	Y	are	independent Y	is	a	stochasit function	of	X
H(X, Y ) = 1.722
H(X) = 0.722, H(Y ) = 1
H(Y |X) = H(X, Y ) H(X)
= 1 = H(Y )
H(X, Y ) = 1.722
H(X) = 1, H(Y ) = 1
H(Y |X) = H(X, Y ) H(X)
= 0.722
P(X,Y) Y=1 Y=2
X=1 0.5 0
X=2 0 0.5
Y	is	a	deterministic	function	of	X
H(X, Y ) = 1
H(X) = 1, H(Y ) = 1
H(Y |X) = H(X, Y ) H(X)
= 0
H(X)
H(X, Y )
H(Y |X) = H(Y )
H(Y )
H(X, Y )
H(X)
H(Y |X) = 0
H(Y |X)
H(X, Y )
Mutual	Information
• The	mutual	dependence	between	two	variables	X,	Y
I(X; Y ) =
X
x2X,y2Y
p(x, y) log
p(x, x)
p(x)p(y)
P(X,Y) Y=1 Y=2
X=1 0.4 0.4
X=2 0.1 0.1
P(X,Y) Y=1 Y=2
X=1 0.4 0.1
X=2 0.1 0.4
X	&	Y	are	independent
I(X; Y ) = 0 I(X; Y ) = 0.278
Y	is	a	stochasit function	of	X
P(X,Y) Y=1 Y=2
X=1 0.5 0
X=2 0 0.5
Y	is	a	deterministic	function	of	X
I(X, Y ) = 1
Mutual	Information
H(X)H(Y )
H(X, Y )
Information	
Diagram
I(X; Y )
I(X; Y ) =
X
x2X,y2Y
p(x, y) log
p(x, x)
p(x)p(y)
=
X
x2X,y2Y
p(x, y)( log p(x) log p(y) + log p(x, x))
=
X
x2X
p(x) log p(x)
X
y2Y
p(y) log p(y) +
X
x2X,y2Y
p(x, y) log p(x, x))
= H(X) + H(Y ) H(X, Y )
Mutual	Information
P(X,Y) Y=1 Y=2
X=1 0.4 0.4
X=2 0.1 0.1
P(X,Y) Y=1 Y=2
X=1 0.4 0.1
X=2 0.1 0.4
X	&	Y	are	independent
H(Y )
H(X, Y )
H(X)
I(X; Y )
H(X)
H(X, Y )
H(Y )
I(X; Y ) = 0
Y	is	a	stochasit function	of	X
P(X,Y) Y=1 Y=2
X=1 0.5 0
X=2 0 0.5
Y	is	a	deterministic	function	of	X
H(X, Y ) = 1.722
H(X) = 0.722, H(Y ) = 1
I(X; Y ) = H(X) + H(Y )
H(X, Y ) = 0
H(X, Y ) = 1.722
H(X) = 1, H(Y ) = 1
I(X; Y ) = H(X) + H(Y )
H(X, Y ) = 0.278
H(X, Y ) = 1
H(X) = 1, H(Y ) = 1
I(X; Y ) = H(X) + H(Y )
H(X, Y ) = 1
H(Y )
H(X, Y )
H(X)
I(X; Y )
Entropy,	Joint	Entropy	,	Conditional	Entropy	&	
Mutual	Information
H(X)
H(Y ) H(Z)
H(X|Y, Z)
H(Y |X, Z) H(Z|X, Y )
I(X; Z|Y )I(X; Y |Z)
I(Y ; Z|X)
I(X; Y ; Z)
Cross	Entropy
Hp,q(X) =
X
x2X
p(x) log q(x)
=
X
x2X
p(x) log p(x)
X
x2X
p(x) log q(x) log p(x)
= Hp(X) + KL p(X)kq(X)
Information	in	the	Weights
• Cause	of	Over-fitting	
• Information	in	the	Weights	as a Regularizer
• Bounding	the	Information	in	the	Weights	
• Connection	with	Flat	Minimum
• Connection	with	PAC-Bayesian	Bound
• Experiments
Information	in	the	Weights
Cause	of	Over-fitting	
• Training	loss	(Cross-Entropy):
Hp,q(y|x, w) = Ex,y
⇥
p(y|x, w) log q(y|x, w)
⇤
p : probability density function of data
q : probability density function predicted by model
x : input feature of training data
y : label of training data
w : weights of model
✓ : latent parameters of data distribution
Hp,q(y|x, w) = Hp(y|x, w) + Ex,wKL p(y|x, w)kq(y|x, w)
Cause	of	Over-fitting	
the	uncertainty	of	y	given	w and	x
Hp(x)
Hp(y)
Hp(y|x, w)
Hp(w)
Cause	of	Over-fitting	
• lower	
->	lower	uncertainty	of	y	given	w	and	x	 ->	lower	training	error	
• ex:	 given		x as									,	and	a	fixed	w
8
>><
>>:
p(y = 1|x, w) = 0.9
p(y = 2|x, w) = 0.1
p(y = 3|x, w) = 0.0
p(y = 4|x, w) = 0.0
8
>><
>>:
p(y = 1|x, w) = 0.3
p(y = 2|x, w) = 0.3
p(y = 3|x, w) = 0.2
p(y = 4|x, w) = 0.2
higher Hp(y|x, w)lower Hp(y|x, w)
Hp(y|x, w)
Hp,q(y|x, w) = Hp(y|x, w) + Ex,wKL p(y|x, w)kq(y|x, w)
Cause	of	Over-fitting	
Hp,q(y|x, w) = Hp(y|x, w) + Ex,wKL p(y|x, w)kq(y|x, w)
Hp(y|x, w) = Hp(y|x, ✓) + I(y; ✓|x, w) I(y; w|x, ✓)
✓ : latent parameters of (training & testing) data distribution
Hp(y|x)Hp(ytest|xtest)
Hp(✓)
Cause	of	Over-fitting	
Hp(y|x)Hp(ytest|xtest)
Hp(✓)
✓ : latent parameters of (training & testing) data distribution
1
2
3
x y
3
2
x y
3
2
x y
3
Hp(y|x, ✓)
Ip(y; ✓|x)
useful	information	in	
training	data	
noisy information
and	outlier in	
training	data
noise	and	outlier	
in	testing	data
normal	samples	not	in	
training	data
Cause	of	Over-fitting	
Hp(y|x, w) = Hp(y|x, ✓) + I(y; ✓|x, w) I(y; w|x, ✓)
Hp(x)
Hp(y)
Hp(✓)
Hp(y|x, ✓) the	uncertainty	of	
y	given	w and	x
Hp(y|x, w)
Hp(w)
I(y; ✓|x, w)
noisy information
and	outlier in training data
useful	information	not	
learned	by	weights
noisy information
and	outlier learned	by	
weights
I(y; w|x, ✓)
Cause	of	Over-fitting	
Hp(x)
Hp(y)
Hp(✓)
Hp(y|x, ✓)
Hp(w)
noisy information
and	outlier in training data
Hp(y|x, w) = Hp(y|x, ✓) + I(y; ✓|x, w) I(y; w|x, ✓)
Cause	of	Over-fitting	
• lower																						->	less	noise	and	outlier	in	training	data.		low Hp(y|x, ✓)
lower Hp(y|x, ✓) higher Hp(y|x, ✓)
Hp(y|x)Hp(✓)
3
2
x y
1
3
Hp(y|x, w) = Hp(y|x, ✓) + I(y; ✓|x, w) I(y; w|x, ✓)
Hp(y|x)Hp(✓)
3
2
x y
2
3
Cause	of	Over-fitting	
Hp(y|x, w) = Hp(y|x, ✓) + I(y; ✓|x, w) I(y; w|x, ✓)
Hp(x)
Hp(y)
Hp(✓)
Hp(w)
I(y; ✓|x, w)
useful	information	not	
learned	by	weights
Cause	of	Over-fitting	
• lower
->	more	useful	information	learned	by	weights
->	lower
->	lower	testing	error		
Hp(✓)
Hp(w1)
Hp(w2)
I(y; ✓|x, w)
Hp(y|x)
Hp(ytest|xtest)
I(y; ✓|x, w2) < I(y; ✓|x, w1)
) Hp(ytest|xtest, w2) < Hp(ytest|xtest, w1)
Hp(ytest|xtest, w)
Hp(y|x, w) = Hp(y|x, ✓) + I(y; ✓|x, w) I(y; w|x, ✓)
Cause	of	Over-fitting	
Hp(y|x, w) = Hp(y|x, ✓) + I(y; ✓|x, w) I(y; w|x, ✓)
Hp(x)
Hp(y)
Hp(✓)
Hp(w)
noisy information
and	outlier learned	by	
weights
I(y; w|x, ✓)
w
• Cause	of	over	fitting:	weights	memorize	the noisy informationin	training	data.
Cause	of	Over-fitting	
High	VC	Dimension,	but	clean	data
->	few	noise	to	memorize
High	VC	Dimension,	and	noisy	data
->	much	noise	to	memorize	
training	data
testing	data
Hp(y|x, w) = Hp(y|x, ✓) + I(y; ✓|x, w) I(y; w|x, ✓)
ww
• is	unknown,																					cannot	be	compute
• ,	the	information	in	the	weight,	is an upper bound of
Information	in	the	Weights	as a Regularizer
I(y; w|x, ✓)  I(y, x; w|✓) = I(D; w|✓)  I(D; w)
I(y; w|x, ✓)
I(y; w|x, ✓)I(D; w)
D
Hp(x)
Hp(y)
Hp(✓)
I(y; w|x, ✓) Hp(x, y) = Hp(D)
I(y, x; w|✓) = I(D; w|✓)I(D; w)
Hp(w)
I(y; w|x, ✓)
Information	in	the	Weights	as a Regularizer
• The	actual	data distribution	p is unknown
• Estimate by
• New loss function : as a regularizerIp(D; w) ⇡ Iq(D; w)
L (q(w|D)) = Hp,q(y|x, w) + Iq(D; w)
I(D; w) Iq(D; w)
Connection	with	Flat	Minimum
• Flat	minimum has	low	information in the weight
kH k⇤ : nuclear norm of the Hessian at the local minimum
Flat	minimum ->	low	neclear norm of the Hessian	 -> low information
Iq(w; D) 
1
2
K
⇥
log k ˆwk2
2 + log kH k⇤ K log(
K2
2
)
⇤
Connection	with	PAC-Bayesian	Bound
• Given a prior distribution , we have:p(w)
distribution	of	weights
before training	(prior)
distributionof	weights	
after	training on dataset D
(posterior)
Iq(w, D) = EDKL(q(w|D)kq(w))
 EDKL(q(w|D)kq(w)) + EDKL(q(w|D)kp(w))
 EDKL(q(w|D)kp(w))
Connection	with	PAC-Bayesian	Bound
• Loss function with the	regularizer :
• PAC Bayesian Bound :
L (q(w|D)) = Hp,q(y|x, w) + Iq(D; w)
 Hp,q(y|x, w) + EDKL(q(w|D)kp(w))
Ip(D; w) ⇡ Iq(D; w)
ED
⇥
Ltest
(q(w|D))
⇤

Hp,q(y|x, w) + LmaxED
⇥
KL(q(w|D)kp(w))
⇤
n(1 1
2 )
Ltest : test error of the network with weights q(w|D)
Lmax : maximum	per-sample	loss	function
Experiments
• Random	Labels
Dataset	size
Information	complexity
L (q(w|D)) = Hp,q(y|x, w) + Iq(D; w)
Experiments
Dataset	size
Information	complexity
• Real Labels
L (q(w|D)) = Hp,q(y|x, w) + Iq(D; w)
Experiments
• information	in	the	weights	v.s. percentage	of	corrupted	labels

Contenu connexe

Tendances

Control of Discrete-Time Piecewise Affine Probabilistic Systems using Reachab...
Control of Discrete-Time Piecewise Affine Probabilistic Systems using Reachab...Control of Discrete-Time Piecewise Affine Probabilistic Systems using Reachab...
Control of Discrete-Time Piecewise Affine Probabilistic Systems using Reachab...Leo Asselborn
 
Locality-sensitive hashing for search in metric space
Locality-sensitive hashing for search in metric space Locality-sensitive hashing for search in metric space
Locality-sensitive hashing for search in metric space Eliezer Silva
 
High-Performance Approach to String Similarity using Most Frequent K Characters
High-Performance Approach to String Similarity using Most Frequent K CharactersHigh-Performance Approach to String Similarity using Most Frequent K Characters
High-Performance Approach to String Similarity using Most Frequent K CharactersHolistic Benchmarking of Big Linked Data
 
Signal Processing Course : Convex Optimization
Signal Processing Course : Convex OptimizationSignal Processing Course : Convex Optimization
Signal Processing Course : Convex OptimizationGabriel Peyré
 
DissertationSlides169
DissertationSlides169DissertationSlides169
DissertationSlides169Ryan White
 
Processing Reachability Queries with Realistic Constraints on Massive Network...
Processing Reachability Queries with Realistic Constraints on Massive Network...Processing Reachability Queries with Realistic Constraints on Massive Network...
Processing Reachability Queries with Realistic Constraints on Massive Network...BigMine
 
Mesh Processing Course : Active Contours
Mesh Processing Course : Active ContoursMesh Processing Course : Active Contours
Mesh Processing Course : Active ContoursGabriel Peyré
 
ABC: How Bayesian can it be?
ABC: How Bayesian can it be?ABC: How Bayesian can it be?
ABC: How Bayesian can it be?Christian Robert
 
Parallel Optimization in Machine Learning
Parallel Optimization in Machine LearningParallel Optimization in Machine Learning
Parallel Optimization in Machine LearningFabian Pedregosa
 
Decision Making with Hierarchical Credal Sets (IPMU 2014)
Decision Making with Hierarchical Credal Sets (IPMU 2014)Decision Making with Hierarchical Credal Sets (IPMU 2014)
Decision Making with Hierarchical Credal Sets (IPMU 2014)Alessandro Antonucci
 
Low Complexity Regularization of Inverse Problems - Course #3 Proximal Splitt...
Low Complexity Regularization of Inverse Problems - Course #3 Proximal Splitt...Low Complexity Regularization of Inverse Problems - Course #3 Proximal Splitt...
Low Complexity Regularization of Inverse Problems - Course #3 Proximal Splitt...Gabriel Peyré
 
GradStudentSeminarSept30
GradStudentSeminarSept30GradStudentSeminarSept30
GradStudentSeminarSept30Ryan White
 
Signal Processing Course : Inverse Problems Regularization
Signal Processing Course : Inverse Problems RegularizationSignal Processing Course : Inverse Problems Regularization
Signal Processing Course : Inverse Problems RegularizationGabriel Peyré
 
Low Complexity Regularization of Inverse Problems
Low Complexity Regularization of Inverse ProblemsLow Complexity Regularization of Inverse Problems
Low Complexity Regularization of Inverse ProblemsGabriel Peyré
 

Tendances (19)

QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
 
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
 
Control of Discrete-Time Piecewise Affine Probabilistic Systems using Reachab...
Control of Discrete-Time Piecewise Affine Probabilistic Systems using Reachab...Control of Discrete-Time Piecewise Affine Probabilistic Systems using Reachab...
Control of Discrete-Time Piecewise Affine Probabilistic Systems using Reachab...
 
Locality-sensitive hashing for search in metric space
Locality-sensitive hashing for search in metric space Locality-sensitive hashing for search in metric space
Locality-sensitive hashing for search in metric space
 
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
 
cyclic_code.pdf
cyclic_code.pdfcyclic_code.pdf
cyclic_code.pdf
 
High-Performance Approach to String Similarity using Most Frequent K Characters
High-Performance Approach to String Similarity using Most Frequent K CharactersHigh-Performance Approach to String Similarity using Most Frequent K Characters
High-Performance Approach to String Similarity using Most Frequent K Characters
 
Signal Processing Course : Convex Optimization
Signal Processing Course : Convex OptimizationSignal Processing Course : Convex Optimization
Signal Processing Course : Convex Optimization
 
DissertationSlides169
DissertationSlides169DissertationSlides169
DissertationSlides169
 
Processing Reachability Queries with Realistic Constraints on Massive Network...
Processing Reachability Queries with Realistic Constraints on Massive Network...Processing Reachability Queries with Realistic Constraints on Massive Network...
Processing Reachability Queries with Realistic Constraints on Massive Network...
 
Claire98
Claire98Claire98
Claire98
 
Mesh Processing Course : Active Contours
Mesh Processing Course : Active ContoursMesh Processing Course : Active Contours
Mesh Processing Course : Active Contours
 
ABC: How Bayesian can it be?
ABC: How Bayesian can it be?ABC: How Bayesian can it be?
ABC: How Bayesian can it be?
 
Parallel Optimization in Machine Learning
Parallel Optimization in Machine LearningParallel Optimization in Machine Learning
Parallel Optimization in Machine Learning
 
Decision Making with Hierarchical Credal Sets (IPMU 2014)
Decision Making with Hierarchical Credal Sets (IPMU 2014)Decision Making with Hierarchical Credal Sets (IPMU 2014)
Decision Making with Hierarchical Credal Sets (IPMU 2014)
 
Low Complexity Regularization of Inverse Problems - Course #3 Proximal Splitt...
Low Complexity Regularization of Inverse Problems - Course #3 Proximal Splitt...Low Complexity Regularization of Inverse Problems - Course #3 Proximal Splitt...
Low Complexity Regularization of Inverse Problems - Course #3 Proximal Splitt...
 
GradStudentSeminarSept30
GradStudentSeminarSept30GradStudentSeminarSept30
GradStudentSeminarSept30
 
Signal Processing Course : Inverse Problems Regularization
Signal Processing Course : Inverse Problems RegularizationSignal Processing Course : Inverse Problems Regularization
Signal Processing Course : Inverse Problems Regularization
 
Low Complexity Regularization of Inverse Problems
Low Complexity Regularization of Inverse ProblemsLow Complexity Regularization of Inverse Problems
Low Complexity Regularization of Inverse Problems
 

Similaire à Information in the Weights

presentation on Fandamental of Probability
presentation on Fandamental of Probabilitypresentation on Fandamental of Probability
presentation on Fandamental of ProbabilityMaheshGour5
 
MLHEP 2015: Introductory Lecture #1
MLHEP 2015: Introductory Lecture #1MLHEP 2015: Introductory Lecture #1
MLHEP 2015: Introductory Lecture #1arogozhnikov
 
Probability_Review.ppt
Probability_Review.pptProbability_Review.ppt
Probability_Review.pptGireeshNcs
 
Probability_Review.ppt
Probability_Review.pptProbability_Review.ppt
Probability_Review.pptRobinBushu
 
Probability_Review.ppt
Probability_Review.pptProbability_Review.ppt
Probability_Review.pptssuserd329601
 
Probability_Review.ppt
Probability_Review.pptProbability_Review.ppt
Probability_Review.pptsarahfarhin
 
Probability_Review.ppt
Probability_Review.pptProbability_Review.ppt
Probability_Review.pptSameer607695
 
Probability_Review HELPFUL IN STATISTICS.ppt
Probability_Review HELPFUL IN STATISTICS.pptProbability_Review HELPFUL IN STATISTICS.ppt
Probability_Review HELPFUL IN STATISTICS.pptShamshadAli58
 
Probability_Review.ppt for your knowledg
Probability_Review.ppt for your knowledgProbability_Review.ppt for your knowledg
Probability_Review.ppt for your knowledgnsnayak03
 
Probability_Review.ppt
Probability_Review.pptProbability_Review.ppt
Probability_Review.pptYonas992841
 
Probability statistics assignment help
Probability statistics assignment helpProbability statistics assignment help
Probability statistics assignment helpHomeworkAssignmentHe
 
Expectation propagation
Expectation propagationExpectation propagation
Expectation propagationDong Guo
 
Application of Bayesian and Sparse Network Models for Assessing Linkage Diseq...
Application of Bayesian and Sparse Network Models for Assessing Linkage Diseq...Application of Bayesian and Sparse Network Models for Assessing Linkage Diseq...
Application of Bayesian and Sparse Network Models for Assessing Linkage Diseq...Gota Morota
 
Bayes Classification
Bayes ClassificationBayes Classification
Bayes Classificationsathish sak
 
Support vector machine
Support vector machineSupport vector machine
Support vector machinePrasenjit Dey
 
Delayed acceptance for Metropolis-Hastings algorithms
Delayed acceptance for Metropolis-Hastings algorithmsDelayed acceptance for Metropolis-Hastings algorithms
Delayed acceptance for Metropolis-Hastings algorithmsChristian Robert
 
Low-rank matrix approximations in Python by Christian Thurau PyData 2014
Low-rank matrix approximations in Python by Christian Thurau PyData 2014Low-rank matrix approximations in Python by Christian Thurau PyData 2014
Low-rank matrix approximations in Python by Christian Thurau PyData 2014PyData
 

Similaire à Information in the Weights (20)

presentation on Fandamental of Probability
presentation on Fandamental of Probabilitypresentation on Fandamental of Probability
presentation on Fandamental of Probability
 
MLHEP 2015: Introductory Lecture #1
MLHEP 2015: Introductory Lecture #1MLHEP 2015: Introductory Lecture #1
MLHEP 2015: Introductory Lecture #1
 
Probability_Review.ppt
Probability_Review.pptProbability_Review.ppt
Probability_Review.ppt
 
Probability_Review.ppt
Probability_Review.pptProbability_Review.ppt
Probability_Review.ppt
 
Probability_Review.ppt
Probability_Review.pptProbability_Review.ppt
Probability_Review.ppt
 
Probability_Review.ppt
Probability_Review.pptProbability_Review.ppt
Probability_Review.ppt
 
Probability_Review.ppt
Probability_Review.pptProbability_Review.ppt
Probability_Review.ppt
 
Probability_Review.ppt
Probability_Review.pptProbability_Review.ppt
Probability_Review.ppt
 
Probability_Review HELPFUL IN STATISTICS.ppt
Probability_Review HELPFUL IN STATISTICS.pptProbability_Review HELPFUL IN STATISTICS.ppt
Probability_Review HELPFUL IN STATISTICS.ppt
 
Probability_Review.ppt for your knowledg
Probability_Review.ppt for your knowledgProbability_Review.ppt for your knowledg
Probability_Review.ppt for your knowledg
 
Probability_Review.ppt
Probability_Review.pptProbability_Review.ppt
Probability_Review.ppt
 
Probability statistics assignment help
Probability statistics assignment helpProbability statistics assignment help
Probability statistics assignment help
 
Expectation propagation
Expectation propagationExpectation propagation
Expectation propagation
 
Application of Bayesian and Sparse Network Models for Assessing Linkage Diseq...
Application of Bayesian and Sparse Network Models for Assessing Linkage Diseq...Application of Bayesian and Sparse Network Models for Assessing Linkage Diseq...
Application of Bayesian and Sparse Network Models for Assessing Linkage Diseq...
 
Bayes Classification
Bayes ClassificationBayes Classification
Bayes Classification
 
ABC workshop: 17w5025
ABC workshop: 17w5025ABC workshop: 17w5025
ABC workshop: 17w5025
 
Support vector machine
Support vector machineSupport vector machine
Support vector machine
 
Bayes 6
Bayes 6Bayes 6
Bayes 6
 
Delayed acceptance for Metropolis-Hastings algorithms
Delayed acceptance for Metropolis-Hastings algorithmsDelayed acceptance for Metropolis-Hastings algorithms
Delayed acceptance for Metropolis-Hastings algorithms
 
Low-rank matrix approximations in Python by Christian Thurau PyData 2014
Low-rank matrix approximations in Python by Christian Thurau PyData 2014Low-rank matrix approximations in Python by Christian Thurau PyData 2014
Low-rank matrix approximations in Python by Christian Thurau PyData 2014
 

Plus de Mark Chang

Domain Adaptation
Domain AdaptationDomain Adaptation
Domain AdaptationMark Chang
 
NTU ML TENSORFLOW
NTU ML TENSORFLOWNTU ML TENSORFLOW
NTU ML TENSORFLOWMark Chang
 
NTHU AI Reading Group: Improved Training of Wasserstein GANs
NTHU AI Reading Group: Improved Training of Wasserstein GANsNTHU AI Reading Group: Improved Training of Wasserstein GANs
NTHU AI Reading Group: Improved Training of Wasserstein GANsMark Chang
 
Generative Adversarial Networks
Generative Adversarial NetworksGenerative Adversarial Networks
Generative Adversarial NetworksMark Chang
 
Applied Deep Learning 11/03 Convolutional Neural Networks
Applied Deep Learning 11/03 Convolutional Neural NetworksApplied Deep Learning 11/03 Convolutional Neural Networks
Applied Deep Learning 11/03 Convolutional Neural NetworksMark Chang
 
The Genome Assembly Problem
The Genome Assembly ProblemThe Genome Assembly Problem
The Genome Assembly ProblemMark Chang
 
DRAW: Deep Recurrent Attentive Writer
DRAW: Deep Recurrent Attentive WriterDRAW: Deep Recurrent Attentive Writer
DRAW: Deep Recurrent Attentive WriterMark Chang
 
淺談深度學習
淺談深度學習淺談深度學習
淺談深度學習Mark Chang
 
Variational Autoencoder
Variational AutoencoderVariational Autoencoder
Variational AutoencoderMark Chang
 
TensorFlow 深度學習快速上手班--深度學習
 TensorFlow 深度學習快速上手班--深度學習 TensorFlow 深度學習快速上手班--深度學習
TensorFlow 深度學習快速上手班--深度學習Mark Chang
 
TensorFlow 深度學習快速上手班--電腦視覺應用
TensorFlow 深度學習快速上手班--電腦視覺應用TensorFlow 深度學習快速上手班--電腦視覺應用
TensorFlow 深度學習快速上手班--電腦視覺應用Mark Chang
 
TensorFlow 深度學習快速上手班--自然語言處理應用
TensorFlow 深度學習快速上手班--自然語言處理應用TensorFlow 深度學習快速上手班--自然語言處理應用
TensorFlow 深度學習快速上手班--自然語言處理應用Mark Chang
 
TensorFlow 深度學習快速上手班--機器學習
TensorFlow 深度學習快速上手班--機器學習TensorFlow 深度學習快速上手班--機器學習
TensorFlow 深度學習快速上手班--機器學習Mark Chang
 
Computational Linguistics week 10
 Computational Linguistics week 10 Computational Linguistics week 10
Computational Linguistics week 10Mark Chang
 
TensorFlow 深度學習講座
TensorFlow 深度學習講座TensorFlow 深度學習講座
TensorFlow 深度學習講座Mark Chang
 
Computational Linguistics week 5
Computational Linguistics  week 5Computational Linguistics  week 5
Computational Linguistics week 5Mark Chang
 
Neural Art (English Version)
Neural Art (English Version)Neural Art (English Version)
Neural Art (English Version)Mark Chang
 
AlphaGo in Depth
AlphaGo in Depth AlphaGo in Depth
AlphaGo in Depth Mark Chang
 
Image completion
Image completionImage completion
Image completionMark Chang
 

Plus de Mark Chang (20)

Domain Adaptation
Domain AdaptationDomain Adaptation
Domain Adaptation
 
NTU ML TENSORFLOW
NTU ML TENSORFLOWNTU ML TENSORFLOW
NTU ML TENSORFLOW
 
NTHU AI Reading Group: Improved Training of Wasserstein GANs
NTHU AI Reading Group: Improved Training of Wasserstein GANsNTHU AI Reading Group: Improved Training of Wasserstein GANs
NTHU AI Reading Group: Improved Training of Wasserstein GANs
 
Generative Adversarial Networks
Generative Adversarial NetworksGenerative Adversarial Networks
Generative Adversarial Networks
 
Applied Deep Learning 11/03 Convolutional Neural Networks
Applied Deep Learning 11/03 Convolutional Neural NetworksApplied Deep Learning 11/03 Convolutional Neural Networks
Applied Deep Learning 11/03 Convolutional Neural Networks
 
The Genome Assembly Problem
The Genome Assembly ProblemThe Genome Assembly Problem
The Genome Assembly Problem
 
DRAW: Deep Recurrent Attentive Writer
DRAW: Deep Recurrent Attentive WriterDRAW: Deep Recurrent Attentive Writer
DRAW: Deep Recurrent Attentive Writer
 
淺談深度學習
淺談深度學習淺談深度學習
淺談深度學習
 
Variational Autoencoder
Variational AutoencoderVariational Autoencoder
Variational Autoencoder
 
TensorFlow 深度學習快速上手班--深度學習
 TensorFlow 深度學習快速上手班--深度學習 TensorFlow 深度學習快速上手班--深度學習
TensorFlow 深度學習快速上手班--深度學習
 
TensorFlow 深度學習快速上手班--電腦視覺應用
TensorFlow 深度學習快速上手班--電腦視覺應用TensorFlow 深度學習快速上手班--電腦視覺應用
TensorFlow 深度學習快速上手班--電腦視覺應用
 
TensorFlow 深度學習快速上手班--自然語言處理應用
TensorFlow 深度學習快速上手班--自然語言處理應用TensorFlow 深度學習快速上手班--自然語言處理應用
TensorFlow 深度學習快速上手班--自然語言處理應用
 
TensorFlow 深度學習快速上手班--機器學習
TensorFlow 深度學習快速上手班--機器學習TensorFlow 深度學習快速上手班--機器學習
TensorFlow 深度學習快速上手班--機器學習
 
Computational Linguistics week 10
 Computational Linguistics week 10 Computational Linguistics week 10
Computational Linguistics week 10
 
Neural Doodle
Neural DoodleNeural Doodle
Neural Doodle
 
TensorFlow 深度學習講座
TensorFlow 深度學習講座TensorFlow 深度學習講座
TensorFlow 深度學習講座
 
Computational Linguistics week 5
Computational Linguistics  week 5Computational Linguistics  week 5
Computational Linguistics week 5
 
Neural Art (English Version)
Neural Art (English Version)Neural Art (English Version)
Neural Art (English Version)
 
AlphaGo in Depth
AlphaGo in Depth AlphaGo in Depth
AlphaGo in Depth
 
Image completion
Image completionImage completion
Image completion
 

Dernier

Leverage Zilliz Serverless - Up to 50X Saving for Your Vector Storage Cost
Leverage Zilliz Serverless - Up to 50X Saving for Your Vector Storage CostLeverage Zilliz Serverless - Up to 50X Saving for Your Vector Storage Cost
Leverage Zilliz Serverless - Up to 50X Saving for Your Vector Storage CostZilliz
 
Human Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsHuman Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsMark Billinghurst
 
SAP Build Work Zone - Overview L2-L3.pptx
SAP Build Work Zone - Overview L2-L3.pptxSAP Build Work Zone - Overview L2-L3.pptx
SAP Build Work Zone - Overview L2-L3.pptxNavinnSomaal
 
Search Engine Optimization SEO PDF for 2024.pdf
Search Engine Optimization SEO PDF for 2024.pdfSearch Engine Optimization SEO PDF for 2024.pdf
Search Engine Optimization SEO PDF for 2024.pdfRankYa
 
Vector Databases 101 - An introduction to the world of Vector Databases
Vector Databases 101 - An introduction to the world of Vector DatabasesVector Databases 101 - An introduction to the world of Vector Databases
Vector Databases 101 - An introduction to the world of Vector DatabasesZilliz
 
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)Mark Simos
 
Developer Data Modeling Mistakes: From Postgres to NoSQL
Developer Data Modeling Mistakes: From Postgres to NoSQLDeveloper Data Modeling Mistakes: From Postgres to NoSQL
Developer Data Modeling Mistakes: From Postgres to NoSQLScyllaDB
 
Ensuring Technical Readiness For Copilot in Microsoft 365
Ensuring Technical Readiness For Copilot in Microsoft 365Ensuring Technical Readiness For Copilot in Microsoft 365
Ensuring Technical Readiness For Copilot in Microsoft 3652toLead Limited
 
Powerpoint exploring the locations used in television show Time Clash
Powerpoint exploring the locations used in television show Time ClashPowerpoint exploring the locations used in television show Time Clash
Powerpoint exploring the locations used in television show Time Clashcharlottematthew16
 
Unraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdfUnraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdfAlex Barbosa Coqueiro
 
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...Patryk Bandurski
 
"Federated learning: out of reach no matter how close",Oleksandr Lapshyn
"Federated learning: out of reach no matter how close",Oleksandr Lapshyn"Federated learning: out of reach no matter how close",Oleksandr Lapshyn
"Federated learning: out of reach no matter how close",Oleksandr LapshynFwdays
 
Bun (KitWorks Team Study 노별마루 발표 2024.4.22)
Bun (KitWorks Team Study 노별마루 발표 2024.4.22)Bun (KitWorks Team Study 노별마루 발표 2024.4.22)
Bun (KitWorks Team Study 노별마루 발표 2024.4.22)Wonjun Hwang
 
"ML in Production",Oleksandr Bagan
"ML in Production",Oleksandr Bagan"ML in Production",Oleksandr Bagan
"ML in Production",Oleksandr BaganFwdays
 
Streamlining Python Development: A Guide to a Modern Project Setup
Streamlining Python Development: A Guide to a Modern Project SetupStreamlining Python Development: A Guide to a Modern Project Setup
Streamlining Python Development: A Guide to a Modern Project SetupFlorian Wilhelm
 
Nell’iperspazio con Rocket: il Framework Web di Rust!
Nell’iperspazio con Rocket: il Framework Web di Rust!Nell’iperspazio con Rocket: il Framework Web di Rust!
Nell’iperspazio con Rocket: il Framework Web di Rust!Commit University
 
"Debugging python applications inside k8s environment", Andrii Soldatenko
"Debugging python applications inside k8s environment", Andrii Soldatenko"Debugging python applications inside k8s environment", Andrii Soldatenko
"Debugging python applications inside k8s environment", Andrii SoldatenkoFwdays
 
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024BookNet Canada
 
CloudStudio User manual (basic edition):
CloudStudio User manual (basic edition):CloudStudio User manual (basic edition):
CloudStudio User manual (basic edition):comworks
 
Designing IA for AI - Information Architecture Conference 2024
Designing IA for AI - Information Architecture Conference 2024Designing IA for AI - Information Architecture Conference 2024
Designing IA for AI - Information Architecture Conference 2024Enterprise Knowledge
 

Dernier (20)

Leverage Zilliz Serverless - Up to 50X Saving for Your Vector Storage Cost
Leverage Zilliz Serverless - Up to 50X Saving for Your Vector Storage CostLeverage Zilliz Serverless - Up to 50X Saving for Your Vector Storage Cost
Leverage Zilliz Serverless - Up to 50X Saving for Your Vector Storage Cost
 
Human Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsHuman Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR Systems
 
SAP Build Work Zone - Overview L2-L3.pptx
SAP Build Work Zone - Overview L2-L3.pptxSAP Build Work Zone - Overview L2-L3.pptx
SAP Build Work Zone - Overview L2-L3.pptx
 
Search Engine Optimization SEO PDF for 2024.pdf
Search Engine Optimization SEO PDF for 2024.pdfSearch Engine Optimization SEO PDF for 2024.pdf
Search Engine Optimization SEO PDF for 2024.pdf
 
Vector Databases 101 - An introduction to the world of Vector Databases
Vector Databases 101 - An introduction to the world of Vector DatabasesVector Databases 101 - An introduction to the world of Vector Databases
Vector Databases 101 - An introduction to the world of Vector Databases
 
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
 
Developer Data Modeling Mistakes: From Postgres to NoSQL
Developer Data Modeling Mistakes: From Postgres to NoSQLDeveloper Data Modeling Mistakes: From Postgres to NoSQL
Developer Data Modeling Mistakes: From Postgres to NoSQL
 
Ensuring Technical Readiness For Copilot in Microsoft 365
Ensuring Technical Readiness For Copilot in Microsoft 365Ensuring Technical Readiness For Copilot in Microsoft 365
Ensuring Technical Readiness For Copilot in Microsoft 365
 
Powerpoint exploring the locations used in television show Time Clash
Powerpoint exploring the locations used in television show Time ClashPowerpoint exploring the locations used in television show Time Clash
Powerpoint exploring the locations used in television show Time Clash
 
Unraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdfUnraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdf
 
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
 
"Federated learning: out of reach no matter how close",Oleksandr Lapshyn
"Federated learning: out of reach no matter how close",Oleksandr Lapshyn"Federated learning: out of reach no matter how close",Oleksandr Lapshyn
"Federated learning: out of reach no matter how close",Oleksandr Lapshyn
 
Bun (KitWorks Team Study 노별마루 발표 2024.4.22)
Bun (KitWorks Team Study 노별마루 발표 2024.4.22)Bun (KitWorks Team Study 노별마루 발표 2024.4.22)
Bun (KitWorks Team Study 노별마루 발표 2024.4.22)
 
"ML in Production",Oleksandr Bagan
"ML in Production",Oleksandr Bagan"ML in Production",Oleksandr Bagan
"ML in Production",Oleksandr Bagan
 
Streamlining Python Development: A Guide to a Modern Project Setup
Streamlining Python Development: A Guide to a Modern Project SetupStreamlining Python Development: A Guide to a Modern Project Setup
Streamlining Python Development: A Guide to a Modern Project Setup
 
Nell’iperspazio con Rocket: il Framework Web di Rust!
Nell’iperspazio con Rocket: il Framework Web di Rust!Nell’iperspazio con Rocket: il Framework Web di Rust!
Nell’iperspazio con Rocket: il Framework Web di Rust!
 
"Debugging python applications inside k8s environment", Andrii Soldatenko
"Debugging python applications inside k8s environment", Andrii Soldatenko"Debugging python applications inside k8s environment", Andrii Soldatenko
"Debugging python applications inside k8s environment", Andrii Soldatenko
 
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
 
CloudStudio User manual (basic edition):
CloudStudio User manual (basic edition):CloudStudio User manual (basic edition):
CloudStudio User manual (basic edition):
 
Designing IA for AI - Information Architecture Conference 2024
Designing IA for AI - Information Architecture Conference 2024Designing IA for AI - Information Architecture Conference 2024
Designing IA for AI - Information Architecture Conference 2024
 

Information in the Weights