SlideShare une entreprise Scribd logo
1  sur  27
Télécharger pour lire hors ligne
Arthur CHARPENTIER - Probit transformation for nonparametric kernel estimation of the copula
Probit transformation for nonparametric kernel estimation of the copula
A. Charpentier (Université de Rennes 1 & UQAM),
joint work with G. Geenens (UNSW) & D. Paindaveine (ULB)
Journées de Statistiques, Lille, Juin 2015
http://arxiv.org/abs/1404.4414
url: http://freakonometrics.hypotheses.org
email : charpentier.arthur@uqam.ca
: @freakonometrics
1
Arthur CHARPENTIER - Probit transformation for nonparametric kernel estimation of the copula
Motivation
Consider some n-i.i.d. sample {(Xi, Yi)} with cu-
mulative distribution function FXY and joint den-
sity fXY . Let FX and FY denote the marginal
distributions, and C the copula,
FXY (x, y) = C(FX(x), FY (y))
so that
fXY (x, y) = fX(x)fY (y)c(FX(x), FY (y))
We want a nonparametric estimate of c on [0, 1]2
. 1e+01 1e+03 1e+05
1e+011e+021e+031e+041e+05
2
Arthur CHARPENTIER - Probit transformation for nonparametric kernel estimation of the copula
Notations
Define uniformized n-i.i.d. sample {(Ui, Vi)}
Ui = FX(Xi) and Vi = FY (Yi)
or uniformized n-i.i.d. pseudo-sample {( ˆUi, ˆVi)}
ˆUi =
n
n + 1
ˆFXn(Xi) and ˆVi =
n
n + 1
ˆFY n(Yi)
where ˆFXn and ˆFY n denote empirical c.d.f.
0.0 0.2 0.4 0.6 0.8 1.0
0.00.20.40.60.81.0
3
Arthur CHARPENTIER - Probit transformation for nonparametric kernel estimation of the copula
Standard Kernel Estimate
The standard kernel estimator for c, say ˆc∗
, at (u, v) ∈ I would be (see Wand &
Jones (1995))
ˆc∗
(u, v) =
1
n|HUV |1/2
n
i=1
K H
−1/2
UV
u − Ui
v − Vi
, (1)
where K : R2
→ R is a kernel function and HUV is a bandwidth matrix.
4
Arthur CHARPENTIER - Probit transformation for nonparametric kernel estimation of the copula
Standard Kernel Estimate
However, this estimator is not consistent along
boundaries of [0, 1]2
E(ˆc∗
(u, v)) =
1
4
c(u, v) + O(h) at corners
E(ˆc∗
(u, v)) =
1
2
c(u, v) + O(h) on the borders
if K is symmetric and HUV symmetric.
Corrections have been proposed, e.g. mirror reflec-
tion Gijbels (1990) or the usage of boundary kernels
Chen (2007), but with mixed results.
Remark: the graph on the bottom is ˆc∗
on the
(first) diagonal. 0.0 0.2 0.4 0.6 0.8 1.0
01234567
5
Arthur CHARPENTIER - Probit transformation for nonparametric kernel estimation of the copula
Mirror Kernel Estimate
Use an enlarged sample: instead of only {( ˆUi, ˆVi)},
add {(− ˆUi, ˆVi)}, {( ˆUi, − ˆVi)}, {(− ˆUi, − ˆVi)},
{( ˆUi, 2 − ˆVi)}, {(2 − ˆUi, ˆVi)},{(− ˆUi, 2 − ˆVi)},
{(2 − ˆUi, − ˆVi)} and {(2 − ˆUi, 2 − ˆVi)}.
See Gijbels & Mielniczuk (1990).
That estimator will be used as a benchmark in the
simulation study.
0.0 0.2 0.4 0.6 0.8 1.0
01234567
6
Arthur CHARPENTIER - Probit transformation for nonparametric kernel estimation of the copula
Using Beta Kernels
Use a Kernel which is a product of beta kernels
Kxi
(u)) ∝ u
x1,i
b
1 [1 − u1]
x1,i
b · u
x2,i
b
2 [1 − u2]
x2,i
b
See Chen (1999).
0.0 0.2 0.4 0.6 0.8 1.0
01234567
7
Arthur CHARPENTIER - Probit transformation for nonparametric kernel estimation of the copula
Probit Transformation
See Devroye & Gyöfi (1985) and Marron & Ruppert
(1994).
Define normalized n-i.i.d. sample {(Si, Ti)}
Si = Φ−1
(Ui) and Ti = Φ−1
(Vi)
or normalized n-i.i.d. pseudo-sample {( ˆSi, ˆTi)}
ˆUi = Φ−1
( ˆUi) and ˆVi = Φ−1
( ˆVi)
where Φ−1
is the quantile function of N(0, 1)
(probit transformation). −3 −2 −1 0 1 2 3
−3−2−10123
8
Arthur CHARPENTIER - Probit transformation for nonparametric kernel estimation of the copula
Probit Transformation
FST (x, y) = C(Φ(x), Φ(y))
so that
fST (x, y) = φ(x)φ(y)c(Φ(x), Φ(y))
Thus
c(u, v) =
fST (Φ−1
(u), Φ−1
(v))
φ(Φ−1(u))φ(Φ−1(v))
.
So use
ˆc(τ)
(u, v) =
ˆfST (Φ−1
(u), Φ−1
(v))
φ(Φ−1(u))φ(Φ−1(v))
−3 −2 −1 0 1 2 3
−3−2−10123
9
Arthur CHARPENTIER - Probit transformation for nonparametric kernel estimation of the copula
The naive estimator
Since we cannot use
ˆf∗
ST (s, t) =
1
n|HST |1/2
n
i=1
K H
−1/2
ST
s − Si
t − Ti
,
where K is a kernel function and HST is a band-
width matrix, use
ˆfST (s, t) =
1
n|HST |1/2
n
i=1
K H
−1/2
ST
s − ˆSi
t − ˆTi
.
and the copula density is
ˆc(τ)
(u, v) =
ˆfST (Φ−1
(u), Φ−1
(v))
φ(Φ−1(u))φ(Φ−1(v)) 0.0 0.2 0.4 0.6 0.8 1.0
01234567
10
Arthur CHARPENTIER - Probit transformation for nonparametric kernel estimation of the copula
The naive estimator
ˆc(τ)
(u, v) =
1
n|HST |1/2φ(Φ−1(u))φ(Φ−1(v))
n
i=1
K H
−1/2
ST
Φ−1
(u) − Φ−1
( ˆUi)
Φ−1(v) − Φ−1( ˆVi)
as suggested in C., Fermanian & Scaillet (2007) and Lopez-Paz . et al. (2013).
Note that Omelka . et al. (2009) obtained theoretical properties on the
convergence of ˆC(τ)
(u, v) (not c).
11
Arthur CHARPENTIER - Probit transformation for nonparametric kernel estimation of the copula
Improved probit-transformation copula density estimators
When estimating a density from pseudo-sample, Loader (1996) and Hjort &
Jones (1996) define a local likelihood estimator
Around (s, t) ∈ R2
, use a polynomial approximation of order p for log fST
log fST (ˇs, ˇt) a1,0(s, t) + a1,1(s, t)(ˇs − s) + a1,2(s, t)(ˇt − t)
.
= Pa1
(ˇs − s, ˇt − t)
log fST (ˇs, ˇt) a2,0(s, t) + a2,1(s, t)(ˇs − s) + a2,2(s, t)(ˇt − t)
+ a2,3(s, t)(ˇs − s)2
+ a2,4(s, t)(ˇt − t)2
+ a2,5(s, t)(ˇs − s)(ˇt − t)
.
= Pa2 (ˇs − s, ˇt − t).
12
Arthur CHARPENTIER - Probit transformation for nonparametric kernel estimation of the copula
Improved probit-transformation copula density estimators
Remark Vectors a1(s, t) = (a1,0(s, t), a1,1(s, t), a1,2(s, t)) and
a2(s, t)
.
= (a2,0(s, t), . . . , a2,5(s, t)) are then estimated by solving a weighted
maximum likelihood problem.
˜ap(s, t) = arg max
ap
n
i=1
K H
−1/2
ST
s − ˆSi
t − ˆTi
Pap
( ˆSi − s, ˆTi − t)
−n
R2
K H
−1/2
ST
s − ˇs
t − ˇt
exp Pap (ˇs − s, ˇt − t) dˇs dˇt ,
The estimate of fST at (s, t) is then ˜f
(p)
ST (s, t) = exp(˜ap,0(s, t)), for p = 1, 2.
The Improved probit-transformation kernel copula density estimators are
˜c(τ,p)
(u, v) =
˜f
(p)
ST (Φ−1
(u), Φ−1
(v))
φ(Φ−1(u))φ(Φ−1(v))
13
Arthur CHARPENTIER - Probit transformation for nonparametric kernel estimation of the copula
Improved probit-transformation
copula density estimators
For the local log-linear (p = 1) approximation
˜c(τ,1)
(u, v) =
exp(˜a1,0(Φ−1
(u), Φ−1
(v))
φ(Φ−1(u))φ(Φ−1(v))
0.0 0.2 0.4 0.6 0.8 1.0
01234567
14
Arthur CHARPENTIER - Probit transformation for nonparametric kernel estimation of the copula
Improved probit-transformation
copula density estimators
For the local log-quadratic (p = 2) approximation
˜c(τ,2)
(u, v) =
exp(˜a2,0(Φ−1
(u), Φ−1
(v))
φ(Φ−1(u))φ(Φ−1(v))
0.0 0.2 0.4 0.6 0.8 1.0
01234567
15
Arthur CHARPENTIER - Probit transformation for nonparametric kernel estimation of the copula
Asymptotic properties
A1. The sample {(Xi, Yi)} is a n- i.i.d. sample from the joint distribution FXY ,
an absolutely continuous distribution with marginals FX and FY strictly
increasing on their support;
(uniqueness of the copula)
16
Arthur CHARPENTIER - Probit transformation for nonparametric kernel estimation of the copula
Asymptotic properties
A2. The copula C of FXY is such that (∂C/∂u)(u, v) and (∂2
C/∂u2
)(u, v) exist
and are continuous on {(u, v) : u ∈ (0, 1), v ∈ [0, 1]}, and (∂C/∂v)(u, v) and
(∂2
C/∂v2
)(u, v) exist and are continuous on {(u, v) : u ∈ [0, 1], v ∈ (0, 1)}. In
addition, there are constants K1 and K2 such that



∂2
C
∂u2
(u, v) ≤
K1
u(1 − u)
for (u, v) ∈ (0, 1) × [0, 1];
∂2
C
∂v2
(u, v) ≤
K2
v(1 − v)
for (u, v) ∈ [0, 1] × (0, 1);
A3. The density c of C exists, is positive and admits continuous second-order
partial derivatives on the interior of the unit square I. In addition, there is a
constant K00 such that
c(u, v) ≤ K00 min
1
u(1 − u)
,
1
v(1 − v)
∀(u, v) ∈ (0, 1)2
.
see Segers (2012).
17
Arthur CHARPENTIER - Probit transformation for nonparametric kernel estimation of the copula
Asymptotic properties
Assume that K(z1, z2) = φ(z1)φ(z2) and HST = h2
I with h ∼ n−a
for some
a ∈ (0, 1/4). Under Assumptions A1-A3, the ‘naive’ probit transformation kernel
copula density estimator at any (u, v) ∈ (0, 1)2
is such that
√
nh2 ˆc(τ)
(u, v) − c(u, v) − h2
b(u, v)
L
−→ N 0, σ2
(u, v) ,
where b(u, v) =
1
2
∂2
c
∂u2
(u, v)φ2
(Φ−1
(u)) +
∂2
c
∂v2
(u, v)φ2
(Φ−1
(v))
− 3
∂c
∂u
(u, v)Φ−1
(u)φ(Φ−1
(u)) +
∂c
∂v
(u, v)Φ−1
(v)φ(Φ−1
(v))
+ c(u, v) {Φ−1
(u)}2
+ {Φ−1
(v)}2
− 2 (2)
and σ2
(u, v) =
c(u, v)
4πφ(Φ−1(u))φ(Φ−1(v))
.
18
Arthur CHARPENTIER - Probit transformation for nonparametric kernel estimation of the copula
The Amended version
The last unbounded term in b be easily adjusted.
ˆc(τam)
(u, v) =
ˆfST (Φ−1
(u), Φ−1
(v))
φ(Φ−1(u))φ(Φ−1(v))
×
1
1 + 1
2 h2 ({Φ−1(u)}2 + {Φ−1(v)}2 − 2)
.
The asymptotic bias becomes proportional to
b(am)
(u, v) =
1
2
∂2
c
∂u2
(u, v)φ2
(Φ−1
(u)) +
∂2
c
∂v2
(u, v)φ2
(Φ−1
(v))
−3
∂c
∂u
(u, v)Φ−1
(u)φ(Φ−1
(u)) +
∂c
∂v
(u, v)Φ−1
(v)φ(Φ−1
(v)) .
19
Arthur CHARPENTIER - Probit transformation for nonparametric kernel estimation of the copula
A local log-linear probit-transformation kernel estimator
˜c∗(τ,1)
(u, v) = ˜f
∗(1)
ST (Φ−1
(u), Φ−1
(v))/ φ(Φ−1
(u))φ(Φ−1
(v))
Then
√
nh2 ˜c∗(τ,1)
(u, v) − c(u, v) − h2
b(1)
(u, v)
L
−→ N 0, σ(1) 2
(u, v) ,
where b(1)
(u, v) =
1
2
∂2
c
∂u2
(u, v)φ2
(Φ−1
(u)) +
∂2
c
∂v2
(u, v)φ2
(Φ−1
(v))
−
1
c(u, v)
∂c
∂u
(u, v)
2
φ2
(Φ−1
(u)) +
∂c
∂v
(u, v)
2
φ2
(Φ−1
(v))
−
∂c
∂u
(u, v)Φ−1
(u)φ(Φ−1
(u)) +
∂c
∂v
(u, v)Φ−1
(v)φ(Φ−1
(v)) − 2c(u, v) (3)
and σ(1) 2
(u, v) =
c(u, v)
4πφ(Φ−1(u))φ(Φ−1(v))
.
20
Arthur CHARPENTIER - Probit transformation for nonparametric kernel estimation of the copula
Using a higher order polynomial approximation
Locally fitting a polynomial of a higher degree is known to reduce the asymptotic
bias of the estimator, here from order O(h2
) to order O(h4
), see Loader (1996) or
Hjort (1996), under sufficient smoothness conditions.
If fST admits continuous fourth-order partial derivatives and is positive at (s, t),
then
√
nh2 ˜f
∗(2)
ST (s, t) − fST (s, t) − h4
b
(2)
ST (s, t)
L
−→ N 0, σ
(2)
ST
2
(s, t) ,
where σ
(2)
ST
2
(s, t) =
5
2
fST (s, t)
4π
and
b
(2)
ST (s, t) = −
1
8
fST (s, t)
×
∂4
g
∂s4
+
∂4
g
∂t4
+ 4
∂3
g
∂s3
∂g
∂s
+
∂3
g
∂t3
∂g
∂t
+
∂3
g
∂s2∂t
∂g
∂t
+
∂3
g
∂s∂t2
∂g
∂s
+ 2
∂4
g
∂s2∂t2
(s, t),
with g(s, t) = log fST (s, t).
21
Arthur CHARPENTIER - Probit transformation for nonparametric kernel estimation of the copula
Using a higher order polynomial approximation
A4. The copula density c(u, v) = (∂2
C/∂u∂v)(u, v) admits continuous
fourth-order partial derivatives on the interior of the unit square [0, 1]2
.
Then
√
nh2 ˜c∗(τ,2)
(u, v) − c(u, v) − h4
b(2)
(u, v)
L
−→ N 0, σ(2) 2
(u, v)
where σ(2) 2
(u, v) =
5
2
c(u, v)
4πφ(Φ−1(u))φ(Φ−1(v))
22
Arthur CHARPENTIER - Probit transformation for nonparametric kernel estimation of the copula
Improving Bandwidth choice
Consider the principal components decomposition of the (n × 2) matrix
[ˆS, ˆT ] = M.
Let W1 = (W11, W12)T
and W2 = (W21, W22)T
be the eigenvectors of MT
M. Set
Q
R
=


W11 W12
W21 W22

 S
T
= W
S
T
which is only a linear reparametrization of R2
, so
an estimate of fST can be readily obtained from an
estimate of the density of (Q, R)
Since { ˆQi} and { ˆRi} are empirically uncorrelated,
consider a diagonal bandwidth matrix HQR =
diag(h2
Q, h2
R).
−4 −2 0 2 4
−3−2−1012
23
Arthur CHARPENTIER - Probit transformation for nonparametric kernel estimation of the copula
Improving Bandwidth choice
Use univariate procedures to select hQ and hR independently
Denote ˜f
(p)
Q and ˜f
(p)
R (p = 1, 2), the local log-polynomial estimators for the
densities
hQ can be selected via cross-validation (see Section 5.3.3 in Loader (1999))
hQ = arg min
h>0
∞
−∞
˜f
(p)
Q (q)
2
dq −
2
n
n
i=1
˜f
(p)
Q(−i)( ˆQi) ,
where ˜f
(p)
Q(−i) is the ‘leave-one-out’ version of ˜f
(p)
Q .
24
Arthur CHARPENTIER - Probit transformation for nonparametric kernel estimation of the copula
Graphical Comparison (loss ALAE dataset)
c~(τ2)
Loss (X)
ALAE(Y)
0.25
0.25
0.5
0.5
0.75
0.75
1
1
1.25
1.25
1.5
1.5
2
2
4
0.0 0.2 0.4 0.6 0.8 1.0
0.00.20.40.60.81.0
0.25
0.25
0.5
0.5
0.75
0.75
1
1
1
1.25
1.25
1.5
1.5
2
2
4
c^
β
Loss (X)
ALAE(Y)
0.25
0.25
0.5
0.5
0.75
0.75
1
1
1.25
1.25
1.5
1.5
2
2
4
0.0 0.2 0.4 0.6 0.8 1.0
0.00.20.40.60.81.0
0.25
0.25
0.25 0.25
0.5
0.5
0.75
0.75
0.75
1
1
1
1
1
1.25
1.25
1.25
1.25
1.25
1.5
1.5
2
2
2
4
c^
b
Loss (X)
ALAE(Y)
0.25
0.25
0.5
0.5
0.75
0.75
1
1
1.25
1.25
1.5
1.5
2
2
4
0.0 0.2 0.4 0.6 0.8 1.0
0.00.20.40.60.81.0
0.25
0.25
0.5
0.5
0.75
0.75
1
1
1.25
1.25
1.5
1.5
2
2
c^
p
Loss (X)
ALAE(Y)
0.25
0.25
0.5
0.5
0.75
0.75
1
1
1.25
1.25
1.5
1.5
2
2
4
0.0 0.2 0.4 0.6 0.8 1.0
0.00.20.40.60.81.0
0.25
0.25
0.25
0.25
0.5
0.5
0.75
0.75
1
1
1
1.25
1.25
1.25
1.25
1.5
1.5
1.5
2
2
4
0.0 0.2 0.4 0.6 0.8 1.0
0.00.20.40.60.81.0
c~(τ2)
Loss (X)
ALAE(Y)
0.25
0.25
0.5
0.5
0.75
0.75
1
1
1
1.25
1.25
1.5
1.5
2
2
4
0.0 0.2 0.4 0.6 0.8 1.0
0.00.20.40.60.81.0
c^
β
Loss (X)
ALAE(Y)
0.25
0.25
0.25 0.25
0.5
0.5
0.75
0.75
0.75
1
1
1
1
1
1.25
1.25
1.25
1.25
1.25
1.5
1.5
1.5
2
2
2
4
0.0 0.2 0.4 0.6 0.8 1.0
0.00.20.40.60.81.0
c^
b
Loss (X)
ALAE(Y)
0.25
0.25
0.5
0.5
0.75
0.751
1
1.25
1.25
1.5
1.5
2
2
0.0 0.2 0.4 0.6 0.8 1.0
0.00.20.40.60.81.0
c^
p
Loss (X)
ALAE(Y)
0.25
0.25
0.25
0.25
0.5
0.5
0.75
0.75
1
1
1
1.25
1.25
1.25
1.25
1.5
1.5
1.5
2
2
4
25
Arthur CHARPENTIER - Probit transformation for nonparametric kernel estimation of the copula
Simulation Study
M = 1, 000 independent random samples {(Ui, Vi)}n
i=1 of sizes n = 200, n = 500
and n = 1000 were generated from each of the following copulas:
· the independence copula (i.e., Ui’s and Vi’s drawn independently);
· the Gaussian copula, with parameters ρ = 0.31, ρ = 0.59 and ρ = 0.81;
· the Student t-copula with 4 degrees of freedom, with parameters ρ = 0.31,
ρ = 0.59 and ρ = 0.81;
· the Frank copula, with parameter θ = 1.86, θ = 4.16 and θ = 7.93;
· the Gumbel copula, with parameter θ = 1.25, θ = 1.67 and θ = 2.5;
· the Clayton copula, with parameter θ = 0.5, θ = 1.67 and θ = 2.5.
(approximated) MISE relative to the MISE of the mirror-reflection estimator
(last column), n = 1000. Bold values show the minimum MISE for the
corresponding copula (non-significantly different values are highlighted as well).
26
Arthur CHARPENTIER - Probit transformation for nonparametric kernel estimation of the copula
n = 1000 ˆc(τ) ˆc(τam) ˜c(τ,1) ˜c(τ,2) ˆc
(β)
1 ˆc
(β)
2 ˆc
(B)
1 ˆc
(B)
2 ˆc
(p)
1 ˆc
(p)
2 ˆc
(p)
3
Indep 3.57 2.80 2.89 1.40 7.96 11.65 1.69 3.43 1.62 0.50 0.14
Gauss2 2.03 1.52 1.60 0.76 4.63 6.06 1.10 1.82 0.98 0.66 0.89
Gauss4 0.63 0.49 0.44 0.21 1.72 1.60 0.75 0.58 0.62 0.99 2.93
Gauss6 0.21 0.20 0.11 0.05 0.74 0.33 0.77 0.37 0.72 1.21 2.83
Std(4)2 0.61 0.56 0.50 0.40 1.57 1.80 0.78 0.67 0.75 1.01 1.88
Std(4)4 0.21 0.27 0.17 0.15 0.88 0.51 0.75 0.42 0.75 1.12 2.07
Std(4)6 0.09 0.17 0.08 0.09 0.70 0.19 0.82 0.47 0.90 1.17 1.90
Frank2 3.31 2.42 2.57 1.35 7.16 9.63 1.70 2.95 1.31 0.45 0.49
Frank4 2.35 1.45 1.51 0.99 4.42 4.89 1.49 1.65 0.60 0.72 6.14
Frank6 0.96 0.52 0.45 0.44 1.51 1.19 1.35 0.76 0.65 1.58 7.25
Gumbel2 0.65 0.62 0.56 0.43 1.77 1.97 0.82 0.75 0.83 1.03 1.52
Gumbel4 0.18 0.28 0.16 0.19 0.89 0.41 0.78 0.47 0.81 1.10 1.78
Gumbel6 0.09 0.21 0.10 0.15 0.78 0.29 0.85 0.58 0.94 1.12 1.63
Clayton2 0.63 0.60 0.51 0.34 1.78 1.99 0.78 0.70 0.79 1.04 1.79
Clayton4 0.11 0.26 0.10 0.15 0.79 0.27 0.83 0.56 0.90 1.10 1.50
Clayton6 0.11 0.28 0.08 0.15 0.82 0.35 0.88 0.67 0.96 1.09 1.36
27

Contenu connexe

Tendances (20)

Slides université Laval, Actuariat, Avril 2011
Slides université Laval, Actuariat, Avril 2011Slides université Laval, Actuariat, Avril 2011
Slides université Laval, Actuariat, Avril 2011
 
Slides ensae 9
Slides ensae 9Slides ensae 9
Slides ensae 9
 
Quantile and Expectile Regression
Quantile and Expectile RegressionQuantile and Expectile Regression
Quantile and Expectile Regression
 
Slides ensae 8
Slides ensae 8Slides ensae 8
Slides ensae 8
 
Sildes buenos aires
Sildes buenos airesSildes buenos aires
Sildes buenos aires
 
Slides amsterdam-2013
Slides amsterdam-2013Slides amsterdam-2013
Slides amsterdam-2013
 
Slides risk-rennes
Slides risk-rennesSlides risk-rennes
Slides risk-rennes
 
Slides erasmus
Slides erasmusSlides erasmus
Slides erasmus
 
Berlin
BerlinBerlin
Berlin
 
Slides ensae-2016-9
Slides ensae-2016-9Slides ensae-2016-9
Slides ensae-2016-9
 
Slides univ-van-amsterdam
Slides univ-van-amsterdamSlides univ-van-amsterdam
Slides univ-van-amsterdam
 
Multivariate Distributions, an overview
Multivariate Distributions, an overviewMultivariate Distributions, an overview
Multivariate Distributions, an overview
 
Testing for Extreme Volatility Transmission
Testing for Extreme Volatility Transmission Testing for Extreme Volatility Transmission
Testing for Extreme Volatility Transmission
 
Graduate Econometrics Course, part 4, 2017
Graduate Econometrics Course, part 4, 2017Graduate Econometrics Course, part 4, 2017
Graduate Econometrics Course, part 4, 2017
 
Slides sales-forecasting-session2-web
Slides sales-forecasting-session2-webSlides sales-forecasting-session2-web
Slides sales-forecasting-session2-web
 
Slides ihp
Slides ihpSlides ihp
Slides ihp
 
Proba stats-r1-2017
Proba stats-r1-2017Proba stats-r1-2017
Proba stats-r1-2017
 
HdR
HdRHdR
HdR
 
slides CIRM copulas, extremes and actuarial science
slides CIRM copulas, extremes and actuarial scienceslides CIRM copulas, extremes and actuarial science
slides CIRM copulas, extremes and actuarial science
 
Slides barcelona Machine Learning
Slides barcelona Machine LearningSlides barcelona Machine Learning
Slides barcelona Machine Learning
 

En vedette (20)

Multiattribute utility copula
Multiattribute utility copulaMultiattribute utility copula
Multiattribute utility copula
 
Slides lausanne-2013-v2
Slides lausanne-2013-v2Slides lausanne-2013-v2
Slides lausanne-2013-v2
 
Lg ph d_slides_vfinal
Lg ph d_slides_vfinalLg ph d_slides_vfinal
Lg ph d_slides_vfinal
 
Slides ensae-2016-7
Slides ensae-2016-7Slides ensae-2016-7
Slides ensae-2016-7
 
Soutenance julie viard_partie_1
Soutenance julie viard_partie_1Soutenance julie viard_partie_1
Soutenance julie viard_partie_1
 
Slides ensae-2016-4
Slides ensae-2016-4Slides ensae-2016-4
Slides ensae-2016-4
 
Slides ensae-2016-5
Slides ensae-2016-5Slides ensae-2016-5
Slides ensae-2016-5
 
Slides ensae-2016-3
Slides ensae-2016-3Slides ensae-2016-3
Slides ensae-2016-3
 
Slides ensae-2016-10
Slides ensae-2016-10Slides ensae-2016-10
Slides ensae-2016-10
 
Slides Bank England
Slides Bank EnglandSlides Bank England
Slides Bank England
 
Slides ensae-2016-11
Slides ensae-2016-11Slides ensae-2016-11
Slides ensae-2016-11
 
Econometrics, PhD Course, #1 Nonlinearities
Econometrics, PhD Course, #1 NonlinearitiesEconometrics, PhD Course, #1 Nonlinearities
Econometrics, PhD Course, #1 Nonlinearities
 
Slides econometrics-2017-graduate-2
Slides econometrics-2017-graduate-2Slides econometrics-2017-graduate-2
Slides econometrics-2017-graduate-2
 
Econometrics 2017-graduate-3
Econometrics 2017-graduate-3Econometrics 2017-graduate-3
Econometrics 2017-graduate-3
 
Slides ensae-2016-6
Slides ensae-2016-6Slides ensae-2016-6
Slides ensae-2016-6
 
Slides ensae 5
Slides ensae 5Slides ensae 5
Slides ensae 5
 
Slides ensae 4
Slides ensae 4Slides ensae 4
Slides ensae 4
 
IA-advanced-R
IA-advanced-RIA-advanced-R
IA-advanced-R
 
Slides ads ia
Slides ads iaSlides ads ia
Slides ads ia
 
Classification
ClassificationClassification
Classification
 

Similaire à Lundi 16h15-copules-charpentier

transformations and nonparametric inference
transformations and nonparametric inferencetransformations and nonparametric inference
transformations and nonparametric inferenceArthur Charpentier
 
A new axisymmetric finite element
A new axisymmetric finite elementA new axisymmetric finite element
A new axisymmetric finite elementStefan Duprey
 
1982 A Stochastic Logistic Diffusion Equation
1982 A Stochastic Logistic Diffusion Equation1982 A Stochastic Logistic Diffusion Equation
1982 A Stochastic Logistic Diffusion EquationBob Marcus
 
Slides econometrics-2018-graduate-4
Slides econometrics-2018-graduate-4Slides econometrics-2018-graduate-4
Slides econometrics-2018-graduate-4Arthur Charpentier
 
Iterative methods with special structures
Iterative methods with special structuresIterative methods with special structures
Iterative methods with special structuresDavid Gleich
 
International Journal of Mathematics and Statistics Invention (IJMSI)
International Journal of Mathematics and Statistics Invention (IJMSI)International Journal of Mathematics and Statistics Invention (IJMSI)
International Journal of Mathematics and Statistics Invention (IJMSI)inventionjournals
 
Computer Controlled Systems (solutions manual). Astrom. 3rd edition 1997
Computer Controlled Systems (solutions manual). Astrom. 3rd edition 1997Computer Controlled Systems (solutions manual). Astrom. 3rd edition 1997
Computer Controlled Systems (solutions manual). Astrom. 3rd edition 1997JOAQUIN REA
 
Linear Bayesian update surrogate for updating PCE coefficients
Linear Bayesian update surrogate for updating PCE coefficientsLinear Bayesian update surrogate for updating PCE coefficients
Linear Bayesian update surrogate for updating PCE coefficientsAlexander Litvinenko
 
A sharp nonlinear Hausdorff-Young inequality for small potentials
A sharp nonlinear Hausdorff-Young inequality for small potentialsA sharp nonlinear Hausdorff-Young inequality for small potentials
A sharp nonlinear Hausdorff-Young inequality for small potentialsVjekoslavKovac1
 
Talk at CIRM on Poisson equation and debiasing techniques
Talk at CIRM on Poisson equation and debiasing techniquesTalk at CIRM on Poisson equation and debiasing techniques
Talk at CIRM on Poisson equation and debiasing techniquesPierre Jacob
 
Appendix to MLPI Lecture 2 - Monte Carlo Methods (Basics)
Appendix to MLPI Lecture 2 - Monte Carlo Methods (Basics)Appendix to MLPI Lecture 2 - Monte Carlo Methods (Basics)
Appendix to MLPI Lecture 2 - Monte Carlo Methods (Basics)Dahua Lin
 
FourierTransform detailed power point presentation
FourierTransform detailed power point presentationFourierTransform detailed power point presentation
FourierTransform detailed power point presentationssuseracb8ba
 

Similaire à Lundi 16h15-copules-charpentier (20)

transformations and nonparametric inference
transformations and nonparametric inferencetransformations and nonparametric inference
transformations and nonparametric inference
 
A new axisymmetric finite element
A new axisymmetric finite elementA new axisymmetric finite element
A new axisymmetric finite element
 
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
 
1982 A Stochastic Logistic Diffusion Equation
1982 A Stochastic Logistic Diffusion Equation1982 A Stochastic Logistic Diffusion Equation
1982 A Stochastic Logistic Diffusion Equation
 
Slides econometrics-2018-graduate-4
Slides econometrics-2018-graduate-4Slides econometrics-2018-graduate-4
Slides econometrics-2018-graduate-4
 
Iterative methods with special structures
Iterative methods with special structuresIterative methods with special structures
Iterative methods with special structures
 
International Journal of Mathematics and Statistics Invention (IJMSI)
International Journal of Mathematics and Statistics Invention (IJMSI)International Journal of Mathematics and Statistics Invention (IJMSI)
International Journal of Mathematics and Statistics Invention (IJMSI)
 
Computer Controlled Systems (solutions manual). Astrom. 3rd edition 1997
Computer Controlled Systems (solutions manual). Astrom. 3rd edition 1997Computer Controlled Systems (solutions manual). Astrom. 3rd edition 1997
Computer Controlled Systems (solutions manual). Astrom. 3rd edition 1997
 
Slides laval stat avril 2011
Slides laval stat avril 2011Slides laval stat avril 2011
Slides laval stat avril 2011
 
Slides erm-cea-ia
Slides erm-cea-iaSlides erm-cea-ia
Slides erm-cea-ia
 
Slides erm-cea-ia
Slides erm-cea-iaSlides erm-cea-ia
Slides erm-cea-ia
 
Basic concepts and how to measure price volatility
Basic concepts and how to measure price volatility Basic concepts and how to measure price volatility
Basic concepts and how to measure price volatility
 
Thesis defense
Thesis defenseThesis defense
Thesis defense
 
Linear Bayesian update surrogate for updating PCE coefficients
Linear Bayesian update surrogate for updating PCE coefficientsLinear Bayesian update surrogate for updating PCE coefficients
Linear Bayesian update surrogate for updating PCE coefficients
 
slides tails copulas
slides tails copulasslides tails copulas
slides tails copulas
 
A sharp nonlinear Hausdorff-Young inequality for small potentials
A sharp nonlinear Hausdorff-Young inequality for small potentialsA sharp nonlinear Hausdorff-Young inequality for small potentials
A sharp nonlinear Hausdorff-Young inequality for small potentials
 
Talk at CIRM on Poisson equation and debiasing techniques
Talk at CIRM on Poisson equation and debiasing techniquesTalk at CIRM on Poisson equation and debiasing techniques
Talk at CIRM on Poisson equation and debiasing techniques
 
Congrès SMAI 2019
Congrès SMAI 2019Congrès SMAI 2019
Congrès SMAI 2019
 
Appendix to MLPI Lecture 2 - Monte Carlo Methods (Basics)
Appendix to MLPI Lecture 2 - Monte Carlo Methods (Basics)Appendix to MLPI Lecture 2 - Monte Carlo Methods (Basics)
Appendix to MLPI Lecture 2 - Monte Carlo Methods (Basics)
 
FourierTransform detailed power point presentation
FourierTransform detailed power point presentationFourierTransform detailed power point presentation
FourierTransform detailed power point presentation
 

Plus de Arthur Charpentier (20)

Family History and Life Insurance
Family History and Life InsuranceFamily History and Life Insurance
Family History and Life Insurance
 
ACT6100 introduction
ACT6100 introductionACT6100 introduction
ACT6100 introduction
 
Family History and Life Insurance (UConn actuarial seminar)
Family History and Life Insurance (UConn actuarial seminar)Family History and Life Insurance (UConn actuarial seminar)
Family History and Life Insurance (UConn actuarial seminar)
 
Control epidemics
Control epidemics Control epidemics
Control epidemics
 
STT5100 Automne 2020, introduction
STT5100 Automne 2020, introductionSTT5100 Automne 2020, introduction
STT5100 Automne 2020, introduction
 
Family History and Life Insurance
Family History and Life InsuranceFamily History and Life Insurance
Family History and Life Insurance
 
Machine Learning in Actuarial Science & Insurance
Machine Learning in Actuarial Science & InsuranceMachine Learning in Actuarial Science & Insurance
Machine Learning in Actuarial Science & Insurance
 
Reinforcement Learning in Economics and Finance
Reinforcement Learning in Economics and FinanceReinforcement Learning in Economics and Finance
Reinforcement Learning in Economics and Finance
 
Optimal Control and COVID-19
Optimal Control and COVID-19Optimal Control and COVID-19
Optimal Control and COVID-19
 
Slides OICA 2020
Slides OICA 2020Slides OICA 2020
Slides OICA 2020
 
Lausanne 2019 #3
Lausanne 2019 #3Lausanne 2019 #3
Lausanne 2019 #3
 
Lausanne 2019 #4
Lausanne 2019 #4Lausanne 2019 #4
Lausanne 2019 #4
 
Lausanne 2019 #2
Lausanne 2019 #2Lausanne 2019 #2
Lausanne 2019 #2
 
Lausanne 2019 #1
Lausanne 2019 #1Lausanne 2019 #1
Lausanne 2019 #1
 
Side 2019 #10
Side 2019 #10Side 2019 #10
Side 2019 #10
 
Side 2019 #11
Side 2019 #11Side 2019 #11
Side 2019 #11
 
Side 2019 #12
Side 2019 #12Side 2019 #12
Side 2019 #12
 
Side 2019 #9
Side 2019 #9Side 2019 #9
Side 2019 #9
 
Side 2019 #8
Side 2019 #8Side 2019 #8
Side 2019 #8
 
Side 2019 #7
Side 2019 #7Side 2019 #7
Side 2019 #7
 

Dernier

SKILL OF INTRODUCING THE LESSON MICRO SKILLS.pptx
SKILL OF INTRODUCING THE LESSON MICRO SKILLS.pptxSKILL OF INTRODUCING THE LESSON MICRO SKILLS.pptx
SKILL OF INTRODUCING THE LESSON MICRO SKILLS.pptxAmanpreet Kaur
 
Unit-V; Pricing (Pharma Marketing Management).pptx
Unit-V; Pricing (Pharma Marketing Management).pptxUnit-V; Pricing (Pharma Marketing Management).pptx
Unit-V; Pricing (Pharma Marketing Management).pptxVishalSingh1417
 
psychiatric nursing HISTORY COLLECTION .docx
psychiatric  nursing HISTORY  COLLECTION  .docxpsychiatric  nursing HISTORY  COLLECTION  .docx
psychiatric nursing HISTORY COLLECTION .docxPoojaSen20
 
Unit-IV; Professional Sales Representative (PSR).pptx
Unit-IV; Professional Sales Representative (PSR).pptxUnit-IV; Professional Sales Representative (PSR).pptx
Unit-IV; Professional Sales Representative (PSR).pptxVishalSingh1417
 
Food safety_Challenges food safety laboratories_.pdf
Food safety_Challenges food safety laboratories_.pdfFood safety_Challenges food safety laboratories_.pdf
Food safety_Challenges food safety laboratories_.pdfSherif Taha
 
Grant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingGrant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingTechSoup
 
Dyslexia AI Workshop for Slideshare.pptx
Dyslexia AI Workshop for Slideshare.pptxDyslexia AI Workshop for Slideshare.pptx
Dyslexia AI Workshop for Slideshare.pptxcallscotland1987
 
Accessible Digital Futures project (20/03/2024)
Accessible Digital Futures project (20/03/2024)Accessible Digital Futures project (20/03/2024)
Accessible Digital Futures project (20/03/2024)Jisc
 
Jual Obat Aborsi Hongkong ( Asli No.1 ) 085657271886 Obat Penggugur Kandungan...
Jual Obat Aborsi Hongkong ( Asli No.1 ) 085657271886 Obat Penggugur Kandungan...Jual Obat Aborsi Hongkong ( Asli No.1 ) 085657271886 Obat Penggugur Kandungan...
Jual Obat Aborsi Hongkong ( Asli No.1 ) 085657271886 Obat Penggugur Kandungan...ZurliaSoop
 
Mixin Classes in Odoo 17 How to Extend Models Using Mixin Classes
Mixin Classes in Odoo 17  How to Extend Models Using Mixin ClassesMixin Classes in Odoo 17  How to Extend Models Using Mixin Classes
Mixin Classes in Odoo 17 How to Extend Models Using Mixin ClassesCeline George
 
1029-Danh muc Sach Giao Khoa khoi 6.pdf
1029-Danh muc Sach Giao Khoa khoi  6.pdf1029-Danh muc Sach Giao Khoa khoi  6.pdf
1029-Danh muc Sach Giao Khoa khoi 6.pdfQucHHunhnh
 
Micro-Scholarship, What it is, How can it help me.pdf
Micro-Scholarship, What it is, How can it help me.pdfMicro-Scholarship, What it is, How can it help me.pdf
Micro-Scholarship, What it is, How can it help me.pdfPoh-Sun Goh
 
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptxMaritesTamaniVerdade
 
Key note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdfKey note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdfAdmir Softic
 
How to Manage Global Discount in Odoo 17 POS
How to Manage Global Discount in Odoo 17 POSHow to Manage Global Discount in Odoo 17 POS
How to Manage Global Discount in Odoo 17 POSCeline George
 
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...Nguyen Thanh Tu Collection
 
ICT Role in 21st Century Education & its Challenges.pptx
ICT Role in 21st Century Education & its Challenges.pptxICT Role in 21st Century Education & its Challenges.pptx
ICT Role in 21st Century Education & its Challenges.pptxAreebaZafar22
 
The basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptxThe basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptxheathfieldcps1
 
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...christianmathematics
 

Dernier (20)

SKILL OF INTRODUCING THE LESSON MICRO SKILLS.pptx
SKILL OF INTRODUCING THE LESSON MICRO SKILLS.pptxSKILL OF INTRODUCING THE LESSON MICRO SKILLS.pptx
SKILL OF INTRODUCING THE LESSON MICRO SKILLS.pptx
 
Unit-V; Pricing (Pharma Marketing Management).pptx
Unit-V; Pricing (Pharma Marketing Management).pptxUnit-V; Pricing (Pharma Marketing Management).pptx
Unit-V; Pricing (Pharma Marketing Management).pptx
 
psychiatric nursing HISTORY COLLECTION .docx
psychiatric  nursing HISTORY  COLLECTION  .docxpsychiatric  nursing HISTORY  COLLECTION  .docx
psychiatric nursing HISTORY COLLECTION .docx
 
Unit-IV; Professional Sales Representative (PSR).pptx
Unit-IV; Professional Sales Representative (PSR).pptxUnit-IV; Professional Sales Representative (PSR).pptx
Unit-IV; Professional Sales Representative (PSR).pptx
 
Food safety_Challenges food safety laboratories_.pdf
Food safety_Challenges food safety laboratories_.pdfFood safety_Challenges food safety laboratories_.pdf
Food safety_Challenges food safety laboratories_.pdf
 
Grant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingGrant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy Consulting
 
Dyslexia AI Workshop for Slideshare.pptx
Dyslexia AI Workshop for Slideshare.pptxDyslexia AI Workshop for Slideshare.pptx
Dyslexia AI Workshop for Slideshare.pptx
 
Accessible Digital Futures project (20/03/2024)
Accessible Digital Futures project (20/03/2024)Accessible Digital Futures project (20/03/2024)
Accessible Digital Futures project (20/03/2024)
 
Jual Obat Aborsi Hongkong ( Asli No.1 ) 085657271886 Obat Penggugur Kandungan...
Jual Obat Aborsi Hongkong ( Asli No.1 ) 085657271886 Obat Penggugur Kandungan...Jual Obat Aborsi Hongkong ( Asli No.1 ) 085657271886 Obat Penggugur Kandungan...
Jual Obat Aborsi Hongkong ( Asli No.1 ) 085657271886 Obat Penggugur Kandungan...
 
Mixin Classes in Odoo 17 How to Extend Models Using Mixin Classes
Mixin Classes in Odoo 17  How to Extend Models Using Mixin ClassesMixin Classes in Odoo 17  How to Extend Models Using Mixin Classes
Mixin Classes in Odoo 17 How to Extend Models Using Mixin Classes
 
1029-Danh muc Sach Giao Khoa khoi 6.pdf
1029-Danh muc Sach Giao Khoa khoi  6.pdf1029-Danh muc Sach Giao Khoa khoi  6.pdf
1029-Danh muc Sach Giao Khoa khoi 6.pdf
 
Micro-Scholarship, What it is, How can it help me.pdf
Micro-Scholarship, What it is, How can it help me.pdfMicro-Scholarship, What it is, How can it help me.pdf
Micro-Scholarship, What it is, How can it help me.pdf
 
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
 
Key note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdfKey note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdf
 
Mehran University Newsletter Vol-X, Issue-I, 2024
Mehran University Newsletter Vol-X, Issue-I, 2024Mehran University Newsletter Vol-X, Issue-I, 2024
Mehran University Newsletter Vol-X, Issue-I, 2024
 
How to Manage Global Discount in Odoo 17 POS
How to Manage Global Discount in Odoo 17 POSHow to Manage Global Discount in Odoo 17 POS
How to Manage Global Discount in Odoo 17 POS
 
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
 
ICT Role in 21st Century Education & its Challenges.pptx
ICT Role in 21st Century Education & its Challenges.pptxICT Role in 21st Century Education & its Challenges.pptx
ICT Role in 21st Century Education & its Challenges.pptx
 
The basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptxThe basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptx
 
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
 

Lundi 16h15-copules-charpentier

  • 1. Arthur CHARPENTIER - Probit transformation for nonparametric kernel estimation of the copula Probit transformation for nonparametric kernel estimation of the copula A. Charpentier (Université de Rennes 1 & UQAM), joint work with G. Geenens (UNSW) & D. Paindaveine (ULB) Journées de Statistiques, Lille, Juin 2015 http://arxiv.org/abs/1404.4414 url: http://freakonometrics.hypotheses.org email : charpentier.arthur@uqam.ca : @freakonometrics 1
  • 2. Arthur CHARPENTIER - Probit transformation for nonparametric kernel estimation of the copula Motivation Consider some n-i.i.d. sample {(Xi, Yi)} with cu- mulative distribution function FXY and joint den- sity fXY . Let FX and FY denote the marginal distributions, and C the copula, FXY (x, y) = C(FX(x), FY (y)) so that fXY (x, y) = fX(x)fY (y)c(FX(x), FY (y)) We want a nonparametric estimate of c on [0, 1]2 . 1e+01 1e+03 1e+05 1e+011e+021e+031e+041e+05 2
  • 3. Arthur CHARPENTIER - Probit transformation for nonparametric kernel estimation of the copula Notations Define uniformized n-i.i.d. sample {(Ui, Vi)} Ui = FX(Xi) and Vi = FY (Yi) or uniformized n-i.i.d. pseudo-sample {( ˆUi, ˆVi)} ˆUi = n n + 1 ˆFXn(Xi) and ˆVi = n n + 1 ˆFY n(Yi) where ˆFXn and ˆFY n denote empirical c.d.f. 0.0 0.2 0.4 0.6 0.8 1.0 0.00.20.40.60.81.0 3
  • 4. Arthur CHARPENTIER - Probit transformation for nonparametric kernel estimation of the copula Standard Kernel Estimate The standard kernel estimator for c, say ˆc∗ , at (u, v) ∈ I would be (see Wand & Jones (1995)) ˆc∗ (u, v) = 1 n|HUV |1/2 n i=1 K H −1/2 UV u − Ui v − Vi , (1) where K : R2 → R is a kernel function and HUV is a bandwidth matrix. 4
  • 5. Arthur CHARPENTIER - Probit transformation for nonparametric kernel estimation of the copula Standard Kernel Estimate However, this estimator is not consistent along boundaries of [0, 1]2 E(ˆc∗ (u, v)) = 1 4 c(u, v) + O(h) at corners E(ˆc∗ (u, v)) = 1 2 c(u, v) + O(h) on the borders if K is symmetric and HUV symmetric. Corrections have been proposed, e.g. mirror reflec- tion Gijbels (1990) or the usage of boundary kernels Chen (2007), but with mixed results. Remark: the graph on the bottom is ˆc∗ on the (first) diagonal. 0.0 0.2 0.4 0.6 0.8 1.0 01234567 5
  • 6. Arthur CHARPENTIER - Probit transformation for nonparametric kernel estimation of the copula Mirror Kernel Estimate Use an enlarged sample: instead of only {( ˆUi, ˆVi)}, add {(− ˆUi, ˆVi)}, {( ˆUi, − ˆVi)}, {(− ˆUi, − ˆVi)}, {( ˆUi, 2 − ˆVi)}, {(2 − ˆUi, ˆVi)},{(− ˆUi, 2 − ˆVi)}, {(2 − ˆUi, − ˆVi)} and {(2 − ˆUi, 2 − ˆVi)}. See Gijbels & Mielniczuk (1990). That estimator will be used as a benchmark in the simulation study. 0.0 0.2 0.4 0.6 0.8 1.0 01234567 6
  • 7. Arthur CHARPENTIER - Probit transformation for nonparametric kernel estimation of the copula Using Beta Kernels Use a Kernel which is a product of beta kernels Kxi (u)) ∝ u x1,i b 1 [1 − u1] x1,i b · u x2,i b 2 [1 − u2] x2,i b See Chen (1999). 0.0 0.2 0.4 0.6 0.8 1.0 01234567 7
  • 8. Arthur CHARPENTIER - Probit transformation for nonparametric kernel estimation of the copula Probit Transformation See Devroye & Gyöfi (1985) and Marron & Ruppert (1994). Define normalized n-i.i.d. sample {(Si, Ti)} Si = Φ−1 (Ui) and Ti = Φ−1 (Vi) or normalized n-i.i.d. pseudo-sample {( ˆSi, ˆTi)} ˆUi = Φ−1 ( ˆUi) and ˆVi = Φ−1 ( ˆVi) where Φ−1 is the quantile function of N(0, 1) (probit transformation). −3 −2 −1 0 1 2 3 −3−2−10123 8
  • 9. Arthur CHARPENTIER - Probit transformation for nonparametric kernel estimation of the copula Probit Transformation FST (x, y) = C(Φ(x), Φ(y)) so that fST (x, y) = φ(x)φ(y)c(Φ(x), Φ(y)) Thus c(u, v) = fST (Φ−1 (u), Φ−1 (v)) φ(Φ−1(u))φ(Φ−1(v)) . So use ˆc(τ) (u, v) = ˆfST (Φ−1 (u), Φ−1 (v)) φ(Φ−1(u))φ(Φ−1(v)) −3 −2 −1 0 1 2 3 −3−2−10123 9
  • 10. Arthur CHARPENTIER - Probit transformation for nonparametric kernel estimation of the copula The naive estimator Since we cannot use ˆf∗ ST (s, t) = 1 n|HST |1/2 n i=1 K H −1/2 ST s − Si t − Ti , where K is a kernel function and HST is a band- width matrix, use ˆfST (s, t) = 1 n|HST |1/2 n i=1 K H −1/2 ST s − ˆSi t − ˆTi . and the copula density is ˆc(τ) (u, v) = ˆfST (Φ−1 (u), Φ−1 (v)) φ(Φ−1(u))φ(Φ−1(v)) 0.0 0.2 0.4 0.6 0.8 1.0 01234567 10
  • 11. Arthur CHARPENTIER - Probit transformation for nonparametric kernel estimation of the copula The naive estimator ˆc(τ) (u, v) = 1 n|HST |1/2φ(Φ−1(u))φ(Φ−1(v)) n i=1 K H −1/2 ST Φ−1 (u) − Φ−1 ( ˆUi) Φ−1(v) − Φ−1( ˆVi) as suggested in C., Fermanian & Scaillet (2007) and Lopez-Paz . et al. (2013). Note that Omelka . et al. (2009) obtained theoretical properties on the convergence of ˆC(τ) (u, v) (not c). 11
  • 12. Arthur CHARPENTIER - Probit transformation for nonparametric kernel estimation of the copula Improved probit-transformation copula density estimators When estimating a density from pseudo-sample, Loader (1996) and Hjort & Jones (1996) define a local likelihood estimator Around (s, t) ∈ R2 , use a polynomial approximation of order p for log fST log fST (ˇs, ˇt) a1,0(s, t) + a1,1(s, t)(ˇs − s) + a1,2(s, t)(ˇt − t) . = Pa1 (ˇs − s, ˇt − t) log fST (ˇs, ˇt) a2,0(s, t) + a2,1(s, t)(ˇs − s) + a2,2(s, t)(ˇt − t) + a2,3(s, t)(ˇs − s)2 + a2,4(s, t)(ˇt − t)2 + a2,5(s, t)(ˇs − s)(ˇt − t) . = Pa2 (ˇs − s, ˇt − t). 12
  • 13. Arthur CHARPENTIER - Probit transformation for nonparametric kernel estimation of the copula Improved probit-transformation copula density estimators Remark Vectors a1(s, t) = (a1,0(s, t), a1,1(s, t), a1,2(s, t)) and a2(s, t) . = (a2,0(s, t), . . . , a2,5(s, t)) are then estimated by solving a weighted maximum likelihood problem. ˜ap(s, t) = arg max ap n i=1 K H −1/2 ST s − ˆSi t − ˆTi Pap ( ˆSi − s, ˆTi − t) −n R2 K H −1/2 ST s − ˇs t − ˇt exp Pap (ˇs − s, ˇt − t) dˇs dˇt , The estimate of fST at (s, t) is then ˜f (p) ST (s, t) = exp(˜ap,0(s, t)), for p = 1, 2. The Improved probit-transformation kernel copula density estimators are ˜c(τ,p) (u, v) = ˜f (p) ST (Φ−1 (u), Φ−1 (v)) φ(Φ−1(u))φ(Φ−1(v)) 13
  • 14. Arthur CHARPENTIER - Probit transformation for nonparametric kernel estimation of the copula Improved probit-transformation copula density estimators For the local log-linear (p = 1) approximation ˜c(τ,1) (u, v) = exp(˜a1,0(Φ−1 (u), Φ−1 (v)) φ(Φ−1(u))φ(Φ−1(v)) 0.0 0.2 0.4 0.6 0.8 1.0 01234567 14
  • 15. Arthur CHARPENTIER - Probit transformation for nonparametric kernel estimation of the copula Improved probit-transformation copula density estimators For the local log-quadratic (p = 2) approximation ˜c(τ,2) (u, v) = exp(˜a2,0(Φ−1 (u), Φ−1 (v)) φ(Φ−1(u))φ(Φ−1(v)) 0.0 0.2 0.4 0.6 0.8 1.0 01234567 15
  • 16. Arthur CHARPENTIER - Probit transformation for nonparametric kernel estimation of the copula Asymptotic properties A1. The sample {(Xi, Yi)} is a n- i.i.d. sample from the joint distribution FXY , an absolutely continuous distribution with marginals FX and FY strictly increasing on their support; (uniqueness of the copula) 16
  • 17. Arthur CHARPENTIER - Probit transformation for nonparametric kernel estimation of the copula Asymptotic properties A2. The copula C of FXY is such that (∂C/∂u)(u, v) and (∂2 C/∂u2 )(u, v) exist and are continuous on {(u, v) : u ∈ (0, 1), v ∈ [0, 1]}, and (∂C/∂v)(u, v) and (∂2 C/∂v2 )(u, v) exist and are continuous on {(u, v) : u ∈ [0, 1], v ∈ (0, 1)}. In addition, there are constants K1 and K2 such that    ∂2 C ∂u2 (u, v) ≤ K1 u(1 − u) for (u, v) ∈ (0, 1) × [0, 1]; ∂2 C ∂v2 (u, v) ≤ K2 v(1 − v) for (u, v) ∈ [0, 1] × (0, 1); A3. The density c of C exists, is positive and admits continuous second-order partial derivatives on the interior of the unit square I. In addition, there is a constant K00 such that c(u, v) ≤ K00 min 1 u(1 − u) , 1 v(1 − v) ∀(u, v) ∈ (0, 1)2 . see Segers (2012). 17
  • 18. Arthur CHARPENTIER - Probit transformation for nonparametric kernel estimation of the copula Asymptotic properties Assume that K(z1, z2) = φ(z1)φ(z2) and HST = h2 I with h ∼ n−a for some a ∈ (0, 1/4). Under Assumptions A1-A3, the ‘naive’ probit transformation kernel copula density estimator at any (u, v) ∈ (0, 1)2 is such that √ nh2 ˆc(τ) (u, v) − c(u, v) − h2 b(u, v) L −→ N 0, σ2 (u, v) , where b(u, v) = 1 2 ∂2 c ∂u2 (u, v)φ2 (Φ−1 (u)) + ∂2 c ∂v2 (u, v)φ2 (Φ−1 (v)) − 3 ∂c ∂u (u, v)Φ−1 (u)φ(Φ−1 (u)) + ∂c ∂v (u, v)Φ−1 (v)φ(Φ−1 (v)) + c(u, v) {Φ−1 (u)}2 + {Φ−1 (v)}2 − 2 (2) and σ2 (u, v) = c(u, v) 4πφ(Φ−1(u))φ(Φ−1(v)) . 18
  • 19. Arthur CHARPENTIER - Probit transformation for nonparametric kernel estimation of the copula The Amended version The last unbounded term in b be easily adjusted. ˆc(τam) (u, v) = ˆfST (Φ−1 (u), Φ−1 (v)) φ(Φ−1(u))φ(Φ−1(v)) × 1 1 + 1 2 h2 ({Φ−1(u)}2 + {Φ−1(v)}2 − 2) . The asymptotic bias becomes proportional to b(am) (u, v) = 1 2 ∂2 c ∂u2 (u, v)φ2 (Φ−1 (u)) + ∂2 c ∂v2 (u, v)φ2 (Φ−1 (v)) −3 ∂c ∂u (u, v)Φ−1 (u)φ(Φ−1 (u)) + ∂c ∂v (u, v)Φ−1 (v)φ(Φ−1 (v)) . 19
  • 20. Arthur CHARPENTIER - Probit transformation for nonparametric kernel estimation of the copula A local log-linear probit-transformation kernel estimator ˜c∗(τ,1) (u, v) = ˜f ∗(1) ST (Φ−1 (u), Φ−1 (v))/ φ(Φ−1 (u))φ(Φ−1 (v)) Then √ nh2 ˜c∗(τ,1) (u, v) − c(u, v) − h2 b(1) (u, v) L −→ N 0, σ(1) 2 (u, v) , where b(1) (u, v) = 1 2 ∂2 c ∂u2 (u, v)φ2 (Φ−1 (u)) + ∂2 c ∂v2 (u, v)φ2 (Φ−1 (v)) − 1 c(u, v) ∂c ∂u (u, v) 2 φ2 (Φ−1 (u)) + ∂c ∂v (u, v) 2 φ2 (Φ−1 (v)) − ∂c ∂u (u, v)Φ−1 (u)φ(Φ−1 (u)) + ∂c ∂v (u, v)Φ−1 (v)φ(Φ−1 (v)) − 2c(u, v) (3) and σ(1) 2 (u, v) = c(u, v) 4πφ(Φ−1(u))φ(Φ−1(v)) . 20
  • 21. Arthur CHARPENTIER - Probit transformation for nonparametric kernel estimation of the copula Using a higher order polynomial approximation Locally fitting a polynomial of a higher degree is known to reduce the asymptotic bias of the estimator, here from order O(h2 ) to order O(h4 ), see Loader (1996) or Hjort (1996), under sufficient smoothness conditions. If fST admits continuous fourth-order partial derivatives and is positive at (s, t), then √ nh2 ˜f ∗(2) ST (s, t) − fST (s, t) − h4 b (2) ST (s, t) L −→ N 0, σ (2) ST 2 (s, t) , where σ (2) ST 2 (s, t) = 5 2 fST (s, t) 4π and b (2) ST (s, t) = − 1 8 fST (s, t) × ∂4 g ∂s4 + ∂4 g ∂t4 + 4 ∂3 g ∂s3 ∂g ∂s + ∂3 g ∂t3 ∂g ∂t + ∂3 g ∂s2∂t ∂g ∂t + ∂3 g ∂s∂t2 ∂g ∂s + 2 ∂4 g ∂s2∂t2 (s, t), with g(s, t) = log fST (s, t). 21
  • 22. Arthur CHARPENTIER - Probit transformation for nonparametric kernel estimation of the copula Using a higher order polynomial approximation A4. The copula density c(u, v) = (∂2 C/∂u∂v)(u, v) admits continuous fourth-order partial derivatives on the interior of the unit square [0, 1]2 . Then √ nh2 ˜c∗(τ,2) (u, v) − c(u, v) − h4 b(2) (u, v) L −→ N 0, σ(2) 2 (u, v) where σ(2) 2 (u, v) = 5 2 c(u, v) 4πφ(Φ−1(u))φ(Φ−1(v)) 22
  • 23. Arthur CHARPENTIER - Probit transformation for nonparametric kernel estimation of the copula Improving Bandwidth choice Consider the principal components decomposition of the (n × 2) matrix [ˆS, ˆT ] = M. Let W1 = (W11, W12)T and W2 = (W21, W22)T be the eigenvectors of MT M. Set Q R =   W11 W12 W21 W22   S T = W S T which is only a linear reparametrization of R2 , so an estimate of fST can be readily obtained from an estimate of the density of (Q, R) Since { ˆQi} and { ˆRi} are empirically uncorrelated, consider a diagonal bandwidth matrix HQR = diag(h2 Q, h2 R). −4 −2 0 2 4 −3−2−1012 23
  • 24. Arthur CHARPENTIER - Probit transformation for nonparametric kernel estimation of the copula Improving Bandwidth choice Use univariate procedures to select hQ and hR independently Denote ˜f (p) Q and ˜f (p) R (p = 1, 2), the local log-polynomial estimators for the densities hQ can be selected via cross-validation (see Section 5.3.3 in Loader (1999)) hQ = arg min h>0 ∞ −∞ ˜f (p) Q (q) 2 dq − 2 n n i=1 ˜f (p) Q(−i)( ˆQi) , where ˜f (p) Q(−i) is the ‘leave-one-out’ version of ˜f (p) Q . 24
  • 25. Arthur CHARPENTIER - Probit transformation for nonparametric kernel estimation of the copula Graphical Comparison (loss ALAE dataset) c~(τ2) Loss (X) ALAE(Y) 0.25 0.25 0.5 0.5 0.75 0.75 1 1 1.25 1.25 1.5 1.5 2 2 4 0.0 0.2 0.4 0.6 0.8 1.0 0.00.20.40.60.81.0 0.25 0.25 0.5 0.5 0.75 0.75 1 1 1 1.25 1.25 1.5 1.5 2 2 4 c^ β Loss (X) ALAE(Y) 0.25 0.25 0.5 0.5 0.75 0.75 1 1 1.25 1.25 1.5 1.5 2 2 4 0.0 0.2 0.4 0.6 0.8 1.0 0.00.20.40.60.81.0 0.25 0.25 0.25 0.25 0.5 0.5 0.75 0.75 0.75 1 1 1 1 1 1.25 1.25 1.25 1.25 1.25 1.5 1.5 2 2 2 4 c^ b Loss (X) ALAE(Y) 0.25 0.25 0.5 0.5 0.75 0.75 1 1 1.25 1.25 1.5 1.5 2 2 4 0.0 0.2 0.4 0.6 0.8 1.0 0.00.20.40.60.81.0 0.25 0.25 0.5 0.5 0.75 0.75 1 1 1.25 1.25 1.5 1.5 2 2 c^ p Loss (X) ALAE(Y) 0.25 0.25 0.5 0.5 0.75 0.75 1 1 1.25 1.25 1.5 1.5 2 2 4 0.0 0.2 0.4 0.6 0.8 1.0 0.00.20.40.60.81.0 0.25 0.25 0.25 0.25 0.5 0.5 0.75 0.75 1 1 1 1.25 1.25 1.25 1.25 1.5 1.5 1.5 2 2 4 0.0 0.2 0.4 0.6 0.8 1.0 0.00.20.40.60.81.0 c~(τ2) Loss (X) ALAE(Y) 0.25 0.25 0.5 0.5 0.75 0.75 1 1 1 1.25 1.25 1.5 1.5 2 2 4 0.0 0.2 0.4 0.6 0.8 1.0 0.00.20.40.60.81.0 c^ β Loss (X) ALAE(Y) 0.25 0.25 0.25 0.25 0.5 0.5 0.75 0.75 0.75 1 1 1 1 1 1.25 1.25 1.25 1.25 1.25 1.5 1.5 1.5 2 2 2 4 0.0 0.2 0.4 0.6 0.8 1.0 0.00.20.40.60.81.0 c^ b Loss (X) ALAE(Y) 0.25 0.25 0.5 0.5 0.75 0.751 1 1.25 1.25 1.5 1.5 2 2 0.0 0.2 0.4 0.6 0.8 1.0 0.00.20.40.60.81.0 c^ p Loss (X) ALAE(Y) 0.25 0.25 0.25 0.25 0.5 0.5 0.75 0.75 1 1 1 1.25 1.25 1.25 1.25 1.5 1.5 1.5 2 2 4 25
  • 26. Arthur CHARPENTIER - Probit transformation for nonparametric kernel estimation of the copula Simulation Study M = 1, 000 independent random samples {(Ui, Vi)}n i=1 of sizes n = 200, n = 500 and n = 1000 were generated from each of the following copulas: · the independence copula (i.e., Ui’s and Vi’s drawn independently); · the Gaussian copula, with parameters ρ = 0.31, ρ = 0.59 and ρ = 0.81; · the Student t-copula with 4 degrees of freedom, with parameters ρ = 0.31, ρ = 0.59 and ρ = 0.81; · the Frank copula, with parameter θ = 1.86, θ = 4.16 and θ = 7.93; · the Gumbel copula, with parameter θ = 1.25, θ = 1.67 and θ = 2.5; · the Clayton copula, with parameter θ = 0.5, θ = 1.67 and θ = 2.5. (approximated) MISE relative to the MISE of the mirror-reflection estimator (last column), n = 1000. Bold values show the minimum MISE for the corresponding copula (non-significantly different values are highlighted as well). 26
  • 27. Arthur CHARPENTIER - Probit transformation for nonparametric kernel estimation of the copula n = 1000 ˆc(τ) ˆc(τam) ˜c(τ,1) ˜c(τ,2) ˆc (β) 1 ˆc (β) 2 ˆc (B) 1 ˆc (B) 2 ˆc (p) 1 ˆc (p) 2 ˆc (p) 3 Indep 3.57 2.80 2.89 1.40 7.96 11.65 1.69 3.43 1.62 0.50 0.14 Gauss2 2.03 1.52 1.60 0.76 4.63 6.06 1.10 1.82 0.98 0.66 0.89 Gauss4 0.63 0.49 0.44 0.21 1.72 1.60 0.75 0.58 0.62 0.99 2.93 Gauss6 0.21 0.20 0.11 0.05 0.74 0.33 0.77 0.37 0.72 1.21 2.83 Std(4)2 0.61 0.56 0.50 0.40 1.57 1.80 0.78 0.67 0.75 1.01 1.88 Std(4)4 0.21 0.27 0.17 0.15 0.88 0.51 0.75 0.42 0.75 1.12 2.07 Std(4)6 0.09 0.17 0.08 0.09 0.70 0.19 0.82 0.47 0.90 1.17 1.90 Frank2 3.31 2.42 2.57 1.35 7.16 9.63 1.70 2.95 1.31 0.45 0.49 Frank4 2.35 1.45 1.51 0.99 4.42 4.89 1.49 1.65 0.60 0.72 6.14 Frank6 0.96 0.52 0.45 0.44 1.51 1.19 1.35 0.76 0.65 1.58 7.25 Gumbel2 0.65 0.62 0.56 0.43 1.77 1.97 0.82 0.75 0.83 1.03 1.52 Gumbel4 0.18 0.28 0.16 0.19 0.89 0.41 0.78 0.47 0.81 1.10 1.78 Gumbel6 0.09 0.21 0.10 0.15 0.78 0.29 0.85 0.58 0.94 1.12 1.63 Clayton2 0.63 0.60 0.51 0.34 1.78 1.99 0.78 0.70 0.79 1.04 1.79 Clayton4 0.11 0.26 0.10 0.15 0.79 0.27 0.83 0.56 0.90 1.10 1.50 Clayton6 0.11 0.28 0.08 0.15 0.82 0.35 0.88 0.67 0.96 1.09 1.36 27