The word optimal is used in different ways in mesh generation. It could mean that the output is in some sense, "the best mesh" or that the algorithm is, by some measure, "the best algorithm". One might hope that the best algorithm also produces the best mesh, but maybe some tradeoffs are necessary. In this talk, I will survey several different notions of optimality in mesh generation and explore the different tradeoffs between them. The bias will be towards Delaunay/Voronoi methods.
6. Mesh Generation
bias: Delaunay/Voronoi Refinement
why: We want theoretical guarantees.
Everything will be in d dimensions,
where d is a constant.
Constants that depend only on d will be
hidden by (really) big-O’s.
12. Optimality
Quality
Mesh Size
Goal: Minimize Number of Vertices/Simplices
Also: Graded according to density/sizing
function.
Goal: Maximize element quality
(many choices for what this means).
13. Optimality
Quality
Mesh Size
Running Time
Goal: Minimize Number of Vertices/Simplices
Also: Graded according to density/sizing
function.
Goal: Maximize element quality
(many choices for what this means).
14. Optimality
Quality
Mesh Size
Running Time
Goal: Minimize Number of Vertices/Simplices
Also: Graded according to density/sizing
function.
Goal: Maximize element quality
(many choices for what this means).
Goal: O(n log n + m) time.
15. Optimality
Quality
Mesh Size
Running Time
Goal: Minimize Number of Vertices/Simplices
Also: Graded according to density/sizing
function.
Goal: Maximize element quality
(many choices for what this means).
Goal: O(n log n + m) time.
The emphasis will be on asymptotic bounds and
minimum requirements so as to produce the most
general lower bounds.
22. Quality
Many different/competing notions of quality.
We will focus on those that yield theoretical guarantees.
This talk: Voronoi Aspect Ratio
Issues:
slivers
rv
Rvv
23. Quality
Many different/competing notions of quality.
We will focus on those that yield theoretical guarantees.
This talk: Voronoi Aspect Ratio
Issues:
slivers
geometric stability rv
Rvv
24. Quality
Many different/competing notions of quality.
We will focus on those that yield theoretical guarantees.
This talk: Voronoi Aspect Ratio
Issues:
slivers
geometric stability
post-processing/smoothing
rv
Rvv
36. Mesh Size
v
Rv
rv
8x 2 Vor(v) : rv fP (x) KRv
Prove your algorithm achieves this
algorithm specific (not for this talk)
37. Mesh Size
v
Rv
rv
8x 2 Vor(v) : rv fP (x) KRv
Prove your algorithm achieves this
algorithm specific (not for this talk)
38. Mesh Size
v
Rv
rv
8x 2 Vor(v) : rv fP (x) KRv
Z
Vor(v)
dx
fP (x)d
Prove your algorithm achieves this
algorithm specific (not for this talk)
39. Mesh Size
v
Rv
rv
8x 2 Vor(v) : rv fP (x) KRv
Z
Vor(v)
dx
rd
v
Z
Vor(v)
dx
fP (x)d
Prove your algorithm achieves this
algorithm specific (not for this talk)
40. Mesh Size
v
Rv
rv
8x 2 Vor(v) : rv fP (x) KRv
Z
Bv
dx
rd
v
Z
Vor(v)
dx
rd
v
Z
Vor(v)
dx
fP (x)d
Prove your algorithm achieves this
algorithm specific (not for this talk)
41. Mesh Size
v
Rv
rv
8x 2 Vor(v) : rv fP (x) KRv
= V
✓
Rv
rv
◆d
Z
Bv
dx
rd
v
Z
Vor(v)
dx
rd
v
Z
Vor(v)
dx
fP (x)d
Prove your algorithm achieves this
algorithm specific (not for this talk)
42. Mesh Size
v
Rv
rv
8x 2 Vor(v) : rv fP (x) KRv
= V
✓
Rv
rv
◆d
Z
Bv
dx
rd
v
Z
Vor(v)
dx
rd
v
Z
Vor(v)
dx
fP (x)d
Z
bv
dx
fP (x)d
Prove your algorithm achieves this
algorithm specific (not for this talk)
43. Mesh Size
v
Rv
rv
8x 2 Vor(v) : rv fP (x) KRv
Z
bv
dx
(KRv)d
= V
✓
Rv
rv
◆d
Z
Bv
dx
rd
v
Z
Vor(v)
dx
rd
v
Z
Vor(v)
dx
fP (x)d
Z
bv
dx
fP (x)d
Prove your algorithm achieves this
algorithm specific (not for this talk)
44. Mesh Size
v
Rv
rv
8x 2 Vor(v) : rv fP (x) KRv
V
✓
rv
KRv
◆d
=
Z
bv
dx
(KRv)d
= V
✓
Rv
rv
◆d
Z
Bv
dx
rd
v
Z
Vor(v)
dx
rd
v
Z
Vor(v)
dx
fP (x)d
Z
bv
dx
fP (x)d
Prove your algorithm achieves this
algorithm specific (not for this talk)
45. Mesh Size
v
Rv
rv
8x 2 Vor(v) : rv fP (x) KRv
V
✓
rv
KRv
◆d
=
Z
bv
dx
(KRv)d
= V
✓
Rv
rv
◆d
Z
Bv
dx
rd
v
Z
Vor(v)
dx
rd
v
Z
Vor(v)
dx
fP (x)d
Z
bv
dx
fP (x)d
✓
1
K⌧
◆d
µP (Vor(v)) ⌧d
Prove your algorithm achieves this
algorithm specific (not for this talk)
46. Mesh Size
v
Rv
rv
There is at most and at least
some constant amount of mass
in each Voronoi cell!
8x 2 Vor(v) : rv fP (x) KRv
V
✓
rv
KRv
◆d
=
Z
bv
dx
(KRv)d
= V
✓
Rv
rv
◆d
Z
Bv
dx
rd
v
Z
Vor(v)
dx
rd
v
Z
Vor(v)
dx
fP (x)d
Z
bv
dx
fP (x)d
✓
1
K⌧
◆d
µP (Vor(v)) ⌧d
Prove your algorithm achieves this
algorithm specific (not for this talk)
58. Mesh Size
Tight per-instance bounds on the mesh size can be expressed
in terms of the “pacing”.
Order the points.
59. Mesh Size
Tight per-instance bounds on the mesh size can be expressed
in terms of the “pacing”.
Order the points.
60. Mesh Size
Tight per-instance bounds on the mesh size can be expressed
in terms of the “pacing”.
Order the points.
61. Mesh Size
Tight per-instance bounds on the mesh size can be expressed
in terms of the “pacing”.
Order the points.
62. Mesh Size
Tight per-instance bounds on the mesh size can be expressed
in terms of the “pacing”.
Order the points.
63. Mesh Size
Tight per-instance bounds on the mesh size can be expressed
in terms of the “pacing”.
Order the points.
64. Mesh Size
Tight per-instance bounds on the mesh size can be expressed
in terms of the “pacing”.
pi
Order the points.
65. Mesh Size
Tight per-instance bounds on the mesh size can be expressed
in terms of the “pacing”.
a = pi − NN(pi)
pi
Order the points.
66. Mesh Size
Tight per-instance bounds on the mesh size can be expressed
in terms of the “pacing”.
b = pi − 2NN(pi)
a = pi − NN(pi)
pi
Order the points.
67. Mesh Size
Tight per-instance bounds on the mesh size can be expressed
in terms of the “pacing”.
b = pi − 2NN(pi)
a = pi − NN(pi)
pi
The pacing of the ith point is φi = b
a .
Order the points.
68. Mesh Size
Tight per-instance bounds on the mesh size can be expressed
in terms of the “pacing”.
b = pi − 2NN(pi)
a = pi − NN(pi)
pi
The pacing of the ith point is φi = b
a .
Let φ be the geometric mean, so log φi = n log φ.
Order the points.
69. Mesh Size
Tight per-instance bounds on the mesh size can be expressed
in terms of the “pacing”.
b = pi − 2NN(pi)
a = pi − NN(pi)
pi
The pacing of the ith point is φi = b
a .
Let φ be the geometric mean, so log φi = n log φ.
φ is the pacing of the ordering.
Order the points.
70. Mesh Size
We can write the feature size measure as a telescoping sum.
71. Mesh Size
We can write the feature size measure as a telescoping sum.
Pi = {p1, . . . , pi}
72. Mesh Size
We can write the feature size measure as a telescoping sum.
Pi = {p1, . . . , pi}
µP = µP2 +
n
i=3
µPi − µPi−1
73. Mesh Size
We can write the feature size measure as a telescoping sum.
Pi = {p1, . . . , pi}
effect of adding the ith point.
µP = µP2 +
n
i=3
µPi − µPi−1
74. Mesh Size
We can write the feature size measure as a telescoping sum.
Pi = {p1, . . . , pi}
effect of adding the ith point.
µP = µP2 +
n
i=3
µPi − µPi−1
µPi (Ω) − µPi−1 (Ω) = Θ(1 + log φi)
When the boundary is “simple” and the first two points are not
too close compared to the diameter,
75. Mesh Size
We can write the feature size measure as a telescoping sum.
Pi = {p1, . . . , pi}
effect of adding the ith point.
µP = µP2 +
n
i=3
µPi − µPi−1
µPi (Ω) − µPi−1 (Ω) = Θ(1 + log φi)
When the boundary is “simple” and the first two points are not
too close compared to the diameter,
Thus,
µP (⌦) = µP2 (⌦) + ⇥(n + n log )
76. Mesh Size
We can write the feature size measure as a telescoping sum.
Pi = {p1, . . . , pi}
effect of adding the ith point.
µP = µP2 +
n
i=3
µPi − µPi−1
µPi (Ω) − µPi−1 (Ω) = Θ(1 + log φi)
When the boundary is “simple” and the first two points are not
too close compared to the diameter,
Thus,
µP (⌦) = µP2 (⌦) + ⇥(n + n log )
Measure induced by just two points.
77. Mesh Size
We can write the feature size measure as a telescoping sum.
Pi = {p1, . . . , pi}
effect of adding the ith point.
µP = µP2 +
n
i=3
µPi − µPi−1
µPi (Ω) − µPi−1 (Ω) = Θ(1 + log φi)
When the boundary is “simple” and the first two points are not
too close compared to the diameter,
Thus,
µP (⌦) = µP2 (⌦) + ⇥(n + n log )
Measure induced by just two points.
Output Mesh Size
78. Mesh Size
The previous bound implies that there is only one necessary but
insufficient condition for the output size to be superlinear in the
number of input points.
[ Picture of bad case ]
79. Mesh Size
The previous bound implies that there is only one necessary but
insufficient condition for the output size to be superlinear in the
number of input points.
[ Picture of bad case ]
80. Mesh Size
The previous bound implies that there is only one necessary but
insufficient condition for the output size to be superlinear in the
number of input points.
[ Picture of bad case ]
81. Mesh Size
The previous bound implies that there is only one necessary but
insufficient condition for the output size to be superlinear in the
number of input points.
[ Picture of bad case ]
82. Mesh Size
The previous bound implies that there is only one necessary but
insufficient condition for the output size to be superlinear in the
number of input points.
[ Picture of bad case ]
83. Mesh Size
The previous bound implies that there is only one necessary but
insufficient condition for the output size to be superlinear in the
number of input points.
[ Picture of bad case ]
84. Mesh Size
The previous bound implies that there is only one necessary but
insufficient condition for the output size to be superlinear in the
number of input points.
[ Picture of bad case ]
85. Mesh Size
The previous bound implies that there is only one necessary but
insufficient condition for the output size to be superlinear in the
number of input points.
[ Picture of bad case ]
86. Mesh Size
The previous bound implies that there is only one necessary but
insufficient condition for the output size to be superlinear in the
number of input points.
[ Picture of bad case ]
87. Mesh Size
The previous bound implies that there is only one necessary but
insufficient condition for the output size to be superlinear in the
number of input points.
[ Picture of bad case ]
89. Running Time
In an incremental construction, the points are added
one at a time.
90. Running Time
In an incremental construction, the points are added
one at a time.
Where is the work?
91. Running Time
In an incremental construction, the points are added
one at a time.
Where is the work?
1. Point Location O(log n) per input vertex
92. Running Time
In an incremental construction, the points are added
one at a time.
Where is the work?
1. Point Location
2. Local Updates
O(log n) per input vertex
O(1) per vertex
93. Running Time
In an incremental construction, the points are added
one at a time.
Where is the work?
1. Point Location
2. Local Updates
O(log n) per input vertex
O(1) per vertex
Goal: O(n log n + m)
96. Running Time
1 Keep it quality. Keep it sparse.
2 Avoid the one bad case. Use hierarchical structure.
97. Running Time
1 Keep it quality. Keep it sparse.
2 Avoid the one bad case. Use hierarchical structure.
3 Preprocess the input vertices for fast point location.
100. Running Time
1 Keep it quality. Keep it sparse.
Incremental Construction
Recover input (vertices or features)
101. Running Time
1 Keep it quality. Keep it sparse.
Incremental Construction
Recover input (vertices or features)
Refine
102. Running Time
1 Keep it quality. Keep it sparse.
Incremental Construction
Recover input (vertices or features)
Refine
an
103. Running Time
1 Keep it quality. Keep it sparse.
Incremental Construction
Recover input (vertices or features)
Refine
Loop
an
104. Running Time
1 Keep it quality. Keep it sparse.
Incremental Construction
Recover input (vertices or features)
Refine
Loop
an
Since the mesh is always quality, we avoid the worst case
for Voronoi diagrams.
Insertions only require a constant number of local updates.
108. Running Time
2 Avoid the one bad case.
If you see a big empty
annulus, do something
different.
109. Running Time
2 Avoid the one bad case.
If you see a big empty
annulus, do something
different.
- hierarchies
110. Running Time
2 Avoid the one bad case.
If you see a big empty
annulus, do something
different.
- hierarchies
- delayed input
111. Running Time
2 Avoid the one bad case.
If you see a big empty
annulus, do something
different.
- hierarchies
- delayed input
Linear-size meshes are
possible by relaxing the
quality condition for this
one case. [MPS08, HMPS09,
MPS11, S12, MSV13]
118. Running Time
3 Preprocess the input vertices for fast point location.
How many steps?
If starting from nearest inserted input point, we only need
to take a constant number of steps.
119. Running Time
3 Preprocess the input vertices for fast point location.
How many steps?
If starting from nearest inserted input point, we only need
to take a constant number of steps.
Ordering input points takes O(n log n) time.
124. Overview
A Defense of Theory:
General lower bounds
Theory can guide practice
Mesh Quality:
125. Overview
A Defense of Theory:
General lower bounds
Theory can guide practice
Mesh Quality:
Many choices.
126. Overview
A Defense of Theory:
General lower bounds
Theory can guide practice
Mesh Quality:
Many choices.
We focused on Voronoi Aspect Ratio
127. Overview
A Defense of Theory:
General lower bounds
Theory can guide practice
Mesh Quality:
Many choices.
We focused on Voronoi Aspect Ratio
Optimal Mesh Size:
128. Overview
A Defense of Theory:
General lower bounds
Theory can guide practice
Mesh Quality:
Many choices.
We focused on Voronoi Aspect Ratio
Optimal Mesh Size:
The feature size measure determines mesh size.
129. Overview
A Defense of Theory:
General lower bounds
Theory can guide practice
Mesh Quality:
Many choices.
We focused on Voronoi Aspect Ratio
Optimal Mesh Size:
The feature size measure determines mesh size.
The pacing determines the feature size measure.
130. Overview
A Defense of Theory:
General lower bounds
Theory can guide practice
Mesh Quality:
Many choices.
We focused on Voronoi Aspect Ratio
Optimal Mesh Size:
The feature size measure determines mesh size.
The pacing determines the feature size measure.
Algorithmic suggestions for optimal Running time:
131. Overview
A Defense of Theory:
General lower bounds
Theory can guide practice
Mesh Quality:
Many choices.
We focused on Voronoi Aspect Ratio
Optimal Mesh Size:
The feature size measure determines mesh size.
The pacing determines the feature size measure.
Algorithmic suggestions for optimal Running time:
Use the Sparse Meshing paradigm.
132. Overview
A Defense of Theory:
General lower bounds
Theory can guide practice
Mesh Quality:
Many choices.
We focused on Voronoi Aspect Ratio
Optimal Mesh Size:
The feature size measure determines mesh size.
The pacing determines the feature size measure.
Algorithmic suggestions for optimal Running time:
Use the Sparse Meshing paradigm.
Adapt to large pacing.
133. Overview
A Defense of Theory:
General lower bounds
Theory can guide practice
Mesh Quality:
Many choices.
We focused on Voronoi Aspect Ratio
Optimal Mesh Size:
The feature size measure determines mesh size.
The pacing determines the feature size measure.
Algorithmic suggestions for optimal Running time:
Use the Sparse Meshing paradigm.
Adapt to large pacing.
Preprocess for walk-based point location
134. Overview
A Defense of Theory:
General lower bounds
Theory can guide practice
Mesh Quality:
Many choices.
We focused on Voronoi Aspect Ratio
Optimal Mesh Size:
The feature size measure determines mesh size.
The pacing determines the feature size measure.
Algorithmic suggestions for optimal Running time:
Use the Sparse Meshing paradigm.
Adapt to large pacing.
Preprocess for walk-based point location
Thank You