SlideShare une entreprise Scribd logo
1  sur  49
Chapter 13 Discrete Image
Transforms
13.1 Introduction
13.2 Linear transformations


One-dimensional discrete linear transformations
Definition. If x is an N-by-1 vector and T is an N-by-N
matrix, then y = Tx is a linear transformation of the
vector x, I.e.,
N−
1

yi = ∑ ij x j
t
j =0

T is called the kernel matrix of the transformation
Linear Transformations
Rotation and scaling are examples of linear
transformations
 y1  cos(θ)
y  = sin(θ)
 2 

−sin(θ) x1 
cos(θ) x2 
 

If T is a nonsingular matrix, the inverse
linear transformation is
x = T −1y
Linear Transformations
Unitary Transforms




If the kernel matrix of a linear system is a
unitary matrix, the linear system is called a
unitary transformation.
A matrix T is unitary if
T −1 = (T∗ ) t , i.e., T(T∗ ) t = (T∗ ) t T = I



If T is unitary and real, then T is an orthogonal
matrix.
T −1 = Tt , i.e., TT t = Tt T = I
Unitary Transforms
The rows ( also the columns ) of an orthogonal
matrix are a set of orthogonal vectors.
One-dimensional DFT is an example of a unitary
transform
1
i
F =
k
∑f exp(−j 2π N )
N
or
N−
1

k

i=
0

i

F = Wf

where the matrix W is a unitary matrix.
Unitary Transforms
Interpretation








The rows (or columns) of a unitary matrix form a
orthonormal basis for the N-dimensional vector space.
A unitary transform can be viewed as a coordinate
transformation, rotating the vector in N-space without
changing its length.
A unitary linear transformation converts a Ndimensional vector to a N-dimensional vector of
transform coefficients, each of which is computed as
the inner product of the input vector x with one of the
rows of the transform matrix T.
The forward transformation is referred to as analysis,
and the backward transformation is referred to as
synthesis.
Two-dimensional discrete linear
transformations
A two-dimensional linear transformation is
N− N−
1
1

Gmn =∑ Fik t (i, k ; m, n)
∑
i= k =
0
0

where t (i, k ; m, n) forms an N2-by-N2 block
matrix having N-by-N blocks, each of
which is an N-by-N matrix
If t (i, k ; m, n) = t (i, m)t (k , n) , the transformation
is called separable.
r

Gmn

c

N 1
 −

=∑∑ ik t c ( k , n)  r (i , m)
F
t

i=  =
0
k 0

N−
1
2-D separable symmetric unitary
transforms
If t (i, k ; m, n) = t (i, m)t (k , n) , the transformation is
symmetric,
Gmn

N 1
 −

=∑∑ ik t ( k , n)  (i , m)
F
t

i=  =
0
k 0

N−
1

or

G =
TFT

The inverse transform is
F = T −1GT −1 = (T∗) t G (T∗) t

Example: The two-dimensional DFT
G = WFW ,

F = ( W ∗) t G ( W ∗ ) t
Orthogonal transformations
If the matrix T is real, the linear
transformation is called an orthogonal
transformation. Its inverse transformation is
F = T′GT′

If T is a symmetric matrix, the forward and
inverse transformation are identical,
G = TFT and F = TGT
Basis functions and basis images
Basis functions


The rows of a unitary form a orthonormal basis
for a N-dimensional vector space. There are
different unitary transformations with different
choice of basis vectors. A unitary
transformation corresponds to a rotation of a
vector in a N-dimensional (N2 for twodimensional case) vector space.
Basis Images
The inverse unitary transform an be viewed as the
weighted sum of N2 basis image, which is the
} as
inverse transformation of a matrix G = {δ
p,q

Fmn

i − p , j −q

1
N −

=∑(i, m) ∑ i −p , k −q t ( k , n)  =t ( p, m)t ( q, n)
t
δ
i=
0
k =0

n−
1

This means that each basis image is the outer
product of two rows of the transform matrix.
Any image can be decomposed into a set of basis
image by the forward transformation, and the
inverse transformation reconstitute the image by
summing the basis images.
13.4 Sinusoidal Transforms
13.4.1 The discrete Fourier transform


The forward and inverse DFT’s are
F = Wf and f = ( W ∗ ) t F

where
 w0, 0  w0, N −1 


W= 

 
 wN −1,0  wN −1, N −1 


ik

1 − j 2π N
wi ,k =
e
N
13.4 Sinusoidal Transforms
The spectrum vector


The frequency corresponding to the ith element
of F is
 2i
 N fN

si = 
 2( N − i ) f
N
 N




0≤ i ≤ N /2
N / 2 +1≤ i ≤ N −1

The frequency components are arranged as
0

fN
N /2

0

1

i

fN
N /2

i

fN

N/2

− ( N − i)

fN
N /2

−

fN
N /2

N-1
13.4 Sinusoidal Transforms


The frequencies are symmetric about the
highest frequency component. Using a circular
right shift by the amount N/2, we can place the
zero-frequency at N/2 and frequency increases
in both directions from there. The Nyquist
frequency (the highest frequency) at F0. This
can be done by changing the signs of the oddnumbered elements of f(x) prior to computing
the DFT. This is because
13.4 Sinusoidal Transforms
F (u ) ⇔ ( x) ⇒ F (u − N / 2) ⇔ exp( j 2πx

N /2
) f ( x)
N

= exp( jπx) f ( x) = (−1) x f ( x)


The two-dimensional DFT
 For the two-dimensional DFT, changing the sign of

half the elements of the image matrix shifts its zero
frequency component to the center of the spectrum
F (u , v) ⇔ f ( x, y ) ⇒ F (u − N / 2, v − N / 2) ⇔ (− 1) x+ y f ( x, y )
1

4

3

2

2

3

4

1
13.4.2 Discrete Cosine
Transform
The two-dimensional discrete cosine
transform (DCT) is defined as
 π (2i + 1)m   π (2k + 1)n 
Gc (m, n) = α (m)α (n)∑ ∑ g (i, k ) cos 
 cos  2 N 
 2N  

i=0 k =0
N −1 N −1

and its inverse
 π (2i + 1)m   π (2k + 1)n 
g (i, k ) = ∑ ∑ α (m)α (n)Gc (m, n) cos 
 cos  2 N 
 2N  

m= 0 n= 0
N −1 N −1

where
α ( 0) =

1
N

and α ( m) =

2
N

for 1 ≤ m ≤ N
The discrete cosine transform
DCT can be expressed as a unitary matrix form
G c = CgC

Where the kernel matrix has elements
Ci , m

π (2i + 1)m 
= α ( m) cos 

2N



The DCT is useful in image compression.
The sine transform
The discrete sine transform (DST) is defined as
2 N −1 N −1
 π (i + 1)(m + 1)   π (k + 1)(n + 1) 
Gs (m, n) =
g (i, k ) sin 
∑ ∑0
 sin 

N + 1 i=0 k =
N +1
N +1

 


and
2 N −1 N −1
 π (i + 1)(m + 1)   π (k + 1)(n + 1) 
g (i, k ) =
Gs (m, n) sin 
∑0 ∑0
 sin 

N + 1 m= n=
N +1
N +1

 


The DST has unitary kernel
ti , k =

2
π (i + 1)(k + 1) 
sin 

N +1
N +1


The Hartley transform
The forward two-dimensional discrete
Hartley transform1
N −1 N −
Gm ,n

1
=
N

 2π

g i ,k cas 
(im + kn)
∑∑
N

i =0 k =0

The inverse DHT
g i ,k

1
=
N

 2π

Gm , n cas 
(im + kn)
∑∑
N

m =0 n =0
N −1 N −1

where the basis function
cas(θ ) = cos(θ ) + sin(θ ) = 2 cos(θ − π / 4)
The Hartley transform
The unitary kernel matrix of the Hartley transform
has elements
ti , k

1
=
N

 
ik 
cas 2π N 

 

The Hartley transform is the real part minus the
imaginary part of the corresponding Fourier
transform, and the Fourier transform is the even
part minus j times the odd part of the Hartley
transform.
13.5 Rectangular wave
transforms
13.5.1 The Hadamard Transform
 Also called the Walsh transform.
 The Hadamard transform is a symmetric, separable

orthogonal transformation that has only +1 and –1
as elements in its kernel matrix. It exists only for N = 2 n
 For the two-by-two case
1
H2 =
2

1 1 
1 −1



 And for general cases
1
1 H N / 2
HN =

N
N H N / 2

H N /2 
− HN /2 

The Hadamard Transform
H8

1
1

1

1 1
H8 =
2 2 1

1
1

1


1
1

1

1 1
H8 =
2 2 1

1
1

1


1 1 1 1 1 1 1 0
− 1 1 − 1 1 − 1 1 − 1 7

1 − 1 − 1 1 1 − 1 − 1 3

−1 −1 1 1 −1 −1 1  4
1 1 1 − 1 − 1 − 1 − 1 1

−1 1 −1 −1 1 −1 1  6
1 −1 −1 −1 −1 1 1  2

− 1 − 1 1 − 1 1 1 − 1 5


1
1

1
1

1
1

1
1
−1
−1

−1 −1
−1 −1
−1 1
−1 1

−1
−1

1
1

1 1 1 1
− 1 − 1 − 1 − 1

−1 −1 1 1 

1 1 − 1 − 1
1 −1 −1 1 

− 1 1 1 − 1
−1 1 −1 1 

1 − 1 1 − 1


0
1
2
3
4
5

Ordered Hadamard transform
−1
−1

6
7
13.5.3 The Slant Transform
The orthogonal kernel matrix for the slant
transform is obtained iteratively as
1
S2 =
2
 1
 2
1  5
S4 =

2 0
− 1

5


0
1
5
1
2
5

1
2
−
5
0
1
5


0 
1 

5 
−1 
2 

5 




1
2
1
2

1
2
1
−
2

0

0

0

0

1 1 
1 −1



0
0
1
2
1
2

 1

 2

 3



0
 = 1  10
1 
2 1


2 
 2
1 
 1
−
 10
2


0

1
2
1
10
1
−
2
3
−
10

1
2
1
−
10
1
−
2
3
10

1 
2 
3 

−
10 
1 

2 
1 
−
10 

13.5.3 The Slant Transform


And
 1 0
a b
 N N
1  0
SN =
2 0 1
− b a
 N0 N


0
I
0
I

1 0

0
− aN bN
 S
0
I  N /2 0 
0 −1
  0 S N / 2 
0
bN aN
0
− I 

where I is the identity matrix of order N/2-2 and
a2 N

3N 2
N2 −1
=
, b2 N =
2
4N − 1
4N 2 − 1
13.5.3 The Slant Transform
The basis function for N=8
0

4

1

5

2

6

3

7
13.5.4 The Haar Transform
The Basis functions of Haar transform


For any integer

0 ≤ k ≤ N −1

, let

k = 2p + q −1

p ≥ 0, q ≥ 1



where
and
is the largest power of
2 that 2 p ≤ k , and q − 1 is the remainder.
The Haar function is defined by
h0 ( x) =

2p

1
N
13.5.4 The Haar Transform


And



The 8-by-8 Haar orthogonal kernel matrix is

 p/2
2

1  −p/2
hk ( x) =
2
N 
0








1 
Hr =
2 2







q −1
q −1 / 2
≤x<
2p
2p
q −1 / 2
q
≤x< p
2p
2
otherwise

1 1
1
1
1 1
1
1 
1 1
1
1
−1 −1 −1
−1 

2 2 − 2 − 2 0 0
0
0 

0 0
0
0
2 2 − 2 − 2
2 −2 0
0
0 0
0
0 

0 0
2
−2 0 0
0
0 
0 0
0
0
2 −2 0
0 

1 0
0
0
0 0
2
−2 

13.5.4 The Haar Transform
Basis Functions for N=8
Basis images of Haar
Basis image of Hadamard
Basis images of DCT
13.6 Eigenvector-based
transforms
Eigenvalues and eigenvectors


For an N-by-N matrix A, λ is a scalar, if

| A −λ |=0
I
then λ is called an eigenvalue of A. The vector v
that satisfies
Av =λ
v
is called an eigenvectors of A.
13.6 Eigenvector-based
transforms
13.6.2 Principal-Component Analysis


Suppose x is an N-by-1 random vector, The
mean vector can be estimated from its L
samples as
L
1
m x ≈ ∑x l
L l =1

and its covariance matrix can be estimated by
1
C = ε {( x − m )(x − m )′} ≈ ∑ x x − m m
L
L

x



x

x

l =1

l

t
l

x

t
x

The matrix C x is a real and symmetric matrix.
13.6 Eigenvector-based
transforms


Let A be a matrix whose rows are eigenvectors
of C x , then C y = AC x A′ is a diagonal matrix
having the eigenvalues of C x along its
diagonal, I.e.,
0
λ1

Cy = 



0
λN 





Let the matrix A define a linear transformation
by
y = A ( x −m x )
13.6 Eigenvector-based
transforms


It can be shown that the covariance matrix of
the vector y is C y = AC x A′ . Since the matrix
Cy



y
is a diagonal matrix, its off-diagonal
elements are zero, the element of are
uncorrelated. Thus the linear transformation
remove the correlation among the variables.
The reverse transform = A′y +m
x = A −1y +m

can reconstruct x from y.
13.6 Dimension Reduction
We can reduce the dimensionality of the y
vector by ignoring one or more of the
eigenvectors that have small eigenvalues.
Let B be the M-by-N matrix (M<N) formed
by discarding the lower N-M rows of A, and
let mx=0 for simplicity, then the transformed
vector has smaller dimension
ˆ
y = Bx
13.6 Dimension Reduction
The vector x can be
reconstructed(approximately) by
ˆ
ˆ
x = B′y
The mean square error is
MSE =

N

∑λ

k =M +
1

k

ˆ
The vector y is called the principal
component of the vector x.
13.6.3 The Karhunen-Loeve
Transform
The K-L transform is defined as
y = A(x − m x )

The dimension-reducing capability of the K-L
transform makes it quite useful for image
compression.
When the image is a first-order Markov process,
where the correlation between pixels decreases
linearly with their separation distance, the basis
images for the K-L transform can be written
explicitly.
13.6.3 The Karhunen-Loeve
Transform
When the correlation between adjacent
pixels approaches unity, the K-L basis
functions approach those of discrete cosine
transform. Thus, DCT is a good
approximation for the K-L transform.
13.6.4 The SVD Transform
Singular value decomposition
Any N-by-N matrix A can be decomposed as
A = UΛV t
t
where the columns of U and V are the eigenvectors of AA
andA
, respectively.
is an N-by-N
At
Λ
diagonal matrix containing the singular values of A.
 The forward singular value decomposition(SVD)
transform
Λ = U t AV




The inverse SVD transform

A = UΛV t
13.6.4 The SVD transform
For SVD transform, the kernel matrices are
image-dependent.
The SVD has a very high power of image
compression, we can get lossless
compression by at least a factor of N. and
even higher lossy compression ratio by
ignoring some small singular values may be
achieved.
The SVD transform
Illustration of figure 13-7
13.7 Transform domain filtering
Like in the Fourier transform domain, filter can be
designed in other transform domain.
Transform domain filtering involves modification
of the weighting coefficients prior to
reconstruction of the image via the inverse
transform.
If either of the desired components or the
undesired components of the image resemble one
or a few of the basis image of a particular
transform, then that transform will be useful in
separating the two.
13.7 Transform domain filtering
Haar transform is a good candidate for
detecting vertical and horizontal lines and
edges.
13.7 Transform domain filtering
Illustration of fig. 13-8.
13.7 Transform domain filtering
Figure 13-9
Ch13

Contenu connexe

Tendances

Signals and systems( chapter 1)
Signals and systems( chapter 1)Signals and systems( chapter 1)
Signals and systems( chapter 1)
Fariza Zahari
 

Tendances (20)

DSP_FOEHU - MATLAB 04 - The Discrete Fourier Transform (DFT)
DSP_FOEHU - MATLAB 04 - The Discrete Fourier Transform (DFT)DSP_FOEHU - MATLAB 04 - The Discrete Fourier Transform (DFT)
DSP_FOEHU - MATLAB 04 - The Discrete Fourier Transform (DFT)
 
Properties of dft
Properties of dftProperties of dft
Properties of dft
 
Signals and systems( chapter 1)
Signals and systems( chapter 1)Signals and systems( chapter 1)
Signals and systems( chapter 1)
 
Classification of systems : Part 2
Classification of systems  :  Part 2Classification of systems  :  Part 2
Classification of systems : Part 2
 
Z transform
Z transformZ transform
Z transform
 
lec_3.pdf
lec_3.pdflec_3.pdf
lec_3.pdf
 
Digital Signal Processing
Digital Signal Processing Digital Signal Processing
Digital Signal Processing
 
Fourier Transform ,LAPLACE TRANSFORM,ROC and its Properties
Fourier Transform ,LAPLACE TRANSFORM,ROC and its Properties Fourier Transform ,LAPLACE TRANSFORM,ROC and its Properties
Fourier Transform ,LAPLACE TRANSFORM,ROC and its Properties
 
Ppt of analog communication
Ppt of analog communicationPpt of analog communication
Ppt of analog communication
 
Decimation and Interpolation
Decimation and InterpolationDecimation and Interpolation
Decimation and Interpolation
 
Multirate digital signal processing
Multirate digital signal processingMultirate digital signal processing
Multirate digital signal processing
 
Lecture6 Signal and Systems
Lecture6 Signal and SystemsLecture6 Signal and Systems
Lecture6 Signal and Systems
 
Multirate DSP
Multirate DSPMultirate DSP
Multirate DSP
 
DSP_FOEHU - Lec 08 - The Discrete Fourier Transform
DSP_FOEHU - Lec 08 - The Discrete Fourier TransformDSP_FOEHU - Lec 08 - The Discrete Fourier Transform
DSP_FOEHU - Lec 08 - The Discrete Fourier Transform
 
Digital Signal Processing
Digital Signal ProcessingDigital Signal Processing
Digital Signal Processing
 
Digital signal processing part2
Digital signal processing part2Digital signal processing part2
Digital signal processing part2
 
4.Sampling and Hilbert Transform
4.Sampling and Hilbert Transform4.Sampling and Hilbert Transform
4.Sampling and Hilbert Transform
 
LPC for Speech Recognition
LPC for Speech RecognitionLPC for Speech Recognition
LPC for Speech Recognition
 
Pcm
PcmPcm
Pcm
 
Sampling Theorem
Sampling TheoremSampling Theorem
Sampling Theorem
 

En vedette

35 dijkstras alg
35 dijkstras alg35 dijkstras alg
35 dijkstras alg
douglaslyon
 
Td and-fd-mmse-teq
Td and-fd-mmse-teqTd and-fd-mmse-teq
Td and-fd-mmse-teq
douglaslyon
 
Inventors association of ct april 23 2013
Inventors association of ct april 23 2013Inventors association of ct april 23 2013
Inventors association of ct april 23 2013
douglaslyon
 
Allaboutequalizers
AllaboutequalizersAllaboutequalizers
Allaboutequalizers
douglaslyon
 
Seminar 20091023 heydt_presentation
Seminar 20091023 heydt_presentationSeminar 20091023 heydt_presentation
Seminar 20091023 heydt_presentation
douglaslyon
 
Lecture5 cuda-memory-spring-2010
Lecture5 cuda-memory-spring-2010Lecture5 cuda-memory-spring-2010
Lecture5 cuda-memory-spring-2010
douglaslyon
 
Crowdfunding your invention_ia
Crowdfunding your invention_iaCrowdfunding your invention_ia
Crowdfunding your invention_ia
douglaslyon
 
2011Howell Smackdown 2.0
2011Howell Smackdown 2.02011Howell Smackdown 2.0
2011Howell Smackdown 2.0
WCET
 
2010 Models for Adult Student Success
2010 Models for Adult Student Success2010 Models for Adult Student Success
2010 Models for Adult Student Success
WCET
 
2010 Leveraging Technology to Deliver Platform Independent Content: Gance
2010 Leveraging Technology to Deliver Platform Independent Content: Gance2010 Leveraging Technology to Deliver Platform Independent Content: Gance
2010 Leveraging Technology to Deliver Platform Independent Content: Gance
WCET
 
2010 NEXus: Opportunities for Nursing Students
2010 NEXus: Opportunities for Nursing Students2010 NEXus: Opportunities for Nursing Students
2010 NEXus: Opportunities for Nursing Students
WCET
 
2010 Developing high quality online doctoral programs2
2010 Developing high quality online doctoral programs22010 Developing high quality online doctoral programs2
2010 Developing high quality online doctoral programs2
WCET
 
2010 Process, Collaboration, and Social Media
2010 Process, Collaboration, and Social Media2010 Process, Collaboration, and Social Media
2010 Process, Collaboration, and Social Media
WCET
 

En vedette (20)

35 dijkstras alg
35 dijkstras alg35 dijkstras alg
35 dijkstras alg
 
Ht upload
Ht uploadHt upload
Ht upload
 
01 intro
01 intro01 intro
01 intro
 
06tcpip
06tcpip06tcpip
06tcpip
 
Td and-fd-mmse-teq
Td and-fd-mmse-teqTd and-fd-mmse-teq
Td and-fd-mmse-teq
 
Bmev3 w mov
Bmev3 w movBmev3 w mov
Bmev3 w mov
 
CR346-Lec00 history
CR346-Lec00 historyCR346-Lec00 history
CR346-Lec00 history
 
Inventors association of ct april 23 2013
Inventors association of ct april 23 2013Inventors association of ct april 23 2013
Inventors association of ct april 23 2013
 
Allaboutequalizers
AllaboutequalizersAllaboutequalizers
Allaboutequalizers
 
Lec16 memoryhier
Lec16 memoryhierLec16 memoryhier
Lec16 memoryhier
 
Seminar 20091023 heydt_presentation
Seminar 20091023 heydt_presentationSeminar 20091023 heydt_presentation
Seminar 20091023 heydt_presentation
 
Lecture5 cuda-memory-spring-2010
Lecture5 cuda-memory-spring-2010Lecture5 cuda-memory-spring-2010
Lecture5 cuda-memory-spring-2010
 
Crowdfunding your invention_ia
Crowdfunding your invention_iaCrowdfunding your invention_ia
Crowdfunding your invention_ia
 
Lecture 04
Lecture 04Lecture 04
Lecture 04
 
2011Howell Smackdown 2.0
2011Howell Smackdown 2.02011Howell Smackdown 2.0
2011Howell Smackdown 2.0
 
2010 Models for Adult Student Success
2010 Models for Adult Student Success2010 Models for Adult Student Success
2010 Models for Adult Student Success
 
2010 Leveraging Technology to Deliver Platform Independent Content: Gance
2010 Leveraging Technology to Deliver Platform Independent Content: Gance2010 Leveraging Technology to Deliver Platform Independent Content: Gance
2010 Leveraging Technology to Deliver Platform Independent Content: Gance
 
2010 NEXus: Opportunities for Nursing Students
2010 NEXus: Opportunities for Nursing Students2010 NEXus: Opportunities for Nursing Students
2010 NEXus: Opportunities for Nursing Students
 
2010 Developing high quality online doctoral programs2
2010 Developing high quality online doctoral programs22010 Developing high quality online doctoral programs2
2010 Developing high quality online doctoral programs2
 
2010 Process, Collaboration, and Social Media
2010 Process, Collaboration, and Social Media2010 Process, Collaboration, and Social Media
2010 Process, Collaboration, and Social Media
 

Similaire à Ch13

Fourier transforms & fft algorithm (paul heckbert, 1998) by tantanoid
Fourier transforms & fft algorithm (paul heckbert, 1998) by tantanoidFourier transforms & fft algorithm (paul heckbert, 1998) by tantanoid
Fourier transforms & fft algorithm (paul heckbert, 1998) by tantanoid
Xavier Davias
 
Image trnsformations
Image trnsformationsImage trnsformations
Image trnsformations
John Williams
 
Analysis of multiple groove guide
Analysis of multiple groove guideAnalysis of multiple groove guide
Analysis of multiple groove guide
Yong Heui Cho
 

Similaire à Ch13 (20)

Unit ii
Unit iiUnit ii
Unit ii
 
Fourier transforms & fft algorithm (paul heckbert, 1998) by tantanoid
Fourier transforms & fft algorithm (paul heckbert, 1998) by tantanoidFourier transforms & fft algorithm (paul heckbert, 1998) by tantanoid
Fourier transforms & fft algorithm (paul heckbert, 1998) by tantanoid
 
Low rank tensor approximation of probability density and characteristic funct...
Low rank tensor approximation of probability density and characteristic funct...Low rank tensor approximation of probability density and characteristic funct...
Low rank tensor approximation of probability density and characteristic funct...
 
Image trnsformations
Image trnsformationsImage trnsformations
Image trnsformations
 
Ch15 transforms
Ch15 transformsCh15 transforms
Ch15 transforms
 
Dft
DftDft
Dft
 
2018 MUMS Fall Course - Statistical Representation of Model Input (EDITED) - ...
2018 MUMS Fall Course - Statistical Representation of Model Input (EDITED) - ...2018 MUMS Fall Course - Statistical Representation of Model Input (EDITED) - ...
2018 MUMS Fall Course - Statistical Representation of Model Input (EDITED) - ...
 
Computing f-Divergences and Distances of\\ High-Dimensional Probability Densi...
Computing f-Divergences and Distances of\\ High-Dimensional Probability Densi...Computing f-Divergences and Distances of\\ High-Dimensional Probability Densi...
Computing f-Divergences and Distances of\\ High-Dimensional Probability Densi...
 
Lecture 9
Lecture 9Lecture 9
Lecture 9
 
DFT,DCT TRANSFORMS.pdf
DFT,DCT TRANSFORMS.pdfDFT,DCT TRANSFORMS.pdf
DFT,DCT TRANSFORMS.pdf
 
Introduction to Diffusion Monte Carlo
Introduction to Diffusion Monte CarloIntroduction to Diffusion Monte Carlo
Introduction to Diffusion Monte Carlo
 
lecture6.ppt
lecture6.pptlecture6.ppt
lecture6.ppt
 
digital control Chapter 2 slide
digital control Chapter 2 slidedigital control Chapter 2 slide
digital control Chapter 2 slide
 
Time Series Analysis
Time Series AnalysisTime Series Analysis
Time Series Analysis
 
Dcs lec02 - z-transform
Dcs   lec02 - z-transformDcs   lec02 - z-transform
Dcs lec02 - z-transform
 
Lect5 v2
Lect5 v2Lect5 v2
Lect5 v2
 
03 image transform
03 image transform03 image transform
03 image transform
 
Analysis of multiple groove guide
Analysis of multiple groove guideAnalysis of multiple groove guide
Analysis of multiple groove guide
 
2.1 Calculus 2.formulas.pdf.pdf
2.1 Calculus 2.formulas.pdf.pdf2.1 Calculus 2.formulas.pdf.pdf
2.1 Calculus 2.formulas.pdf.pdf
 
Direct method for soliton solution
Direct method for soliton solutionDirect method for soliton solution
Direct method for soliton solution
 

Dernier

Histor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slideHistor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slide
vu2urc
 
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
?#DUbAI#??##{{(☎️+971_581248768%)**%*]'#abortion pills for sale in dubai@
 

Dernier (20)

A Year of the Servo Reboot: Where Are We Now?
A Year of the Servo Reboot: Where Are We Now?A Year of the Servo Reboot: Where Are We Now?
A Year of the Servo Reboot: Where Are We Now?
 
HTML Injection Attacks: Impact and Mitigation Strategies
HTML Injection Attacks: Impact and Mitigation StrategiesHTML Injection Attacks: Impact and Mitigation Strategies
HTML Injection Attacks: Impact and Mitigation Strategies
 
2024: Domino Containers - The Next Step. News from the Domino Container commu...
2024: Domino Containers - The Next Step. News from the Domino Container commu...2024: Domino Containers - The Next Step. News from the Domino Container commu...
2024: Domino Containers - The Next Step. News from the Domino Container commu...
 
🐬 The future of MySQL is Postgres 🐘
🐬  The future of MySQL is Postgres   🐘🐬  The future of MySQL is Postgres   🐘
🐬 The future of MySQL is Postgres 🐘
 
Finology Group – Insurtech Innovation Award 2024
Finology Group – Insurtech Innovation Award 2024Finology Group – Insurtech Innovation Award 2024
Finology Group – Insurtech Innovation Award 2024
 
Workshop - Best of Both Worlds_ Combine KG and Vector search for enhanced R...
Workshop - Best of Both Worlds_ Combine  KG and Vector search for  enhanced R...Workshop - Best of Both Worlds_ Combine  KG and Vector search for  enhanced R...
Workshop - Best of Both Worlds_ Combine KG and Vector search for enhanced R...
 
The 7 Things I Know About Cyber Security After 25 Years | April 2024
The 7 Things I Know About Cyber Security After 25 Years | April 2024The 7 Things I Know About Cyber Security After 25 Years | April 2024
The 7 Things I Know About Cyber Security After 25 Years | April 2024
 
How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected Worker
 
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
 
Axa Assurance Maroc - Insurer Innovation Award 2024
Axa Assurance Maroc - Insurer Innovation Award 2024Axa Assurance Maroc - Insurer Innovation Award 2024
Axa Assurance Maroc - Insurer Innovation Award 2024
 
Automating Google Workspace (GWS) & more with Apps Script
Automating Google Workspace (GWS) & more with Apps ScriptAutomating Google Workspace (GWS) & more with Apps Script
Automating Google Workspace (GWS) & more with Apps Script
 
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
 
Understanding Discord NSFW Servers A Guide for Responsible Users.pdf
Understanding Discord NSFW Servers A Guide for Responsible Users.pdfUnderstanding Discord NSFW Servers A Guide for Responsible Users.pdf
Understanding Discord NSFW Servers A Guide for Responsible Users.pdf
 
Boost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivityBoost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivity
 
Histor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slideHistor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slide
 
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
 
Strategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
Strategize a Smooth Tenant-to-tenant Migration and Copilot TakeoffStrategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
Strategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
 
Driving Behavioral Change for Information Management through Data-Driven Gree...
Driving Behavioral Change for Information Management through Data-Driven Gree...Driving Behavioral Change for Information Management through Data-Driven Gree...
Driving Behavioral Change for Information Management through Data-Driven Gree...
 
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
 
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
 

Ch13

  • 1.
  • 2. Chapter 13 Discrete Image Transforms 13.1 Introduction 13.2 Linear transformations  One-dimensional discrete linear transformations Definition. If x is an N-by-1 vector and T is an N-by-N matrix, then y = Tx is a linear transformation of the vector x, I.e., N− 1 yi = ∑ ij x j t j =0 T is called the kernel matrix of the transformation
  • 3. Linear Transformations Rotation and scaling are examples of linear transformations  y1  cos(θ) y  = sin(θ)  2  −sin(θ) x1  cos(θ) x2    If T is a nonsingular matrix, the inverse linear transformation is x = T −1y
  • 4. Linear Transformations Unitary Transforms   If the kernel matrix of a linear system is a unitary matrix, the linear system is called a unitary transformation. A matrix T is unitary if T −1 = (T∗ ) t , i.e., T(T∗ ) t = (T∗ ) t T = I  If T is unitary and real, then T is an orthogonal matrix. T −1 = Tt , i.e., TT t = Tt T = I
  • 5. Unitary Transforms The rows ( also the columns ) of an orthogonal matrix are a set of orthogonal vectors. One-dimensional DFT is an example of a unitary transform 1 i F = k ∑f exp(−j 2π N ) N or N− 1 k i= 0 i F = Wf where the matrix W is a unitary matrix.
  • 6. Unitary Transforms Interpretation     The rows (or columns) of a unitary matrix form a orthonormal basis for the N-dimensional vector space. A unitary transform can be viewed as a coordinate transformation, rotating the vector in N-space without changing its length. A unitary linear transformation converts a Ndimensional vector to a N-dimensional vector of transform coefficients, each of which is computed as the inner product of the input vector x with one of the rows of the transform matrix T. The forward transformation is referred to as analysis, and the backward transformation is referred to as synthesis.
  • 7. Two-dimensional discrete linear transformations A two-dimensional linear transformation is N− N− 1 1 Gmn =∑ Fik t (i, k ; m, n) ∑ i= k = 0 0 where t (i, k ; m, n) forms an N2-by-N2 block matrix having N-by-N blocks, each of which is an N-by-N matrix If t (i, k ; m, n) = t (i, m)t (k , n) , the transformation is called separable. r Gmn c N 1  −  =∑∑ ik t c ( k , n)  r (i , m) F t  i=  = 0 k 0  N− 1
  • 8. 2-D separable symmetric unitary transforms If t (i, k ; m, n) = t (i, m)t (k , n) , the transformation is symmetric, Gmn N 1  −  =∑∑ ik t ( k , n)  (i , m) F t  i=  = 0 k 0  N− 1 or G = TFT The inverse transform is F = T −1GT −1 = (T∗) t G (T∗) t Example: The two-dimensional DFT G = WFW , F = ( W ∗) t G ( W ∗ ) t
  • 9. Orthogonal transformations If the matrix T is real, the linear transformation is called an orthogonal transformation. Its inverse transformation is F = T′GT′ If T is a symmetric matrix, the forward and inverse transformation are identical, G = TFT and F = TGT
  • 10. Basis functions and basis images Basis functions  The rows of a unitary form a orthonormal basis for a N-dimensional vector space. There are different unitary transformations with different choice of basis vectors. A unitary transformation corresponds to a rotation of a vector in a N-dimensional (N2 for twodimensional case) vector space.
  • 11. Basis Images The inverse unitary transform an be viewed as the weighted sum of N2 basis image, which is the } as inverse transformation of a matrix G = {δ p,q Fmn i − p , j −q 1 N −  =∑(i, m) ∑ i −p , k −q t ( k , n)  =t ( p, m)t ( q, n) t δ i= 0 k =0  n− 1 This means that each basis image is the outer product of two rows of the transform matrix. Any image can be decomposed into a set of basis image by the forward transformation, and the inverse transformation reconstitute the image by summing the basis images.
  • 12. 13.4 Sinusoidal Transforms 13.4.1 The discrete Fourier transform  The forward and inverse DFT’s are F = Wf and f = ( W ∗ ) t F where  w0, 0  w0, N −1    W=      wN −1,0  wN −1, N −1    ik 1 − j 2π N wi ,k = e N
  • 13. 13.4 Sinusoidal Transforms The spectrum vector  The frequency corresponding to the ith element of F is  2i  N fN  si =   2( N − i ) f N  N   0≤ i ≤ N /2 N / 2 +1≤ i ≤ N −1 The frequency components are arranged as 0 fN N /2 0 1 i fN N /2 i fN N/2 − ( N − i) fN N /2 − fN N /2 N-1
  • 14. 13.4 Sinusoidal Transforms  The frequencies are symmetric about the highest frequency component. Using a circular right shift by the amount N/2, we can place the zero-frequency at N/2 and frequency increases in both directions from there. The Nyquist frequency (the highest frequency) at F0. This can be done by changing the signs of the oddnumbered elements of f(x) prior to computing the DFT. This is because
  • 15. 13.4 Sinusoidal Transforms F (u ) ⇔ ( x) ⇒ F (u − N / 2) ⇔ exp( j 2πx N /2 ) f ( x) N = exp( jπx) f ( x) = (−1) x f ( x)  The two-dimensional DFT  For the two-dimensional DFT, changing the sign of half the elements of the image matrix shifts its zero frequency component to the center of the spectrum F (u , v) ⇔ f ( x, y ) ⇒ F (u − N / 2, v − N / 2) ⇔ (− 1) x+ y f ( x, y ) 1 4 3 2 2 3 4 1
  • 16.
  • 17. 13.4.2 Discrete Cosine Transform The two-dimensional discrete cosine transform (DCT) is defined as  π (2i + 1)m   π (2k + 1)n  Gc (m, n) = α (m)α (n)∑ ∑ g (i, k ) cos   cos  2 N   2N    i=0 k =0 N −1 N −1 and its inverse  π (2i + 1)m   π (2k + 1)n  g (i, k ) = ∑ ∑ α (m)α (n)Gc (m, n) cos   cos  2 N   2N    m= 0 n= 0 N −1 N −1 where α ( 0) = 1 N and α ( m) = 2 N for 1 ≤ m ≤ N
  • 18. The discrete cosine transform DCT can be expressed as a unitary matrix form G c = CgC Where the kernel matrix has elements Ci , m π (2i + 1)m  = α ( m) cos   2N   The DCT is useful in image compression.
  • 19.
  • 20. The sine transform The discrete sine transform (DST) is defined as 2 N −1 N −1  π (i + 1)(m + 1)   π (k + 1)(n + 1)  Gs (m, n) = g (i, k ) sin  ∑ ∑0  sin   N + 1 i=0 k = N +1 N +1     and 2 N −1 N −1  π (i + 1)(m + 1)   π (k + 1)(n + 1)  g (i, k ) = Gs (m, n) sin  ∑0 ∑0  sin   N + 1 m= n= N +1 N +1     The DST has unitary kernel ti , k = 2 π (i + 1)(k + 1)  sin   N +1 N +1  
  • 21. The Hartley transform The forward two-dimensional discrete Hartley transform1 N −1 N − Gm ,n 1 = N  2π  g i ,k cas  (im + kn) ∑∑ N  i =0 k =0 The inverse DHT g i ,k 1 = N  2π  Gm , n cas  (im + kn) ∑∑ N  m =0 n =0 N −1 N −1 where the basis function cas(θ ) = cos(θ ) + sin(θ ) = 2 cos(θ − π / 4)
  • 22. The Hartley transform The unitary kernel matrix of the Hartley transform has elements ti , k 1 = N   ik  cas 2π N     The Hartley transform is the real part minus the imaginary part of the corresponding Fourier transform, and the Fourier transform is the even part minus j times the odd part of the Hartley transform.
  • 23. 13.5 Rectangular wave transforms 13.5.1 The Hadamard Transform  Also called the Walsh transform.  The Hadamard transform is a symmetric, separable orthogonal transformation that has only +1 and –1 as elements in its kernel matrix. It exists only for N = 2 n  For the two-by-two case 1 H2 = 2 1 1  1 −1    And for general cases 1 1 H N / 2 HN =  N N H N / 2 H N /2  − HN /2  
  • 24. The Hadamard Transform H8 1 1  1  1 1 H8 = 2 2 1  1 1  1  1 1  1  1 1 H8 = 2 2 1  1 1  1  1 1 1 1 1 1 1 0 − 1 1 − 1 1 − 1 1 − 1 7  1 − 1 − 1 1 1 − 1 − 1 3  −1 −1 1 1 −1 −1 1  4 1 1 1 − 1 − 1 − 1 − 1 1  −1 1 −1 −1 1 −1 1  6 1 −1 −1 −1 −1 1 1  2  − 1 − 1 1 − 1 1 1 − 1 5  1 1 1 1 1 1 1 1 −1 −1 −1 −1 −1 −1 −1 1 −1 1 −1 −1 1 1 1 1 1 1 − 1 − 1 − 1 − 1  −1 −1 1 1   1 1 − 1 − 1 1 −1 −1 1   − 1 1 1 − 1 −1 1 −1 1   1 − 1 1 − 1  0 1 2 3 4 5 Ordered Hadamard transform −1 −1 6 7
  • 25. 13.5.3 The Slant Transform The orthogonal kernel matrix for the slant transform is obtained iteratively as 1 S2 = 2  1  2 1  5 S4 =  2 0 − 1  5  0 1 5 1 2 5 1 2 − 5 0 1 5  0  1   5  −1  2   5     1 2 1 2 1 2 1 − 2 0 0 0 0 1 1  1 −1   0 0 1 2 1 2  1   2   3    0  = 1  10 1  2 1   2   2 1   1 −  10 2   0 1 2 1 10 1 − 2 3 − 10 1 2 1 − 10 1 − 2 3 10 1  2  3   − 10  1   2  1  − 10  
  • 26. 13.5.3 The Slant Transform  And  1 0 a b  N N 1  0 SN = 2 0 1 − b a  N0 N  0 I 0 I 1 0  0 − aN bN  S 0 I  N /2 0  0 −1   0 S N / 2  0 bN aN 0 − I  where I is the identity matrix of order N/2-2 and a2 N 3N 2 N2 −1 = , b2 N = 2 4N − 1 4N 2 − 1
  • 27. 13.5.3 The Slant Transform The basis function for N=8 0 4 1 5 2 6 3 7
  • 28. 13.5.4 The Haar Transform The Basis functions of Haar transform  For any integer 0 ≤ k ≤ N −1 , let k = 2p + q −1 p ≥ 0, q ≥ 1  where and is the largest power of 2 that 2 p ≤ k , and q − 1 is the remainder. The Haar function is defined by h0 ( x) = 2p 1 N
  • 29. 13.5.4 The Haar Transform  And  The 8-by-8 Haar orthogonal kernel matrix is  p/2 2  1  −p/2 hk ( x) = 2 N  0        1  Hr = 2 2       q −1 q −1 / 2 ≤x< 2p 2p q −1 / 2 q ≤x< p 2p 2 otherwise 1 1 1 1 1 1 1 1  1 1 1 1 −1 −1 −1 −1   2 2 − 2 − 2 0 0 0 0   0 0 0 0 2 2 − 2 − 2 2 −2 0 0 0 0 0 0   0 0 2 −2 0 0 0 0  0 0 0 0 2 −2 0 0   1 0 0 0 0 0 2 −2  
  • 30. 13.5.4 The Haar Transform Basis Functions for N=8
  • 32. Basis image of Hadamard
  • 34. 13.6 Eigenvector-based transforms Eigenvalues and eigenvectors  For an N-by-N matrix A, λ is a scalar, if | A −λ |=0 I then λ is called an eigenvalue of A. The vector v that satisfies Av =λ v is called an eigenvectors of A.
  • 35. 13.6 Eigenvector-based transforms 13.6.2 Principal-Component Analysis  Suppose x is an N-by-1 random vector, The mean vector can be estimated from its L samples as L 1 m x ≈ ∑x l L l =1 and its covariance matrix can be estimated by 1 C = ε {( x − m )(x − m )′} ≈ ∑ x x − m m L L x  x x l =1 l t l x t x The matrix C x is a real and symmetric matrix.
  • 36. 13.6 Eigenvector-based transforms  Let A be a matrix whose rows are eigenvectors of C x , then C y = AC x A′ is a diagonal matrix having the eigenvalues of C x along its diagonal, I.e., 0 λ1  Cy =     0 λN     Let the matrix A define a linear transformation by y = A ( x −m x )
  • 37. 13.6 Eigenvector-based transforms  It can be shown that the covariance matrix of the vector y is C y = AC x A′ . Since the matrix Cy  y is a diagonal matrix, its off-diagonal elements are zero, the element of are uncorrelated. Thus the linear transformation remove the correlation among the variables. The reverse transform = A′y +m x = A −1y +m can reconstruct x from y.
  • 38. 13.6 Dimension Reduction We can reduce the dimensionality of the y vector by ignoring one or more of the eigenvectors that have small eigenvalues. Let B be the M-by-N matrix (M<N) formed by discarding the lower N-M rows of A, and let mx=0 for simplicity, then the transformed vector has smaller dimension ˆ y = Bx
  • 39. 13.6 Dimension Reduction The vector x can be reconstructed(approximately) by ˆ ˆ x = B′y The mean square error is MSE = N ∑λ k =M + 1 k ˆ The vector y is called the principal component of the vector x.
  • 40. 13.6.3 The Karhunen-Loeve Transform The K-L transform is defined as y = A(x − m x ) The dimension-reducing capability of the K-L transform makes it quite useful for image compression. When the image is a first-order Markov process, where the correlation between pixels decreases linearly with their separation distance, the basis images for the K-L transform can be written explicitly.
  • 41. 13.6.3 The Karhunen-Loeve Transform When the correlation between adjacent pixels approaches unity, the K-L basis functions approach those of discrete cosine transform. Thus, DCT is a good approximation for the K-L transform.
  • 42. 13.6.4 The SVD Transform Singular value decomposition Any N-by-N matrix A can be decomposed as A = UΛV t t where the columns of U and V are the eigenvectors of AA andA , respectively. is an N-by-N At Λ diagonal matrix containing the singular values of A.  The forward singular value decomposition(SVD) transform Λ = U t AV   The inverse SVD transform A = UΛV t
  • 43. 13.6.4 The SVD transform For SVD transform, the kernel matrices are image-dependent. The SVD has a very high power of image compression, we can get lossless compression by at least a factor of N. and even higher lossy compression ratio by ignoring some small singular values may be achieved.
  • 45. 13.7 Transform domain filtering Like in the Fourier transform domain, filter can be designed in other transform domain. Transform domain filtering involves modification of the weighting coefficients prior to reconstruction of the image via the inverse transform. If either of the desired components or the undesired components of the image resemble one or a few of the basis image of a particular transform, then that transform will be useful in separating the two.
  • 46. 13.7 Transform domain filtering Haar transform is a good candidate for detecting vertical and horizontal lines and edges.
  • 47. 13.7 Transform domain filtering Illustration of fig. 13-8.
  • 48. 13.7 Transform domain filtering Figure 13-9