SlideShare a Scribd company logo
1 of 16
Vector Quantization

  Aniruddh Tyagi
     02-06-12
Voronoi Region
• Blocks:
   o   A sequence of audio.
   o   A block of image pixels.
       Formally: vector example: (0.2, 0.3, 0.5, 0.1)
• A vector quantizer maps k-dimensional vectors in the vector
  space Rk into a finite set of vectors Y = {yi: i = 1, 2, ..., N}. Each
  vector yi is called a code vector or a codeword. and the set of all
  the codewords is called a codebook. Associated with each
  codeword, yi, is a nearest neighbor region called Voronoi region,
  and it is defined by:


• The set of Voronoi regions partition the entire space Rk .
Two Dimensional Voronoi Diagram




Codewords in 2-dimensional space. Input vectors are marked with an
x, codewords are marked with red circles, and the Voronoi regions are
separated with boundary lines.
The Schematic of a Vector
Quantizer
Compression Formula
• Amount of compression:
   o Codebook size is K, input vector of dimension L
   o In order to inform the decoder of which code vector is
     selected, we need to use bits.
        E.g. need 8 bits to represent 256 code vectors.
   o Rate: each code vector contains the reconstruction value of L
     source output samples, the number of bits per sample would
     be: .
   o Sample: a scalar value in vector.
   o K: level of vector quantizer.
VQ vs SQ
Advantage of VQ over SQ:
• For a given rate, VQ results in a lower distortion than
  SQ.
• If the source output is correlate, vectors of source
  output values will tend to fall in clusters.
   o   E.g. Sayood’s book Exp 9.3.1
• Even if no dependency: greater flexibility.
   o   E.g. Sayood’s book Exp 9.3.2
Algorithms
• Lloyd algorithm: pdf-optimized
  quantizer, assume that distribution is
  known
• LBG: (VQ)
  o   Continuous (require integral ooperation)
  o   Modified: with training set.
LBG Algorithm
– Determine the number of codewords, N, or the size of the
  codebook.
– Select N codewords at random, and let that be the initial
  codebook. The initial codewords can be randomly chosen from the set
  of input vectors.
– Using the Euclidean distance measure clusterize the vectors
  around each codeword. This is done by taking each input vector and
  finding the Euclidean distance between it and each codeword. The
  input vector belongs to the cluster of the codeword that yields the
  minimum distance.
LBG Algorithm (contd.)
4.Compute the new set of codewords. This is done by obtaining the
average of each cluster. Add the component of each vector and divide by
the number of vectors in the cluster.



where i is the component of each vector (x, y, z, ... directions), m is the
number of vectors in the cluster.

5.Repeat steps 2 and 3 until the either the codewords don't change or
the change in the codewords is small.
Other Algorithms
• Problems: LBG is a greedy algorithm, may fall into
  Local minimum.
• Four methods selecting initial vectors:
  o   Random
  o   Splitting ( with perturbation vector) Animation
  o   Train with different subset
  o   PNN (pairwise nearest neighbor)
• Empty cell problem:
  o   No input correspond to a output vector
  o   Solution: give to other clusters, e.g. most populate cluster.
LBG for image compression
• Taking blocks of images as vector L=NM.
• If K vectors in code book:
   o need to use bits.
   o Rate:
• The higher the value K, the better quality, but lower compression
  ratio.
• Overhead to transmit code book:




• Train with a set of images.
Rate_Dimension Product
• Rate-dimension product
  o The size of the codebook increase exponentially with the rate.
  o Suppose we want to encode a source using R bits/sample. If
    we use an L-d quantizer, we would group L samples together
    into vectors. This means that we would have RL bits available
    to represent wach vector.
  o With RL bits, we can represent 2^(RL) output vectors.
Tree structured VQ
• Set vectors in different quadrant. Only signs of vectors need to
  be compared. Thus reduce the number of comparisons by 2^L
  for L-d vector problem.
• It works well for symmetric distribution. But not when we lose
  more and more symmetry.
Tree Structured Vector Quantizer
•   Extend to non-symmetric case:
    o   Divide the set of output points into two groups, g0 and g1, and assign to
        each group a test vector s.t. output points in each group are closer to test
        vector assigned to that group than to the test vector assigned to the other
        group.
    o   Label the two test vectors 0 and 1.
    o   When we got an input vector, compare it against the test vectors.
        Depending on the outcome, the input is compared to the output points
        associated with the test vector closest to the input.
    o   After these two comparisons, we can discard half of the output points.
    o   Comparison with the test vectors takes the place of looking at the signs of
        the components to decide which set of output points to discard from
        contention.
    o   If the total number of output points is K, we make( K/2)+2 comparisons
        instead of K comparisons.
    o   Can continue to expand the number of groups. Finally: 2logK comparisons
        instead of K.( 2 comparisons with the test vectors and a total of logK stages
Tree Structured VQ (continued)
• Since the test vectors are assigned to groups: 0, 1,
  00,01,10,11,000,001,010,011,100,101,110,111 etc.
  which are the nodes of a binary tree, the VQ has the
  name “Tree Structured VQ”.
• Penalty:
  o Possible increase in distortion: it is possible that at some
    stage the input is closer to one test vector while at the same
    time being closest to an output belonging to the rejected
    group.
  o Increase storage: output points from VQ codebook plus the
    test vectors.
Additional Links
• Slides are adapted from:
http://www.data-compression.com
and
http://www.geocities.com/mohamedqase
m/vectorquantization/vq.html

More Related Content

What's hot

5 linear block codes
5 linear block codes5 linear block codes
5 linear block codesJagruti_Ingale
 
Vector Quantization Vs Scalar Quantization
Vector Quantization Vs Scalar Quantization Vector Quantization Vs Scalar Quantization
Vector Quantization Vs Scalar Quantization ManasiKaur
 
Smoothing in Digital Image Processing
Smoothing in Digital Image ProcessingSmoothing in Digital Image Processing
Smoothing in Digital Image ProcessingPallavi Agarwal
 
Image Representation & Descriptors
Image Representation & DescriptorsImage Representation & Descriptors
Image Representation & DescriptorsPundrikPatel
 
Huffman codes
Huffman codesHuffman codes
Huffman codesNargis Ehsan
 
Huffman and Arithmetic coding - Performance analysis
Huffman and Arithmetic coding - Performance analysisHuffman and Arithmetic coding - Performance analysis
Huffman and Arithmetic coding - Performance analysisRamakant Soni
 
Convolution Codes
Convolution CodesConvolution Codes
Convolution CodesPratishtha Ram
 
Lecture 16 KL Transform in Image Processing
Lecture 16 KL Transform in Image ProcessingLecture 16 KL Transform in Image Processing
Lecture 16 KL Transform in Image ProcessingVARUN KUMAR
 
Image trnsformations
Image trnsformationsImage trnsformations
Image trnsformationsJohn Williams
 
03 Machine Learning Linear Algebra
03 Machine Learning Linear Algebra03 Machine Learning Linear Algebra
03 Machine Learning Linear AlgebraAndres Mendez-Vazquez
 
Back propagation
Back propagationBack propagation
Back propagationNagarajan
 
digital image processing
digital image processingdigital image processing
digital image processingAbinaya B
 
DIGITAL COMMUNICATION: ENCODING AND DECODING OF CYCLIC CODE
DIGITAL COMMUNICATION: ENCODING AND DECODING OF CYCLIC CODE DIGITAL COMMUNICATION: ENCODING AND DECODING OF CYCLIC CODE
DIGITAL COMMUNICATION: ENCODING AND DECODING OF CYCLIC CODE ShivangiSingh241
 
Homomorphic filtering
Homomorphic filteringHomomorphic filtering
Homomorphic filteringPramod Rathore
 
Digital Image Processing - Image Compression
Digital Image Processing - Image CompressionDigital Image Processing - Image Compression
Digital Image Processing - Image CompressionMathankumar S
 
Huffman coding
Huffman coding Huffman coding
Huffman coding Nazmul Hyder
 
Computer Vision - Image Filters
Computer Vision - Image FiltersComputer Vision - Image Filters
Computer Vision - Image FiltersYoss Cohen
 
Vanishing & Exploding Gradients
Vanishing & Exploding GradientsVanishing & Exploding Gradients
Vanishing & Exploding GradientsSiddharth Vij
 
Birch Algorithm With Solved Example
Birch Algorithm With Solved ExampleBirch Algorithm With Solved Example
Birch Algorithm With Solved Examplekailash shaw
 

What's hot (20)

5 linear block codes
5 linear block codes5 linear block codes
5 linear block codes
 
Vector Quantization Vs Scalar Quantization
Vector Quantization Vs Scalar Quantization Vector Quantization Vs Scalar Quantization
Vector Quantization Vs Scalar Quantization
 
Smoothing in Digital Image Processing
Smoothing in Digital Image ProcessingSmoothing in Digital Image Processing
Smoothing in Digital Image Processing
 
Image Representation & Descriptors
Image Representation & DescriptorsImage Representation & Descriptors
Image Representation & Descriptors
 
Huffman codes
Huffman codesHuffman codes
Huffman codes
 
Huffman and Arithmetic coding - Performance analysis
Huffman and Arithmetic coding - Performance analysisHuffman and Arithmetic coding - Performance analysis
Huffman and Arithmetic coding - Performance analysis
 
Convolution Codes
Convolution CodesConvolution Codes
Convolution Codes
 
Lecture 16 KL Transform in Image Processing
Lecture 16 KL Transform in Image ProcessingLecture 16 KL Transform in Image Processing
Lecture 16 KL Transform in Image Processing
 
Image trnsformations
Image trnsformationsImage trnsformations
Image trnsformations
 
03 Machine Learning Linear Algebra
03 Machine Learning Linear Algebra03 Machine Learning Linear Algebra
03 Machine Learning Linear Algebra
 
Back propagation
Back propagationBack propagation
Back propagation
 
digital image processing
digital image processingdigital image processing
digital image processing
 
DIGITAL COMMUNICATION: ENCODING AND DECODING OF CYCLIC CODE
DIGITAL COMMUNICATION: ENCODING AND DECODING OF CYCLIC CODE DIGITAL COMMUNICATION: ENCODING AND DECODING OF CYCLIC CODE
DIGITAL COMMUNICATION: ENCODING AND DECODING OF CYCLIC CODE
 
Homomorphic filtering
Homomorphic filteringHomomorphic filtering
Homomorphic filtering
 
Digital Image Processing - Image Compression
Digital Image Processing - Image CompressionDigital Image Processing - Image Compression
Digital Image Processing - Image Compression
 
Huffman coding
Huffman coding Huffman coding
Huffman coding
 
Computer Vision - Image Filters
Computer Vision - Image FiltersComputer Vision - Image Filters
Computer Vision - Image Filters
 
Admission control
Admission controlAdmission control
Admission control
 
Vanishing & Exploding Gradients
Vanishing & Exploding GradientsVanishing & Exploding Gradients
Vanishing & Exploding Gradients
 
Birch Algorithm With Solved Example
Birch Algorithm With Solved ExampleBirch Algorithm With Solved Example
Birch Algorithm With Solved Example
 

Similar to vector QUANTIZATION

Multimedia lossy compression algorithms
Multimedia lossy compression algorithmsMultimedia lossy compression algorithms
Multimedia lossy compression algorithmsMazin Alwaaly
 
Performance of-block-codes-using-the-eigenstructure-of-the-code-correlation-m...
Performance of-block-codes-using-the-eigenstructure-of-the-code-correlation-m...Performance of-block-codes-using-the-eigenstructure-of-the-code-correlation-m...
Performance of-block-codes-using-the-eigenstructure-of-the-code-correlation-m...Cemal Ardil
 
machine learning.pptx
machine learning.pptxmachine learning.pptx
machine learning.pptxAbdusSadik
 
Kohonen self organizing maps
Kohonen self organizing mapsKohonen self organizing maps
Kohonen self organizing mapsraphaelkiminya
 
Fundamentals of Machine Learning.pptx
Fundamentals of Machine Learning.pptxFundamentals of Machine Learning.pptx
Fundamentals of Machine Learning.pptxWiamFADEL
 
Sess07 Clustering02_KohonenNet.pptx
Sess07 Clustering02_KohonenNet.pptxSess07 Clustering02_KohonenNet.pptx
Sess07 Clustering02_KohonenNet.pptxSarthakKabi2
 
Deep Learning Bangalore meet up
Deep Learning Bangalore meet up Deep Learning Bangalore meet up
Deep Learning Bangalore meet up Satyam Saxena
 
DLBLR talk
DLBLR talkDLBLR talk
DLBLR talkAnuj Gupta
 
Convolutional Error Control Coding
Convolutional Error Control CodingConvolutional Error Control Coding
Convolutional Error Control CodingMohammed Abuibaid
 
IEEE DSP Workshop 2011
IEEE DSP Workshop 2011IEEE DSP Workshop 2011
IEEE DSP Workshop 2011Behzad Dogahe
 
IEEE DSP Workshop 2011
IEEE DSP Workshop 2011IEEE DSP Workshop 2011
IEEE DSP Workshop 2011dogahe
 
Chapter 10: Error Correction and Detection
Chapter 10: Error Correction and DetectionChapter 10: Error Correction and Detection
Chapter 10: Error Correction and DetectionJeoffnaRuth
 
Dynamic time wrapping (dtw), vector quantization(vq), linear predictive codin...
Dynamic time wrapping (dtw), vector quantization(vq), linear predictive codin...Dynamic time wrapping (dtw), vector quantization(vq), linear predictive codin...
Dynamic time wrapping (dtw), vector quantization(vq), linear predictive codin...Tanjarul Islam Mishu
 
Deep learning book_chap_02
Deep learning book_chap_02Deep learning book_chap_02
Deep learning book_chap_02HyeongGooKang
 
10 Backpropagation Algorithm for Neural Networks (1).pptx
10 Backpropagation Algorithm for Neural Networks (1).pptx10 Backpropagation Algorithm for Neural Networks (1).pptx
10 Backpropagation Algorithm for Neural Networks (1).pptxSaifKhan703888
 
MDCT audio coding with pulse vector quantizers
MDCT audio coding with pulse vector quantizersMDCT audio coding with pulse vector quantizers
MDCT audio coding with pulse vector quantizersEricsson
 
7 convolutional codes
7 convolutional codes7 convolutional codes
7 convolutional codesVarun Raj
 
CNN for modeling sentence
CNN for modeling sentenceCNN for modeling sentence
CNN for modeling sentenceANISH BHANUSHALI
 
Bca1040 digital logic
Bca1040  digital logicBca1040  digital logic
Bca1040 digital logicsmumbahelp
 
Digital Communication: Channel Coding
Digital Communication: Channel CodingDigital Communication: Channel Coding
Digital Communication: Channel CodingDr. Sanjay M. Gulhane
 

Similar to vector QUANTIZATION (20)

Multimedia lossy compression algorithms
Multimedia lossy compression algorithmsMultimedia lossy compression algorithms
Multimedia lossy compression algorithms
 
Performance of-block-codes-using-the-eigenstructure-of-the-code-correlation-m...
Performance of-block-codes-using-the-eigenstructure-of-the-code-correlation-m...Performance of-block-codes-using-the-eigenstructure-of-the-code-correlation-m...
Performance of-block-codes-using-the-eigenstructure-of-the-code-correlation-m...
 
machine learning.pptx
machine learning.pptxmachine learning.pptx
machine learning.pptx
 
Kohonen self organizing maps
Kohonen self organizing mapsKohonen self organizing maps
Kohonen self organizing maps
 
Fundamentals of Machine Learning.pptx
Fundamentals of Machine Learning.pptxFundamentals of Machine Learning.pptx
Fundamentals of Machine Learning.pptx
 
Sess07 Clustering02_KohonenNet.pptx
Sess07 Clustering02_KohonenNet.pptxSess07 Clustering02_KohonenNet.pptx
Sess07 Clustering02_KohonenNet.pptx
 
Deep Learning Bangalore meet up
Deep Learning Bangalore meet up Deep Learning Bangalore meet up
Deep Learning Bangalore meet up
 
DLBLR talk
DLBLR talkDLBLR talk
DLBLR talk
 
Convolutional Error Control Coding
Convolutional Error Control CodingConvolutional Error Control Coding
Convolutional Error Control Coding
 
IEEE DSP Workshop 2011
IEEE DSP Workshop 2011IEEE DSP Workshop 2011
IEEE DSP Workshop 2011
 
IEEE DSP Workshop 2011
IEEE DSP Workshop 2011IEEE DSP Workshop 2011
IEEE DSP Workshop 2011
 
Chapter 10: Error Correction and Detection
Chapter 10: Error Correction and DetectionChapter 10: Error Correction and Detection
Chapter 10: Error Correction and Detection
 
Dynamic time wrapping (dtw), vector quantization(vq), linear predictive codin...
Dynamic time wrapping (dtw), vector quantization(vq), linear predictive codin...Dynamic time wrapping (dtw), vector quantization(vq), linear predictive codin...
Dynamic time wrapping (dtw), vector quantization(vq), linear predictive codin...
 
Deep learning book_chap_02
Deep learning book_chap_02Deep learning book_chap_02
Deep learning book_chap_02
 
10 Backpropagation Algorithm for Neural Networks (1).pptx
10 Backpropagation Algorithm for Neural Networks (1).pptx10 Backpropagation Algorithm for Neural Networks (1).pptx
10 Backpropagation Algorithm for Neural Networks (1).pptx
 
MDCT audio coding with pulse vector quantizers
MDCT audio coding with pulse vector quantizersMDCT audio coding with pulse vector quantizers
MDCT audio coding with pulse vector quantizers
 
7 convolutional codes
7 convolutional codes7 convolutional codes
7 convolutional codes
 
CNN for modeling sentence
CNN for modeling sentenceCNN for modeling sentence
CNN for modeling sentence
 
Bca1040 digital logic
Bca1040  digital logicBca1040  digital logic
Bca1040 digital logic
 
Digital Communication: Channel Coding
Digital Communication: Channel CodingDigital Communication: Channel Coding
Digital Communication: Channel Coding
 

More from aniruddh Tyagi

whitepaper_mpeg-if_understanding_mpeg4
whitepaper_mpeg-if_understanding_mpeg4whitepaper_mpeg-if_understanding_mpeg4
whitepaper_mpeg-if_understanding_mpeg4aniruddh Tyagi
 
BUC BLOCK UP CONVERTER
BUC BLOCK UP CONVERTERBUC BLOCK UP CONVERTER
BUC BLOCK UP CONVERTERaniruddh Tyagi
 
digital_set_top_box2
digital_set_top_box2digital_set_top_box2
digital_set_top_box2aniruddh Tyagi
 
Discrete cosine transform
Discrete cosine transformDiscrete cosine transform
Discrete cosine transformaniruddh Tyagi
 
EBU_DVB_S2 READY TO LIFT OFF
EBU_DVB_S2 READY TO LIFT OFFEBU_DVB_S2 READY TO LIFT OFF
EBU_DVB_S2 READY TO LIFT OFFaniruddh Tyagi
 
ADVANCED DVB-C,DVB-S STB DEMOD
ADVANCED DVB-C,DVB-S STB DEMODADVANCED DVB-C,DVB-S STB DEMOD
ADVANCED DVB-C,DVB-S STB DEMODaniruddh Tyagi
 
haffman coding DCT transform
haffman coding DCT transformhaffman coding DCT transform
haffman coding DCT transformaniruddh Tyagi
 
quantization_PCM
quantization_PCMquantization_PCM
quantization_PCManiruddh Tyagi
 
ECMG & EMMG protocol
ECMG & EMMG protocolECMG & EMMG protocol
ECMG & EMMG protocolaniruddh Tyagi
 
fundamentals_satellite_communication_part_1
fundamentals_satellite_communication_part_1fundamentals_satellite_communication_part_1
fundamentals_satellite_communication_part_1aniruddh Tyagi
 
art_sklar7_reed-solomon
art_sklar7_reed-solomonart_sklar7_reed-solomon
art_sklar7_reed-solomonaniruddh Tyagi
 

More from aniruddh Tyagi (20)

whitepaper_mpeg-if_understanding_mpeg4
whitepaper_mpeg-if_understanding_mpeg4whitepaper_mpeg-if_understanding_mpeg4
whitepaper_mpeg-if_understanding_mpeg4
 
BUC BLOCK UP CONVERTER
BUC BLOCK UP CONVERTERBUC BLOCK UP CONVERTER
BUC BLOCK UP CONVERTER
 
digital_set_top_box2
digital_set_top_box2digital_set_top_box2
digital_set_top_box2
 
Discrete cosine transform
Discrete cosine transformDiscrete cosine transform
Discrete cosine transform
 
DCT
DCTDCT
DCT
 
EBU_DVB_S2 READY TO LIFT OFF
EBU_DVB_S2 READY TO LIFT OFFEBU_DVB_S2 READY TO LIFT OFF
EBU_DVB_S2 READY TO LIFT OFF
 
ADVANCED DVB-C,DVB-S STB DEMOD
ADVANCED DVB-C,DVB-S STB DEMODADVANCED DVB-C,DVB-S STB DEMOD
ADVANCED DVB-C,DVB-S STB DEMOD
 
DVB_Arch
DVB_ArchDVB_Arch
DVB_Arch
 
haffman coding DCT transform
haffman coding DCT transformhaffman coding DCT transform
haffman coding DCT transform
 
Classification
ClassificationClassification
Classification
 
tyagi 's doc
tyagi 's doctyagi 's doc
tyagi 's doc
 
quantization_PCM
quantization_PCMquantization_PCM
quantization_PCM
 
ECMG & EMMG protocol
ECMG & EMMG protocolECMG & EMMG protocol
ECMG & EMMG protocol
 
7015567A
7015567A7015567A
7015567A
 
Basic of BISS
Basic of BISSBasic of BISS
Basic of BISS
 
euler theorm
euler theormeuler theorm
euler theorm
 
fundamentals_satellite_communication_part_1
fundamentals_satellite_communication_part_1fundamentals_satellite_communication_part_1
fundamentals_satellite_communication_part_1
 
quantization
quantizationquantization
quantization
 
art_sklar7_reed-solomon
art_sklar7_reed-solomonart_sklar7_reed-solomon
art_sklar7_reed-solomon
 
DVBSimulcrypt2
DVBSimulcrypt2DVBSimulcrypt2
DVBSimulcrypt2
 

Recently uploaded

Streamlining Python Development: A Guide to a Modern Project Setup
Streamlining Python Development: A Guide to a Modern Project SetupStreamlining Python Development: A Guide to a Modern Project Setup
Streamlining Python Development: A Guide to a Modern Project SetupFlorian Wilhelm
 
Are Multi-Cloud and Serverless Good or Bad?
Are Multi-Cloud and Serverless Good or Bad?Are Multi-Cloud and Serverless Good or Bad?
Are Multi-Cloud and Serverless Good or Bad?Mattias Andersson
 
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhi
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | DelhiFULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhi
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhisoniya singh
 
The Codex of Business Writing Software for Real-World Solutions 2.pptx
The Codex of Business Writing Software for Real-World Solutions 2.pptxThe Codex of Business Writing Software for Real-World Solutions 2.pptx
The Codex of Business Writing Software for Real-World Solutions 2.pptxMalak Abu Hammad
 
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024BookNet Canada
 
Swan(sea) Song – personal research during my six years at Swansea ... and bey...
Swan(sea) Song – personal research during my six years at Swansea ... and bey...Swan(sea) Song – personal research during my six years at Swansea ... and bey...
Swan(sea) Song – personal research during my six years at Swansea ... and bey...Alan Dix
 
My Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 PresentationMy Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 PresentationRidwan Fadjar
 
How to convert PDF to text with Nanonets
How to convert PDF to text with NanonetsHow to convert PDF to text with Nanonets
How to convert PDF to text with Nanonetsnaman860154
 
New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024BookNet Canada
 
Transcript: New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024Transcript: New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024BookNet Canada
 
"LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks...
"LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks..."LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks...
"LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks...Fwdays
 
08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking Men08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking MenDelhi Call girls
 
APIForce Zurich 5 April Automation LPDG
APIForce Zurich 5 April  Automation LPDGAPIForce Zurich 5 April  Automation LPDG
APIForce Zurich 5 April Automation LPDGMarianaLemus7
 
Artificial intelligence in the post-deep learning era
Artificial intelligence in the post-deep learning eraArtificial intelligence in the post-deep learning era
Artificial intelligence in the post-deep learning eraDeakin University
 
Install Stable Diffusion in windows machine
Install Stable Diffusion in windows machineInstall Stable Diffusion in windows machine
Install Stable Diffusion in windows machinePadma Pradeep
 
SIEMENS: RAPUNZEL – A Tale About Knowledge Graph
SIEMENS: RAPUNZEL – A Tale About Knowledge GraphSIEMENS: RAPUNZEL – A Tale About Knowledge Graph
SIEMENS: RAPUNZEL – A Tale About Knowledge GraphNeo4j
 
Maximizing Board Effectiveness 2024 Webinar.pptx
Maximizing Board Effectiveness 2024 Webinar.pptxMaximizing Board Effectiveness 2024 Webinar.pptx
Maximizing Board Effectiveness 2024 Webinar.pptxOnBoard
 
Benefits Of Flutter Compared To Other Frameworks
Benefits Of Flutter Compared To Other FrameworksBenefits Of Flutter Compared To Other Frameworks
Benefits Of Flutter Compared To Other FrameworksSoftradix Technologies
 
Advanced Test Driven-Development @ php[tek] 2024
Advanced Test Driven-Development @ php[tek] 2024Advanced Test Driven-Development @ php[tek] 2024
Advanced Test Driven-Development @ php[tek] 2024Scott Keck-Warren
 

Recently uploaded (20)

Streamlining Python Development: A Guide to a Modern Project Setup
Streamlining Python Development: A Guide to a Modern Project SetupStreamlining Python Development: A Guide to a Modern Project Setup
Streamlining Python Development: A Guide to a Modern Project Setup
 
Are Multi-Cloud and Serverless Good or Bad?
Are Multi-Cloud and Serverless Good or Bad?Are Multi-Cloud and Serverless Good or Bad?
Are Multi-Cloud and Serverless Good or Bad?
 
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhi
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | DelhiFULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhi
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhi
 
The Codex of Business Writing Software for Real-World Solutions 2.pptx
The Codex of Business Writing Software for Real-World Solutions 2.pptxThe Codex of Business Writing Software for Real-World Solutions 2.pptx
The Codex of Business Writing Software for Real-World Solutions 2.pptx
 
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
 
Swan(sea) Song – personal research during my six years at Swansea ... and bey...
Swan(sea) Song – personal research during my six years at Swansea ... and bey...Swan(sea) Song – personal research during my six years at Swansea ... and bey...
Swan(sea) Song – personal research during my six years at Swansea ... and bey...
 
My Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 PresentationMy Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 Presentation
 
How to convert PDF to text with Nanonets
How to convert PDF to text with NanonetsHow to convert PDF to text with Nanonets
How to convert PDF to text with Nanonets
 
New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
 
Transcript: New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024Transcript: New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
 
The transition to renewables in India.pdf
The transition to renewables in India.pdfThe transition to renewables in India.pdf
The transition to renewables in India.pdf
 
"LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks...
"LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks..."LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks...
"LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks...
 
08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking Men08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking Men
 
APIForce Zurich 5 April Automation LPDG
APIForce Zurich 5 April  Automation LPDGAPIForce Zurich 5 April  Automation LPDG
APIForce Zurich 5 April Automation LPDG
 
Artificial intelligence in the post-deep learning era
Artificial intelligence in the post-deep learning eraArtificial intelligence in the post-deep learning era
Artificial intelligence in the post-deep learning era
 
Install Stable Diffusion in windows machine
Install Stable Diffusion in windows machineInstall Stable Diffusion in windows machine
Install Stable Diffusion in windows machine
 
SIEMENS: RAPUNZEL – A Tale About Knowledge Graph
SIEMENS: RAPUNZEL – A Tale About Knowledge GraphSIEMENS: RAPUNZEL – A Tale About Knowledge Graph
SIEMENS: RAPUNZEL – A Tale About Knowledge Graph
 
Maximizing Board Effectiveness 2024 Webinar.pptx
Maximizing Board Effectiveness 2024 Webinar.pptxMaximizing Board Effectiveness 2024 Webinar.pptx
Maximizing Board Effectiveness 2024 Webinar.pptx
 
Benefits Of Flutter Compared To Other Frameworks
Benefits Of Flutter Compared To Other FrameworksBenefits Of Flutter Compared To Other Frameworks
Benefits Of Flutter Compared To Other Frameworks
 
Advanced Test Driven-Development @ php[tek] 2024
Advanced Test Driven-Development @ php[tek] 2024Advanced Test Driven-Development @ php[tek] 2024
Advanced Test Driven-Development @ php[tek] 2024
 

vector QUANTIZATION

  • 1. Vector Quantization Aniruddh Tyagi 02-06-12
  • 2. Voronoi Region • Blocks: o A sequence of audio. o A block of image pixels. Formally: vector example: (0.2, 0.3, 0.5, 0.1) • A vector quantizer maps k-dimensional vectors in the vector space Rk into a finite set of vectors Y = {yi: i = 1, 2, ..., N}. Each vector yi is called a code vector or a codeword. and the set of all the codewords is called a codebook. Associated with each codeword, yi, is a nearest neighbor region called Voronoi region, and it is defined by: • The set of Voronoi regions partition the entire space Rk .
  • 3. Two Dimensional Voronoi Diagram Codewords in 2-dimensional space. Input vectors are marked with an x, codewords are marked with red circles, and the Voronoi regions are separated with boundary lines.
  • 4. The Schematic of a Vector Quantizer
  • 5. Compression Formula • Amount of compression: o Codebook size is K, input vector of dimension L o In order to inform the decoder of which code vector is selected, we need to use bits.  E.g. need 8 bits to represent 256 code vectors. o Rate: each code vector contains the reconstruction value of L source output samples, the number of bits per sample would be: . o Sample: a scalar value in vector. o K: level of vector quantizer.
  • 6. VQ vs SQ Advantage of VQ over SQ: • For a given rate, VQ results in a lower distortion than SQ. • If the source output is correlate, vectors of source output values will tend to fall in clusters. o E.g. Sayood’s book Exp 9.3.1 • Even if no dependency: greater flexibility. o E.g. Sayood’s book Exp 9.3.2
  • 7. Algorithms • Lloyd algorithm: pdf-optimized quantizer, assume that distribution is known • LBG: (VQ) o Continuous (require integral ooperation) o Modified: with training set.
  • 8. LBG Algorithm – Determine the number of codewords, N, or the size of the codebook. – Select N codewords at random, and let that be the initial codebook. The initial codewords can be randomly chosen from the set of input vectors. – Using the Euclidean distance measure clusterize the vectors around each codeword. This is done by taking each input vector and finding the Euclidean distance between it and each codeword. The input vector belongs to the cluster of the codeword that yields the minimum distance.
  • 9. LBG Algorithm (contd.) 4.Compute the new set of codewords. This is done by obtaining the average of each cluster. Add the component of each vector and divide by the number of vectors in the cluster. where i is the component of each vector (x, y, z, ... directions), m is the number of vectors in the cluster. 5.Repeat steps 2 and 3 until the either the codewords don't change or the change in the codewords is small.
  • 10. Other Algorithms • Problems: LBG is a greedy algorithm, may fall into Local minimum. • Four methods selecting initial vectors: o Random o Splitting ( with perturbation vector) Animation o Train with different subset o PNN (pairwise nearest neighbor) • Empty cell problem: o No input correspond to a output vector o Solution: give to other clusters, e.g. most populate cluster.
  • 11. LBG for image compression • Taking blocks of images as vector L=NM. • If K vectors in code book: o need to use bits. o Rate: • The higher the value K, the better quality, but lower compression ratio. • Overhead to transmit code book: • Train with a set of images.
  • 12. Rate_Dimension Product • Rate-dimension product o The size of the codebook increase exponentially with the rate. o Suppose we want to encode a source using R bits/sample. If we use an L-d quantizer, we would group L samples together into vectors. This means that we would have RL bits available to represent wach vector. o With RL bits, we can represent 2^(RL) output vectors.
  • 13. Tree structured VQ • Set vectors in different quadrant. Only signs of vectors need to be compared. Thus reduce the number of comparisons by 2^L for L-d vector problem. • It works well for symmetric distribution. But not when we lose more and more symmetry.
  • 14. Tree Structured Vector Quantizer • Extend to non-symmetric case: o Divide the set of output points into two groups, g0 and g1, and assign to each group a test vector s.t. output points in each group are closer to test vector assigned to that group than to the test vector assigned to the other group. o Label the two test vectors 0 and 1. o When we got an input vector, compare it against the test vectors. Depending on the outcome, the input is compared to the output points associated with the test vector closest to the input. o After these two comparisons, we can discard half of the output points. o Comparison with the test vectors takes the place of looking at the signs of the components to decide which set of output points to discard from contention. o If the total number of output points is K, we make( K/2)+2 comparisons instead of K comparisons. o Can continue to expand the number of groups. Finally: 2logK comparisons instead of K.( 2 comparisons with the test vectors and a total of logK stages
  • 15. Tree Structured VQ (continued) • Since the test vectors are assigned to groups: 0, 1, 00,01,10,11,000,001,010,011,100,101,110,111 etc. which are the nodes of a binary tree, the VQ has the name “Tree Structured VQ”. • Penalty: o Possible increase in distortion: it is possible that at some stage the input is closer to one test vector while at the same time being closest to an output belonging to the rejected group. o Increase storage: output points from VQ codebook plus the test vectors.
  • 16. Additional Links • Slides are adapted from: http://www.data-compression.com and http://www.geocities.com/mohamedqase m/vectorquantization/vq.html