2. History
Pre 1980s
−
−
−
−
−
−
17. Support Vector Machine
- Almost all learning methods learned linear decision surfaces.
- Linear learning methods have nice theoretical properties
1980s
- Almost all learning methods learned linear decision surfaces.
- Linear learning methods have nice theoretical properties
1990’s
- Efficient learning algorithms for non-linear functions based on computational learning
theory developed
- Nice theoretical properties.
2
3. History
Two independent developments within last decade
– Computational learning theory
– New efficient separability of non-linear functions that use “kernel functions”
−
The resultant learning algorithm is optimization
algorithm rather than a greedy search.
What is greedy search ?
17. Support Vector Machine
3
7. Learning Theory
A system receives data as input.
Output a function that can be predict some features of
future data.
α
x
f
yest
f(x,w,b) = sign(w. x - b)
17. Support Vector Machine
7
8. Features of SVM's
Not affected by local minima.
Do not suffer from the curse of dimensionality.
Have modular design that allows one to separately
implement and design other component.
Various properties of the SVM solution help avoid over
fitting, even in very high dimensional feature spaces
17. Support Vector Machine
8
9. Support Vectors
Support vectors are data points that lie closes to the
decision surface.
But they are difficult to classify.
They have direct bearing of optimum location on the
surface.
17. Support Vector Machine
9
10. Vector Space – Primer
d1 and d2 are 2 vectors. And sum of their distance is
d1+d2=q.
d1=2x+5y and d2=3x+2y
q=d1+d2=5x+7y
17. Support Vector Machine
10
11. Kernal – Primer
Computing the inner products between the vectors
in the featured space.
17. Support Vector Machine
11
18. Classifier Margin
denotes +1
Define the margin of
a linear classifier as
the width that the
boundary could be
increased by before
hitting a datapoint.
denotes -1
17. Support Vector Machine
18
19. Maximum Margin
denotes +1
The maximum
margin linear
classifier is the linear
classifier with the
maximum margin.
denotes -1
This is the simplest
kind of SVM (Called
an LSVM)
17. Support Vector Machine
19
Linear SVM
25. License Plate Recognition
∗ Pre-process the image of number plate.
∗ Segment the image into several parts of which each
contains only a single character.
17. Support Vector Machine
25
26. License Plate Recognition
∗ Extract the feature vector of each normalized
candidate
∗ Recognizes the single character (a digit or a letter) by
the set of SVMs trained in advance.
17. Support Vector Machine
26
31. License Plate Recognition
∗ If there are no more unclassified samples, then STOP.
Otherwise, then repeat the process of recognition of
character.
∗ Add these test samples into their corresponding
database for further training.
∗ Recognize number plate by bringing all characters
used together
17. Support Vector Machine
31
32. Conclusion
SVM is widely used as classify spam detection in the
market.
It supports for Linear and Non-Linear spectrum.
17. Support Vector Machine
32