SlideShare une entreprise Scribd logo
1  sur  57
CS 446:  Machine Learning Gerald DeJong [email_address] 3-0491 3320 SC Recent approval for a TA to be named later
[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Please answer these and hand in now ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Approx. Course Overview / Topics ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
What to Learn ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
What to Learn?  ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Supervised  Learning ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Example and Hypothesis Spaces X H X: Example Space – set of all well-formed inputs [w/a distribution] H: Hypothesis Space – set of all well-formed outputs - - + + + - - - +
Supervised  Learning: Examples ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
y  =  f  (x 1 , x 2 , x 3 , x 4 ) Unknown function x 1 x 2 x 3 x 4 A  Learning Problem X H ? ? (Boolean: x1, x2, x3, x4,  f )
y  =  f  (x 1 , x 2 , x 3 , x 4 ) Unknown function x 1 x 2 x 3 x 4 Training Set Example  x 1   x 2   x 3   x 4  y 1   0  0  1  0  0 3   0  0  1  1  1 4  1  0  0  1  1 5   0  1  1  0  0 6  1  1  0  0  0 7  0  1  0  1  0 2   0  1  0  0  0
Hypothesis Space ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],Example  x 1   x 2   x 3   x 4  y 1  1  1  1  ? 0  0  0  0  ? 1  0  0  0  ? 1  0  1  1  ? 1  1  0  0  0 1  1  0  1  ? 1  0  1  0  ? 1  0  0  1  1 0  1  0  0  0 0  1  0  1  0 0  1  1  0  0 0  1  1  1  ? 0  0  1  1  1 0  0  1  0  0 0  0  0  1  ? 1  1  1  0  ?
Another Hypothesis Space ,[object Object],[object Object],[object Object],1   0  0  1  0  0 3   0  0  1  1  1 4  1  0  0  1  1 5   0  1  1  0  0 6  1  1  0  0  0 7  0  1  0  1  0 2   0  1  0  0  0 y =c  x 1  1100  0 x 2  0100  0 x 3  0110  0 x 4  0101  1 x 1     x 2   1100  0 x 1     x 3  0011  1 x 1     x 4  0011  1 Rule  Counterexample x 2     x 3  0011  1 x 2     x 4  0011  1 x 3     x 4  1001  1 x 1     x 2     x 3  0011 1 x 1     x 2     x 4  0011 1 x 1     x 3     x 4  0011 1 x 2     x 3     x 4  0011 1 x 1     x 2     x 3     x 4  0011 1 Rule  Counterexample
Third Hypothesis Space ,[object Object],[object Object],[object Object],[object Object],1   0  0  1  0  0 3   0  0  1  1  1 4  1  0  0  1  1 5   0  1  1  0  0 6  1  1  0  0  0 7  0  1  0  1  0 2   0  1  0  0  0  x 1    3  -  -  -  x 2    2  -  -  -  x 3    1  -  -  -  x 4    7  -  -  -  x 1, x 2    2  3  -  -  x 1,  x 3    1  3  -  -  x 1,  x 4    6  3  -  -  x 2, x 3    2  3  -  - variables  1 -of  2 -of  3 -of  4 -of  x 2,  x 4    2  3  -  -   x 3,  x 4    4  4  -  -  x 1, x 2,  x 3    1  3  3  -  x 1, x 2,  x 4    2  3  3  -  x 1, x 3, x 4    1           3  -  x 2,  x 3, x 4    1  5  3  -  x 1,  x 2,  x 3, x 4    1  5  3  3 variables  1 -of  2 -of  3 -of  4 -of
Views of Learning ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
General strategy for Machine Learning   ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Terminology ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Key Issues in Machine Learning ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Key Issues in Machine Learning ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Example: Generalization vs Overfitting ,[object Object],[object Object],[object Object],[object Object],[object Object]
Self-organize into Groups of 4 or 5 ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Linear Discriminators ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],Learning Protocol? Supervised? Unsupervised?
What’s Good?  ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Exclusive-OR  (XOR) ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],x 1 x 2
Sometimes Functions Can be Made Linear ,[object Object],[object Object],[object Object],[object Object],y 3   Ç  y 4   Ç  y 7   New discriminator is functionally simpler Weather Whether
[object Object],[object Object],Feature Space x
Blown Up Feature Space ,[object Object],x x 2 Key issue: what features to use.  Computationally, can be done implicitly  (kernels)
A General Framework for Learning ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],Simply: # of mistakes […] is a indicator function
A General Framework for Learning (II) ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Learning as an Optimization Problem ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],A continuous convex  loss function also allows  a conceptually simple  optimization algorithm. f(x) –y
How to Learn?  ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Learning Linear Separators (LTU)  ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],w 
Expressivity  ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],Probabilistic Classifiers as well
Canonical Representation ,[object Object],[object Object],[object Object],[object Object],[object Object]
LMS: An online, local search algorithm ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],w 
LMS: An online, local search algorithm ,[object Object],[object Object],[object Object],[object Object],(i  (subscript) – vector component;  j  (superscript) -  time; d – example #) Assumption:  x  2  R n ;  u  2  R n  is the target weight vector; the target (label) is  t d  = u  ¢  x  Noise has been added; so, possibly, no weight vector is consistent with the data.
Gradient Descent ,[object Object],[object Object],[object Object],E(w) w w 4  w 3  w 2  w 1
[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],Gradient Descent
[object Object],[object Object],Gradient Descent: LMS
Gradient Descent: LMS ,[object Object]
Gradient Descent: LMS ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],Gradient Descent: LMS
[object Object],Incremental Gradient Descent: LMS
Incremental Gradient Descent - LMS  ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Learning Rates and Convergence ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Computational Issues Assume the data is linearly separable. Sample complexity: Suppose we want to ensure that our LTU has an error rate  (on new examples) of less than    with high probability(at least (1-  )) How large must m (the number of examples) be in order to  achieve this? It can be shown that for  n  dimensional problems m = O(1/    [ln(1/   ) + (n+1) ln(1/   ) ]. Computational complexity: What can be said? It can be shown that there exists a polynomial time algorithm for  finding  consistent LTU (by reduction from linear programming).  (On-line algorithms have inverse quadratic dependence on the margin)
Other methods for LTUs ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Summary of LMS algorithms for LTUs Local search:  Begins with initial weight vector. Modifies iteratively to minimize and error function. The error function is  loosely  related to the goal of  minimizing the number of classification errors.  Memory:  The classifier is constructed from the training examples.  The examples can then be discarded. Online or Batch: Both online and batch variants of the algorithms can be used.
Fisher Linear Discriminant ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Fisher Linear Discriminant ,[object Object],[object Object],[object Object],[object Object],[object Object],(all vectors are column vectors)
Finding a Good Direction ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],Want large difference
Finding a Good Direction (2) ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
J as an explicit function of w (1) ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
J as an explicit function of w (2) ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
J as an explicit function of w (3) ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Fisher Linear Discriminant - Summary ,[object Object],[object Object],[object Object],[object Object]
Introduction - Summary ,[object Object],[object Object],[object Object],[object Object]

Contenu connexe

Tendances

Learning
LearningLearning
Learningbutest
 
Langrange Interpolation Polynomials
Langrange Interpolation PolynomialsLangrange Interpolation Polynomials
Langrange Interpolation PolynomialsSohaib H. Khan
 
Numerical Analysis and Epistemology of Information
Numerical Analysis and Epistemology of InformationNumerical Analysis and Epistemology of Information
Numerical Analysis and Epistemology of InformationMarco Benini
 
IN ORDER TO IMPLEMENT A SET OF RULES / TUTORIALOUTLET DOT COM
IN ORDER TO IMPLEMENT A SET OF RULES / TUTORIALOUTLET DOT COMIN ORDER TO IMPLEMENT A SET OF RULES / TUTORIALOUTLET DOT COM
IN ORDER TO IMPLEMENT A SET OF RULES / TUTORIALOUTLET DOT COMjorge0050
 
Using binary classifiers
Using binary classifiersUsing binary classifiers
Using binary classifiersbutest
 
Course Design Best Practices
Course Design Best PracticesCourse Design Best Practices
Course Design Best PracticesKeitaro Matsuoka
 
3_learning.ppt
3_learning.ppt3_learning.ppt
3_learning.pptbutest
 
Math Reviewer - Word Problems in Algebra
Math Reviewer - Word Problems in AlgebraMath Reviewer - Word Problems in Algebra
Math Reviewer - Word Problems in AlgebraGilbert Joseph Abueg
 
Permutations and Combinations IIT JEE+Olympiad Lecture 1
Permutations and Combinations IIT JEE+Olympiad Lecture 1 Permutations and Combinations IIT JEE+Olympiad Lecture 1
Permutations and Combinations IIT JEE+Olympiad Lecture 1 Parth Nandedkar
 
First meeting
First meetingFirst meeting
First meetingbutest
 

Tendances (16)

"Let us talk about output features! by Florence d’Alché-Buc, LTCI & Full Prof...
"Let us talk about output features! by Florence d’Alché-Buc, LTCI & Full Prof..."Let us talk about output features! by Florence d’Alché-Buc, LTCI & Full Prof...
"Let us talk about output features! by Florence d’Alché-Buc, LTCI & Full Prof...
 
.ppt
.ppt.ppt
.ppt
 
Learning
LearningLearning
Learning
 
Langrange Interpolation Polynomials
Langrange Interpolation PolynomialsLangrange Interpolation Polynomials
Langrange Interpolation Polynomials
 
Numerical Analysis and Epistemology of Information
Numerical Analysis and Epistemology of InformationNumerical Analysis and Epistemology of Information
Numerical Analysis and Epistemology of Information
 
IN ORDER TO IMPLEMENT A SET OF RULES / TUTORIALOUTLET DOT COM
IN ORDER TO IMPLEMENT A SET OF RULES / TUTORIALOUTLET DOT COMIN ORDER TO IMPLEMENT A SET OF RULES / TUTORIALOUTLET DOT COM
IN ORDER TO IMPLEMENT A SET OF RULES / TUTORIALOUTLET DOT COM
 
Using binary classifiers
Using binary classifiersUsing binary classifiers
Using binary classifiers
 
Foil
FoilFoil
Foil
 
Foil method
Foil methodFoil method
Foil method
 
Course Design Best Practices
Course Design Best PracticesCourse Design Best Practices
Course Design Best Practices
 
3_learning.ppt
3_learning.ppt3_learning.ppt
3_learning.ppt
 
Lesson 1: Functions
Lesson 1: FunctionsLesson 1: Functions
Lesson 1: Functions
 
Section 1.1 inductive & deductive reasoning
Section 1.1 inductive & deductive reasoningSection 1.1 inductive & deductive reasoning
Section 1.1 inductive & deductive reasoning
 
Math Reviewer - Word Problems in Algebra
Math Reviewer - Word Problems in AlgebraMath Reviewer - Word Problems in Algebra
Math Reviewer - Word Problems in Algebra
 
Permutations and Combinations IIT JEE+Olympiad Lecture 1
Permutations and Combinations IIT JEE+Olympiad Lecture 1 Permutations and Combinations IIT JEE+Olympiad Lecture 1
Permutations and Combinations IIT JEE+Olympiad Lecture 1
 
First meeting
First meetingFirst meeting
First meeting
 

Similaire à ppt

EMBODO LP Grade 11 Anti-derivative of Polynomial Functions .docx
EMBODO LP Grade 11 Anti-derivative of Polynomial Functions .docxEMBODO LP Grade 11 Anti-derivative of Polynomial Functions .docx
EMBODO LP Grade 11 Anti-derivative of Polynomial Functions .docxElton John Embodo
 
Machine Learning and Inductive Inference
Machine Learning and Inductive InferenceMachine Learning and Inductive Inference
Machine Learning and Inductive Inferencebutest
 
MachineLearning.ppt
MachineLearning.pptMachineLearning.ppt
MachineLearning.pptbutest
 
MachineLearning.ppt
MachineLearning.pptMachineLearning.ppt
MachineLearning.pptbutest
 
MachineLearning.ppt
MachineLearning.pptMachineLearning.ppt
MachineLearning.pptbutest
 
Sean Holden (University of Cambridge) - Proving Theorems_ Still A Major Test ...
Sean Holden (University of Cambridge) - Proving Theorems_ Still A Major Test ...Sean Holden (University of Cambridge) - Proving Theorems_ Still A Major Test ...
Sean Holden (University of Cambridge) - Proving Theorems_ Still A Major Test ...Codiax
 
02.03 Artificial Intelligence: Search by Optimization
02.03 Artificial Intelligence: Search by Optimization02.03 Artificial Intelligence: Search by Optimization
02.03 Artificial Intelligence: Search by OptimizationAndres Mendez-Vazquez
 
GDSC SSN - solution Challenge : Fundamentals of Decision Making
GDSC SSN - solution Challenge : Fundamentals of Decision MakingGDSC SSN - solution Challenge : Fundamentals of Decision Making
GDSC SSN - solution Challenge : Fundamentals of Decision MakingGDSCSSN
 
20130928 automated theorem_proving_harrison
20130928 automated theorem_proving_harrison20130928 automated theorem_proving_harrison
20130928 automated theorem_proving_harrisonComputer Science Club
 
Introduction to Machine Learning.
Introduction to Machine Learning.Introduction to Machine Learning.
Introduction to Machine Learning.butest
 
Machine Learning: Foundations Course Number 0368403401
Machine Learning: Foundations Course Number 0368403401Machine Learning: Foundations Course Number 0368403401
Machine Learning: Foundations Course Number 0368403401butest
 
Stochastic Processes Homework Help
Stochastic Processes Homework HelpStochastic Processes Homework Help
Stochastic Processes Homework HelpExcel Homework Help
 
Poggi analytics - ebl - 1
Poggi   analytics - ebl - 1Poggi   analytics - ebl - 1
Poggi analytics - ebl - 1Gaston Liberman
 

Similaire à ppt (20)

EMBODO LP Grade 11 Anti-derivative of Polynomial Functions .docx
EMBODO LP Grade 11 Anti-derivative of Polynomial Functions .docxEMBODO LP Grade 11 Anti-derivative of Polynomial Functions .docx
EMBODO LP Grade 11 Anti-derivative of Polynomial Functions .docx
 
Machine Learning and Inductive Inference
Machine Learning and Inductive InferenceMachine Learning and Inductive Inference
Machine Learning and Inductive Inference
 
MachineLearning.ppt
MachineLearning.pptMachineLearning.ppt
MachineLearning.ppt
 
MachineLearning.ppt
MachineLearning.pptMachineLearning.ppt
MachineLearning.ppt
 
MachineLearning.ppt
MachineLearning.pptMachineLearning.ppt
MachineLearning.ppt
 
Sean Holden (University of Cambridge) - Proving Theorems_ Still A Major Test ...
Sean Holden (University of Cambridge) - Proving Theorems_ Still A Major Test ...Sean Holden (University of Cambridge) - Proving Theorems_ Still A Major Test ...
Sean Holden (University of Cambridge) - Proving Theorems_ Still A Major Test ...
 
02.03 Artificial Intelligence: Search by Optimization
02.03 Artificial Intelligence: Search by Optimization02.03 Artificial Intelligence: Search by Optimization
02.03 Artificial Intelligence: Search by Optimization
 
GDSC SSN - solution Challenge : Fundamentals of Decision Making
GDSC SSN - solution Challenge : Fundamentals of Decision MakingGDSC SSN - solution Challenge : Fundamentals of Decision Making
GDSC SSN - solution Challenge : Fundamentals of Decision Making
 
ppt
pptppt
ppt
 
20130928 automated theorem_proving_harrison
20130928 automated theorem_proving_harrison20130928 automated theorem_proving_harrison
20130928 automated theorem_proving_harrison
 
Introduction to Machine Learning.
Introduction to Machine Learning.Introduction to Machine Learning.
Introduction to Machine Learning.
 
Machine Learning: Foundations Course Number 0368403401
Machine Learning: Foundations Course Number 0368403401Machine Learning: Foundations Course Number 0368403401
Machine Learning: Foundations Course Number 0368403401
 
artficial intelligence
artficial intelligenceartficial intelligence
artficial intelligence
 
Stochastic Processes Homework Help
Stochastic Processes Homework HelpStochastic Processes Homework Help
Stochastic Processes Homework Help
 
Stochastic Processes Homework Help
Stochastic Processes Homework Help Stochastic Processes Homework Help
Stochastic Processes Homework Help
 
S10
S10S10
S10
 
S10
S10S10
S10
 
Poggi analytics - ebl - 1
Poggi   analytics - ebl - 1Poggi   analytics - ebl - 1
Poggi analytics - ebl - 1
 
Problem space
Problem spaceProblem space
Problem space
 
Problem space
Problem spaceProblem space
Problem space
 

Plus de butest

EL MODELO DE NEGOCIO DE YOUTUBE
EL MODELO DE NEGOCIO DE YOUTUBEEL MODELO DE NEGOCIO DE YOUTUBE
EL MODELO DE NEGOCIO DE YOUTUBEbutest
 
1. MPEG I.B.P frame之不同
1. MPEG I.B.P frame之不同1. MPEG I.B.P frame之不同
1. MPEG I.B.P frame之不同butest
 
LESSONS FROM THE MICHAEL JACKSON TRIAL
LESSONS FROM THE MICHAEL JACKSON TRIALLESSONS FROM THE MICHAEL JACKSON TRIAL
LESSONS FROM THE MICHAEL JACKSON TRIALbutest
 
Timeline: The Life of Michael Jackson
Timeline: The Life of Michael JacksonTimeline: The Life of Michael Jackson
Timeline: The Life of Michael Jacksonbutest
 
Popular Reading Last Updated April 1, 2010 Adams, Lorraine The ...
Popular Reading Last Updated April 1, 2010 Adams, Lorraine The ...Popular Reading Last Updated April 1, 2010 Adams, Lorraine The ...
Popular Reading Last Updated April 1, 2010 Adams, Lorraine The ...butest
 
LESSONS FROM THE MICHAEL JACKSON TRIAL
LESSONS FROM THE MICHAEL JACKSON TRIALLESSONS FROM THE MICHAEL JACKSON TRIAL
LESSONS FROM THE MICHAEL JACKSON TRIALbutest
 
Com 380, Summer II
Com 380, Summer IICom 380, Summer II
Com 380, Summer IIbutest
 
The MYnstrel Free Press Volume 2: Economic Struggles, Meet Jazz
The MYnstrel Free Press Volume 2: Economic Struggles, Meet JazzThe MYnstrel Free Press Volume 2: Economic Struggles, Meet Jazz
The MYnstrel Free Press Volume 2: Economic Struggles, Meet Jazzbutest
 
MICHAEL JACKSON.doc
MICHAEL JACKSON.docMICHAEL JACKSON.doc
MICHAEL JACKSON.docbutest
 
Social Networks: Twitter Facebook SL - Slide 1
Social Networks: Twitter Facebook SL - Slide 1Social Networks: Twitter Facebook SL - Slide 1
Social Networks: Twitter Facebook SL - Slide 1butest
 
Facebook
Facebook Facebook
Facebook butest
 
Executive Summary Hare Chevrolet is a General Motors dealership ...
Executive Summary Hare Chevrolet is a General Motors dealership ...Executive Summary Hare Chevrolet is a General Motors dealership ...
Executive Summary Hare Chevrolet is a General Motors dealership ...butest
 
Welcome to the Dougherty County Public Library's Facebook and ...
Welcome to the Dougherty County Public Library's Facebook and ...Welcome to the Dougherty County Public Library's Facebook and ...
Welcome to the Dougherty County Public Library's Facebook and ...butest
 
NEWS ANNOUNCEMENT
NEWS ANNOUNCEMENTNEWS ANNOUNCEMENT
NEWS ANNOUNCEMENTbutest
 
C-2100 Ultra Zoom.doc
C-2100 Ultra Zoom.docC-2100 Ultra Zoom.doc
C-2100 Ultra Zoom.docbutest
 
MAC Printing on ITS Printers.doc.doc
MAC Printing on ITS Printers.doc.docMAC Printing on ITS Printers.doc.doc
MAC Printing on ITS Printers.doc.docbutest
 
Mac OS X Guide.doc
Mac OS X Guide.docMac OS X Guide.doc
Mac OS X Guide.docbutest
 
WEB DESIGN!
WEB DESIGN!WEB DESIGN!
WEB DESIGN!butest
 

Plus de butest (20)

EL MODELO DE NEGOCIO DE YOUTUBE
EL MODELO DE NEGOCIO DE YOUTUBEEL MODELO DE NEGOCIO DE YOUTUBE
EL MODELO DE NEGOCIO DE YOUTUBE
 
1. MPEG I.B.P frame之不同
1. MPEG I.B.P frame之不同1. MPEG I.B.P frame之不同
1. MPEG I.B.P frame之不同
 
LESSONS FROM THE MICHAEL JACKSON TRIAL
LESSONS FROM THE MICHAEL JACKSON TRIALLESSONS FROM THE MICHAEL JACKSON TRIAL
LESSONS FROM THE MICHAEL JACKSON TRIAL
 
Timeline: The Life of Michael Jackson
Timeline: The Life of Michael JacksonTimeline: The Life of Michael Jackson
Timeline: The Life of Michael Jackson
 
Popular Reading Last Updated April 1, 2010 Adams, Lorraine The ...
Popular Reading Last Updated April 1, 2010 Adams, Lorraine The ...Popular Reading Last Updated April 1, 2010 Adams, Lorraine The ...
Popular Reading Last Updated April 1, 2010 Adams, Lorraine The ...
 
LESSONS FROM THE MICHAEL JACKSON TRIAL
LESSONS FROM THE MICHAEL JACKSON TRIALLESSONS FROM THE MICHAEL JACKSON TRIAL
LESSONS FROM THE MICHAEL JACKSON TRIAL
 
Com 380, Summer II
Com 380, Summer IICom 380, Summer II
Com 380, Summer II
 
PPT
PPTPPT
PPT
 
The MYnstrel Free Press Volume 2: Economic Struggles, Meet Jazz
The MYnstrel Free Press Volume 2: Economic Struggles, Meet JazzThe MYnstrel Free Press Volume 2: Economic Struggles, Meet Jazz
The MYnstrel Free Press Volume 2: Economic Struggles, Meet Jazz
 
MICHAEL JACKSON.doc
MICHAEL JACKSON.docMICHAEL JACKSON.doc
MICHAEL JACKSON.doc
 
Social Networks: Twitter Facebook SL - Slide 1
Social Networks: Twitter Facebook SL - Slide 1Social Networks: Twitter Facebook SL - Slide 1
Social Networks: Twitter Facebook SL - Slide 1
 
Facebook
Facebook Facebook
Facebook
 
Executive Summary Hare Chevrolet is a General Motors dealership ...
Executive Summary Hare Chevrolet is a General Motors dealership ...Executive Summary Hare Chevrolet is a General Motors dealership ...
Executive Summary Hare Chevrolet is a General Motors dealership ...
 
Welcome to the Dougherty County Public Library's Facebook and ...
Welcome to the Dougherty County Public Library's Facebook and ...Welcome to the Dougherty County Public Library's Facebook and ...
Welcome to the Dougherty County Public Library's Facebook and ...
 
NEWS ANNOUNCEMENT
NEWS ANNOUNCEMENTNEWS ANNOUNCEMENT
NEWS ANNOUNCEMENT
 
C-2100 Ultra Zoom.doc
C-2100 Ultra Zoom.docC-2100 Ultra Zoom.doc
C-2100 Ultra Zoom.doc
 
MAC Printing on ITS Printers.doc.doc
MAC Printing on ITS Printers.doc.docMAC Printing on ITS Printers.doc.doc
MAC Printing on ITS Printers.doc.doc
 
Mac OS X Guide.doc
Mac OS X Guide.docMac OS X Guide.doc
Mac OS X Guide.doc
 
hier
hierhier
hier
 
WEB DESIGN!
WEB DESIGN!WEB DESIGN!
WEB DESIGN!
 

ppt

  • 1. CS 446: Machine Learning Gerald DeJong [email_address] 3-0491 3320 SC Recent approval for a TA to be named later
  • 2.
  • 3.
  • 4.
  • 5.
  • 6.
  • 7.
  • 8. Example and Hypothesis Spaces X H X: Example Space – set of all well-formed inputs [w/a distribution] H: Hypothesis Space – set of all well-formed outputs - - + + + - - - +
  • 9.
  • 10. y = f (x 1 , x 2 , x 3 , x 4 ) Unknown function x 1 x 2 x 3 x 4 A Learning Problem X H ? ? (Boolean: x1, x2, x3, x4, f )
  • 11. y = f (x 1 , x 2 , x 3 , x 4 ) Unknown function x 1 x 2 x 3 x 4 Training Set Example x 1 x 2 x 3 x 4 y 1 0 0 1 0 0 3 0 0 1 1 1 4 1 0 0 1 1 5 0 1 1 0 0 6 1 1 0 0 0 7 0 1 0 1 0 2 0 1 0 0 0
  • 12.
  • 13.
  • 14.
  • 15.
  • 16.
  • 17.
  • 18.
  • 19.
  • 20.
  • 21.
  • 22.
  • 23.
  • 24.
  • 25.
  • 26.
  • 27.
  • 28.
  • 29.
  • 30.
  • 31.
  • 32.
  • 33.
  • 34.
  • 35.
  • 36.
  • 37.
  • 38.
  • 39.
  • 40.
  • 41.
  • 42.
  • 43.
  • 44.
  • 45.
  • 46. Computational Issues Assume the data is linearly separable. Sample complexity: Suppose we want to ensure that our LTU has an error rate (on new examples) of less than  with high probability(at least (1-  )) How large must m (the number of examples) be in order to achieve this? It can be shown that for n dimensional problems m = O(1/  [ln(1/  ) + (n+1) ln(1/  ) ]. Computational complexity: What can be said? It can be shown that there exists a polynomial time algorithm for finding consistent LTU (by reduction from linear programming). (On-line algorithms have inverse quadratic dependence on the margin)
  • 47.
  • 48. Summary of LMS algorithms for LTUs Local search: Begins with initial weight vector. Modifies iteratively to minimize and error function. The error function is loosely related to the goal of minimizing the number of classification errors. Memory: The classifier is constructed from the training examples. The examples can then be discarded. Online or Batch: Both online and batch variants of the algorithms can be used.
  • 49.
  • 50.
  • 51.
  • 52.
  • 53.
  • 54.
  • 55.
  • 56.
  • 57.

Notes de l'éditeur

  1. As we said, this is the game we are playing; in NLP, it has always been clear, that the raw information In a sentence is not sufficient, as is to represent a good predictor. Better functions of the input were Generated, and learning was done in these terms.
  2. As we said, this is the game we are playing; in NLP, it has always been clear, that the raw information In a sentence is not sufficient, as is to represent a good predictor. Better functions of the input were Generated, and learning was done in these terms.
  3. As we said, this is the game we are playing; in NLP, it has always been clear, that the raw information In a sentence is not sufficient, as is to represent a good predictor. Better functions of the input were Generated, and learning was done in these terms.
  4. Badges game Don’t give me the answer Start thinking about how to write a program that will figure out whether my name has + or – next to it.
  5. Badges game Don’t give me the answer Start thinking about how to write a program that will figure out whether my name has + or – next to it.
  6. Badges game Don’t give me the answer Start thinking about how to write a program that will figure out whether my name has + or – next to it.
  7. Badges game Don’t give me the answer Start thinking about how to write a program that will figure out whether my name has + or – next to it.
  8. Badges game Don’t give me the answer Start thinking about how to write a program that will figure out whether my name has + or – next to it.
  9. Badges game Don’t give me the answer Start thinking about how to write a program that will figure out whether my name has + or – next to it.
  10. Badges game Don’t give me the answer Start thinking about how to write a program that will figure out whether my name has + or – next to it.
  11. Badges game Don’t give me the answer Start thinking about how to write a program that will figure out whether my name has + or – next to it.
  12. Badges game Don’t give me the answer Start thinking about how to write a program that will figure out whether my name has + or – next to it.
  13. Badges game Don’t give me the answer Start thinking about how to write a program that will figure out whether my name has + or – next to it.
  14. Badges game Don’t give me the answer Start thinking about how to write a program that will figure out whether my name has + or – next to it.
  15. Badges game Don’t give me the answer Start thinking about how to write a program that will figure out whether my name has + or – next to it.
  16. As we said, this is the game we are playing; in NLP, it has always been clear, that the raw information In a sentence is not sufficient, as is to represent a good predictor. Better functions of the input were Generated, and learning was done in these terms.
  17. As we said, this is the game we are playing; in NLP, it has always been clear, that the raw information In a sentence is not sufficient, as is to represent a good predictor. Better functions of the input were Generated, and learning was done in these terms.
  18. As we said, this is the game we are playing; in NLP, it has always been clear, that the raw information In a sentence is not sufficient, as is to represent a good predictor. Better functions of the input were Generated, and learning was done in these terms.
  19. As we said, this is the game we are playing; in NLP, it has always been clear, that the raw information In a sentence is not sufficient, as is to represent a good predictor. Better functions of the input were Generated, and learning was done in these terms.
  20. As we said, this is the game we are playing; in NLP, it has always been clear, that the raw information In a sentence is not sufficient, as is to represent a good predictor. Better functions of the input were Generated, and learning was done in these terms.
  21. As we said, this is the game we are playing; in NLP, it has always been clear, that the raw information In a sentence is not sufficient, as is to represent a good predictor. Better functions of the input were Generated, and learning was done in these terms.
  22. As we said, this is the game we are playing; in NLP, it has always been clear, that the raw information In a sentence is not sufficient, as is to represent a good predictor. Better functions of the input were Generated, and learning was done in these terms.
  23. Good treatment in Bishop, Chp 3 Classic Weiner filtering solution; text omits 0.5 factor; In any case we use the gradient and eta (text) or R (these notes) to modulate the step size
  24. Badges game Don’t give me the answer Start thinking about how to write a program that will figure out whether my name has + or – next to it.
  25. Badges game Don’t give me the answer Start thinking about how to write a program that will figure out whether my name has + or – next to it.
  26. Badges game Don’t give me the answer Start thinking about how to write a program that will figure out whether my name has + or – next to it.
  27. Badges game Don’t give me the answer Start thinking about how to write a program that will figure out whether my name has + or – next to it.
  28. Badges game Don’t give me the answer Start thinking about how to write a program that will figure out whether my name has + or – next to it.
  29. Badges game Don’t give me the answer Start thinking about how to write a program that will figure out whether my name has + or – next to it.
  30. Badges game Don’t give me the answer Start thinking about how to write a program that will figure out whether my name has + or – next to it.
  31. Badges game Don’t give me the answer Start thinking about how to write a program that will figure out whether my name has + or – next to it.
  32. Badges game Don’t give me the answer Start thinking about how to write a program that will figure out whether my name has + or – next to it.