SlideShare une entreprise Scribd logo
1  sur  73
04/26/10 Intelligent Systems and Soft Computing Lecture 7  Artificial neural networks:  Supervised learning ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
04/26/10 Intelligent Systems and Soft Computing Introduction, or how the brain works Machine learning involves adaptive mechanisms that enable computers to learn from experience, learn by example and learn by analogy. Learning  capabilities can improve the performance of an intelligent system over time. The most popular approaches to machine learning are  artificial neural networks  and  genetic algorithms . This lecture is dedicated to neural networks.
04/26/10 Intelligent Systems and Soft Computing ,[object Object],[object Object]
04/26/10 Intelligent Systems and Soft Computing ,[object Object],[object Object]
04/26/10 Intelligent Systems and Soft Computing Biological neural network
04/26/10 Intelligent Systems and Soft Computing ,[object Object],[object Object],[object Object]
04/26/10 Intelligent Systems and Soft Computing ,[object Object],[object Object],[object Object]
04/26/10 Intelligent Systems and Soft Computing Architecture of a typical artificial neural network I n p u t  S i g n a l s O u t p u t  S i g n a l s
04/26/10 Intelligent Systems and Soft Computing Analogy between biological and artificial neural networks
04/26/10 Intelligent Systems and Soft Computing The neuron as a simple computing element Diagram of a   neuron
04/26/10 Intelligent Systems and Soft Computing ,[object Object],[object Object],[object Object]
04/26/10 Intelligent Systems and Soft Computing Activation functions of a neuron
04/26/10 Intelligent Systems and Soft Computing Can a single neuron learn a task? ,[object Object],[object Object]
04/26/10 Intelligent Systems and Soft Computing Single-layer two-input perceptron
04/26/10 Intelligent Systems and Soft Computing The   Perceptron ,[object Object],[object Object]
04/26/10 Intelligent Systems and Soft Computing ,[object Object],[object Object]
04/26/10 Intelligent Systems and Soft Computing Linear separability in the perceptrons
04/26/10 Intelligent Systems and Soft Computing How does the perceptron learn its classification tasks? This is done by making small adjustments in the weights to reduce the difference between the actual and desired outputs of the perceptron. The initial weights are randomly assigned, usually in the range [  0.5, 0.5], and then updated to obtain the output consistent with the training examples.
04/26/10 Intelligent Systems and Soft Computing ,[object Object],where  p  = 1, 2, 3, . . . ,[object Object],[object Object]
04/26/10 Intelligent Systems and Soft Computing where  p  = 1, 2, 3, . . .   is the  learning rate , a positive constant less than unity. The perceptron learning rule was first proposed by  Rosenblatt  in 1960. Using this rule we can derive the perceptron training algorithm for classification tasks. The perceptron learning rule   
04/26/10 Intelligent Systems and Soft Computing Perceptron’s training algorithm Step 1 : Initialisation  Set initial weights  w 1 ,  w 2 ,…,  w n   and threshold   to random numbers in the range [  0.5, 0.5]. If the error,  e ( p ), is positive, we need to increase perceptron   output  Y ( p ), but if it is negative, we  need to decrease  Y ( p ).
04/26/10 Intelligent Systems and Soft Computing Perceptron’s training algorithm (continued) Step 2 : Activation  Activate the perceptron by applying inputs  x 1 ( p ),  x 2 ( p ),…,  x n ( p ) and desired output  Y d   ( p ).  Calculate the actual output at iteration  p  = 1 where  n  is the number of the perceptron inputs, and  step  is a step activation function.
04/26/10 Intelligent Systems and Soft Computing Perceptron’s training algorithm (continued) Step 3 : Weight training  Update the weights of the perceptron where   w i ( p ) is the weight correction at iteration  p . The weight correction is computed by the  delta rule : Step 4 : Iteration  Increase iteration  p  by one, go back to  Step 2  and repeat the process until convergence. .
04/26/10 Intelligent Systems and Soft Computing Example of perceptron learning: the logical operation  AND
04/26/10 Intelligent Systems and Soft Computing Two-dimensional plots of basic logical operations A perceptron can learn the operations  AND  and  OR , but not  Exclusive-OR . ( a )  AND  ( x 1     x 2 )
04/26/10 Intelligent Systems and Soft Computing Multilayer   neural   networks ,[object Object],[object Object],[object Object]
04/26/10 Intelligent Systems and Soft Computing Multilayer perceptron with two hidden layers I n p u t  S i g n a l s O u t   p u t  S i g n a l s
04/26/10 Intelligent Systems and Soft Computing What does the middle layer hide? ,[object Object],[object Object]
04/26/10 Intelligent Systems and Soft Computing Back-propagation neural network ,[object Object],[object Object],[object Object]
04/26/10 Intelligent Systems and Soft Computing ,[object Object],[object Object],[object Object]
04/26/10 Intelligent Systems and Soft Computing Three-layer back-propagation neural network
04/26/10 Intelligent Systems and Soft Computing The back-propagation training algorithm Step 1 : Initialization  Set all the weights and threshold levels of the network to random numbers uniformly  distributed inside a small range: where  F i   is the total number of inputs of neuron  i  in the network. The weight initialization is done  on a neuron-by-neuron basis.
04/26/10 Intelligent Systems and Soft Computing Step 2 : Activation  Activate the back-propagation neural network by applying inputs  x 1 ( p ),  x 2 ( p ),…,  x n ( p ) and desired outputs  y d ,1 ( p ),  y d ,2 ( p ),…,  y d , n ( p ). ( a ) Calculate the actual outputs of the neurons in the hidden layer: where  n  is the number of inputs of neuron  j  in the hidden layer, and  sigmoid  is the  sigmoid  activation function.
04/26/10 Intelligent Systems and Soft Computing ( b ) Calculate the actual outputs of the neurons in the output layer: Step 2   : Activation (continued) where  m  is the number of inputs of neuron  k  in the output layer.
04/26/10 Intelligent Systems and Soft Computing Step 3 : Weight training  Update the weights in the back-propagation network propagating backward the errors associated with output neurons.  ( a ) Calculate the error gradient for the neurons in the output layer: where Calculate the weight corrections: Update the weights at the output neurons:
04/26/10 Intelligent Systems and Soft Computing ( b ) Calculate the error gradient for the neurons in the hidden layer: Step 3 : Weight training (continued) Calculate the weight corrections: Update the weights at the hidden neurons:
04/26/10 Intelligent Systems and Soft Computing Step 4 : Iteration  Increase iteration  p  by one, go back to  Step 2  and repeat the process until the selected error criterion is satisfied. As an example, we may consider the three-layer back-propagation network. Suppose that the network is required to perform logical operation  Exclusive-OR . Recall that a single-layer perceptron could not do this operation. Now we will apply the three-layer net.
04/26/10 Intelligent Systems and Soft Computing Three-layer network for solving the Exclusive-OR operation
04/26/10 Intelligent Systems and Soft Computing ,[object Object],[object Object]
04/26/10 Intelligent Systems and Soft Computing ,[object Object],[object Object],[object Object]
04/26/10 Intelligent Systems and Soft Computing ,[object Object],[object Object],[object Object]
04/26/10 Intelligent Systems and Soft Computing ,[object Object],[object Object]
04/26/10 Intelligent Systems and Soft Computing ,[object Object],[object Object]
04/26/10 Intelligent Systems and Soft Computing Learning curve for operation  Exclusive-OR Sum-Squared Error
04/26/10 Intelligent Systems and Soft Computing Final results of three-layer network learning
04/26/10 Intelligent Systems and Soft Computing Network represented by McCulloch-Pitts model for solving the  Exclusive-OR  operation
04/26/10 Intelligent Systems and Soft Computing Decision boundaries ,[object Object]
04/26/10 Intelligent Systems and Soft Computing Accelerated learning in multilayer neural networks ,[object Object],where  a  and  b   are constants. Suitable values for  a  and  b  are:  a  = 1.716 and  b  = 0. 667
04/26/10 Intelligent Systems and Soft Computing ,[object Object],where   is a positive number (0   1) called the  momentum constant . Typically, the momentum constant is set to 0.95.  This equation is called the  generalised delta rule .
04/26/10 Intelligent Systems and Soft Computing Learning with momentum for operation  Exclusive-OR Learning Rate
04/26/10 Intelligent Systems and Soft Computing Learning with adaptive learning rate To accelerate the convergence and yet avoid the  danger of instability, we can apply two heuristics: Heuristic 1   If the change of the sum of squared errors has the same algebraic sign for several consequent epochs, then the learning rate parameter,   , should be increased. Heuristic 2   If the algebraic sign of the change of the sum of squared errors alternates for several consequent epochs, then the learning rate parameter,   , should be decreased.
04/26/10 Intelligent Systems and Soft Computing ,[object Object],[object Object],[object Object]
04/26/10 Intelligent Systems and Soft Computing Learning with adaptive learning rate Sum-Squared Erro Learning Rate
04/26/10 Intelligent Systems and Soft Computing Learning with momentum and adaptive learning rate Sum-Squared Erro Learning Rate
04/26/10 Intelligent Systems and Soft Computing The Hopfield Network ,[object Object]
04/26/10 Intelligent Systems and Soft Computing ,[object Object],[object Object]
04/26/10 Intelligent Systems and Soft Computing ,[object Object]
04/26/10 Intelligent Systems and Soft Computing Single-layer  n -neuron Hopfield network I n p u t  S i g n a l s O u t p u t  S i g n a l s
04/26/10 Intelligent Systems and Soft Computing ,[object Object]
04/26/10 Intelligent Systems and Soft Computing ,[object Object],[object Object]
04/26/10 Intelligent Systems and Soft Computing ,[object Object],where  M  is the number of states to be memorised by the network, Y m  is the n-dimensional binary vector, I is  n      n  identity matrix, and superscript  T  denotes matrix transposition.
04/26/10 Intelligent Systems and Soft Computing Possible states for the three-neuron Hopfield network
04/26/10 Intelligent Systems and Soft Computing ,[object Object],[object Object],where  Y 1  and  Y 2  are the three-dimensional vectors. or
04/26/10 Intelligent Systems and Soft Computing ,[object Object],[object Object],[object Object]
04/26/10 Intelligent Systems and Soft Computing ,[object Object]
04/26/10 Intelligent Systems and Soft Computing ,[object Object],[object Object],[object Object],[object Object]
04/26/10 Intelligent Systems and Soft Computing Storage capacity of the Hopfield network ,[object Object],[object Object],[object Object]
04/26/10 Intelligent Systems and Soft Computing Bidirectional associative memory (BAM) ,[object Object],[object Object]
04/26/10 Intelligent Systems and Soft Computing ,[object Object],[object Object]
04/26/10 Intelligent Systems and Soft Computing BAM operation
04/26/10 Intelligent Systems and Soft Computing The basic idea behind the BAM is to store  pattern pairs so that when  n -dimensional vector  X  from set  A  is presented as input, the BAM recalls  m -dimensional vector  Y  from set  B , but when  Y  is presented as input, the BAM recalls  X .
04/26/10 Intelligent Systems and Soft Computing ,[object Object],where  M  is the number of pattern pairs to be stored in the BAM.
04/26/10 Intelligent Systems and Soft Computing Stability and storage capacity of the BAM ,[object Object],[object Object],[object Object]

Contenu connexe

Tendances

Deep Learning Tutorial
Deep Learning TutorialDeep Learning Tutorial
Deep Learning TutorialAmr Rashed
 
Machine Learning - Convolutional Neural Network
Machine Learning - Convolutional Neural NetworkMachine Learning - Convolutional Neural Network
Machine Learning - Convolutional Neural NetworkRichard Kuo
 
Convolutional Neural Networks
Convolutional Neural NetworksConvolutional Neural Networks
Convolutional Neural NetworksAshray Bhandare
 
Decision trees in Machine Learning
Decision trees in Machine Learning Decision trees in Machine Learning
Decision trees in Machine Learning Mohammad Junaid Khan
 
Image classification using CNN
Image classification using CNNImage classification using CNN
Image classification using CNNNoura Hussein
 
Feed forward ,back propagation,gradient descent
Feed forward ,back propagation,gradient descentFeed forward ,back propagation,gradient descent
Feed forward ,back propagation,gradient descentMuhammad Rasel
 
Image sampling and quantization
Image sampling and quantizationImage sampling and quantization
Image sampling and quantizationBCET, Balasore
 
Homomorphic filtering
Homomorphic filteringHomomorphic filtering
Homomorphic filteringGautam Saxena
 
Artificial Neural Network
Artificial Neural NetworkArtificial Neural Network
Artificial Neural NetworkPrakash K
 
Convolutional Neural Network - CNN | How CNN Works | Deep Learning Course | S...
Convolutional Neural Network - CNN | How CNN Works | Deep Learning Course | S...Convolutional Neural Network - CNN | How CNN Works | Deep Learning Course | S...
Convolutional Neural Network - CNN | How CNN Works | Deep Learning Course | S...Simplilearn
 
backpropagation in neural networks
backpropagation in neural networksbackpropagation in neural networks
backpropagation in neural networksAkash Goel
 
Feedforward neural network
Feedforward neural networkFeedforward neural network
Feedforward neural networkSopheaktra YONG
 
Introduction to Deep Learning
Introduction to Deep LearningIntroduction to Deep Learning
Introduction to Deep LearningOswald Campesato
 
Neural Networks: Self-Organizing Maps (SOM)
Neural Networks:  Self-Organizing Maps (SOM)Neural Networks:  Self-Organizing Maps (SOM)
Neural Networks: Self-Organizing Maps (SOM)Mostafa G. M. Mostafa
 
Neural Networks: Rosenblatt's Perceptron
Neural Networks: Rosenblatt's PerceptronNeural Networks: Rosenblatt's Perceptron
Neural Networks: Rosenblatt's PerceptronMostafa G. M. Mostafa
 
Artificial Neural Network | Deep Neural Network Explained | Artificial Neural...
Artificial Neural Network | Deep Neural Network Explained | Artificial Neural...Artificial Neural Network | Deep Neural Network Explained | Artificial Neural...
Artificial Neural Network | Deep Neural Network Explained | Artificial Neural...Simplilearn
 
Activation function
Activation functionActivation function
Activation functionAstha Jain
 

Tendances (20)

HOPFIELD NETWORK
HOPFIELD NETWORKHOPFIELD NETWORK
HOPFIELD NETWORK
 
Deep Learning Tutorial
Deep Learning TutorialDeep Learning Tutorial
Deep Learning Tutorial
 
Machine Learning - Convolutional Neural Network
Machine Learning - Convolutional Neural NetworkMachine Learning - Convolutional Neural Network
Machine Learning - Convolutional Neural Network
 
Convolutional Neural Networks
Convolutional Neural NetworksConvolutional Neural Networks
Convolutional Neural Networks
 
Decision trees in Machine Learning
Decision trees in Machine Learning Decision trees in Machine Learning
Decision trees in Machine Learning
 
Image classification using CNN
Image classification using CNNImage classification using CNN
Image classification using CNN
 
Feed forward ,back propagation,gradient descent
Feed forward ,back propagation,gradient descentFeed forward ,back propagation,gradient descent
Feed forward ,back propagation,gradient descent
 
Image sampling and quantization
Image sampling and quantizationImage sampling and quantization
Image sampling and quantization
 
Homomorphic filtering
Homomorphic filteringHomomorphic filtering
Homomorphic filtering
 
K Nearest Neighbors
K Nearest NeighborsK Nearest Neighbors
K Nearest Neighbors
 
Artificial Neural Network
Artificial Neural NetworkArtificial Neural Network
Artificial Neural Network
 
Convolutional Neural Network - CNN | How CNN Works | Deep Learning Course | S...
Convolutional Neural Network - CNN | How CNN Works | Deep Learning Course | S...Convolutional Neural Network - CNN | How CNN Works | Deep Learning Course | S...
Convolutional Neural Network - CNN | How CNN Works | Deep Learning Course | S...
 
backpropagation in neural networks
backpropagation in neural networksbackpropagation in neural networks
backpropagation in neural networks
 
Feedforward neural network
Feedforward neural networkFeedforward neural network
Feedforward neural network
 
Introduction to Deep Learning
Introduction to Deep LearningIntroduction to Deep Learning
Introduction to Deep Learning
 
Neural Networks: Self-Organizing Maps (SOM)
Neural Networks:  Self-Organizing Maps (SOM)Neural Networks:  Self-Organizing Maps (SOM)
Neural Networks: Self-Organizing Maps (SOM)
 
Neural Networks: Rosenblatt's Perceptron
Neural Networks: Rosenblatt's PerceptronNeural Networks: Rosenblatt's Perceptron
Neural Networks: Rosenblatt's Perceptron
 
Artificial Neural Network | Deep Neural Network Explained | Artificial Neural...
Artificial Neural Network | Deep Neural Network Explained | Artificial Neural...Artificial Neural Network | Deep Neural Network Explained | Artificial Neural...
Artificial Neural Network | Deep Neural Network Explained | Artificial Neural...
 
02 Fundamental Concepts of ANN
02 Fundamental Concepts of ANN02 Fundamental Concepts of ANN
02 Fundamental Concepts of ANN
 
Activation function
Activation functionActivation function
Activation function
 

Similaire à Supervised Learning

lecture07.ppt
lecture07.pptlecture07.ppt
lecture07.pptbutest
 
Neural Networks Ver1
Neural  Networks  Ver1Neural  Networks  Ver1
Neural Networks Ver1ncct
 
SOFT COMPUTERING TECHNICS -Unit 1
SOFT COMPUTERING TECHNICS -Unit 1SOFT COMPUTERING TECHNICS -Unit 1
SOFT COMPUTERING TECHNICS -Unit 1sravanthi computers
 
Soft Computing-173101
Soft Computing-173101Soft Computing-173101
Soft Computing-173101AMIT KUMAR
 
Neural network final NWU 4.3 Graphics Course
Neural network final NWU 4.3 Graphics CourseNeural network final NWU 4.3 Graphics Course
Neural network final NWU 4.3 Graphics CourseMohaiminur Rahman
 
Neuralnetwork 101222074552-phpapp02
Neuralnetwork 101222074552-phpapp02Neuralnetwork 101222074552-phpapp02
Neuralnetwork 101222074552-phpapp02Deepu Gupta
 
BACKPROPOGATION ALGO.pdfLECTURE NOTES WITH SOLVED EXAMPLE AND FEED FORWARD NE...
BACKPROPOGATION ALGO.pdfLECTURE NOTES WITH SOLVED EXAMPLE AND FEED FORWARD NE...BACKPROPOGATION ALGO.pdfLECTURE NOTES WITH SOLVED EXAMPLE AND FEED FORWARD NE...
BACKPROPOGATION ALGO.pdfLECTURE NOTES WITH SOLVED EXAMPLE AND FEED FORWARD NE...DurgadeviParamasivam
 
Neural networks
Neural networksNeural networks
Neural networksBasil John
 
Economic Load Dispatch (ELD), Economic Emission Dispatch (EED), Combined Econ...
Economic Load Dispatch (ELD), Economic Emission Dispatch (EED), Combined Econ...Economic Load Dispatch (ELD), Economic Emission Dispatch (EED), Combined Econ...
Economic Load Dispatch (ELD), Economic Emission Dispatch (EED), Combined Econ...cscpconf
 
Fuzzy Logic Final Report
Fuzzy Logic Final ReportFuzzy Logic Final Report
Fuzzy Logic Final ReportShikhar Agarwal
 
Artificial Neural Networks ppt.pptx for final sem cse
Artificial Neural Networks  ppt.pptx for final sem cseArtificial Neural Networks  ppt.pptx for final sem cse
Artificial Neural Networks ppt.pptx for final sem cseNaveenBhajantri1
 
Dr. Syed Muhammad Ali Tirmizi - Special topics in finance lec 13
Dr. Syed Muhammad Ali Tirmizi - Special topics in finance   lec 13Dr. Syed Muhammad Ali Tirmizi - Special topics in finance   lec 13
Dr. Syed Muhammad Ali Tirmizi - Special topics in finance lec 13Dr. Muhammad Ali Tirmizi., Ph.D.
 
Artificial neural networks
Artificial neural networksArtificial neural networks
Artificial neural networksMohamed Arif
 
Artificial Neural Network in Medical Diagnosis
Artificial Neural Network in Medical DiagnosisArtificial Neural Network in Medical Diagnosis
Artificial Neural Network in Medical DiagnosisAdityendra Kumar Singh
 

Similaire à Supervised Learning (20)

lecture07.ppt
lecture07.pptlecture07.ppt
lecture07.ppt
 
Neural Networks Ver1
Neural  Networks  Ver1Neural  Networks  Ver1
Neural Networks Ver1
 
SOFT COMPUTERING TECHNICS -Unit 1
SOFT COMPUTERING TECHNICS -Unit 1SOFT COMPUTERING TECHNICS -Unit 1
SOFT COMPUTERING TECHNICS -Unit 1
 
19_Learning.ppt
19_Learning.ppt19_Learning.ppt
19_Learning.ppt
 
10-Perceptron.pdf
10-Perceptron.pdf10-Perceptron.pdf
10-Perceptron.pdf
 
Soft Computing-173101
Soft Computing-173101Soft Computing-173101
Soft Computing-173101
 
ANN.ppt
ANN.pptANN.ppt
ANN.ppt
 
Neural network final NWU 4.3 Graphics Course
Neural network final NWU 4.3 Graphics CourseNeural network final NWU 4.3 Graphics Course
Neural network final NWU 4.3 Graphics Course
 
Neuralnetwork 101222074552-phpapp02
Neuralnetwork 101222074552-phpapp02Neuralnetwork 101222074552-phpapp02
Neuralnetwork 101222074552-phpapp02
 
BACKPROPOGATION ALGO.pdfLECTURE NOTES WITH SOLVED EXAMPLE AND FEED FORWARD NE...
BACKPROPOGATION ALGO.pdfLECTURE NOTES WITH SOLVED EXAMPLE AND FEED FORWARD NE...BACKPROPOGATION ALGO.pdfLECTURE NOTES WITH SOLVED EXAMPLE AND FEED FORWARD NE...
BACKPROPOGATION ALGO.pdfLECTURE NOTES WITH SOLVED EXAMPLE AND FEED FORWARD NE...
 
Neural networks
Neural networksNeural networks
Neural networks
 
Economic Load Dispatch (ELD), Economic Emission Dispatch (EED), Combined Econ...
Economic Load Dispatch (ELD), Economic Emission Dispatch (EED), Combined Econ...Economic Load Dispatch (ELD), Economic Emission Dispatch (EED), Combined Econ...
Economic Load Dispatch (ELD), Economic Emission Dispatch (EED), Combined Econ...
 
Neural Networks
Neural NetworksNeural Networks
Neural Networks
 
Neural network
Neural networkNeural network
Neural network
 
Fuzzy Logic Final Report
Fuzzy Logic Final ReportFuzzy Logic Final Report
Fuzzy Logic Final Report
 
Artificial Neural Networks ppt.pptx for final sem cse
Artificial Neural Networks  ppt.pptx for final sem cseArtificial Neural Networks  ppt.pptx for final sem cse
Artificial Neural Networks ppt.pptx for final sem cse
 
Dr. Syed Muhammad Ali Tirmizi - Special topics in finance lec 13
Dr. Syed Muhammad Ali Tirmizi - Special topics in finance   lec 13Dr. Syed Muhammad Ali Tirmizi - Special topics in finance   lec 13
Dr. Syed Muhammad Ali Tirmizi - Special topics in finance lec 13
 
Artificial neural networks
Artificial neural networksArtificial neural networks
Artificial neural networks
 
Artificial Neural Network in Medical Diagnosis
Artificial Neural Network in Medical DiagnosisArtificial Neural Network in Medical Diagnosis
Artificial Neural Network in Medical Diagnosis
 
Neural Networks
Neural NetworksNeural Networks
Neural Networks
 

Plus de butest

EL MODELO DE NEGOCIO DE YOUTUBE
EL MODELO DE NEGOCIO DE YOUTUBEEL MODELO DE NEGOCIO DE YOUTUBE
EL MODELO DE NEGOCIO DE YOUTUBEbutest
 
1. MPEG I.B.P frame之不同
1. MPEG I.B.P frame之不同1. MPEG I.B.P frame之不同
1. MPEG I.B.P frame之不同butest
 
LESSONS FROM THE MICHAEL JACKSON TRIAL
LESSONS FROM THE MICHAEL JACKSON TRIALLESSONS FROM THE MICHAEL JACKSON TRIAL
LESSONS FROM THE MICHAEL JACKSON TRIALbutest
 
Timeline: The Life of Michael Jackson
Timeline: The Life of Michael JacksonTimeline: The Life of Michael Jackson
Timeline: The Life of Michael Jacksonbutest
 
Popular Reading Last Updated April 1, 2010 Adams, Lorraine The ...
Popular Reading Last Updated April 1, 2010 Adams, Lorraine The ...Popular Reading Last Updated April 1, 2010 Adams, Lorraine The ...
Popular Reading Last Updated April 1, 2010 Adams, Lorraine The ...butest
 
LESSONS FROM THE MICHAEL JACKSON TRIAL
LESSONS FROM THE MICHAEL JACKSON TRIALLESSONS FROM THE MICHAEL JACKSON TRIAL
LESSONS FROM THE MICHAEL JACKSON TRIALbutest
 
Com 380, Summer II
Com 380, Summer IICom 380, Summer II
Com 380, Summer IIbutest
 
The MYnstrel Free Press Volume 2: Economic Struggles, Meet Jazz
The MYnstrel Free Press Volume 2: Economic Struggles, Meet JazzThe MYnstrel Free Press Volume 2: Economic Struggles, Meet Jazz
The MYnstrel Free Press Volume 2: Economic Struggles, Meet Jazzbutest
 
MICHAEL JACKSON.doc
MICHAEL JACKSON.docMICHAEL JACKSON.doc
MICHAEL JACKSON.docbutest
 
Social Networks: Twitter Facebook SL - Slide 1
Social Networks: Twitter Facebook SL - Slide 1Social Networks: Twitter Facebook SL - Slide 1
Social Networks: Twitter Facebook SL - Slide 1butest
 
Facebook
Facebook Facebook
Facebook butest
 
Executive Summary Hare Chevrolet is a General Motors dealership ...
Executive Summary Hare Chevrolet is a General Motors dealership ...Executive Summary Hare Chevrolet is a General Motors dealership ...
Executive Summary Hare Chevrolet is a General Motors dealership ...butest
 
Welcome to the Dougherty County Public Library's Facebook and ...
Welcome to the Dougherty County Public Library's Facebook and ...Welcome to the Dougherty County Public Library's Facebook and ...
Welcome to the Dougherty County Public Library's Facebook and ...butest
 
NEWS ANNOUNCEMENT
NEWS ANNOUNCEMENTNEWS ANNOUNCEMENT
NEWS ANNOUNCEMENTbutest
 
C-2100 Ultra Zoom.doc
C-2100 Ultra Zoom.docC-2100 Ultra Zoom.doc
C-2100 Ultra Zoom.docbutest
 
MAC Printing on ITS Printers.doc.doc
MAC Printing on ITS Printers.doc.docMAC Printing on ITS Printers.doc.doc
MAC Printing on ITS Printers.doc.docbutest
 
Mac OS X Guide.doc
Mac OS X Guide.docMac OS X Guide.doc
Mac OS X Guide.docbutest
 
WEB DESIGN!
WEB DESIGN!WEB DESIGN!
WEB DESIGN!butest
 

Plus de butest (20)

EL MODELO DE NEGOCIO DE YOUTUBE
EL MODELO DE NEGOCIO DE YOUTUBEEL MODELO DE NEGOCIO DE YOUTUBE
EL MODELO DE NEGOCIO DE YOUTUBE
 
1. MPEG I.B.P frame之不同
1. MPEG I.B.P frame之不同1. MPEG I.B.P frame之不同
1. MPEG I.B.P frame之不同
 
LESSONS FROM THE MICHAEL JACKSON TRIAL
LESSONS FROM THE MICHAEL JACKSON TRIALLESSONS FROM THE MICHAEL JACKSON TRIAL
LESSONS FROM THE MICHAEL JACKSON TRIAL
 
Timeline: The Life of Michael Jackson
Timeline: The Life of Michael JacksonTimeline: The Life of Michael Jackson
Timeline: The Life of Michael Jackson
 
Popular Reading Last Updated April 1, 2010 Adams, Lorraine The ...
Popular Reading Last Updated April 1, 2010 Adams, Lorraine The ...Popular Reading Last Updated April 1, 2010 Adams, Lorraine The ...
Popular Reading Last Updated April 1, 2010 Adams, Lorraine The ...
 
LESSONS FROM THE MICHAEL JACKSON TRIAL
LESSONS FROM THE MICHAEL JACKSON TRIALLESSONS FROM THE MICHAEL JACKSON TRIAL
LESSONS FROM THE MICHAEL JACKSON TRIAL
 
Com 380, Summer II
Com 380, Summer IICom 380, Summer II
Com 380, Summer II
 
PPT
PPTPPT
PPT
 
The MYnstrel Free Press Volume 2: Economic Struggles, Meet Jazz
The MYnstrel Free Press Volume 2: Economic Struggles, Meet JazzThe MYnstrel Free Press Volume 2: Economic Struggles, Meet Jazz
The MYnstrel Free Press Volume 2: Economic Struggles, Meet Jazz
 
MICHAEL JACKSON.doc
MICHAEL JACKSON.docMICHAEL JACKSON.doc
MICHAEL JACKSON.doc
 
Social Networks: Twitter Facebook SL - Slide 1
Social Networks: Twitter Facebook SL - Slide 1Social Networks: Twitter Facebook SL - Slide 1
Social Networks: Twitter Facebook SL - Slide 1
 
Facebook
Facebook Facebook
Facebook
 
Executive Summary Hare Chevrolet is a General Motors dealership ...
Executive Summary Hare Chevrolet is a General Motors dealership ...Executive Summary Hare Chevrolet is a General Motors dealership ...
Executive Summary Hare Chevrolet is a General Motors dealership ...
 
Welcome to the Dougherty County Public Library's Facebook and ...
Welcome to the Dougherty County Public Library's Facebook and ...Welcome to the Dougherty County Public Library's Facebook and ...
Welcome to the Dougherty County Public Library's Facebook and ...
 
NEWS ANNOUNCEMENT
NEWS ANNOUNCEMENTNEWS ANNOUNCEMENT
NEWS ANNOUNCEMENT
 
C-2100 Ultra Zoom.doc
C-2100 Ultra Zoom.docC-2100 Ultra Zoom.doc
C-2100 Ultra Zoom.doc
 
MAC Printing on ITS Printers.doc.doc
MAC Printing on ITS Printers.doc.docMAC Printing on ITS Printers.doc.doc
MAC Printing on ITS Printers.doc.doc
 
Mac OS X Guide.doc
Mac OS X Guide.docMac OS X Guide.doc
Mac OS X Guide.doc
 
hier
hierhier
hier
 
WEB DESIGN!
WEB DESIGN!WEB DESIGN!
WEB DESIGN!
 

Supervised Learning

  • 1.
  • 2. 04/26/10 Intelligent Systems and Soft Computing Introduction, or how the brain works Machine learning involves adaptive mechanisms that enable computers to learn from experience, learn by example and learn by analogy. Learning capabilities can improve the performance of an intelligent system over time. The most popular approaches to machine learning are artificial neural networks and genetic algorithms . This lecture is dedicated to neural networks.
  • 3.
  • 4.
  • 5. 04/26/10 Intelligent Systems and Soft Computing Biological neural network
  • 6.
  • 7.
  • 8. 04/26/10 Intelligent Systems and Soft Computing Architecture of a typical artificial neural network I n p u t S i g n a l s O u t p u t S i g n a l s
  • 9. 04/26/10 Intelligent Systems and Soft Computing Analogy between biological and artificial neural networks
  • 10. 04/26/10 Intelligent Systems and Soft Computing The neuron as a simple computing element Diagram of a neuron
  • 11.
  • 12. 04/26/10 Intelligent Systems and Soft Computing Activation functions of a neuron
  • 13.
  • 14. 04/26/10 Intelligent Systems and Soft Computing Single-layer two-input perceptron
  • 15.
  • 16.
  • 17. 04/26/10 Intelligent Systems and Soft Computing Linear separability in the perceptrons
  • 18. 04/26/10 Intelligent Systems and Soft Computing How does the perceptron learn its classification tasks? This is done by making small adjustments in the weights to reduce the difference between the actual and desired outputs of the perceptron. The initial weights are randomly assigned, usually in the range [  0.5, 0.5], and then updated to obtain the output consistent with the training examples.
  • 19.
  • 20. 04/26/10 Intelligent Systems and Soft Computing where p = 1, 2, 3, . . .  is the learning rate , a positive constant less than unity. The perceptron learning rule was first proposed by Rosenblatt in 1960. Using this rule we can derive the perceptron training algorithm for classification tasks. The perceptron learning rule   
  • 21. 04/26/10 Intelligent Systems and Soft Computing Perceptron’s training algorithm Step 1 : Initialisation Set initial weights w 1 , w 2 ,…, w n and threshold  to random numbers in the range [  0.5, 0.5]. If the error, e ( p ), is positive, we need to increase perceptron output Y ( p ), but if it is negative, we need to decrease Y ( p ).
  • 22. 04/26/10 Intelligent Systems and Soft Computing Perceptron’s training algorithm (continued) Step 2 : Activation Activate the perceptron by applying inputs x 1 ( p ), x 2 ( p ),…, x n ( p ) and desired output Y d ( p ). Calculate the actual output at iteration p = 1 where n is the number of the perceptron inputs, and step is a step activation function.
  • 23. 04/26/10 Intelligent Systems and Soft Computing Perceptron’s training algorithm (continued) Step 3 : Weight training Update the weights of the perceptron where  w i ( p ) is the weight correction at iteration p . The weight correction is computed by the delta rule : Step 4 : Iteration Increase iteration p by one, go back to Step 2 and repeat the process until convergence. .
  • 24. 04/26/10 Intelligent Systems and Soft Computing Example of perceptron learning: the logical operation AND
  • 25. 04/26/10 Intelligent Systems and Soft Computing Two-dimensional plots of basic logical operations A perceptron can learn the operations AND and OR , but not Exclusive-OR . ( a ) AND ( x 1  x 2 )
  • 26.
  • 27. 04/26/10 Intelligent Systems and Soft Computing Multilayer perceptron with two hidden layers I n p u t S i g n a l s O u t p u t S i g n a l s
  • 28.
  • 29.
  • 30.
  • 31. 04/26/10 Intelligent Systems and Soft Computing Three-layer back-propagation neural network
  • 32. 04/26/10 Intelligent Systems and Soft Computing The back-propagation training algorithm Step 1 : Initialization Set all the weights and threshold levels of the network to random numbers uniformly distributed inside a small range: where F i is the total number of inputs of neuron i in the network. The weight initialization is done on a neuron-by-neuron basis.
  • 33. 04/26/10 Intelligent Systems and Soft Computing Step 2 : Activation Activate the back-propagation neural network by applying inputs x 1 ( p ), x 2 ( p ),…, x n ( p ) and desired outputs y d ,1 ( p ), y d ,2 ( p ),…, y d , n ( p ). ( a ) Calculate the actual outputs of the neurons in the hidden layer: where n is the number of inputs of neuron j in the hidden layer, and sigmoid is the sigmoid activation function.
  • 34. 04/26/10 Intelligent Systems and Soft Computing ( b ) Calculate the actual outputs of the neurons in the output layer: Step 2 : Activation (continued) where m is the number of inputs of neuron k in the output layer.
  • 35. 04/26/10 Intelligent Systems and Soft Computing Step 3 : Weight training Update the weights in the back-propagation network propagating backward the errors associated with output neurons. ( a ) Calculate the error gradient for the neurons in the output layer: where Calculate the weight corrections: Update the weights at the output neurons:
  • 36. 04/26/10 Intelligent Systems and Soft Computing ( b ) Calculate the error gradient for the neurons in the hidden layer: Step 3 : Weight training (continued) Calculate the weight corrections: Update the weights at the hidden neurons:
  • 37. 04/26/10 Intelligent Systems and Soft Computing Step 4 : Iteration Increase iteration p by one, go back to Step 2 and repeat the process until the selected error criterion is satisfied. As an example, we may consider the three-layer back-propagation network. Suppose that the network is required to perform logical operation Exclusive-OR . Recall that a single-layer perceptron could not do this operation. Now we will apply the three-layer net.
  • 38. 04/26/10 Intelligent Systems and Soft Computing Three-layer network for solving the Exclusive-OR operation
  • 39.
  • 40.
  • 41.
  • 42.
  • 43.
  • 44. 04/26/10 Intelligent Systems and Soft Computing Learning curve for operation Exclusive-OR Sum-Squared Error
  • 45. 04/26/10 Intelligent Systems and Soft Computing Final results of three-layer network learning
  • 46. 04/26/10 Intelligent Systems and Soft Computing Network represented by McCulloch-Pitts model for solving the Exclusive-OR operation
  • 47.
  • 48.
  • 49.
  • 50. 04/26/10 Intelligent Systems and Soft Computing Learning with momentum for operation Exclusive-OR Learning Rate
  • 51. 04/26/10 Intelligent Systems and Soft Computing Learning with adaptive learning rate To accelerate the convergence and yet avoid the danger of instability, we can apply two heuristics: Heuristic 1 If the change of the sum of squared errors has the same algebraic sign for several consequent epochs, then the learning rate parameter,  , should be increased. Heuristic 2 If the algebraic sign of the change of the sum of squared errors alternates for several consequent epochs, then the learning rate parameter,  , should be decreased.
  • 52.
  • 53. 04/26/10 Intelligent Systems and Soft Computing Learning with adaptive learning rate Sum-Squared Erro Learning Rate
  • 54. 04/26/10 Intelligent Systems and Soft Computing Learning with momentum and adaptive learning rate Sum-Squared Erro Learning Rate
  • 55.
  • 56.
  • 57.
  • 58. 04/26/10 Intelligent Systems and Soft Computing Single-layer n -neuron Hopfield network I n p u t S i g n a l s O u t p u t S i g n a l s
  • 59.
  • 60.
  • 61.
  • 62. 04/26/10 Intelligent Systems and Soft Computing Possible states for the three-neuron Hopfield network
  • 63.
  • 64.
  • 65.
  • 66.
  • 67.
  • 68.
  • 69.
  • 70. 04/26/10 Intelligent Systems and Soft Computing BAM operation
  • 71. 04/26/10 Intelligent Systems and Soft Computing The basic idea behind the BAM is to store pattern pairs so that when n -dimensional vector X from set A is presented as input, the BAM recalls m -dimensional vector Y from set B , but when Y is presented as input, the BAM recalls X .
  • 72.
  • 73.

Notes de l'éditeur

  1. 44