SlideShare une entreprise Scribd logo
1  sur  20
ARTIFICIAL NEURAL
NETWORKS
Sai Anjaneya
WHAT ARE NEURAL
NETWORKS?
• In machine learning and cognitive
science, artificial neural networks (ANNs)
are a family of models inspired by biological
neural networks (the central nervous
systems of animals, in particular the brain)
and are used to estimate or approximate
functions that can depend on a large
number of inputs and are generally
unknown. The brain consists of a densely
interconnected set of nerve cells, or basic
information-processing units, called neurons.
Neural Networks in the Brain
Biological Neural Network
In human brain a neuron is a information processing unit which
receives several input signals from the environment, computes a new
activation level and sends an output signal to the other neurons or
body muscles through output links.
• Neural networks exhibit plasticity.
• In response to the stimulation
pattern, neurons demonstrate long-
term changes in the strength of their
connections.
• Neurons also can form new
connections with other neurons. Even
entire collections of neurons may
sometimes migrate from one place
to another.
• These mechanisms form the basis for
learning in the brain.
MACHINE LEARNING
• In general, machine learning involves
adaptive mechanisms that enable
computers to learn from experience,
learn by example and learn by analogy.
• Learning capabilities can improve the
performance of an intelligent system
over time.
• Machine learning mechanisms form the
basis for adaptive systems.
WHY MACHINE LEARNING?
• The techniques that we’ve seen before relies on expert knowledge to set the rules.
Once the rules are set, the decision making is automated.
• What happens if the rules become obsolete or new information is gathered?
• The change, then needs to happen at a very basic level and needs to be done
manually.
• ANNs attempt to automate the process.
• The objective is to come up with a model to predict a set of outputs Y <y1, y2,…, yn>
from a given set of inputs X <x1, x2,…, xm> given training dataset with records of the
form (X, Y).
• The result must be a function f(X) that approximates Y for values of X not in the
dataset.
THE PERCEPTRON
As like as human brain, in ANN the PERCEPTRON is the simplest form of a
neural network. It consists of a single “neuron” which computes an output
function by assigning weights to each of the links to the n parameters.
How does the Perceptron learn from experience?
Weights(w1,w2….) are assigned to inputs of
perceptron initially in the range [-0.5,0.5] and
then updated to obtain the output consistent
with the training examples. Thus weights
Are updated and summation of these weights
is calculated in linear combiner at each
training level.
Hard Limiters: Step and Sign
UPDATE RULES
• Error function for pth training example:
• e(p) = Yd(p) – Y(p); where p = 1, 2, 3, . . .
• Update:
• wi(p+1) = wi(p) + α × xi(p) × e(p);
• where α is the learning rate, a positive constant less than unity
• Only works on Linearly Separable data.
LIMITATIONS
MULTILAYERED NETWORKS
BACKPROPAGATION
• The indices i, j and k here refer to neurons in the input, hidden
and output layers, respectively.
• Input signals, x1, x2, . . . , xn, are propagated through the network
from left to right, and error signals, e1, e2, . . .,el, from right to left.
The symbol wij denotes the weight for the connection between
neuron i in the input layer and neuron j in the hidden layer, and
the symbol wjk the weight between neuron j in the hidden layer
and neuron k in the output layer.
The error signal at the output of neuron
k at iteration p is defined by.
Update rule
Outer Layer Hidden Layer(s)
WHEN TO STOP
IMPROVEMENTS
A multilayer network, in general, learns much faster when
the sigmoidal activation function is represented by a
hyperbolic tangent:
We also can accelerate training by including a momentum
term in the earlier expression for delta, according to the
observations made in Watrous (1987) and Jacobs (1988),
the inclusion of momentum in the back-propagation
algorithm has a stabilising effect on training:
LINKS
• Intro
• Forward Propagation [until 3:45]
• Gradient Descent [Cost, Curse of Dimensionality not covered]
• Backpropagation
SELF-ORGANISING MAPS
(LEARNING WITHOUT A ‘TEACHER’)
• Hebb’s Law:
• If two neurons on either side of a connection are activated synchronously,
then the weight of that connection is increased.
• If two neurons on either side of a connection are activated asynchronously,
then the weight of that connection is decreased.
Forgetting Factor (To allow for the
possibility of decreasing connection
strength):
HOW TO CHOOSE THE FORGETTING
FACTOR
• Forgetting factor specifies the weight decay in a single learning cycle. It usually falls in
the interval between 0 and 1.
• If the forgetting factor is 0, the neural network is capable only of strengthening its
synaptic weights, and as a result, these weights grow towards infinity. On the other hand,
if the forgetting factor is close to 1, the network remembers very little of what it learns.
• Therefore, a rather small forgetting factor should be chosen, typically between 0.01 and
0.1, to allow only a little ‘forgetting’ while limiting the weight growth.
Artificial neural networks (2)

Contenu connexe

Tendances

Neural network
Neural networkNeural network
Neural network
Saddam Hussain
 
NEURAL NETWORKS
NEURAL NETWORKSNEURAL NETWORKS
NEURAL NETWORKS
ESCOM
 

Tendances (20)

Unit 1
Unit 1Unit 1
Unit 1
 
Artifical Neural Network and its applications
Artifical Neural Network and its applicationsArtifical Neural Network and its applications
Artifical Neural Network and its applications
 
Neural Networks: Multilayer Perceptron
Neural Networks: Multilayer PerceptronNeural Networks: Multilayer Perceptron
Neural Networks: Multilayer Perceptron
 
Artificial Neural Network
Artificial Neural NetworkArtificial Neural Network
Artificial Neural Network
 
Artificial Neural Network report
Artificial Neural Network reportArtificial Neural Network report
Artificial Neural Network report
 
Artificial neural network for machine learning
Artificial neural network for machine learningArtificial neural network for machine learning
Artificial neural network for machine learning
 
Neural networks
Neural networksNeural networks
Neural networks
 
Fundamental, An Introduction to Neural Networks
Fundamental, An Introduction to Neural NetworksFundamental, An Introduction to Neural Networks
Fundamental, An Introduction to Neural Networks
 
Neural networks
Neural networksNeural networks
Neural networks
 
Neural network & its applications
Neural network & its applications Neural network & its applications
Neural network & its applications
 
Mc culloch pitts neuron
Mc culloch pitts neuronMc culloch pitts neuron
Mc culloch pitts neuron
 
Neural network
Neural networkNeural network
Neural network
 
Perceptron & Neural Networks
Perceptron & Neural NetworksPerceptron & Neural Networks
Perceptron & Neural Networks
 
Counter propagation Network
Counter propagation NetworkCounter propagation Network
Counter propagation Network
 
Perceptron
PerceptronPerceptron
Perceptron
 
Adaptive Resonance Theory
Adaptive Resonance TheoryAdaptive Resonance Theory
Adaptive Resonance Theory
 
Machine Learning - Convolutional Neural Network
Machine Learning - Convolutional Neural NetworkMachine Learning - Convolutional Neural Network
Machine Learning - Convolutional Neural Network
 
Introduction to Neural Networks
Introduction to Neural NetworksIntroduction to Neural Networks
Introduction to Neural Networks
 
NEURAL NETWORKS
NEURAL NETWORKSNEURAL NETWORKS
NEURAL NETWORKS
 
Introduction Of Artificial neural network
Introduction Of Artificial neural networkIntroduction Of Artificial neural network
Introduction Of Artificial neural network
 

En vedette

Artificial neural network
Artificial neural networkArtificial neural network
Artificial neural network
DEEPASHRI HK
 
Chapter12 Section03 Edit
Chapter12  Section03 EditChapter12  Section03 Edit
Chapter12 Section03 Edit
Vicki Mizner
 
Artificial intelligence in power systems
Artificial intelligence in power systems Artificial intelligence in power systems
Artificial intelligence in power systems
Riyas K H
 
Artificial neural networks
Artificial neural networksArtificial neural networks
Artificial neural networks
stellajoseph
 

En vedette (14)

Artificial neural network
Artificial neural networkArtificial neural network
Artificial neural network
 
Artificial Neural Networks for NIU
Artificial Neural Networks for NIUArtificial Neural Networks for NIU
Artificial Neural Networks for NIU
 
14. mohsin dalvi artificial neural networks presentation
14. mohsin dalvi   artificial neural networks presentation14. mohsin dalvi   artificial neural networks presentation
14. mohsin dalvi artificial neural networks presentation
 
Chapter12 Section03 Edit
Chapter12  Section03 EditChapter12  Section03 Edit
Chapter12 Section03 Edit
 
Artificial Neural Network (draft)
Artificial Neural Network (draft)Artificial Neural Network (draft)
Artificial Neural Network (draft)
 
Fuzzy and nn
Fuzzy and nnFuzzy and nn
Fuzzy and nn
 
ARTIFICIAL INTELLIGENCE & NEURAL NETWORKS
ARTIFICIAL INTELLIGENCE & NEURAL NETWORKSARTIFICIAL INTELLIGENCE & NEURAL NETWORKS
ARTIFICIAL INTELLIGENCE & NEURAL NETWORKS
 
Artificial neural network model & hidden layers in multilayer artificial neur...
Artificial neural network model & hidden layers in multilayer artificial neur...Artificial neural network model & hidden layers in multilayer artificial neur...
Artificial neural network model & hidden layers in multilayer artificial neur...
 
Magnetic levitation
Magnetic levitationMagnetic levitation
Magnetic levitation
 
Final ppt maglev
Final ppt maglevFinal ppt maglev
Final ppt maglev
 
Artificial intelligence in power systems
Artificial intelligence in power systems Artificial intelligence in power systems
Artificial intelligence in power systems
 
Artificial intelligence NEURAL NETWORKS
Artificial intelligence NEURAL NETWORKSArtificial intelligence NEURAL NETWORKS
Artificial intelligence NEURAL NETWORKS
 
Artificial neural networks
Artificial neural networksArtificial neural networks
Artificial neural networks
 
Data Science - Part VIII - Artifical Neural Network
Data Science - Part VIII -  Artifical Neural NetworkData Science - Part VIII -  Artifical Neural Network
Data Science - Part VIII - Artifical Neural Network
 

Similaire à Artificial neural networks (2)

lecture07.ppt
lecture07.pptlecture07.ppt
lecture07.ppt
butest
 
Neural Networks Ver1
Neural  Networks  Ver1Neural  Networks  Ver1
Neural Networks Ver1
ncct
 
SOFT COMPUTERING TECHNICS -Unit 1
SOFT COMPUTERING TECHNICS -Unit 1SOFT COMPUTERING TECHNICS -Unit 1
SOFT COMPUTERING TECHNICS -Unit 1
sravanthi computers
 
Artificial Neural Network_VCW (1).pptx
Artificial Neural Network_VCW (1).pptxArtificial Neural Network_VCW (1).pptx
Artificial Neural Network_VCW (1).pptx
pratik610182
 
Neuralnetwork 101222074552-phpapp02
Neuralnetwork 101222074552-phpapp02Neuralnetwork 101222074552-phpapp02
Neuralnetwork 101222074552-phpapp02
Deepu Gupta
 

Similaire à Artificial neural networks (2) (20)

lecture07.ppt
lecture07.pptlecture07.ppt
lecture07.ppt
 
Artificial Neural Networks ppt.pptx for final sem cse
Artificial Neural Networks  ppt.pptx for final sem cseArtificial Neural Networks  ppt.pptx for final sem cse
Artificial Neural Networks ppt.pptx for final sem cse
 
Artificial neural networks seminar presentation using MSWord.
Artificial neural networks seminar presentation using MSWord.Artificial neural networks seminar presentation using MSWord.
Artificial neural networks seminar presentation using MSWord.
 
Artificial neural networks
Artificial neural networks Artificial neural networks
Artificial neural networks
 
Artificial Neural Network in Medical Diagnosis
Artificial Neural Network in Medical DiagnosisArtificial Neural Network in Medical Diagnosis
Artificial Neural Network in Medical Diagnosis
 
Neural Networks Ver1
Neural  Networks  Ver1Neural  Networks  Ver1
Neural Networks Ver1
 
Classification by back propagation, multi layered feed forward neural network...
Classification by back propagation, multi layered feed forward neural network...Classification by back propagation, multi layered feed forward neural network...
Classification by back propagation, multi layered feed forward neural network...
 
ACUMENS ON NEURAL NET AKG 20 7 23.pptx
ACUMENS ON NEURAL NET AKG 20 7 23.pptxACUMENS ON NEURAL NET AKG 20 7 23.pptx
ACUMENS ON NEURAL NET AKG 20 7 23.pptx
 
02 Fundamental Concepts of ANN
02 Fundamental Concepts of ANN02 Fundamental Concepts of ANN
02 Fundamental Concepts of ANN
 
SOFT COMPUTERING TECHNICS -Unit 1
SOFT COMPUTERING TECHNICS -Unit 1SOFT COMPUTERING TECHNICS -Unit 1
SOFT COMPUTERING TECHNICS -Unit 1
 
Artificial Neural Network
Artificial Neural NetworkArtificial Neural Network
Artificial Neural Network
 
20120140503023
2012014050302320120140503023
20120140503023
 
Artificial Neural Network
Artificial Neural NetworkArtificial Neural Network
Artificial Neural Network
 
NEURALNETWORKS_DM_SOWMYAJYOTHI.pdf
NEURALNETWORKS_DM_SOWMYAJYOTHI.pdfNEURALNETWORKS_DM_SOWMYAJYOTHI.pdf
NEURALNETWORKS_DM_SOWMYAJYOTHI.pdf
 
JAISTサマースクール2016「脳を知るための理論」講義04 Neural Networks and Neuroscience
JAISTサマースクール2016「脳を知るための理論」講義04 Neural Networks and Neuroscience JAISTサマースクール2016「脳を知るための理論」講義04 Neural Networks and Neuroscience
JAISTサマースクール2016「脳を知るための理論」講義04 Neural Networks and Neuroscience
 
Unit 2
Unit 2Unit 2
Unit 2
 
19_Learning.ppt
19_Learning.ppt19_Learning.ppt
19_Learning.ppt
 
Artificial Neural Network_VCW (1).pptx
Artificial Neural Network_VCW (1).pptxArtificial Neural Network_VCW (1).pptx
Artificial Neural Network_VCW (1).pptx
 
Backpropagation.pptx
Backpropagation.pptxBackpropagation.pptx
Backpropagation.pptx
 
Neuralnetwork 101222074552-phpapp02
Neuralnetwork 101222074552-phpapp02Neuralnetwork 101222074552-phpapp02
Neuralnetwork 101222074552-phpapp02
 

Plus de sai anjaneya (20)

Quality control and inspection of tyres
Quality control and inspection of tyresQuality control and inspection of tyres
Quality control and inspection of tyres
 
Fuel level sensor
Fuel level sensorFuel level sensor
Fuel level sensor
 
Welding — arc stud welding of metallic
Welding — arc stud welding of metallicWelding — arc stud welding of metallic
Welding — arc stud welding of metallic
 
Basic concepts in manufacturing
Basic concepts in manufacturingBasic concepts in manufacturing
Basic concepts in manufacturing
 
Lenoir cycle(pulse jet engine)
Lenoir cycle(pulse jet engine)Lenoir cycle(pulse jet engine)
Lenoir cycle(pulse jet engine)
 
Human computer interaction
Human computer interactionHuman computer interaction
Human computer interaction
 
industrial automation history
industrial automation historyindustrial automation history
industrial automation history
 
Bulk to micro optics
Bulk to micro opticsBulk to micro optics
Bulk to micro optics
 
Ceramics
CeramicsCeramics
Ceramics
 
Silicon(ceramics) carbide
Silicon(ceramics)  carbideSilicon(ceramics)  carbide
Silicon(ceramics) carbide
 
Silicon carbide
Silicon carbideSilicon carbide
Silicon carbide
 
Poly vinyl chloride
Poly vinyl chloridePoly vinyl chloride
Poly vinyl chloride
 
Porcelain
PorcelainPorcelain
Porcelain
 
Polyurethane
PolyurethanePolyurethane
Polyurethane
 
Polystyrene
PolystyrenePolystyrene
Polystyrene
 
Poly vinyl chloride(polymers)
Poly vinyl chloride(polymers)Poly vinyl chloride(polymers)
Poly vinyl chloride(polymers)
 
Nickel(metal)
Nickel(metal)Nickel(metal)
Nickel(metal)
 
Graphene
GrapheneGraphene
Graphene
 
Neoprene
NeopreneNeoprene
Neoprene
 
Kevalar(composite)
Kevalar(composite)Kevalar(composite)
Kevalar(composite)
 

Dernier

Activity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfActivity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdf
ciinovamais
 

Dernier (20)

ICT role in 21st century education and it's challenges.
ICT role in 21st century education and it's challenges.ICT role in 21st century education and it's challenges.
ICT role in 21st century education and it's challenges.
 
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
 
How to Give a Domain for a Field in Odoo 17
How to Give a Domain for a Field in Odoo 17How to Give a Domain for a Field in Odoo 17
How to Give a Domain for a Field in Odoo 17
 
microwave assisted reaction. General introduction
microwave assisted reaction. General introductionmicrowave assisted reaction. General introduction
microwave assisted reaction. General introduction
 
Unit-IV; Professional Sales Representative (PSR).pptx
Unit-IV; Professional Sales Representative (PSR).pptxUnit-IV; Professional Sales Representative (PSR).pptx
Unit-IV; Professional Sales Representative (PSR).pptx
 
Food Chain and Food Web (Ecosystem) EVS, B. Pharmacy 1st Year, Sem-II
Food Chain and Food Web (Ecosystem) EVS, B. Pharmacy 1st Year, Sem-IIFood Chain and Food Web (Ecosystem) EVS, B. Pharmacy 1st Year, Sem-II
Food Chain and Food Web (Ecosystem) EVS, B. Pharmacy 1st Year, Sem-II
 
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
 
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
 
Z Score,T Score, Percential Rank and Box Plot Graph
Z Score,T Score, Percential Rank and Box Plot GraphZ Score,T Score, Percential Rank and Box Plot Graph
Z Score,T Score, Percential Rank and Box Plot Graph
 
Unit-IV- Pharma. Marketing Channels.pptx
Unit-IV- Pharma. Marketing Channels.pptxUnit-IV- Pharma. Marketing Channels.pptx
Unit-IV- Pharma. Marketing Channels.pptx
 
ComPTIA Overview | Comptia Security+ Book SY0-701
ComPTIA Overview | Comptia Security+ Book SY0-701ComPTIA Overview | Comptia Security+ Book SY0-701
ComPTIA Overview | Comptia Security+ Book SY0-701
 
This PowerPoint helps students to consider the concept of infinity.
This PowerPoint helps students to consider the concept of infinity.This PowerPoint helps students to consider the concept of infinity.
This PowerPoint helps students to consider the concept of infinity.
 
Grant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingGrant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy Consulting
 
Application orientated numerical on hev.ppt
Application orientated numerical on hev.pptApplication orientated numerical on hev.ppt
Application orientated numerical on hev.ppt
 
Introduction to Nonprofit Accounting: The Basics
Introduction to Nonprofit Accounting: The BasicsIntroduction to Nonprofit Accounting: The Basics
Introduction to Nonprofit Accounting: The Basics
 
Basic Civil Engineering first year Notes- Chapter 4 Building.pptx
Basic Civil Engineering first year Notes- Chapter 4 Building.pptxBasic Civil Engineering first year Notes- Chapter 4 Building.pptx
Basic Civil Engineering first year Notes- Chapter 4 Building.pptx
 
General Principles of Intellectual Property: Concepts of Intellectual Proper...
General Principles of Intellectual Property: Concepts of Intellectual  Proper...General Principles of Intellectual Property: Concepts of Intellectual  Proper...
General Principles of Intellectual Property: Concepts of Intellectual Proper...
 
Ecological Succession. ( ECOSYSTEM, B. Pharmacy, 1st Year, Sem-II, Environmen...
Ecological Succession. ( ECOSYSTEM, B. Pharmacy, 1st Year, Sem-II, Environmen...Ecological Succession. ( ECOSYSTEM, B. Pharmacy, 1st Year, Sem-II, Environmen...
Ecological Succession. ( ECOSYSTEM, B. Pharmacy, 1st Year, Sem-II, Environmen...
 
Activity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfActivity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdf
 
On National Teacher Day, meet the 2024-25 Kenan Fellows
On National Teacher Day, meet the 2024-25 Kenan FellowsOn National Teacher Day, meet the 2024-25 Kenan Fellows
On National Teacher Day, meet the 2024-25 Kenan Fellows
 

Artificial neural networks (2)

  • 2. WHAT ARE NEURAL NETWORKS? • In machine learning and cognitive science, artificial neural networks (ANNs) are a family of models inspired by biological neural networks (the central nervous systems of animals, in particular the brain) and are used to estimate or approximate functions that can depend on a large number of inputs and are generally unknown. The brain consists of a densely interconnected set of nerve cells, or basic information-processing units, called neurons.
  • 3. Neural Networks in the Brain Biological Neural Network In human brain a neuron is a information processing unit which receives several input signals from the environment, computes a new activation level and sends an output signal to the other neurons or body muscles through output links.
  • 4. • Neural networks exhibit plasticity. • In response to the stimulation pattern, neurons demonstrate long- term changes in the strength of their connections. • Neurons also can form new connections with other neurons. Even entire collections of neurons may sometimes migrate from one place to another. • These mechanisms form the basis for learning in the brain.
  • 5. MACHINE LEARNING • In general, machine learning involves adaptive mechanisms that enable computers to learn from experience, learn by example and learn by analogy. • Learning capabilities can improve the performance of an intelligent system over time. • Machine learning mechanisms form the basis for adaptive systems.
  • 6. WHY MACHINE LEARNING? • The techniques that we’ve seen before relies on expert knowledge to set the rules. Once the rules are set, the decision making is automated. • What happens if the rules become obsolete or new information is gathered? • The change, then needs to happen at a very basic level and needs to be done manually. • ANNs attempt to automate the process. • The objective is to come up with a model to predict a set of outputs Y <y1, y2,…, yn> from a given set of inputs X <x1, x2,…, xm> given training dataset with records of the form (X, Y). • The result must be a function f(X) that approximates Y for values of X not in the dataset.
  • 7. THE PERCEPTRON As like as human brain, in ANN the PERCEPTRON is the simplest form of a neural network. It consists of a single “neuron” which computes an output function by assigning weights to each of the links to the n parameters.
  • 8. How does the Perceptron learn from experience? Weights(w1,w2….) are assigned to inputs of perceptron initially in the range [-0.5,0.5] and then updated to obtain the output consistent with the training examples. Thus weights Are updated and summation of these weights is calculated in linear combiner at each training level.
  • 10. UPDATE RULES • Error function for pth training example: • e(p) = Yd(p) – Y(p); where p = 1, 2, 3, . . . • Update: • wi(p+1) = wi(p) + α × xi(p) × e(p); • where α is the learning rate, a positive constant less than unity • Only works on Linearly Separable data.
  • 13. BACKPROPAGATION • The indices i, j and k here refer to neurons in the input, hidden and output layers, respectively. • Input signals, x1, x2, . . . , xn, are propagated through the network from left to right, and error signals, e1, e2, . . .,el, from right to left. The symbol wij denotes the weight for the connection between neuron i in the input layer and neuron j in the hidden layer, and the symbol wjk the weight between neuron j in the hidden layer and neuron k in the output layer. The error signal at the output of neuron k at iteration p is defined by. Update rule
  • 14. Outer Layer Hidden Layer(s)
  • 16. IMPROVEMENTS A multilayer network, in general, learns much faster when the sigmoidal activation function is represented by a hyperbolic tangent: We also can accelerate training by including a momentum term in the earlier expression for delta, according to the observations made in Watrous (1987) and Jacobs (1988), the inclusion of momentum in the back-propagation algorithm has a stabilising effect on training:
  • 17. LINKS • Intro • Forward Propagation [until 3:45] • Gradient Descent [Cost, Curse of Dimensionality not covered] • Backpropagation
  • 18. SELF-ORGANISING MAPS (LEARNING WITHOUT A ‘TEACHER’) • Hebb’s Law: • If two neurons on either side of a connection are activated synchronously, then the weight of that connection is increased. • If two neurons on either side of a connection are activated asynchronously, then the weight of that connection is decreased. Forgetting Factor (To allow for the possibility of decreasing connection strength):
  • 19. HOW TO CHOOSE THE FORGETTING FACTOR • Forgetting factor specifies the weight decay in a single learning cycle. It usually falls in the interval between 0 and 1. • If the forgetting factor is 0, the neural network is capable only of strengthening its synaptic weights, and as a result, these weights grow towards infinity. On the other hand, if the forgetting factor is close to 1, the network remembers very little of what it learns. • Therefore, a rather small forgetting factor should be chosen, typically between 0.01 and 0.1, to allow only a little ‘forgetting’ while limiting the weight growth.