SlideShare a Scribd company logo
1 of 16
Neural Networks
Dr. Randa Elanwar
Lecture 6
Lecture Content
• Non Linearly separable functions: XOR gate
implementation
– MLP data transformation
– mapping implementation
– graphical solution
2Neural Networks Dr. Randa Elanwar
Non linear problems
• XOR problem
• The only way to separate the positive from negative examples
is to draw 2 lines (i.e., we need 2 straight line equations) or
nonlinear region to capture one type only
3Neural Networks Dr. Randa Elanwar
+ve
+ve
-ve
-ve+ve
-ve
cba yx 
22
Non linear problems
• To implement the nonlinearity we need to insert one or more
extra layer of nodes between the input layer and the output
layer (Hidden layer)
4Neural Networks Dr. Randa Elanwar
Non linear problems
2-layer Feed Forward Example XOR solution
5Neural Networks Dr. Randa Elanwar
MLP data transformation and mapping
implementation
• Need for hidden units:
• If there is one layer of enough hidden units, the input can be
recoded (memorized)  multilayer perceptron (MLP)
• This recoding allows any problem to be mapped/represented
(e.g., x2, x3, etc.)
• Question: how can the weights of the hidden units be trained?
• Answer: Learning algorithms e.g., back propagation
• The word ‘Back propagation’ is meant to the error propagation
for weight adaptation of layers beginning from the last hidden
layer back to the first i.e. weights of last layer are computed
before the previous layers
6Neural Networks Dr. Randa Elanwar
Learning Non Linearly Separable Functions
• Back propagation tries to transform training patterns to
make them almost linearly separable and use linear
network
• In other words, if we need more than 1 straight line to
separate +ve and –ve patterns, we solve the problem in
two phases:
– In phase 1: we first represent each straight line with a single
perceptron and classify/map the training patterns (output)
– In phase 2: these outputs are transformed to new patterns which
are now linearly separable and can be classified by an additional
perceptron giving the final result.
7Neural Networks Dr. Randa Elanwar
Multi-layer Networks and Perceptrons
8Neural Networks Dr. Randa Elanwar
- Have one or more
layers of hidden units.
- With two possibly very
large hidden layers, it is
possible to implement
any function.
- Networks without hidden
layer are called perceptrons.
- Perceptrons are very limited
in what they can represent,
but this makes their learning
problem much simpler.
Solving Non Linearly Separable Functions
• Example: XOR problem
• Phase 1: we draw arbitrary lines
• We find the line equations g1(x) = 0 and g2(x) = 0 using
arbitrary intersections on the axes (yellow points p1, p2, p3, p4).
• We assume the +ve and –ve directions for each line.
• We classify the given patterns as +ve/-ve with respect to both g1(x) & g2(x)
• Phase 2: we transform the patterns we have
• Let the patterns that are +ve/-ve with respect to both g1(x) and g2(x) belong to class
B (similar signs), otherwise belong to class A (different signs).
• We find the line equations g(y) = 0 using arbitrary intersections on the new axes
9Neural Networks Dr. Randa Elanwar
X2
X1
A
A
B
B
x1 x2 XOR Class
0 0 0 B
0 1 1 A
1 0 1 A
1 1 0 B
g2(x)
g1(x) +ve+ve
p1
p2
p3
p4
Solving Non Linearly Separable Functions
• Let p1 = (0.5,0), p2 = (0,0.5), p3(1.5,0), p4(0,1.5)
• Constructing g1(x) = 0
• g1(x) = x1 + x2 – 0.5 = 0
• Constructing g2(x) = 0
• g2(x) = x1 + x2 – 1.5 = 0
10Neural Networks Dr. Randa Elanwar
)1(1)1(2
)2(1)2(2
)1(11
)2(12
pp
pp
px
px





5.00
05.0
5.01
02





x
x
)1(3)1(4
)2(3)2(4
)1(31
)2(32
pp
pp
px
px





5.10
05.1
5.11
02





x
x
Solving Non Linearly Separable Functions
• Assume x1>p1(1) is the positive direction for g1(x)
• Assume x1>p3(1) is the positive direction for g2(x)
• Classifying the given patterns with respect to g1(x) and g2(x):
• If we represent +ve and –ve values as a result of step function
i.e., y1 =f(g1(x))= 0 if g1(x) is –ve and 1 otherwise
and y2 =f(g2(x))= 0 if g2(x) is –ve and 1 otherwise
• We now have only three patterns that can be linearly separable and
we got rid of the extra pattern causing the problem (since 2 patterns
coincide)
11Neural Networks Dr. Randa Elanwar
x1 x2 g1(x) g2(x) y1 y2 Class
0 0 -ve -ve 0 0 B
0 1 +ve -ve 1 0 A
1 0 +ve -ve 1 0 A
1 1 +ve +ve 1 1 B
Solving Non Linearly Separable Functions
• Let p1 = (0.5,0), p2 = (0,-0.25)
• Constructing g(y) = 0
• g(y) = y1 – 2 y2 – 0.5 = 0
12Neural Networks Dr. Randa Elanwar
y2
y1
A
B
B
g(y)
+ve
p1
p2)1(1)1(2
)2(1)2(2
)1(11
)2(12
pp
pp
py
py





5.00
025.0
5.01
02





y
y
x1
x2
1
1
-0.5
1
-2 -0.5
1
1
-1.5
g1(x)
g2(x)
g(y)
Output layerHidden layerInput layer
Solving Non Linearly Separable Functions
• Example: The linearly non separable patterns x1 = [3 0],
x2 = [5 2], x3 = [1 3], x4 = [2 4], x5 = [1 1], x6 = [3 3]
have to be classified into two categories C1 = {x1, x2,
x3, x4} and C2 = {x5, x6} using a feed forward 2-layer
neural network.
– Select a suitable number of partitioning straight lines.
– Consequently design the first stage (hidden layer) of the
network with bipolar discrete perceptrons.
– Using this layer, transform the six samples.
– Design the output layer of the network with a bipolar
discrete perceptron using the transformed samples.
13Neural Networks Dr. Randa Elanwar
Solving Non Linearly Separable Functions
• Let p1 = (0,1), p2 = (1,2),
• p3 = (2,0), p4 = (3,1)
• Constructing g1(x) = 0
• g1(x) = x1 - x2 + 1 = 0
• Constructing g2(x) = 0
• g2(x) = x1 - x2 – 2 = 0
14Neural Networks Dr. Randa Elanwar
X2
X1
x3
x1
x6
x5
g2(x)
g1(x)
+ve
+ve
p1
p2 x2
x4
p3
p4
)1(1)1(2
)2(1)2(2
)1(11
)2(12
pp
pp
px
px





01
12
01
12





x
x
)1(3)1(4
)2(3)2(4
)1(31
)2(32
pp
pp
px
px





23
01
21
02





x
x
Solving Non Linearly Separable Functions
15Neural Networks Dr. Randa Elanwar
x1 x2 g1(x) g2(x) y1 y2 Class
3 0 +ve +ve 1 1 B
5 2 +ve +ve 1 1 B
1 3 -ve -ve -1 -1 B
2 4 -ve -ve -1 -1 B
1 1 +ve -ve 1 -1 A
3 3 +ve -ve 1 -1 A
Assume x2<p1(2) is the positive direction for g1(x)
Assume x1>p3(1) is the positive direction for g2(x)
Classifying the given patterns with respect to g1(x) and g2(x):
We now have only three patterns that can be linearly separable and we got rid of the
extra pattern causing the problem (since 2 patterns coincide)
we represent +ve and –ve values as a result of
bipolar function
y1 =f(g1(x))= -1 if g1(x) is –ve and 1 otherwise
y2 =f(g2(x))= -1 if g2(x) is –ve and 1 otherwise
Solving Non Linearly Separable Functions
• Let p1 = (1,0), p2 = (0,-1)
• Constructing g(y) = 0
• g(y) = y1 – y2 – 1 = 0
16Neural Networks Dr. Randa Elanwar
y2
y1
x5,x6
x1,x2
x3,x4
g(y)
+vep1
p2
)1(1)1(2
)2(1)2(2
)1(11
)2(12
pp
pp
py
py





10
01
11
02





y
y
x1
x2
1
-1
1
1
-1 -1
1
-1
-2
g1(x)
g2(x)
g(y)
Output layerHidden layerInput layer

More Related Content

What's hot

Market oriented Cloud Computing
Market oriented Cloud ComputingMarket oriented Cloud Computing
Market oriented Cloud Computing
Jithin Parakka
 
Gsm Global System For Mobile Comm[1]. Really Nice
Gsm  Global System For Mobile Comm[1].   Really NiceGsm  Global System For Mobile Comm[1].   Really Nice
Gsm Global System For Mobile Comm[1]. Really Nice
er_tiwari
 
Publish subscribe model overview
Publish subscribe model overviewPublish subscribe model overview
Publish subscribe model overview
Ishraq Al Fataftah
 
Wirreless wide area network rehan
Wirreless wide area network rehanWirreless wide area network rehan
Wirreless wide area network rehan
Khuzaima Shahid
 

What's hot (20)

Market oriented Cloud Computing
Market oriented Cloud ComputingMarket oriented Cloud Computing
Market oriented Cloud Computing
 
Lecture #5 Data Communication and Network
Lecture #5 Data Communication and NetworkLecture #5 Data Communication and Network
Lecture #5 Data Communication and Network
 
Telecom seminar
Telecom seminarTelecom seminar
Telecom seminar
 
Sensor Networks Introduction and Architecture
Sensor Networks Introduction and ArchitectureSensor Networks Introduction and Architecture
Sensor Networks Introduction and Architecture
 
Gsm network
Gsm networkGsm network
Gsm network
 
Communication middleware
Communication middlewareCommunication middleware
Communication middleware
 
Gsm Global System For Mobile Comm[1]. Really Nice
Gsm  Global System For Mobile Comm[1].   Really NiceGsm  Global System For Mobile Comm[1].   Really Nice
Gsm Global System For Mobile Comm[1]. Really Nice
 
Introduction to IoT Architectures and Protocols
Introduction to IoT Architectures and ProtocolsIntroduction to IoT Architectures and Protocols
Introduction to IoT Architectures and Protocols
 
03 data transmission
03 data transmission03 data transmission
03 data transmission
 
Cloud Service Life-cycle Management
Cloud Service Life-cycle ManagementCloud Service Life-cycle Management
Cloud Service Life-cycle Management
 
CS8601 mobile computing Two marks Questions and Answer
CS8601 mobile computing Two marks Questions and AnswerCS8601 mobile computing Two marks Questions and Answer
CS8601 mobile computing Two marks Questions and Answer
 
Publish subscribe model overview
Publish subscribe model overviewPublish subscribe model overview
Publish subscribe model overview
 
IoT Protocol Stack.pdf
IoT Protocol Stack.pdfIoT Protocol Stack.pdf
IoT Protocol Stack.pdf
 
TELECOMMUNICATION SYSTEMS
TELECOMMUNICATION SYSTEMS TELECOMMUNICATION SYSTEMS
TELECOMMUNICATION SYSTEMS
 
Artificial intelligence NEURAL NETWORKS
Artificial intelligence NEURAL NETWORKSArtificial intelligence NEURAL NETWORKS
Artificial intelligence NEURAL NETWORKS
 
Design and Implementation of Boolean Functions using Multiplexer and also usi...
Design and Implementation of Boolean Functions using Multiplexer and also usi...Design and Implementation of Boolean Functions using Multiplexer and also usi...
Design and Implementation of Boolean Functions using Multiplexer and also usi...
 
Gsm ppt
Gsm pptGsm ppt
Gsm ppt
 
IPv4 adresiranje
IPv4 adresiranjeIPv4 adresiranje
IPv4 adresiranje
 
Wirreless wide area network rehan
Wirreless wide area network rehanWirreless wide area network rehan
Wirreless wide area network rehan
 
Internet of Things and its applications
Internet of Things and its applicationsInternet of Things and its applications
Internet of Things and its applications
 

Viewers also liked

2.3.1 properties of functions
2.3.1 properties of functions2.3.1 properties of functions
2.3.1 properties of functions
Northside ISD
 
Deterministic Chaos Poster 655800
Deterministic Chaos Poster 655800Deterministic Chaos Poster 655800
Deterministic Chaos Poster 655800
Thomas Davies
 
Chaos Presentation
Chaos PresentationChaos Presentation
Chaos Presentation
Albert Yang
 
The Back Propagation Learning Algorithm
The Back Propagation Learning AlgorithmThe Back Propagation Learning Algorithm
The Back Propagation Learning Algorithm
ESCOM
 

Viewers also liked (20)

Setting Artificial Neural Networks parameters
Setting Artificial Neural Networks parametersSetting Artificial Neural Networks parameters
Setting Artificial Neural Networks parameters
 
2.3.1 properties of functions
2.3.1 properties of functions2.3.1 properties of functions
2.3.1 properties of functions
 
Improving Performance of Back propagation Learning Algorithm
Improving Performance of Back propagation Learning AlgorithmImproving Performance of Back propagation Learning Algorithm
Improving Performance of Back propagation Learning Algorithm
 
Introduction to Neural networks (under graduate course) Lecture 9 of 9
Introduction to Neural networks (under graduate course) Lecture 9 of 9Introduction to Neural networks (under graduate course) Lecture 9 of 9
Introduction to Neural networks (under graduate course) Lecture 9 of 9
 
Creative Chaos: Banking &amp; Finance Portfolio 2011
Creative Chaos: Banking &amp; Finance Portfolio 2011Creative Chaos: Banking &amp; Finance Portfolio 2011
Creative Chaos: Banking &amp; Finance Portfolio 2011
 
nasir
nasirnasir
nasir
 
Deterministic Chaos Poster 655800
Deterministic Chaos Poster 655800Deterministic Chaos Poster 655800
Deterministic Chaos Poster 655800
 
Introduction to Neural networks (under graduate course) Lecture 1 of 9
Introduction to Neural networks (under graduate course) Lecture 1 of 9Introduction to Neural networks (under graduate course) Lecture 1 of 9
Introduction to Neural networks (under graduate course) Lecture 1 of 9
 
Dynamics, control and synchronization of some models of neuronal oscillators
Dynamics, control and synchronization of some models of neuronal oscillatorsDynamics, control and synchronization of some models of neuronal oscillators
Dynamics, control and synchronization of some models of neuronal oscillators
 
Introduction to Neural networks (under graduate course) Lecture 8 of 9
Introduction to Neural networks (under graduate course) Lecture 8 of 9Introduction to Neural networks (under graduate course) Lecture 8 of 9
Introduction to Neural networks (under graduate course) Lecture 8 of 9
 
Logistic map
Logistic mapLogistic map
Logistic map
 
On the Dynamics and Synchronization of a Class of Nonlinear High Frequency Ch...
On the Dynamics and Synchronization of a Class of Nonlinear High Frequency Ch...On the Dynamics and Synchronization of a Class of Nonlinear High Frequency Ch...
On the Dynamics and Synchronization of a Class of Nonlinear High Frequency Ch...
 
الجزء السادس ماذا ستقدم لعميلك ريادة الأعمال خطوة بخطوة
الجزء السادس ماذا ستقدم لعميلك ريادة الأعمال خطوة بخطوةالجزء السادس ماذا ستقدم لعميلك ريادة الأعمال خطوة بخطوة
الجزء السادس ماذا ستقدم لعميلك ريادة الأعمال خطوة بخطوة
 
Introduction to Neural networks (under graduate course) Lecture 3 of 9
Introduction to Neural networks (under graduate course) Lecture 3 of 9Introduction to Neural networks (under graduate course) Lecture 3 of 9
Introduction to Neural networks (under graduate course) Lecture 3 of 9
 
Maths scert text book model, chapter 7,statistics
 Maths  scert text book model, chapter 7,statistics Maths  scert text book model, chapter 7,statistics
Maths scert text book model, chapter 7,statistics
 
Chaos Presentation
Chaos PresentationChaos Presentation
Chaos Presentation
 
Introduction to Neural networks (under graduate course) Lecture 7 of 9
Introduction to Neural networks (under graduate course) Lecture 7 of 9Introduction to Neural networks (under graduate course) Lecture 7 of 9
Introduction to Neural networks (under graduate course) Lecture 7 of 9
 
The Back Propagation Learning Algorithm
The Back Propagation Learning AlgorithmThe Back Propagation Learning Algorithm
The Back Propagation Learning Algorithm
 
Poulation forecasting
Poulation forecastingPoulation forecasting
Poulation forecasting
 
Artificial Neural Networks Lect3: Neural Network Learning rules
Artificial Neural Networks Lect3: Neural Network Learning rulesArtificial Neural Networks Lect3: Neural Network Learning rules
Artificial Neural Networks Lect3: Neural Network Learning rules
 

Similar to Introduction to Neural networks (under graduate course) Lecture 6 of 9

Adaptive neural network controller Presentation
Adaptive neural network controller PresentationAdaptive neural network controller Presentation
Adaptive neural network controller Presentation
Nguyen Cong Dan
 

Similar to Introduction to Neural networks (under graduate course) Lecture 6 of 9 (20)

Hardware Acceleration for Machine Learning
Hardware Acceleration for Machine LearningHardware Acceleration for Machine Learning
Hardware Acceleration for Machine Learning
 
Introduction to Neural Netwoks
Introduction to Neural Netwoks Introduction to Neural Netwoks
Introduction to Neural Netwoks
 
04 Multi-layer Feedforward Networks
04 Multi-layer Feedforward Networks04 Multi-layer Feedforward Networks
04 Multi-layer Feedforward Networks
 
The world of loss function
The world of loss functionThe world of loss function
The world of loss function
 
Artificial Neural Network
Artificial Neural NetworkArtificial Neural Network
Artificial Neural Network
 
Neural network
Neural networkNeural network
Neural network
 
Adaptive neural network controller Presentation
Adaptive neural network controller PresentationAdaptive neural network controller Presentation
Adaptive neural network controller Presentation
 
Introduction to Neural Networks and Deep Learning
Introduction to Neural Networks and Deep LearningIntroduction to Neural Networks and Deep Learning
Introduction to Neural Networks and Deep Learning
 
Neural Networks
Neural NetworksNeural Networks
Neural Networks
 
Multilayer Perceptron - Elisa Sayrol - UPC Barcelona 2018
Multilayer Perceptron - Elisa Sayrol - UPC Barcelona 2018Multilayer Perceptron - Elisa Sayrol - UPC Barcelona 2018
Multilayer Perceptron - Elisa Sayrol - UPC Barcelona 2018
 
Paper study: Learning to solve circuit sat
Paper study: Learning to solve circuit satPaper study: Learning to solve circuit sat
Paper study: Learning to solve circuit sat
 
Aleksander gegov
Aleksander gegovAleksander gegov
Aleksander gegov
 
Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering
Convolutional Neural Networks on Graphs with Fast Localized Spectral FilteringConvolutional Neural Networks on Graphs with Fast Localized Spectral Filtering
Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering
 
Neural network basic and introduction of Deep learning
Neural network basic and introduction of Deep learningNeural network basic and introduction of Deep learning
Neural network basic and introduction of Deep learning
 
Introduction to Artificial Neural Networks
Introduction to Artificial Neural NetworksIntroduction to Artificial Neural Networks
Introduction to Artificial Neural Networks
 
ann-ics320Part4.ppt
ann-ics320Part4.pptann-ics320Part4.ppt
ann-ics320Part4.ppt
 
ann-ics320Part4.ppt
ann-ics320Part4.pptann-ics320Part4.ppt
ann-ics320Part4.ppt
 
03 Single layer Perception Classifier
03 Single layer Perception Classifier03 Single layer Perception Classifier
03 Single layer Perception Classifier
 
Lec3
Lec3Lec3
Lec3
 
Unit 1
Unit 1Unit 1
Unit 1
 

More from Randa Elanwar

الجزء الخامس ماذا ستقدم لعميلك ريادة الأعمال خطوة بخطوة
الجزء الخامس ماذا ستقدم لعميلك ريادة الأعمال خطوة بخطوةالجزء الخامس ماذا ستقدم لعميلك ريادة الأعمال خطوة بخطوة
الجزء الخامس ماذا ستقدم لعميلك ريادة الأعمال خطوة بخطوة
Randa Elanwar
 
الجزء الثالث ماذا ستقدم لعميلك ريادة الأعمال خطوة بخطوة
الجزء الثالث ماذا ستقدم لعميلك ريادة الأعمال خطوة بخطوةالجزء الثالث ماذا ستقدم لعميلك ريادة الأعمال خطوة بخطوة
الجزء الثالث ماذا ستقدم لعميلك ريادة الأعمال خطوة بخطوة
Randa Elanwar
 
الجزء الأول ماذا ستقدم لعميلك ريادة الأعمال خطوة بخطوة
الجزء الأول ماذا ستقدم لعميلك ريادة الأعمال خطوة بخطوةالجزء الأول ماذا ستقدم لعميلك ريادة الأعمال خطوة بخطوة
الجزء الأول ماذا ستقدم لعميلك ريادة الأعمال خطوة بخطوة
Randa Elanwar
 
تدريب مدونة علماء مصر على الكتابة الفنية (الترجمة والتلخيص )_Pdf5of5
تدريب مدونة علماء مصر على الكتابة الفنية (الترجمة والتلخيص    )_Pdf5of5تدريب مدونة علماء مصر على الكتابة الفنية (الترجمة والتلخيص    )_Pdf5of5
تدريب مدونة علماء مصر على الكتابة الفنية (الترجمة والتلخيص )_Pdf5of5
Randa Elanwar
 
تدريب مدونة علماء مصر على الكتابة الفنية (القصة القصيرة والخاطرة والأخطاء ال...
تدريب مدونة علماء مصر على الكتابة الفنية (القصة القصيرة والخاطرة  والأخطاء ال...تدريب مدونة علماء مصر على الكتابة الفنية (القصة القصيرة والخاطرة  والأخطاء ال...
تدريب مدونة علماء مصر على الكتابة الفنية (القصة القصيرة والخاطرة والأخطاء ال...
Randa Elanwar
 
تدريب مدونة علماء مصر على الكتابة الفنية (مقالات الموارد )_Pdf3of5
تدريب مدونة علماء مصر على الكتابة الفنية (مقالات الموارد   )_Pdf3of5تدريب مدونة علماء مصر على الكتابة الفنية (مقالات الموارد   )_Pdf3of5
تدريب مدونة علماء مصر على الكتابة الفنية (مقالات الموارد )_Pdf3of5
Randa Elanwar
 
تدريب مدونة علماء مصر على الكتابة الفنية (المقالات الإخبارية )_Pdf2of5
تدريب مدونة علماء مصر على الكتابة الفنية (المقالات الإخبارية  )_Pdf2of5تدريب مدونة علماء مصر على الكتابة الفنية (المقالات الإخبارية  )_Pdf2of5
تدريب مدونة علماء مصر على الكتابة الفنية (المقالات الإخبارية )_Pdf2of5
Randa Elanwar
 
تدريب مدونة علماء مصر على الكتابة الفنية (المقالات المبنية على البحث )_Pdf1of5
تدريب مدونة علماء مصر على الكتابة الفنية (المقالات المبنية على البحث )_Pdf1of5تدريب مدونة علماء مصر على الكتابة الفنية (المقالات المبنية على البحث )_Pdf1of5
تدريب مدونة علماء مصر على الكتابة الفنية (المقالات المبنية على البحث )_Pdf1of5
Randa Elanwar
 
تعريف بمدونة علماء مصر ومحاور التدريب على الكتابة للمدونين
تعريف بمدونة علماء مصر ومحاور التدريب على الكتابة للمدونينتعريف بمدونة علماء مصر ومحاور التدريب على الكتابة للمدونين
تعريف بمدونة علماء مصر ومحاور التدريب على الكتابة للمدونين
Randa Elanwar
 
يوميات طالب بدرجة مشرف (Part 18 of 20)
يوميات طالب بدرجة مشرف (Part 18 of 20)يوميات طالب بدرجة مشرف (Part 18 of 20)
يوميات طالب بدرجة مشرف (Part 18 of 20)
Randa Elanwar
 
يوميات طالب بدرجة مشرف (Part 16 of 20)
يوميات طالب بدرجة مشرف (Part 16 of 20)يوميات طالب بدرجة مشرف (Part 16 of 20)
يوميات طالب بدرجة مشرف (Part 16 of 20)
Randa Elanwar
 

More from Randa Elanwar (20)

الجزء الخامس ماذا ستقدم لعميلك ريادة الأعمال خطوة بخطوة
الجزء الخامس ماذا ستقدم لعميلك ريادة الأعمال خطوة بخطوةالجزء الخامس ماذا ستقدم لعميلك ريادة الأعمال خطوة بخطوة
الجزء الخامس ماذا ستقدم لعميلك ريادة الأعمال خطوة بخطوة
 
الجزء الرابع ماذا ستقدم لعميلك ريادة الأعمال خطوة بخطوة
الجزء الرابع ماذا ستقدم لعميلك ريادة الأعمال خطوة بخطوةالجزء الرابع ماذا ستقدم لعميلك ريادة الأعمال خطوة بخطوة
الجزء الرابع ماذا ستقدم لعميلك ريادة الأعمال خطوة بخطوة
 
الجزء الثالث ماذا ستقدم لعميلك ريادة الأعمال خطوة بخطوة
الجزء الثالث ماذا ستقدم لعميلك ريادة الأعمال خطوة بخطوةالجزء الثالث ماذا ستقدم لعميلك ريادة الأعمال خطوة بخطوة
الجزء الثالث ماذا ستقدم لعميلك ريادة الأعمال خطوة بخطوة
 
الجزء الثاني ماذا ستقدم لعميلك ريادة الأعمال خطوة بخطوة
الجزء الثاني ماذا ستقدم لعميلك ريادة الأعمال خطوة بخطوةالجزء الثاني ماذا ستقدم لعميلك ريادة الأعمال خطوة بخطوة
الجزء الثاني ماذا ستقدم لعميلك ريادة الأعمال خطوة بخطوة
 
الجزء الأول ماذا ستقدم لعميلك ريادة الأعمال خطوة بخطوة
الجزء الأول ماذا ستقدم لعميلك ريادة الأعمال خطوة بخطوةالجزء الأول ماذا ستقدم لعميلك ريادة الأعمال خطوة بخطوة
الجزء الأول ماذا ستقدم لعميلك ريادة الأعمال خطوة بخطوة
 
تدريب مدونة علماء مصر على الكتابة الفنية (الترجمة والتلخيص )_Pdf5of5
تدريب مدونة علماء مصر على الكتابة الفنية (الترجمة والتلخيص    )_Pdf5of5تدريب مدونة علماء مصر على الكتابة الفنية (الترجمة والتلخيص    )_Pdf5of5
تدريب مدونة علماء مصر على الكتابة الفنية (الترجمة والتلخيص )_Pdf5of5
 
تدريب مدونة علماء مصر على الكتابة الفنية (القصة القصيرة والخاطرة والأخطاء ال...
تدريب مدونة علماء مصر على الكتابة الفنية (القصة القصيرة والخاطرة  والأخطاء ال...تدريب مدونة علماء مصر على الكتابة الفنية (القصة القصيرة والخاطرة  والأخطاء ال...
تدريب مدونة علماء مصر على الكتابة الفنية (القصة القصيرة والخاطرة والأخطاء ال...
 
تدريب مدونة علماء مصر على الكتابة الفنية (مقالات الموارد )_Pdf3of5
تدريب مدونة علماء مصر على الكتابة الفنية (مقالات الموارد   )_Pdf3of5تدريب مدونة علماء مصر على الكتابة الفنية (مقالات الموارد   )_Pdf3of5
تدريب مدونة علماء مصر على الكتابة الفنية (مقالات الموارد )_Pdf3of5
 
تدريب مدونة علماء مصر على الكتابة الفنية (المقالات الإخبارية )_Pdf2of5
تدريب مدونة علماء مصر على الكتابة الفنية (المقالات الإخبارية  )_Pdf2of5تدريب مدونة علماء مصر على الكتابة الفنية (المقالات الإخبارية  )_Pdf2of5
تدريب مدونة علماء مصر على الكتابة الفنية (المقالات الإخبارية )_Pdf2of5
 
تدريب مدونة علماء مصر على الكتابة الفنية (المقالات المبنية على البحث )_Pdf1of5
تدريب مدونة علماء مصر على الكتابة الفنية (المقالات المبنية على البحث )_Pdf1of5تدريب مدونة علماء مصر على الكتابة الفنية (المقالات المبنية على البحث )_Pdf1of5
تدريب مدونة علماء مصر على الكتابة الفنية (المقالات المبنية على البحث )_Pdf1of5
 
تعريف بمدونة علماء مصر ومحاور التدريب على الكتابة للمدونين
تعريف بمدونة علماء مصر ومحاور التدريب على الكتابة للمدونينتعريف بمدونة علماء مصر ومحاور التدريب على الكتابة للمدونين
تعريف بمدونة علماء مصر ومحاور التدريب على الكتابة للمدونين
 
Entrepreneurship_who_is_your_customer_(arabic)_7of7
Entrepreneurship_who_is_your_customer_(arabic)_7of7Entrepreneurship_who_is_your_customer_(arabic)_7of7
Entrepreneurship_who_is_your_customer_(arabic)_7of7
 
Entrepreneurship_who_is_your_customer_(arabic)_5of7
Entrepreneurship_who_is_your_customer_(arabic)_5of7Entrepreneurship_who_is_your_customer_(arabic)_5of7
Entrepreneurship_who_is_your_customer_(arabic)_5of7
 
Entrepreneurship_who_is_your_customer_(arabic)_4of7
Entrepreneurship_who_is_your_customer_(arabic)_4of7Entrepreneurship_who_is_your_customer_(arabic)_4of7
Entrepreneurship_who_is_your_customer_(arabic)_4of7
 
Entrepreneurship_who_is_your_customer_(arabic)_2of7
Entrepreneurship_who_is_your_customer_(arabic)_2of7Entrepreneurship_who_is_your_customer_(arabic)_2of7
Entrepreneurship_who_is_your_customer_(arabic)_2of7
 
يوميات طالب بدرجة مشرف (Part 19 of 20)
يوميات طالب بدرجة مشرف (Part 19 of 20)يوميات طالب بدرجة مشرف (Part 19 of 20)
يوميات طالب بدرجة مشرف (Part 19 of 20)
 
يوميات طالب بدرجة مشرف (Part 18 of 20)
يوميات طالب بدرجة مشرف (Part 18 of 20)يوميات طالب بدرجة مشرف (Part 18 of 20)
يوميات طالب بدرجة مشرف (Part 18 of 20)
 
يوميات طالب بدرجة مشرف (Part 17 of 20)
يوميات طالب بدرجة مشرف (Part 17 of 20)يوميات طالب بدرجة مشرف (Part 17 of 20)
يوميات طالب بدرجة مشرف (Part 17 of 20)
 
يوميات طالب بدرجة مشرف (Part 16 of 20)
يوميات طالب بدرجة مشرف (Part 16 of 20)يوميات طالب بدرجة مشرف (Part 16 of 20)
يوميات طالب بدرجة مشرف (Part 16 of 20)
 
يوميات طالب بدرجة مشرف (Part 15 of 20)
يوميات طالب بدرجة مشرف (Part 15 of 20)يوميات طالب بدرجة مشرف (Part 15 of 20)
يوميات طالب بدرجة مشرف (Part 15 of 20)
 

Recently uploaded

Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in DelhiRussian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
kauryashika82
 
BASLIQ CURRENT LOOKBOOK LOOKBOOK(1) (1).pdf
BASLIQ CURRENT LOOKBOOK  LOOKBOOK(1) (1).pdfBASLIQ CURRENT LOOKBOOK  LOOKBOOK(1) (1).pdf
BASLIQ CURRENT LOOKBOOK LOOKBOOK(1) (1).pdf
SoniaTolstoy
 
The basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptxThe basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptx
heathfieldcps1
 

Recently uploaded (20)

Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in DelhiRussian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
 
Arihant handbook biology for class 11 .pdf
Arihant handbook biology for class 11 .pdfArihant handbook biology for class 11 .pdf
Arihant handbook biology for class 11 .pdf
 
Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111
 
Interactive Powerpoint_How to Master effective communication
Interactive Powerpoint_How to Master effective communicationInteractive Powerpoint_How to Master effective communication
Interactive Powerpoint_How to Master effective communication
 
microwave assisted reaction. General introduction
microwave assisted reaction. General introductionmicrowave assisted reaction. General introduction
microwave assisted reaction. General introduction
 
Mattingly "AI & Prompt Design: Structured Data, Assistants, & RAG"
Mattingly "AI & Prompt Design: Structured Data, Assistants, & RAG"Mattingly "AI & Prompt Design: Structured Data, Assistants, & RAG"
Mattingly "AI & Prompt Design: Structured Data, Assistants, & RAG"
 
fourth grading exam for kindergarten in writing
fourth grading exam for kindergarten in writingfourth grading exam for kindergarten in writing
fourth grading exam for kindergarten in writing
 
Z Score,T Score, Percential Rank and Box Plot Graph
Z Score,T Score, Percential Rank and Box Plot GraphZ Score,T Score, Percential Rank and Box Plot Graph
Z Score,T Score, Percential Rank and Box Plot Graph
 
BASLIQ CURRENT LOOKBOOK LOOKBOOK(1) (1).pdf
BASLIQ CURRENT LOOKBOOK  LOOKBOOK(1) (1).pdfBASLIQ CURRENT LOOKBOOK  LOOKBOOK(1) (1).pdf
BASLIQ CURRENT LOOKBOOK LOOKBOOK(1) (1).pdf
 
The basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptxThe basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptx
 
Q4-W6-Restating Informational Text Grade 3
Q4-W6-Restating Informational Text Grade 3Q4-W6-Restating Informational Text Grade 3
Q4-W6-Restating Informational Text Grade 3
 
Sanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdfSanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdf
 
Student login on Anyboli platform.helpin
Student login on Anyboli platform.helpinStudent login on Anyboli platform.helpin
Student login on Anyboli platform.helpin
 
Disha NEET Physics Guide for classes 11 and 12.pdf
Disha NEET Physics Guide for classes 11 and 12.pdfDisha NEET Physics Guide for classes 11 and 12.pdf
Disha NEET Physics Guide for classes 11 and 12.pdf
 
A Critique of the Proposed National Education Policy Reform
A Critique of the Proposed National Education Policy ReformA Critique of the Proposed National Education Policy Reform
A Critique of the Proposed National Education Policy Reform
 
Sports & Fitness Value Added Course FY..
Sports & Fitness Value Added Course FY..Sports & Fitness Value Added Course FY..
Sports & Fitness Value Added Course FY..
 
Mattingly "AI & Prompt Design: The Basics of Prompt Design"
Mattingly "AI & Prompt Design: The Basics of Prompt Design"Mattingly "AI & Prompt Design: The Basics of Prompt Design"
Mattingly "AI & Prompt Design: The Basics of Prompt Design"
 
Measures of Dispersion and Variability: Range, QD, AD and SD
Measures of Dispersion and Variability: Range, QD, AD and SDMeasures of Dispersion and Variability: Range, QD, AD and SD
Measures of Dispersion and Variability: Range, QD, AD and SD
 
Grant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingGrant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy Consulting
 
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptxSOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
 

Introduction to Neural networks (under graduate course) Lecture 6 of 9

  • 1. Neural Networks Dr. Randa Elanwar Lecture 6
  • 2. Lecture Content • Non Linearly separable functions: XOR gate implementation – MLP data transformation – mapping implementation – graphical solution 2Neural Networks Dr. Randa Elanwar
  • 3. Non linear problems • XOR problem • The only way to separate the positive from negative examples is to draw 2 lines (i.e., we need 2 straight line equations) or nonlinear region to capture one type only 3Neural Networks Dr. Randa Elanwar +ve +ve -ve -ve+ve -ve cba yx  22
  • 4. Non linear problems • To implement the nonlinearity we need to insert one or more extra layer of nodes between the input layer and the output layer (Hidden layer) 4Neural Networks Dr. Randa Elanwar
  • 5. Non linear problems 2-layer Feed Forward Example XOR solution 5Neural Networks Dr. Randa Elanwar
  • 6. MLP data transformation and mapping implementation • Need for hidden units: • If there is one layer of enough hidden units, the input can be recoded (memorized)  multilayer perceptron (MLP) • This recoding allows any problem to be mapped/represented (e.g., x2, x3, etc.) • Question: how can the weights of the hidden units be trained? • Answer: Learning algorithms e.g., back propagation • The word ‘Back propagation’ is meant to the error propagation for weight adaptation of layers beginning from the last hidden layer back to the first i.e. weights of last layer are computed before the previous layers 6Neural Networks Dr. Randa Elanwar
  • 7. Learning Non Linearly Separable Functions • Back propagation tries to transform training patterns to make them almost linearly separable and use linear network • In other words, if we need more than 1 straight line to separate +ve and –ve patterns, we solve the problem in two phases: – In phase 1: we first represent each straight line with a single perceptron and classify/map the training patterns (output) – In phase 2: these outputs are transformed to new patterns which are now linearly separable and can be classified by an additional perceptron giving the final result. 7Neural Networks Dr. Randa Elanwar
  • 8. Multi-layer Networks and Perceptrons 8Neural Networks Dr. Randa Elanwar - Have one or more layers of hidden units. - With two possibly very large hidden layers, it is possible to implement any function. - Networks without hidden layer are called perceptrons. - Perceptrons are very limited in what they can represent, but this makes their learning problem much simpler.
  • 9. Solving Non Linearly Separable Functions • Example: XOR problem • Phase 1: we draw arbitrary lines • We find the line equations g1(x) = 0 and g2(x) = 0 using arbitrary intersections on the axes (yellow points p1, p2, p3, p4). • We assume the +ve and –ve directions for each line. • We classify the given patterns as +ve/-ve with respect to both g1(x) & g2(x) • Phase 2: we transform the patterns we have • Let the patterns that are +ve/-ve with respect to both g1(x) and g2(x) belong to class B (similar signs), otherwise belong to class A (different signs). • We find the line equations g(y) = 0 using arbitrary intersections on the new axes 9Neural Networks Dr. Randa Elanwar X2 X1 A A B B x1 x2 XOR Class 0 0 0 B 0 1 1 A 1 0 1 A 1 1 0 B g2(x) g1(x) +ve+ve p1 p2 p3 p4
  • 10. Solving Non Linearly Separable Functions • Let p1 = (0.5,0), p2 = (0,0.5), p3(1.5,0), p4(0,1.5) • Constructing g1(x) = 0 • g1(x) = x1 + x2 – 0.5 = 0 • Constructing g2(x) = 0 • g2(x) = x1 + x2 – 1.5 = 0 10Neural Networks Dr. Randa Elanwar )1(1)1(2 )2(1)2(2 )1(11 )2(12 pp pp px px      5.00 05.0 5.01 02      x x )1(3)1(4 )2(3)2(4 )1(31 )2(32 pp pp px px      5.10 05.1 5.11 02      x x
  • 11. Solving Non Linearly Separable Functions • Assume x1>p1(1) is the positive direction for g1(x) • Assume x1>p3(1) is the positive direction for g2(x) • Classifying the given patterns with respect to g1(x) and g2(x): • If we represent +ve and –ve values as a result of step function i.e., y1 =f(g1(x))= 0 if g1(x) is –ve and 1 otherwise and y2 =f(g2(x))= 0 if g2(x) is –ve and 1 otherwise • We now have only three patterns that can be linearly separable and we got rid of the extra pattern causing the problem (since 2 patterns coincide) 11Neural Networks Dr. Randa Elanwar x1 x2 g1(x) g2(x) y1 y2 Class 0 0 -ve -ve 0 0 B 0 1 +ve -ve 1 0 A 1 0 +ve -ve 1 0 A 1 1 +ve +ve 1 1 B
  • 12. Solving Non Linearly Separable Functions • Let p1 = (0.5,0), p2 = (0,-0.25) • Constructing g(y) = 0 • g(y) = y1 – 2 y2 – 0.5 = 0 12Neural Networks Dr. Randa Elanwar y2 y1 A B B g(y) +ve p1 p2)1(1)1(2 )2(1)2(2 )1(11 )2(12 pp pp py py      5.00 025.0 5.01 02      y y x1 x2 1 1 -0.5 1 -2 -0.5 1 1 -1.5 g1(x) g2(x) g(y) Output layerHidden layerInput layer
  • 13. Solving Non Linearly Separable Functions • Example: The linearly non separable patterns x1 = [3 0], x2 = [5 2], x3 = [1 3], x4 = [2 4], x5 = [1 1], x6 = [3 3] have to be classified into two categories C1 = {x1, x2, x3, x4} and C2 = {x5, x6} using a feed forward 2-layer neural network. – Select a suitable number of partitioning straight lines. – Consequently design the first stage (hidden layer) of the network with bipolar discrete perceptrons. – Using this layer, transform the six samples. – Design the output layer of the network with a bipolar discrete perceptron using the transformed samples. 13Neural Networks Dr. Randa Elanwar
  • 14. Solving Non Linearly Separable Functions • Let p1 = (0,1), p2 = (1,2), • p3 = (2,0), p4 = (3,1) • Constructing g1(x) = 0 • g1(x) = x1 - x2 + 1 = 0 • Constructing g2(x) = 0 • g2(x) = x1 - x2 – 2 = 0 14Neural Networks Dr. Randa Elanwar X2 X1 x3 x1 x6 x5 g2(x) g1(x) +ve +ve p1 p2 x2 x4 p3 p4 )1(1)1(2 )2(1)2(2 )1(11 )2(12 pp pp px px      01 12 01 12      x x )1(3)1(4 )2(3)2(4 )1(31 )2(32 pp pp px px      23 01 21 02      x x
  • 15. Solving Non Linearly Separable Functions 15Neural Networks Dr. Randa Elanwar x1 x2 g1(x) g2(x) y1 y2 Class 3 0 +ve +ve 1 1 B 5 2 +ve +ve 1 1 B 1 3 -ve -ve -1 -1 B 2 4 -ve -ve -1 -1 B 1 1 +ve -ve 1 -1 A 3 3 +ve -ve 1 -1 A Assume x2<p1(2) is the positive direction for g1(x) Assume x1>p3(1) is the positive direction for g2(x) Classifying the given patterns with respect to g1(x) and g2(x): We now have only three patterns that can be linearly separable and we got rid of the extra pattern causing the problem (since 2 patterns coincide) we represent +ve and –ve values as a result of bipolar function y1 =f(g1(x))= -1 if g1(x) is –ve and 1 otherwise y2 =f(g2(x))= -1 if g2(x) is –ve and 1 otherwise
  • 16. Solving Non Linearly Separable Functions • Let p1 = (1,0), p2 = (0,-1) • Constructing g(y) = 0 • g(y) = y1 – y2 – 1 = 0 16Neural Networks Dr. Randa Elanwar y2 y1 x5,x6 x1,x2 x3,x4 g(y) +vep1 p2 )1(1)1(2 )2(1)2(2 )1(11 )2(12 pp pp py py      10 01 11 02      y y x1 x2 1 -1 1 1 -1 -1 1 -1 -2 g1(x) g2(x) g(y) Output layerHidden layerInput layer