SlideShare une entreprise Scribd logo
1  sur  125
Télécharger pour lire hors ligne
Brief Introduction to Deep Learning +
Solving XOR using ANN
MENOUFIA UNIVERSITY
FACULTY OF COMPUTERS AND INFORMATION
‫المنوفية‬ ‫جامعة‬
‫الحاسبات‬ ‫كلية‬‫والمعلومات‬
‫المنوفية‬ ‫جامعة‬
Ahmed Fawzy Gad
ahmed.fawzy@ci.menofia.edu.eg
Classification Example
BA
01
1
10
00
0
11
Neural Networks
Input Hidden Output
BA
01
1
10
00
0
11
Neural Networks
BA
01
1
10
00
0
11
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
=
+0
0.2
0.4
0.6
0.8
1
1.2
0 0.5 1 1.5
0
0.2
0.4
0.6
0.8
1
1.2
0 0.5 1 1.5
0
0.2
0.4
0.6
0.8
1
1.2
0 0.5 1 1.5
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
Network Architecture!!
0
0.2
0.4
0.6
0.8
1
1.2
0 0.5 1 1.5
=
+ 0
0.2
0.4
0.6
0.8
1
1.2
0 0.5 1 1.5
0
0.2
0.4
0.6
0.8
1
1.2
0 0.5 1 1.5
0
0.2
0.4
0.6
0.8
1
1.2
0 0.5 1 1.5
0
0.2
0.4
0.6
0.8
1
1.2
0 0.5 1 1.5
0
0.2
0.4
0.6
0.8
1
1.2
0 0.5 1 1.5
Input OutputHidden
0
0.2
0.4
0.6
0.8
1
1.2
0 0.5 1 1.5
Input OutputHidden
0
0.2
0.4
0.6
0.8
1
1.2
0 0.5 1 1.5
Input OutputHidden
0
0.2
0.4
0.6
0.8
1
1.2
0 0.5 1 1.5
BA
01
1
10
00
0
11
Input Output
0
0.2
0.4
0.6
0.8
1
1.2
0 0.5 1 1.5
BA
01
1
10
00
0
11
Input Output
0
0.2
0.4
0.6
0.8
1
1.2
0 0.5 1 1.5
BA
01
1
10
00
0
11
Input Output
0
0.2
0.4
0.6
0.8
1
1.2
0 0.5 1 1.5
BA
01
1
10
00
0
11
Input Output
A
B
0
0.2
0.4
0.6
0.8
1
1.2
0 0.5 1 1.5
BA
01
1
10
00
0
11
Input Output
A
B
0
0.2
0.4
0.6
0.8
1
1.2
0 0.5 1 1.5
BA
01
1 10
00
0 11
Input Output
A
B
0
0.2
0.4
0.6
0.8
1
1.2
0 0.5 1 1.5
BA
01
1
10
00
0
11
Input Output
A
B
0
0.2
0.4
0.6
0.8
1
1.2
0 0.5 1 1.5
BA
01
1
10
00
0
11
Input Output
A
B
1/0
0
0.2
0.4
0.6
0.8
1
1.2
0 0.5 1 1.5
Input Output
A
B
1/0
𝑾 𝟏
𝑾 𝟐
0
0.2
0.4
0.6
0.8
1
1.2
0 0.5 1 1.5
=
+ 0
0.2
0.4
0.6
0.8
1
1.2
0 0.5 1 1.5
0
0.2
0.4
0.6
0.8
1
1.2
0 0.5 1 1.5
0
0.2
0.4
0.6
0.8
1
1.2
0 0.5 1 1.5
0
0.2
0.4
0.6
0.8
1
1.2
0 0.5 1 1.5
0
0.2
0.4
0.6
0.8
1
1.2
0 0.5 1 1.5
Input Output
A
B
1/0
𝑾 𝟑
𝑾 𝟒
0
0.2
0.4
0.6
0.8
1
1.2
0 0.5 1 1.5
A
B
1/0
𝑾 𝟑
𝑾 𝟒
A
B
1/0
𝑾 𝟏
𝑾 𝟐
0
0.2
0.4
0.6
0.8
1
1.2
0 0.5 1 1.5
0
0.2
0.4
0.6
0.8
1
1.2
0 0.5 1 1.5
0
0.2
0.4
0.6
0.8
1
1.2
0 0.5 1 1.5
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
A
B
𝑾 𝟑
𝑾 𝟒
A
B
𝑾 𝟏
𝑾 𝟐
A
B
𝑾 𝟑
𝑾 𝟒
𝑾 𝟓
𝑾 𝟔
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
A
B
𝑾 𝟑
𝑾 𝟒
𝑾 𝟓
𝑾 𝟔
A 𝑾 𝟏
𝑾 𝟐
B
𝑾 𝟑
𝑾 𝟒
𝑾 𝟓
𝑾 𝟔
A 𝑾 𝟏
𝑾 𝟐
B
𝑾 𝟑
𝑾 𝟒
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
Input Output
BA
01
1
10
00
0
11
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
Hidden
Weights=𝑾𝒊
𝑾 𝟓
𝑾 𝟔
1/0
𝒀𝒋
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
Activation Function
Output
𝒀𝒋
BA
01
1
10
00
0
11
Input Hidden
1/0
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
Activation Function
Output
𝒀𝒋
BA
01
1
10
00
0
11
Input Hidden
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
1/0
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
Activation Function
Output
𝒀𝒋
BA
01
1
10
00
0
11
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
Input Hidden
1/0
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
Activation Function
Components
Output
𝒀𝒋
BA
01
1
10
00
0
11
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
Input Hidden
1/0
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
Activation Function
Inputs
Output
s
𝒀𝒋
BA
01
1
10
00
0
11
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
Input Hidden
1/0
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
Activation Function
Inputs
Output
s
𝒀𝒋
BA
01
1
10
00
0
11
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
Input Hidden
1/0
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
s=SOP(𝑿𝒊, 𝑾𝒊)
Activation Function
Inputs
Output
s
𝑿𝒊=Inputs 𝑾𝒊=Weights
𝒀𝒋
BA
01
1
10
00
0
11
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
Input Hidden
1/0
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
s=SOP(𝑿𝒊, 𝑾𝒊)
Activation Function
Inputs
Output
s
𝑿𝒊=Inputs 𝑾𝒊=Weights
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
BA
01
1
10
00
0
11
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
Input Hidden
1/0
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
s=SOP(𝑿𝒊, 𝑾𝒊)
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
Activation Function
Inputs
Output
s
𝑿𝒊=Inputs 𝑾𝒊=Weights
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
BA
01
1
10
00
0
11
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
Input Hidden
S= 𝟏
𝒎
𝑿𝒊 𝑾𝒊
s=SOP(𝑿𝒊, 𝑾𝒊)
1/0
Activation Function
Inputs
Output
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
BA
01
1
10
00
0
11
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
Input Hidden
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
S= 𝟏
𝒎
𝑿𝒊 𝑾𝒊
1/0
Each Hidden/Output Layer
Neuron has its SOP.
Activation Function
Inputs
Output
s
𝑿 𝟏
𝑿 𝟐
𝑺 𝟏=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟑)
𝒀𝒋
BA
01
1
10
00
0
11
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
Input Hidden
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
S= 𝟏
𝒎
𝑿𝒊 𝑾𝒊
1/0
Activation Function
Inputs
Output
s
𝑿 𝟏
𝑿 𝟐
𝑺 𝟐=(𝑿 𝟏 𝑾 𝟐+𝑿 𝟐 𝑾 𝟒)
𝒀𝒋
BA
01
1
10
00
0
11
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
Input Hidden
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
S= 𝟏
𝒎
𝑿𝒊 𝑾𝒊
1/0
Activation Function
Inputs
Output
s
𝑿 𝟏
𝑿 𝟐
𝑺 𝟑=(𝑺 𝟏 𝑾 𝟓+𝑺 𝟐 𝑾 𝟔)
𝒀𝒋
BA
01
1
10
00
0
11
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
Input Hidden
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
S= 𝟏
𝒎
𝑿𝒊 𝑾𝒊
1/0
Activation Function
Outputs
Output
F(s)s
𝑿 𝟏
𝑿 𝟐
Class Label
𝒀𝒋
BA
01
1
10
00
0
11
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
Input Hidden
1/0
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
Activation Function
Outputs
Output
F(s)s
𝑿 𝟏
𝑿 𝟐
Class Label
𝒀𝒋
BA
01
1
10
00
0
11
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
Input Hidden
1/0
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
Activation Functions
Piecewise
Linear Sigmoid Binary
Activation Functions
Which activation function to use?
Outputs
Class
Labels
Activation
Function
TWO Class
Labels
TWO
Outputs
One that gives two outputs.
Which activation function to use?
𝑪𝒋𝒀𝒋
BA
01
1
10
00
0
11
BA
01
1 10
00
0 11
Activation Functions
Piecewise
Linear Sigmoid BinaryBinary
Activation Function
Output
F(s)s
𝑿 𝟏
𝑿 𝟐
bin
𝒀𝒋
BA
01
1
10
00
0
11
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
1/0
Input Hidden
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
Bias
Input Output
BA
01
1
10
00
0
11
F(s)s
𝑿 𝟏
𝑿 𝟐
bin
𝒀𝒋
1/0
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
Bias
Hidden Layer Neurons
Input Output
BA
01
1
10
00
0
11
F(s)s
𝑿 𝟏
𝑿 𝟐
bin
𝒀𝒋
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
+1
𝒃 𝟏
+1
𝒃 𝟐
1/0
Bias
Output Layer Neurons
Input Output
BA
01
1
10
00
0
11
F(s)s
𝑿 𝟏
𝑿 𝟐
bin
𝒀𝒋
+1
𝒃 𝟑
1/0
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
All Bias Values
Input Output
BA
01
1
10
00
0
11
F(s)s
𝑿 𝟏
𝑿 𝟐
bin
𝒀𝒋
+1
𝒃 𝟏
+1
𝒃 𝟐
1/0
+1
𝒃 𝟑
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
Bias
Add Bias to SOP
Input Output
BA
01
1
10
00
0
11
F(s)s
𝑿 𝟏
𝑿 𝟐
bin
𝒀𝒋
+1
𝒃 𝟏
+1
𝒃 𝟐
𝑺 𝟏=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟑)
𝑺 𝟐=(𝑿 𝟏 𝑾 𝟐+𝑿 𝟐 𝑾 𝟒)
𝑺 𝟑=(𝑺 𝟏 𝑾 𝟓+𝑺 𝟐 𝑾 𝟔)
1/0
+1
𝒃 𝟑
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
Bias
Add Bias to SOP
Input Output
BA
01
1
10
00
0
11
F(s)s
𝑿 𝟏
𝑿 𝟐
bin
𝒀𝒋
+1
𝒃 𝟏
+1
𝒃 𝟐
𝑺 𝟏=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟑)
1/0
+1
𝒃 𝟑
𝑺 𝟏=(+𝟏𝒃 𝟏+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟑)
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
Bias
Add Bias to SOP
Input Output
BA
01
1
10
00
0
11
F(s)s
𝑿 𝟏
𝑿 𝟐
bin
𝒀𝒋
+1
𝒃 𝟏
+1
𝒃 𝟐
𝑺 𝟐=(𝑿 𝟏 𝑾 𝟐+𝑿 𝟐 𝑾 𝟒)
1/0
+1
𝒃 𝟑
𝑺 𝟐=(+𝟏𝒃 𝟐+𝑿 𝟏 𝑾 𝟐+𝑿 𝟐 𝑾 𝟒)
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
Bias
Add Bias to SOP
Input Output
BA
01
1
10
00
0
11
F(s)s
𝑿 𝟏
𝑿 𝟐
bin
𝒀𝒋
+1
𝒃 𝟏
+1
𝒃 𝟐
𝑺 𝟑=(𝑺 𝟏 𝑾 𝟓+𝑺 𝟐 𝑾 𝟔)
1/0
+1
𝒃 𝟑
𝑺 𝟑=(+𝟏𝒃 𝟑+𝑺 𝟏 𝑾 𝟓+𝑺 𝟐 𝑾 𝟔)
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
Learning Rate
𝟎 ≤ η ≤ 𝟏
F(s)s
𝑿 𝟏
𝑿 𝟐
bin
𝒀𝒋
+1
𝒃 𝟏
+1
𝒃 𝟐
1/0
+1
𝒃 𝟑
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
Other Parameters
Step n
𝒏 = 𝟎, 𝟏, 𝟐, …
F(s)s
𝑿 𝟏
𝑿 𝟐
bin
𝒀𝒋
+1
𝒃 𝟏
+1
𝒃 𝟐
1/0
+1
𝒃 𝟑
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+…)
𝟎 ≤ η ≤ 𝟏
𝑿(𝒏)=(𝑿 𝟎, 𝑿 𝟏,𝑿 𝟐, …)
W(𝒏)=(𝑾 𝟎, 𝑾 𝟏,𝑾 𝟐, …)
Other Parameters
Desired Output 𝒅𝒋
𝒏 = 𝟎, 𝟏, 𝟐, …
𝒅 𝒏 =
𝟏, 𝒙 𝒏 𝒃𝒆𝒍𝒐𝒏𝒈𝒔 𝒕𝒐 𝑪𝟏 (𝟏)
𝟎, 𝒙 𝒏 𝒃𝒆𝒍𝒐𝒏𝒈𝒔 𝒕𝒐 𝑪𝟐 (𝟎)
BA
01
1
10
00
0
11
F(s)s
𝑿 𝟏
𝑿 𝟐
bin
𝒀𝒋
+1
𝒃 𝟏
+1
𝒃 𝟐
1/0
+1
𝒃 𝟑
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+…)
𝟎 ≤ η ≤ 𝟏
𝑿(𝒏)=(𝑿 𝟎, 𝑿 𝟏,𝑿 𝟐, …)
W(𝒏)=(𝑾 𝟎, 𝑾 𝟏,𝑾 𝟐, …)
Neural Networks Training Steps
Weights Initialization
Inputs Application
Sum of Inputs-Weights Products
Activation Function Response Calculation
Weights Adaptation
Back to Step 2
1
2
3
4
5
6
Regarding 5th Step: Weights Adaptation
• If the predicted output Y is not the same as the desired output d,
then weights are to be adapted according to the following equation:
𝑾 𝒏 + 𝟏 = 𝑾 𝒏 + η 𝒅 𝒏 − 𝒀 𝒏 𝑿(𝒏)
Where
𝑾 𝒏 = [𝒃 𝒏 , 𝑾 𝟏(𝒏), 𝑾 𝟐(𝒏), 𝑾 𝟑(𝒏), … , 𝑾 𝒎(𝒏)]
Neural Networks
Training Example
Step n=0
• In each step in the solution, the parameters of the neural network
must be known.
• Parameters of step n=0:
η = .001
𝑋 𝑛 = 𝑋 0 = +1, +1, +1,1, 0
𝑊 𝑛 = 𝑊 0 = 𝑏1, 𝑏2, 𝑏3, 𝑤1, 𝑤2, 𝑤3, 𝑤4, 𝑤5, 𝑤6
= −1.5, −.5, −.5, 1, 1, 1, 1, −2, 1
𝑑 𝑛 = 𝑑 0 = 1
BA
01
1 => 1
10
00
0 => 0
11
Neural Networks
Training Example
Step n=0
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−2
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
BA
01
1 => 1
10
00
0 => 0
11
Neural Networks
Training Example
Step n=0 – SOP – 𝑺 𝟏
𝑺 𝟏=(+𝟏𝒃 𝟏+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟑)
=+1*-1.5+1*1+0*1
=-.5
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−2
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=0 – Output – 𝑺 𝟏
𝒀 𝑺 𝟏 =
= 𝑩𝑰𝑵 𝑺 𝟏
= 𝑩𝑰𝑵 −. 𝟓
= 𝟎
𝒃𝒊𝒏 𝒔 =
+𝟏, 𝒔 ≥ 𝟎
𝟎, 𝒔 < 𝟎
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−2
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=0 – SOP – 𝑺 𝟐
𝑺 𝟐=(+𝟏𝒃 𝟐+𝑿 𝟏 𝑾 𝟐+𝑿 𝟐 𝑾 𝟒)
=+1*-.5+1*1+0*1
=.5
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−2
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=0 – Output – 𝑺 𝟐
𝒀 𝑺2 =
= 𝑩𝑰𝑵 𝑺2
= 𝑩𝑰𝑵 . 𝟓
= 1
𝒃𝒊𝒏 𝒔 =
+𝟏, 𝒔 ≥ 𝟎
−𝟏, 𝒔 < 𝟎
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−2
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=0 – SOP – 𝑺 𝟑
𝑺 𝟑=(+𝟏𝒃 𝟑+𝑺 𝟏 𝑾 𝟓+𝑺 𝟐 𝑾 𝟔)
=+1*-.5+0*-2+1*1
=.5
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−2
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=0 – Output – 𝑺 𝟑
𝒀 𝑺3 =
= 𝑩𝑰𝑵 𝑺3
= 𝑩𝑰𝑵 . 𝟓
= 1
𝒃𝒊𝒏 𝒔 =
+𝟏, 𝒔 ≥ 𝟎
𝟎, 𝒔 < 𝟎
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−2
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=0 - Output
𝒀 𝒏 = 𝒀 𝟎 = 𝒀 𝑺3
= 1
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−2
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=0
Predicted Vs. Desired
𝒀 𝒏 = 𝒀 𝟎 = 1
𝐝 𝒏 = 𝒅 𝟎 = 1
∵ 𝒀 𝒏 = 𝒅 𝒏
∴ Weights are Correct.
No Adaptation
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−2
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=1
• In each step in the solution, the parameters of the neural network
must be known.
• Parameters of step n=1:
η = .001
𝑋 𝑛 = 𝑋 1 = +1, +1, +1,0, 1
𝑊 𝑛 = 𝑊 1 = 𝑊 0 = 𝑏1, 𝑏2, 𝑏3, 𝑤1, 𝑤1, 𝑤2, 𝑤3, 𝑤4, 𝑤5, 𝑤6
= −1.5, −.5, −.5, 1, 1, 1, 1, −2, 1
𝑑 𝑛 = 𝑑 1 = +1
BA
01
1 => 1
10
00
0 => 0
11
Neural Networks
Training Example
Step n=1
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−2
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
BA
01
1 => 1
10
00
0 => 0
11
Neural Networks
Training Example
Step n=1 – SOP – 𝑺 𝟏
𝑺 𝟏=(+𝟏𝒃 𝟏+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟑)
=+1*-1.5+0*1+1*1
=-.5
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−2
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=1 – Output – 𝑺 𝟏
𝒀 𝑺 𝟏 =
= 𝑩𝑰𝑵 𝑺 𝟏
= 𝑩𝑰𝑵 −. 𝟓
= 𝟎
𝒃𝒊𝒏 𝒔 =
+𝟏, 𝒔 ≥ 𝟎
𝟎, 𝒔 < 𝟎
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−2
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=1 – SOP – 𝑺 𝟐
𝑺 𝟐=(+𝟏𝒃 𝟐+𝑿 𝟏 𝑾 𝟐+𝑿 𝟐 𝑾 𝟒)
=+1*-.5+0*1+1*1
=.5
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−2
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=1 – Output – 𝑺 𝟐
𝒀 𝑺2 =
= 𝑩𝑰𝑵 𝑺2
= 𝑩𝑰𝑵 . 𝟓
= 1
𝒃𝒊𝒏 𝒔 =
+𝟏, 𝒔 ≥ 𝟎
−𝟏, 𝒔 < 𝟎
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−2
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=1 – SOP – 𝑺 𝟑
𝑺 𝟑=(+𝟏𝒃 𝟑+𝑺 𝟏 𝑾 𝟓+𝑺 𝟐 𝑾 𝟔)
=+1*-.5+0*-2+1*1
=.5
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−2
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=1 – Output – 𝑺 𝟑
𝒀 𝑺3 =
= 𝑩𝑰𝑵 𝑺3
= 𝑩𝑰𝑵 . 𝟓
= 1
𝒃𝒊𝒏 𝒔 =
+𝟏, 𝒔 ≥ 𝟎
𝟎, 𝒔 < 𝟎
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−2
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=1 - Output
𝒀 𝒏 = 𝒀 𝟏 = 𝒀 𝑺3
= 1
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−2
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=1
Predicted Vs. Desired
𝒀 𝒏 = 𝒀 𝟏 = 1
𝐝 𝒏 = 𝒅 𝟏 = 1
∵ 𝒀 𝒏 = 𝒅 𝒏
∴ Weights are Correct.
No Adaptation
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−2
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=2
• In each step in the solution, the parameters of the neural network
must be known.
• Parameters of step n=2:
η = .001
𝑋 𝑛 = 𝑋 2 = +1, +1, +1,0, 0
𝑊 𝑛 = 𝑊 2 = 𝑊 1 = 𝑏1, 𝑏2, 𝑏3, 𝑤1, 𝑤1, 𝑤2, 𝑤3, 𝑤4, 𝑤5, 𝑤6
= −1.5, −.5, −.5, 1, 1, 1, 1, −2, 1
𝑑 𝑛 = 𝑑 2 = 0
BA
01
1 => 1
10
00
0 => 0
11
Neural Networks
Training Example
Step n=2
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−𝟐
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
BA
01
1 => 1
10
00
0 => 0
11
Neural Networks
Training Example
Step n=2 – SOP – 𝑺 𝟏
𝑺 𝟏=(+𝟏𝒃 𝟏+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟑)
=+1*-1.5+0*1+0*1
=-1.5
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−𝟐
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=2 – Output – 𝑺 𝟏
𝒀 𝑺 𝟏 =
= 𝑩𝑰𝑵 𝑺 𝟏
= 𝑩𝑰𝑵 −𝟏. 𝟓
= 𝟎
𝒃𝒊n 𝒔 =
+𝟏, 𝒔 ≥ 𝟎
𝟎, 𝒔 < 𝟎
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−𝟐
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=2 – SOP – 𝑺 𝟐
𝑺 𝟐=(+𝟏𝒃 𝟐+𝑿 𝟏 𝑾 𝟐+𝑿 𝟐 𝑾 𝟒)
=+1*-.5+0*1+0*1
=-.5
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−𝟐
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=2 – Output – 𝑺 𝟐
𝒀 𝑺2 =
= 𝑺𝑮𝑵 𝑺2
= 𝑺𝑮𝑵 −. 𝟓
=0
𝒃𝒊𝒏 𝒔 =
+𝟏, 𝒔 ≥ 𝟎
−𝟏, 𝒔 < 𝟎
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−𝟐
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=2 – SOP – 𝑺 𝟑
𝑺 𝟑=(+𝟏𝒃 𝟑+𝑺 𝟏 𝑾 𝟓+𝑺 𝟐 𝑾 𝟔)
=+1*-.5+0*-2+0*1
=-.5
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−𝟐
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=2 – Output – 𝑺 𝟑
𝒀 𝑺3 =
= 𝑩𝑰𝑵 𝑺3
= 𝑩𝑰𝑵 −. 𝟓
= 𝟎
𝒃𝒊𝒏 𝒔 =
+𝟏, 𝒔 ≥ 𝟎
𝟎, 𝒔 < 𝟎
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−𝟐
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=2 - Output
𝒀 𝒏 = 𝒀 𝟐 = 𝒀 𝑺3
= 𝟎
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−𝟐
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=2
Predicted Vs. Desired
𝒀 𝒏 = 𝒀 𝟐 = 𝟎
𝐝 𝒏 = 𝒅 𝟐 = 𝟎
∵ 𝒀 𝒏 = 𝒅 𝒏
∴ Weights are Correct.
No Adaptation
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−𝟐
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=3
• In each step in the solution, the parameters of the neural network
must be known.
• Parameters of step n=3:
η = .001
𝑋 𝑛 = 𝑋 3 = +1, +1, +1,1, 1
𝑊 𝑛 = 𝑊 3 = 𝑊 2 = 𝑏1, 𝑏2, 𝑏3, 𝑤1, 𝑤1, 𝑤2, 𝑤3, 𝑤4, 𝑤5, 𝑤6
= −1.5, −.5, −.5, 1, 1, 1, 1, −2, 1
𝑑 𝑛 = 𝑑 3 = 0
BA
01
1 => 1
10
00
0 => 0
11
Neural Networks
Training Example
Step n=3
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−𝟐
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
BA
01
1 => 1
10
00
0 => 0
11
Neural Networks
Training Example
Step n=3 – SOP – 𝑺 𝟏
𝑺 𝟏=(+𝟏𝒃 𝟏+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟑)
=+1*-1.5+1*1+1*1
=.5
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−𝟐
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=3 – Output – 𝑺 𝟏
𝒀 𝑺 𝟏 =
= 𝑩𝑰𝑵 𝑺 𝟏
= 𝑩𝑰𝑵 . 𝟓
= 1
𝒃𝒊𝒏 𝒔 =
+𝟏, 𝒔 ≥ 𝟎
𝟎, 𝒔 < 𝟎
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−𝟐
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=3 – SOP – 𝑺 𝟐
𝑺 𝟐=(+𝟏𝒃 𝟐+𝑿 𝟏 𝑾 𝟐+𝑿 𝟐 𝑾 𝟒)
=+1*-.5+1*1+1*1
=1.5
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−𝟐
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=3 – Output – 𝑺 𝟐
𝒀 𝑺2 =
= 𝑩𝑰𝑵 𝑺2
= 𝑩𝑰𝑵 𝟏. 𝟓
= 1
𝒃𝒊𝒏 𝒔 =
+𝟏, 𝒔 ≥ 𝟎
−𝟏, 𝒔 < 𝟎
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−𝟐
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=3 – SOP – 𝑺 𝟑
𝑺 𝟑=(+𝟏𝒃 𝟑+𝑺 𝟏 𝑾 𝟓+𝑺 𝟐 𝑾 𝟔)
=+1*-.5+1*-2+1*1
=-1.5
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−𝟐
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=3 – Output – 𝑺 𝟑
𝒀 𝑺3 =
= 𝑩𝑰𝑵 𝑺3
= 𝑩𝑰𝑵 −𝟏. 𝟓
= 𝟎
𝒃𝒊𝒏 𝒔 =
+𝟏, 𝒔 ≥ 𝟎
𝟎, 𝒔 < 𝟎
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−𝟐
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=3 - Output
𝒀 𝒏 = 𝒀 𝟑 = 𝒀 𝑺3
= 𝟎
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−𝟐
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=3
Predicted Vs. Desired
𝒀 𝒏 = 𝒀 𝟑 = 𝟎
𝐝 𝒏 = 𝒅 𝟑 = 𝟎
∵ 𝒀 𝒏 = 𝒅 𝒏
∴ Weights are Correct.
No Adaptation
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−𝟐
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Final Weights
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−𝟐
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Current weights predicted
the desired outputs.

Contenu connexe

Tendances

今さら聞けないカーネル法とサポートベクターマシン
今さら聞けないカーネル法とサポートベクターマシン今さら聞けないカーネル法とサポートベクターマシン
今さら聞けないカーネル法とサポートベクターマシン
Shinya Shimizu
 
K means Clustering
K means ClusteringK means Clustering
K means Clustering
Edureka!
 

Tendances (20)

バンディット問題について
バンディット問題についてバンディット問題について
バンディット問題について
 
【DL輪読会】GPT-4Technical Report
【DL輪読会】GPT-4Technical Report【DL輪読会】GPT-4Technical Report
【DL輪読会】GPT-4Technical Report
 
오토인코더의 모든 것
오토인코더의 모든 것오토인코더의 모든 것
오토인코더의 모든 것
 
ディープラーニングの2値化(Binarized Neural Network)
ディープラーニングの2値化(Binarized Neural Network)ディープラーニングの2値化(Binarized Neural Network)
ディープラーニングの2値化(Binarized Neural Network)
 
Loss functions (DLAI D4L2 2017 UPC Deep Learning for Artificial Intelligence)
Loss functions (DLAI D4L2 2017 UPC Deep Learning for Artificial Intelligence)Loss functions (DLAI D4L2 2017 UPC Deep Learning for Artificial Intelligence)
Loss functions (DLAI D4L2 2017 UPC Deep Learning for Artificial Intelligence)
 
Recurrent Neural Networks, LSTM and GRU
Recurrent Neural Networks, LSTM and GRURecurrent Neural Networks, LSTM and GRU
Recurrent Neural Networks, LSTM and GRU
 
PCAの最終形態GPLVMの解説
PCAの最終形態GPLVMの解説PCAの最終形態GPLVMの解説
PCAの最終形態GPLVMの解説
 
今さら聞けないカーネル法とサポートベクターマシン
今さら聞けないカーネル法とサポートベクターマシン今さら聞けないカーネル法とサポートベクターマシン
今さら聞けないカーネル法とサポートベクターマシン
 
Word Tour: One-dimensional Word Embeddings via the Traveling Salesman Problem...
Word Tour: One-dimensional Word Embeddings via the Traveling Salesman Problem...Word Tour: One-dimensional Word Embeddings via the Traveling Salesman Problem...
Word Tour: One-dimensional Word Embeddings via the Traveling Salesman Problem...
 
lsh
lshlsh
lsh
 
【DL輪読会】Generative models for molecular discovery: Recent advances and challenges
【DL輪読会】Generative models for molecular discovery: Recent advances and challenges【DL輪読会】Generative models for molecular discovery: Recent advances and challenges
【DL輪読会】Generative models for molecular discovery: Recent advances and challenges
 
[DL輪読会]data2vec: A General Framework for Self-supervised Learning in Speech,...
[DL輪読会]data2vec: A General Framework for  Self-supervised Learning in Speech,...[DL輪読会]data2vec: A General Framework for  Self-supervised Learning in Speech,...
[DL輪読会]data2vec: A General Framework for Self-supervised Learning in Speech,...
 
Data mining and data warehouse lab manual updated
Data mining and data warehouse lab manual updatedData mining and data warehouse lab manual updated
Data mining and data warehouse lab manual updated
 
深層学習による非滑らかな関数の推定
深層学習による非滑らかな関数の推定深層学習による非滑らかな関数の推定
深層学習による非滑らかな関数の推定
 
SSII2021 [OS2-01] 転移学習の基礎:異なるタスクの知識を利用するための機械学習の方法
SSII2021 [OS2-01] 転移学習の基礎:異なるタスクの知識を利用するための機械学習の方法SSII2021 [OS2-01] 転移学習の基礎:異なるタスクの知識を利用するための機械学習の方法
SSII2021 [OS2-01] 転移学習の基礎:異なるタスクの知識を利用するための機械学習の方法
 
Unit 1
Unit 1Unit 1
Unit 1
 
Activation function
Activation functionActivation function
Activation function
 
Neural Networks: Multilayer Perceptron
Neural Networks: Multilayer PerceptronNeural Networks: Multilayer Perceptron
Neural Networks: Multilayer Perceptron
 
Graph LSTM解説
Graph LSTM解説Graph LSTM解説
Graph LSTM解説
 
K means Clustering
K means ClusteringK means Clustering
K means Clustering
 

Similaire à Brief Introduction to Deep Learning + Solving XOR using ANNs

Introduction to Artificial Neural Networks (ANNs) - Step-by-Step Training & T...
Introduction to Artificial Neural Networks (ANNs) - Step-by-Step Training & T...Introduction to Artificial Neural Networks (ANNs) - Step-by-Step Training & T...
Introduction to Artificial Neural Networks (ANNs) - Step-by-Step Training & T...
Ahmed Gad
 
Swinburne University of Technology Faculty of Science, E.docx
Swinburne University of Technology Faculty of Science, E.docxSwinburne University of Technology Faculty of Science, E.docx
Swinburne University of Technology Faculty of Science, E.docx
mattinsonjanel
 
Microcontrollers and RT programming 3
Microcontrollers and RT programming 3Microcontrollers and RT programming 3
Microcontrollers and RT programming 3
SSGMCE SHEGAON
 

Similaire à Brief Introduction to Deep Learning + Solving XOR using ANNs (20)

Introduction to Artificial Neural Networks (ANNs) - Step-by-Step Training & T...
Introduction to Artificial Neural Networks (ANNs) - Step-by-Step Training & T...Introduction to Artificial Neural Networks (ANNs) - Step-by-Step Training & T...
Introduction to Artificial Neural Networks (ANNs) - Step-by-Step Training & T...
 
Neural Network Back Propagation Algorithm
Neural Network Back Propagation AlgorithmNeural Network Back Propagation Algorithm
Neural Network Back Propagation Algorithm
 
Digital Electronics Fundamentals
Digital Electronics Fundamentals Digital Electronics Fundamentals
Digital Electronics Fundamentals
 
Pe 3231 week 15 18 plc
Pe 3231 week 15 18  plcPe 3231 week 15 18  plc
Pe 3231 week 15 18 plc
 
Deep learning simplified
Deep learning simplifiedDeep learning simplified
Deep learning simplified
 
Swinburne University of Technology Faculty of Science, E.docx
Swinburne University of Technology Faculty of Science, E.docxSwinburne University of Technology Faculty of Science, E.docx
Swinburne University of Technology Faculty of Science, E.docx
 
Microcontrollers and RT programming 3
Microcontrollers and RT programming 3Microcontrollers and RT programming 3
Microcontrollers and RT programming 3
 
Dsp Datapath
Dsp DatapathDsp Datapath
Dsp Datapath
 
Learning Deep Learning
Learning Deep LearningLearning Deep Learning
Learning Deep Learning
 
How does a Neural Network work?
How does a Neural Network work?How does a Neural Network work?
How does a Neural Network work?
 
Transformers ASR.pdf
Transformers ASR.pdfTransformers ASR.pdf
Transformers ASR.pdf
 
LOC_LOS.pdf
LOC_LOS.pdfLOC_LOS.pdf
LOC_LOS.pdf
 
Chapter 2 Decision Making (Python Programming Lecture)
Chapter 2 Decision Making (Python Programming Lecture)Chapter 2 Decision Making (Python Programming Lecture)
Chapter 2 Decision Making (Python Programming Lecture)
 
eel6935_ch2.pdf
eel6935_ch2.pdfeel6935_ch2.pdf
eel6935_ch2.pdf
 
Beyond php - it's not (just) about the code
Beyond php - it's not (just) about the codeBeyond php - it's not (just) about the code
Beyond php - it's not (just) about the code
 
Ecet 340 Your world/newtonhelp.com
Ecet 340 Your world/newtonhelp.comEcet 340 Your world/newtonhelp.com
Ecet 340 Your world/newtonhelp.com
 
Ecet 340 Motivated Minds/newtonhelp.com
Ecet 340 Motivated Minds/newtonhelp.comEcet 340 Motivated Minds/newtonhelp.com
Ecet 340 Motivated Minds/newtonhelp.com
 
Ecet 340 Extraordinary Success/newtonhelp.com
Ecet 340 Extraordinary Success/newtonhelp.comEcet 340 Extraordinary Success/newtonhelp.com
Ecet 340 Extraordinary Success/newtonhelp.com
 
Ecet 340 Education is Power/newtonhelp.com
Ecet 340 Education is Power/newtonhelp.comEcet 340 Education is Power/newtonhelp.com
Ecet 340 Education is Power/newtonhelp.com
 
OCR GCSE Computing - Binary logic and Truth Tables
OCR GCSE Computing - Binary logic and Truth TablesOCR GCSE Computing - Binary logic and Truth Tables
OCR GCSE Computing - Binary logic and Truth Tables
 

Plus de Ahmed Gad

Derivation of Convolutional Neural Network from Fully Connected Network Step-...
Derivation of Convolutional Neural Network from Fully Connected Network Step-...Derivation of Convolutional Neural Network from Fully Connected Network Step-...
Derivation of Convolutional Neural Network from Fully Connected Network Step-...
Ahmed Gad
 
Derivation of Convolutional Neural Network (ConvNet) from Fully Connected Net...
Derivation of Convolutional Neural Network (ConvNet) from Fully Connected Net...Derivation of Convolutional Neural Network (ConvNet) from Fully Connected Net...
Derivation of Convolutional Neural Network (ConvNet) from Fully Connected Net...
Ahmed Gad
 
ICCES 2017 - Crowd Density Estimation Method using Regression Analysis
ICCES 2017 - Crowd Density Estimation Method using Regression AnalysisICCES 2017 - Crowd Density Estimation Method using Regression Analysis
ICCES 2017 - Crowd Density Estimation Method using Regression Analysis
Ahmed Gad
 
Graduation Project - Face Login : A Robust Face Identification System for Sec...
Graduation Project - Face Login : A Robust Face Identification System for Sec...Graduation Project - Face Login : A Robust Face Identification System for Sec...
Graduation Project - Face Login : A Robust Face Identification System for Sec...
Ahmed Gad
 
Introduction to MATrices LABoratory (MATLAB) as Part of Digital Signal Proces...
Introduction to MATrices LABoratory (MATLAB) as Part of Digital Signal Proces...Introduction to MATrices LABoratory (MATLAB) as Part of Digital Signal Proces...
Introduction to MATrices LABoratory (MATLAB) as Part of Digital Signal Proces...
Ahmed Gad
 

Plus de Ahmed Gad (20)

ICEIT'20 Cython for Speeding-up Genetic Algorithm
ICEIT'20 Cython for Speeding-up Genetic AlgorithmICEIT'20 Cython for Speeding-up Genetic Algorithm
ICEIT'20 Cython for Speeding-up Genetic Algorithm
 
NumPyCNNAndroid: A Library for Straightforward Implementation of Convolutiona...
NumPyCNNAndroid: A Library for Straightforward Implementation of Convolutiona...NumPyCNNAndroid: A Library for Straightforward Implementation of Convolutiona...
NumPyCNNAndroid: A Library for Straightforward Implementation of Convolutiona...
 
Python for Computer Vision - Revision 2nd Edition
Python for Computer Vision - Revision 2nd EditionPython for Computer Vision - Revision 2nd Edition
Python for Computer Vision - Revision 2nd Edition
 
Multi-Objective Optimization using Non-Dominated Sorting Genetic Algorithm wi...
Multi-Objective Optimization using Non-Dominated Sorting Genetic Algorithm wi...Multi-Objective Optimization using Non-Dominated Sorting Genetic Algorithm wi...
Multi-Objective Optimization using Non-Dominated Sorting Genetic Algorithm wi...
 
M.Sc. Thesis - Automatic People Counting in Crowded Scenes
M.Sc. Thesis - Automatic People Counting in Crowded ScenesM.Sc. Thesis - Automatic People Counting in Crowded Scenes
M.Sc. Thesis - Automatic People Counting in Crowded Scenes
 
Derivation of Convolutional Neural Network from Fully Connected Network Step-...
Derivation of Convolutional Neural Network from Fully Connected Network Step-...Derivation of Convolutional Neural Network from Fully Connected Network Step-...
Derivation of Convolutional Neural Network from Fully Connected Network Step-...
 
Introduction to Optimization with Genetic Algorithm (GA)
Introduction to Optimization with Genetic Algorithm (GA)Introduction to Optimization with Genetic Algorithm (GA)
Introduction to Optimization with Genetic Algorithm (GA)
 
Derivation of Convolutional Neural Network (ConvNet) from Fully Connected Net...
Derivation of Convolutional Neural Network (ConvNet) from Fully Connected Net...Derivation of Convolutional Neural Network (ConvNet) from Fully Connected Net...
Derivation of Convolutional Neural Network (ConvNet) from Fully Connected Net...
 
Avoid Overfitting with Regularization
Avoid Overfitting with RegularizationAvoid Overfitting with Regularization
Avoid Overfitting with Regularization
 
Genetic Algorithm (GA) Optimization - Step-by-Step Example
Genetic Algorithm (GA) Optimization - Step-by-Step ExampleGenetic Algorithm (GA) Optimization - Step-by-Step Example
Genetic Algorithm (GA) Optimization - Step-by-Step Example
 
ICCES 2017 - Crowd Density Estimation Method using Regression Analysis
ICCES 2017 - Crowd Density Estimation Method using Regression AnalysisICCES 2017 - Crowd Density Estimation Method using Regression Analysis
ICCES 2017 - Crowd Density Estimation Method using Regression Analysis
 
Backpropagation: Understanding How to Update ANNs Weights Step-by-Step
Backpropagation: Understanding How to Update ANNs Weights Step-by-StepBackpropagation: Understanding How to Update ANNs Weights Step-by-Step
Backpropagation: Understanding How to Update ANNs Weights Step-by-Step
 
Computer Vision: Correlation, Convolution, and Gradient
Computer Vision: Correlation, Convolution, and GradientComputer Vision: Correlation, Convolution, and Gradient
Computer Vision: Correlation, Convolution, and Gradient
 
Python for Computer Vision - Revision
Python for Computer Vision - RevisionPython for Computer Vision - Revision
Python for Computer Vision - Revision
 
Anime Studio Pro 10 Tutorial as Part of Multimedia Course
Anime Studio Pro 10 Tutorial as Part of Multimedia CourseAnime Studio Pro 10 Tutorial as Part of Multimedia Course
Anime Studio Pro 10 Tutorial as Part of Multimedia Course
 
Operations in Digital Image Processing + Convolution by Example
Operations in Digital Image Processing + Convolution by ExampleOperations in Digital Image Processing + Convolution by Example
Operations in Digital Image Processing + Convolution by Example
 
MATLAB Code + Description : Real-Time Object Motion Detection and Tracking
MATLAB Code + Description : Real-Time Object Motion Detection and TrackingMATLAB Code + Description : Real-Time Object Motion Detection and Tracking
MATLAB Code + Description : Real-Time Object Motion Detection and Tracking
 
MATLAB Code + Description : Very Simple Automatic English Optical Character R...
MATLAB Code + Description : Very Simple Automatic English Optical Character R...MATLAB Code + Description : Very Simple Automatic English Optical Character R...
MATLAB Code + Description : Very Simple Automatic English Optical Character R...
 
Graduation Project - Face Login : A Robust Face Identification System for Sec...
Graduation Project - Face Login : A Robust Face Identification System for Sec...Graduation Project - Face Login : A Robust Face Identification System for Sec...
Graduation Project - Face Login : A Robust Face Identification System for Sec...
 
Introduction to MATrices LABoratory (MATLAB) as Part of Digital Signal Proces...
Introduction to MATrices LABoratory (MATLAB) as Part of Digital Signal Proces...Introduction to MATrices LABoratory (MATLAB) as Part of Digital Signal Proces...
Introduction to MATrices LABoratory (MATLAB) as Part of Digital Signal Proces...
 

Dernier

Beyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global ImpactBeyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global Impact
PECB
 
An Overview of Mutual Funds Bcom Project.pdf
An Overview of Mutual Funds Bcom Project.pdfAn Overview of Mutual Funds Bcom Project.pdf
An Overview of Mutual Funds Bcom Project.pdf
SanaAli374401
 
1029-Danh muc Sach Giao Khoa khoi 6.pdf
1029-Danh muc Sach Giao Khoa khoi  6.pdf1029-Danh muc Sach Giao Khoa khoi  6.pdf
1029-Danh muc Sach Giao Khoa khoi 6.pdf
QucHHunhnh
 

Dernier (20)

Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
 
Paris 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activityParis 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activity
 
Unit-IV; Professional Sales Representative (PSR).pptx
Unit-IV; Professional Sales Representative (PSR).pptxUnit-IV; Professional Sales Representative (PSR).pptx
Unit-IV; Professional Sales Representative (PSR).pptx
 
Beyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global ImpactBeyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global Impact
 
APM Welcome, APM North West Network Conference, Synergies Across Sectors
APM Welcome, APM North West Network Conference, Synergies Across SectorsAPM Welcome, APM North West Network Conference, Synergies Across Sectors
APM Welcome, APM North West Network Conference, Synergies Across Sectors
 
How to Give a Domain for a Field in Odoo 17
How to Give a Domain for a Field in Odoo 17How to Give a Domain for a Field in Odoo 17
How to Give a Domain for a Field in Odoo 17
 
An Overview of Mutual Funds Bcom Project.pdf
An Overview of Mutual Funds Bcom Project.pdfAn Overview of Mutual Funds Bcom Project.pdf
An Overview of Mutual Funds Bcom Project.pdf
 
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptxSOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
 
Application orientated numerical on hev.ppt
Application orientated numerical on hev.pptApplication orientated numerical on hev.ppt
Application orientated numerical on hev.ppt
 
INDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptx
INDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptxINDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptx
INDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptx
 
ICT Role in 21st Century Education & its Challenges.pptx
ICT Role in 21st Century Education & its Challenges.pptxICT Role in 21st Century Education & its Challenges.pptx
ICT Role in 21st Century Education & its Challenges.pptx
 
Unit-IV- Pharma. Marketing Channels.pptx
Unit-IV- Pharma. Marketing Channels.pptxUnit-IV- Pharma. Marketing Channels.pptx
Unit-IV- Pharma. Marketing Channels.pptx
 
Web & Social Media Analytics Previous Year Question Paper.pdf
Web & Social Media Analytics Previous Year Question Paper.pdfWeb & Social Media Analytics Previous Year Question Paper.pdf
Web & Social Media Analytics Previous Year Question Paper.pdf
 
Basic Civil Engineering first year Notes- Chapter 4 Building.pptx
Basic Civil Engineering first year Notes- Chapter 4 Building.pptxBasic Civil Engineering first year Notes- Chapter 4 Building.pptx
Basic Civil Engineering first year Notes- Chapter 4 Building.pptx
 
Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104
 
This PowerPoint helps students to consider the concept of infinity.
This PowerPoint helps students to consider the concept of infinity.This PowerPoint helps students to consider the concept of infinity.
This PowerPoint helps students to consider the concept of infinity.
 
psychiatric nursing HISTORY COLLECTION .docx
psychiatric  nursing HISTORY  COLLECTION  .docxpsychiatric  nursing HISTORY  COLLECTION  .docx
psychiatric nursing HISTORY COLLECTION .docx
 
Advance Mobile Application Development class 07
Advance Mobile Application Development class 07Advance Mobile Application Development class 07
Advance Mobile Application Development class 07
 
1029-Danh muc Sach Giao Khoa khoi 6.pdf
1029-Danh muc Sach Giao Khoa khoi  6.pdf1029-Danh muc Sach Giao Khoa khoi  6.pdf
1029-Danh muc Sach Giao Khoa khoi 6.pdf
 
PROCESS RECORDING FORMAT.docx
PROCESS      RECORDING        FORMAT.docxPROCESS      RECORDING        FORMAT.docx
PROCESS RECORDING FORMAT.docx
 

Brief Introduction to Deep Learning + Solving XOR using ANNs

  • 1. Brief Introduction to Deep Learning + Solving XOR using ANN MENOUFIA UNIVERSITY FACULTY OF COMPUTERS AND INFORMATION ‫المنوفية‬ ‫جامعة‬ ‫الحاسبات‬ ‫كلية‬‫والمعلومات‬ ‫المنوفية‬ ‫جامعة‬ Ahmed Fawzy Gad ahmed.fawzy@ci.menofia.edu.eg
  • 3. Neural Networks Input Hidden Output BA 01 1 10 00 0 11
  • 12. = +0 0.2 0.4 0.6 0.8 1 1.2 0 0.5 1 1.5 0 0.2 0.4 0.6 0.8 1 1.2 0 0.5 1 1.5 0 0.2 0.4 0.6 0.8 1 1.2 0 0.5 1 1.5
  • 13. 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Network Architecture!!
  • 14. 0 0.2 0.4 0.6 0.8 1 1.2 0 0.5 1 1.5 = + 0 0.2 0.4 0.6 0.8 1 1.2 0 0.5 1 1.5 0 0.2 0.4 0.6 0.8 1 1.2 0 0.5 1 1.5
  • 17. 0 0.2 0.4 0.6 0.8 1 1.2 0 0.5 1 1.5 Input OutputHidden
  • 18. 0 0.2 0.4 0.6 0.8 1 1.2 0 0.5 1 1.5 Input OutputHidden
  • 19. 0 0.2 0.4 0.6 0.8 1 1.2 0 0.5 1 1.5 Input OutputHidden
  • 20. 0 0.2 0.4 0.6 0.8 1 1.2 0 0.5 1 1.5 BA 01 1 10 00 0 11 Input Output
  • 21. 0 0.2 0.4 0.6 0.8 1 1.2 0 0.5 1 1.5 BA 01 1 10 00 0 11 Input Output
  • 22. 0 0.2 0.4 0.6 0.8 1 1.2 0 0.5 1 1.5 BA 01 1 10 00 0 11 Input Output
  • 23. 0 0.2 0.4 0.6 0.8 1 1.2 0 0.5 1 1.5 BA 01 1 10 00 0 11 Input Output A B
  • 24. 0 0.2 0.4 0.6 0.8 1 1.2 0 0.5 1 1.5 BA 01 1 10 00 0 11 Input Output A B
  • 25. 0 0.2 0.4 0.6 0.8 1 1.2 0 0.5 1 1.5 BA 01 1 10 00 0 11 Input Output A B
  • 26. 0 0.2 0.4 0.6 0.8 1 1.2 0 0.5 1 1.5 BA 01 1 10 00 0 11 Input Output A B
  • 27. 0 0.2 0.4 0.6 0.8 1 1.2 0 0.5 1 1.5 BA 01 1 10 00 0 11 Input Output A B 1/0
  • 28. 0 0.2 0.4 0.6 0.8 1 1.2 0 0.5 1 1.5 Input Output A B 1/0 𝑾 𝟏 𝑾 𝟐
  • 29. 0 0.2 0.4 0.6 0.8 1 1.2 0 0.5 1 1.5 = + 0 0.2 0.4 0.6 0.8 1 1.2 0 0.5 1 1.5 0 0.2 0.4 0.6 0.8 1 1.2 0 0.5 1 1.5
  • 32. 0 0.2 0.4 0.6 0.8 1 1.2 0 0.5 1 1.5 Input Output A B 1/0 𝑾 𝟑 𝑾 𝟒
  • 33. 0 0.2 0.4 0.6 0.8 1 1.2 0 0.5 1 1.5 A B 1/0 𝑾 𝟑 𝑾 𝟒 A B 1/0 𝑾 𝟏 𝑾 𝟐 0 0.2 0.4 0.6 0.8 1 1.2 0 0.5 1 1.5 0 0.2 0.4 0.6 0.8 1 1.2 0 0.5 1 1.5
  • 34. 0 0.2 0.4 0.6 0.8 1 1.2 0 0.5 1 1.5 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 A B 𝑾 𝟑 𝑾 𝟒
  • 35. A B 𝑾 𝟏 𝑾 𝟐 A B 𝑾 𝟑 𝑾 𝟒 𝑾 𝟓 𝑾 𝟔
  • 36. 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 A B 𝑾 𝟑 𝑾 𝟒 𝑾 𝟓 𝑾 𝟔 A 𝑾 𝟏 𝑾 𝟐 B 𝑾 𝟑 𝑾 𝟒
  • 37. 𝑾 𝟓 𝑾 𝟔 A 𝑾 𝟏 𝑾 𝟐 B 𝑾 𝟑 𝑾 𝟒
  • 38.
  • 40. . . .
  • 42. . . .
  • 45. . . .
  • 46. . . .
  • 47.
  • 48.
  • 49.
  • 50.
  • 51.
  • 52. Input Output BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Hidden Weights=𝑾𝒊 𝑾 𝟓 𝑾 𝟔 1/0 𝒀𝒋 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒
  • 53. Activation Function Output 𝒀𝒋 BA 01 1 10 00 0 11 Input Hidden 1/0 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒
  • 54. Activation Function Output 𝒀𝒋 BA 01 1 10 00 0 11 Input Hidden 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 1/0 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒
  • 55. Activation Function Output 𝒀𝒋 BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Input Hidden 1/0 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒
  • 56. Activation Function Components Output 𝒀𝒋 BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Input Hidden 1/0 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒
  • 57. Activation Function Inputs Output s 𝒀𝒋 BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Input Hidden 1/0 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒
  • 58. Activation Function Inputs Output s 𝒀𝒋 BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Input Hidden 1/0 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒 s=SOP(𝑿𝒊, 𝑾𝒊)
  • 59. Activation Function Inputs Output s 𝑿𝒊=Inputs 𝑾𝒊=Weights 𝒀𝒋 BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Input Hidden 1/0 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒 s=SOP(𝑿𝒊, 𝑾𝒊)
  • 60. Activation Function Inputs Output s 𝑿𝒊=Inputs 𝑾𝒊=Weights 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Input Hidden 1/0 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒 s=SOP(𝑿𝒊, 𝑾𝒊)
  • 61. 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒 Activation Function Inputs Output s 𝑿𝒊=Inputs 𝑾𝒊=Weights 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Input Hidden S= 𝟏 𝒎 𝑿𝒊 𝑾𝒊 s=SOP(𝑿𝒊, 𝑾𝒊) 1/0
  • 62. Activation Function Inputs Output s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Input Hidden 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒 S= 𝟏 𝒎 𝑿𝒊 𝑾𝒊 1/0 Each Hidden/Output Layer Neuron has its SOP.
  • 63. Activation Function Inputs Output s 𝑿 𝟏 𝑿 𝟐 𝑺 𝟏=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟑) 𝒀𝒋 BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Input Hidden 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒 S= 𝟏 𝒎 𝑿𝒊 𝑾𝒊 1/0
  • 64. Activation Function Inputs Output s 𝑿 𝟏 𝑿 𝟐 𝑺 𝟐=(𝑿 𝟏 𝑾 𝟐+𝑿 𝟐 𝑾 𝟒) 𝒀𝒋 BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Input Hidden 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒 S= 𝟏 𝒎 𝑿𝒊 𝑾𝒊 1/0
  • 65. Activation Function Inputs Output s 𝑿 𝟏 𝑿 𝟐 𝑺 𝟑=(𝑺 𝟏 𝑾 𝟓+𝑺 𝟐 𝑾 𝟔) 𝒀𝒋 BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Input Hidden 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒 S= 𝟏 𝒎 𝑿𝒊 𝑾𝒊 1/0
  • 66. Activation Function Outputs Output F(s)s 𝑿 𝟏 𝑿 𝟐 Class Label 𝒀𝒋 BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Input Hidden 1/0 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒
  • 67. Activation Function Outputs Output F(s)s 𝑿 𝟏 𝑿 𝟐 Class Label 𝒀𝒋 BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Input Hidden 1/0 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒
  • 69. Activation Functions Which activation function to use? Outputs Class Labels Activation Function TWO Class Labels TWO Outputs One that gives two outputs. Which activation function to use? 𝑪𝒋𝒀𝒋 BA 01 1 10 00 0 11 BA 01 1 10 00 0 11
  • 71. Activation Function Output F(s)s 𝑿 𝟏 𝑿 𝟐 bin 𝒀𝒋 BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 1/0 Input Hidden 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒
  • 72. Bias Input Output BA 01 1 10 00 0 11 F(s)s 𝑿 𝟏 𝑿 𝟐 bin 𝒀𝒋 1/0 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒
  • 73. Bias Hidden Layer Neurons Input Output BA 01 1 10 00 0 11 F(s)s 𝑿 𝟏 𝑿 𝟐 bin 𝒀𝒋 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒 +1 𝒃 𝟏 +1 𝒃 𝟐 1/0
  • 74. Bias Output Layer Neurons Input Output BA 01 1 10 00 0 11 F(s)s 𝑿 𝟏 𝑿 𝟐 bin 𝒀𝒋 +1 𝒃 𝟑 1/0 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒
  • 75. All Bias Values Input Output BA 01 1 10 00 0 11 F(s)s 𝑿 𝟏 𝑿 𝟐 bin 𝒀𝒋 +1 𝒃 𝟏 +1 𝒃 𝟐 1/0 +1 𝒃 𝟑 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒
  • 76. Bias Add Bias to SOP Input Output BA 01 1 10 00 0 11 F(s)s 𝑿 𝟏 𝑿 𝟐 bin 𝒀𝒋 +1 𝒃 𝟏 +1 𝒃 𝟐 𝑺 𝟏=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟑) 𝑺 𝟐=(𝑿 𝟏 𝑾 𝟐+𝑿 𝟐 𝑾 𝟒) 𝑺 𝟑=(𝑺 𝟏 𝑾 𝟓+𝑺 𝟐 𝑾 𝟔) 1/0 +1 𝒃 𝟑 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒
  • 77. Bias Add Bias to SOP Input Output BA 01 1 10 00 0 11 F(s)s 𝑿 𝟏 𝑿 𝟐 bin 𝒀𝒋 +1 𝒃 𝟏 +1 𝒃 𝟐 𝑺 𝟏=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟑) 1/0 +1 𝒃 𝟑 𝑺 𝟏=(+𝟏𝒃 𝟏+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟑) 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒
  • 78. Bias Add Bias to SOP Input Output BA 01 1 10 00 0 11 F(s)s 𝑿 𝟏 𝑿 𝟐 bin 𝒀𝒋 +1 𝒃 𝟏 +1 𝒃 𝟐 𝑺 𝟐=(𝑿 𝟏 𝑾 𝟐+𝑿 𝟐 𝑾 𝟒) 1/0 +1 𝒃 𝟑 𝑺 𝟐=(+𝟏𝒃 𝟐+𝑿 𝟏 𝑾 𝟐+𝑿 𝟐 𝑾 𝟒) 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒
  • 79. Bias Add Bias to SOP Input Output BA 01 1 10 00 0 11 F(s)s 𝑿 𝟏 𝑿 𝟐 bin 𝒀𝒋 +1 𝒃 𝟏 +1 𝒃 𝟐 𝑺 𝟑=(𝑺 𝟏 𝑾 𝟓+𝑺 𝟐 𝑾 𝟔) 1/0 +1 𝒃 𝟑 𝑺 𝟑=(+𝟏𝒃 𝟑+𝑺 𝟏 𝑾 𝟓+𝑺 𝟐 𝑾 𝟔) 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒
  • 80. Learning Rate 𝟎 ≤ η ≤ 𝟏 F(s)s 𝑿 𝟏 𝑿 𝟐 bin 𝒀𝒋 +1 𝒃 𝟏 +1 𝒃 𝟐 1/0 +1 𝒃 𝟑 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒
  • 81. Other Parameters Step n 𝒏 = 𝟎, 𝟏, 𝟐, … F(s)s 𝑿 𝟏 𝑿 𝟐 bin 𝒀𝒋 +1 𝒃 𝟏 +1 𝒃 𝟐 1/0 +1 𝒃 𝟑 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒 s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+…) 𝟎 ≤ η ≤ 𝟏 𝑿(𝒏)=(𝑿 𝟎, 𝑿 𝟏,𝑿 𝟐, …) W(𝒏)=(𝑾 𝟎, 𝑾 𝟏,𝑾 𝟐, …)
  • 82. Other Parameters Desired Output 𝒅𝒋 𝒏 = 𝟎, 𝟏, 𝟐, … 𝒅 𝒏 = 𝟏, 𝒙 𝒏 𝒃𝒆𝒍𝒐𝒏𝒈𝒔 𝒕𝒐 𝑪𝟏 (𝟏) 𝟎, 𝒙 𝒏 𝒃𝒆𝒍𝒐𝒏𝒈𝒔 𝒕𝒐 𝑪𝟐 (𝟎) BA 01 1 10 00 0 11 F(s)s 𝑿 𝟏 𝑿 𝟐 bin 𝒀𝒋 +1 𝒃 𝟏 +1 𝒃 𝟐 1/0 +1 𝒃 𝟑 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒 s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+…) 𝟎 ≤ η ≤ 𝟏 𝑿(𝒏)=(𝑿 𝟎, 𝑿 𝟏,𝑿 𝟐, …) W(𝒏)=(𝑾 𝟎, 𝑾 𝟏,𝑾 𝟐, …)
  • 83. Neural Networks Training Steps Weights Initialization Inputs Application Sum of Inputs-Weights Products Activation Function Response Calculation Weights Adaptation Back to Step 2 1 2 3 4 5 6
  • 84. Regarding 5th Step: Weights Adaptation • If the predicted output Y is not the same as the desired output d, then weights are to be adapted according to the following equation: 𝑾 𝒏 + 𝟏 = 𝑾 𝒏 + η 𝒅 𝒏 − 𝒀 𝒏 𝑿(𝒏) Where 𝑾 𝒏 = [𝒃 𝒏 , 𝑾 𝟏(𝒏), 𝑾 𝟐(𝒏), 𝑾 𝟑(𝒏), … , 𝑾 𝒎(𝒏)]
  • 85. Neural Networks Training Example Step n=0 • In each step in the solution, the parameters of the neural network must be known. • Parameters of step n=0: η = .001 𝑋 𝑛 = 𝑋 0 = +1, +1, +1,1, 0 𝑊 𝑛 = 𝑊 0 = 𝑏1, 𝑏2, 𝑏3, 𝑤1, 𝑤2, 𝑤3, 𝑤4, 𝑤5, 𝑤6 = −1.5, −.5, −.5, 1, 1, 1, 1, −2, 1 𝑑 𝑛 = 𝑑 0 = 1 BA 01 1 => 1 10 00 0 => 0 11
  • 86. Neural Networks Training Example Step n=0 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −2 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin BA 01 1 => 1 10 00 0 => 0 11
  • 87. Neural Networks Training Example Step n=0 – SOP – 𝑺 𝟏 𝑺 𝟏=(+𝟏𝒃 𝟏+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟑) =+1*-1.5+1*1+0*1 =-.5 BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −2 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 88. Neural Networks Training Example Step n=0 – Output – 𝑺 𝟏 𝒀 𝑺 𝟏 = = 𝑩𝑰𝑵 𝑺 𝟏 = 𝑩𝑰𝑵 −. 𝟓 = 𝟎 𝒃𝒊𝒏 𝒔 = +𝟏, 𝒔 ≥ 𝟎 𝟎, 𝒔 < 𝟎 BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −2 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 89. Neural Networks Training Example Step n=0 – SOP – 𝑺 𝟐 𝑺 𝟐=(+𝟏𝒃 𝟐+𝑿 𝟏 𝑾 𝟐+𝑿 𝟐 𝑾 𝟒) =+1*-.5+1*1+0*1 =.5 BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −2 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 90. Neural Networks Training Example Step n=0 – Output – 𝑺 𝟐 𝒀 𝑺2 = = 𝑩𝑰𝑵 𝑺2 = 𝑩𝑰𝑵 . 𝟓 = 1 𝒃𝒊𝒏 𝒔 = +𝟏, 𝒔 ≥ 𝟎 −𝟏, 𝒔 < 𝟎 BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −2 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 91. Neural Networks Training Example Step n=0 – SOP – 𝑺 𝟑 𝑺 𝟑=(+𝟏𝒃 𝟑+𝑺 𝟏 𝑾 𝟓+𝑺 𝟐 𝑾 𝟔) =+1*-.5+0*-2+1*1 =.5 BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −2 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 92. Neural Networks Training Example Step n=0 – Output – 𝑺 𝟑 𝒀 𝑺3 = = 𝑩𝑰𝑵 𝑺3 = 𝑩𝑰𝑵 . 𝟓 = 1 𝒃𝒊𝒏 𝒔 = +𝟏, 𝒔 ≥ 𝟎 𝟎, 𝒔 < 𝟎 BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −2 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 93. Neural Networks Training Example Step n=0 - Output 𝒀 𝒏 = 𝒀 𝟎 = 𝒀 𝑺3 = 1 BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −2 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 94. Neural Networks Training Example Step n=0 Predicted Vs. Desired 𝒀 𝒏 = 𝒀 𝟎 = 1 𝐝 𝒏 = 𝒅 𝟎 = 1 ∵ 𝒀 𝒏 = 𝒅 𝒏 ∴ Weights are Correct. No Adaptation BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −2 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 95. Neural Networks Training Example Step n=1 • In each step in the solution, the parameters of the neural network must be known. • Parameters of step n=1: η = .001 𝑋 𝑛 = 𝑋 1 = +1, +1, +1,0, 1 𝑊 𝑛 = 𝑊 1 = 𝑊 0 = 𝑏1, 𝑏2, 𝑏3, 𝑤1, 𝑤1, 𝑤2, 𝑤3, 𝑤4, 𝑤5, 𝑤6 = −1.5, −.5, −.5, 1, 1, 1, 1, −2, 1 𝑑 𝑛 = 𝑑 1 = +1 BA 01 1 => 1 10 00 0 => 0 11
  • 96. Neural Networks Training Example Step n=1 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −2 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin BA 01 1 => 1 10 00 0 => 0 11
  • 97. Neural Networks Training Example Step n=1 – SOP – 𝑺 𝟏 𝑺 𝟏=(+𝟏𝒃 𝟏+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟑) =+1*-1.5+0*1+1*1 =-.5 BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −2 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 98. Neural Networks Training Example Step n=1 – Output – 𝑺 𝟏 𝒀 𝑺 𝟏 = = 𝑩𝑰𝑵 𝑺 𝟏 = 𝑩𝑰𝑵 −. 𝟓 = 𝟎 𝒃𝒊𝒏 𝒔 = +𝟏, 𝒔 ≥ 𝟎 𝟎, 𝒔 < 𝟎 BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −2 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 99. Neural Networks Training Example Step n=1 – SOP – 𝑺 𝟐 𝑺 𝟐=(+𝟏𝒃 𝟐+𝑿 𝟏 𝑾 𝟐+𝑿 𝟐 𝑾 𝟒) =+1*-.5+0*1+1*1 =.5 BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −2 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 100. Neural Networks Training Example Step n=1 – Output – 𝑺 𝟐 𝒀 𝑺2 = = 𝑩𝑰𝑵 𝑺2 = 𝑩𝑰𝑵 . 𝟓 = 1 𝒃𝒊𝒏 𝒔 = +𝟏, 𝒔 ≥ 𝟎 −𝟏, 𝒔 < 𝟎 BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −2 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 101. Neural Networks Training Example Step n=1 – SOP – 𝑺 𝟑 𝑺 𝟑=(+𝟏𝒃 𝟑+𝑺 𝟏 𝑾 𝟓+𝑺 𝟐 𝑾 𝟔) =+1*-.5+0*-2+1*1 =.5 BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −2 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 102. Neural Networks Training Example Step n=1 – Output – 𝑺 𝟑 𝒀 𝑺3 = = 𝑩𝑰𝑵 𝑺3 = 𝑩𝑰𝑵 . 𝟓 = 1 𝒃𝒊𝒏 𝒔 = +𝟏, 𝒔 ≥ 𝟎 𝟎, 𝒔 < 𝟎 BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −2 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 103. Neural Networks Training Example Step n=1 - Output 𝒀 𝒏 = 𝒀 𝟏 = 𝒀 𝑺3 = 1 BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −2 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 104. Neural Networks Training Example Step n=1 Predicted Vs. Desired 𝒀 𝒏 = 𝒀 𝟏 = 1 𝐝 𝒏 = 𝒅 𝟏 = 1 ∵ 𝒀 𝒏 = 𝒅 𝒏 ∴ Weights are Correct. No Adaptation BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −2 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 105. Neural Networks Training Example Step n=2 • In each step in the solution, the parameters of the neural network must be known. • Parameters of step n=2: η = .001 𝑋 𝑛 = 𝑋 2 = +1, +1, +1,0, 0 𝑊 𝑛 = 𝑊 2 = 𝑊 1 = 𝑏1, 𝑏2, 𝑏3, 𝑤1, 𝑤1, 𝑤2, 𝑤3, 𝑤4, 𝑤5, 𝑤6 = −1.5, −.5, −.5, 1, 1, 1, 1, −2, 1 𝑑 𝑛 = 𝑑 2 = 0 BA 01 1 => 1 10 00 0 => 0 11
  • 106. Neural Networks Training Example Step n=2 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −𝟐 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin BA 01 1 => 1 10 00 0 => 0 11
  • 107. Neural Networks Training Example Step n=2 – SOP – 𝑺 𝟏 𝑺 𝟏=(+𝟏𝒃 𝟏+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟑) =+1*-1.5+0*1+0*1 =-1.5 BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −𝟐 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 108. Neural Networks Training Example Step n=2 – Output – 𝑺 𝟏 𝒀 𝑺 𝟏 = = 𝑩𝑰𝑵 𝑺 𝟏 = 𝑩𝑰𝑵 −𝟏. 𝟓 = 𝟎 𝒃𝒊n 𝒔 = +𝟏, 𝒔 ≥ 𝟎 𝟎, 𝒔 < 𝟎 BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −𝟐 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 109. Neural Networks Training Example Step n=2 – SOP – 𝑺 𝟐 𝑺 𝟐=(+𝟏𝒃 𝟐+𝑿 𝟏 𝑾 𝟐+𝑿 𝟐 𝑾 𝟒) =+1*-.5+0*1+0*1 =-.5 BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −𝟐 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 110. Neural Networks Training Example Step n=2 – Output – 𝑺 𝟐 𝒀 𝑺2 = = 𝑺𝑮𝑵 𝑺2 = 𝑺𝑮𝑵 −. 𝟓 =0 𝒃𝒊𝒏 𝒔 = +𝟏, 𝒔 ≥ 𝟎 −𝟏, 𝒔 < 𝟎 BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −𝟐 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 111. Neural Networks Training Example Step n=2 – SOP – 𝑺 𝟑 𝑺 𝟑=(+𝟏𝒃 𝟑+𝑺 𝟏 𝑾 𝟓+𝑺 𝟐 𝑾 𝟔) =+1*-.5+0*-2+0*1 =-.5 BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −𝟐 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 112. Neural Networks Training Example Step n=2 – Output – 𝑺 𝟑 𝒀 𝑺3 = = 𝑩𝑰𝑵 𝑺3 = 𝑩𝑰𝑵 −. 𝟓 = 𝟎 𝒃𝒊𝒏 𝒔 = +𝟏, 𝒔 ≥ 𝟎 𝟎, 𝒔 < 𝟎 BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −𝟐 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 113. Neural Networks Training Example Step n=2 - Output 𝒀 𝒏 = 𝒀 𝟐 = 𝒀 𝑺3 = 𝟎 BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −𝟐 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 114. Neural Networks Training Example Step n=2 Predicted Vs. Desired 𝒀 𝒏 = 𝒀 𝟐 = 𝟎 𝐝 𝒏 = 𝒅 𝟐 = 𝟎 ∵ 𝒀 𝒏 = 𝒅 𝒏 ∴ Weights are Correct. No Adaptation BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −𝟐 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 115. Neural Networks Training Example Step n=3 • In each step in the solution, the parameters of the neural network must be known. • Parameters of step n=3: η = .001 𝑋 𝑛 = 𝑋 3 = +1, +1, +1,1, 1 𝑊 𝑛 = 𝑊 3 = 𝑊 2 = 𝑏1, 𝑏2, 𝑏3, 𝑤1, 𝑤1, 𝑤2, 𝑤3, 𝑤4, 𝑤5, 𝑤6 = −1.5, −.5, −.5, 1, 1, 1, 1, −2, 1 𝑑 𝑛 = 𝑑 3 = 0 BA 01 1 => 1 10 00 0 => 0 11
  • 116. Neural Networks Training Example Step n=3 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −𝟐 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin BA 01 1 => 1 10 00 0 => 0 11
  • 117. Neural Networks Training Example Step n=3 – SOP – 𝑺 𝟏 𝑺 𝟏=(+𝟏𝒃 𝟏+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟑) =+1*-1.5+1*1+1*1 =.5 BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −𝟐 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 118. Neural Networks Training Example Step n=3 – Output – 𝑺 𝟏 𝒀 𝑺 𝟏 = = 𝑩𝑰𝑵 𝑺 𝟏 = 𝑩𝑰𝑵 . 𝟓 = 1 𝒃𝒊𝒏 𝒔 = +𝟏, 𝒔 ≥ 𝟎 𝟎, 𝒔 < 𝟎 BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −𝟐 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 119. Neural Networks Training Example Step n=3 – SOP – 𝑺 𝟐 𝑺 𝟐=(+𝟏𝒃 𝟐+𝑿 𝟏 𝑾 𝟐+𝑿 𝟐 𝑾 𝟒) =+1*-.5+1*1+1*1 =1.5 BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −𝟐 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 120. Neural Networks Training Example Step n=3 – Output – 𝑺 𝟐 𝒀 𝑺2 = = 𝑩𝑰𝑵 𝑺2 = 𝑩𝑰𝑵 𝟏. 𝟓 = 1 𝒃𝒊𝒏 𝒔 = +𝟏, 𝒔 ≥ 𝟎 −𝟏, 𝒔 < 𝟎 BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −𝟐 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 121. Neural Networks Training Example Step n=3 – SOP – 𝑺 𝟑 𝑺 𝟑=(+𝟏𝒃 𝟑+𝑺 𝟏 𝑾 𝟓+𝑺 𝟐 𝑾 𝟔) =+1*-.5+1*-2+1*1 =-1.5 BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −𝟐 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 122. Neural Networks Training Example Step n=3 – Output – 𝑺 𝟑 𝒀 𝑺3 = = 𝑩𝑰𝑵 𝑺3 = 𝑩𝑰𝑵 −𝟏. 𝟓 = 𝟎 𝒃𝒊𝒏 𝒔 = +𝟏, 𝒔 ≥ 𝟎 𝟎, 𝒔 < 𝟎 BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −𝟐 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 123. Neural Networks Training Example Step n=3 - Output 𝒀 𝒏 = 𝒀 𝟑 = 𝒀 𝑺3 = 𝟎 BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −𝟐 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 124. Neural Networks Training Example Step n=3 Predicted Vs. Desired 𝒀 𝒏 = 𝒀 𝟑 = 𝟎 𝐝 𝒏 = 𝒅 𝟑 = 𝟎 ∵ 𝒀 𝒏 = 𝒅 𝒏 ∴ Weights are Correct. No Adaptation BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −𝟐 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 125. Final Weights s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −𝟐 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin Current weights predicted the desired outputs.