SlideShare a Scribd company logo
1 of 39
1
Machine Learning by
using Python
Neural Network
Lesson 2
By: Professor Lili Saghafi
proflilisaghafi@gmail.com
@Lili_PLS
2
Overview
• Machine learning is the kind of programming which gives
computers the capability to automatically learn from data
without being explicitly programmed.
• This means in other words that these programs change
their behavior by learning from data.
• In this course we will cover various aspects of machine
learning
• Everything will be related to Python. So it is Machine
Learning by using Python.
• What is the best programming language for machine
learning?
• Python is clearly one of the top players!
3
Neural Networks
• When we say "Neural Networks", we mean artificial
Neural Networks (ANN). The idea of ANN is based on
biological neural networks like the brain.
• The basic structure of a neural network is the neuron. A
neuron in biology consists of three major parts: the soma
(cell body), the dendrites, and the axon.
• The dendrites branch of from the soma in a tree-like way
and getting thinner with every branch. They receive
signals (impulses) from other neurons at synapses. The
axon - there is always only one - also leaves the soma
and usually tend to extend for longer distances than the
dentrites. The axon is used for sending the output of the
neuron to other neurons or better to the synapsis of
other neurons.
4
Neural Networks
5
The following image by Quasar
Jarosz, courtesy of Wikipedia,
illustrates this:
6
already an abstraction for a
biologist, we can further abstract it:
7
A perceptron of artificial neural
networks is simulating a biological
neuron.
8
Neural Networks
• It is amazingly simple, what is going on
inside the body of a perceptron or neuron.
The input signals get multiplied by weight
values, i.e. each input has its
corresponding weight. This way the input
can be adjusted individually for every xixi.
We can see all the inputs as an input
vector and the corresponding weights as
the weights vector.
9
Neural Networks
• When a signal comes in, it gets multiplied by a weight
value that is assigned to this particular input. That is, if a
neuron has three inputs, then it has three weights that
can be adjusted individually. The weights usually get
adjusted during the learn phase.
After this the modified input signals are summed up. It is
also possible to add additionally a so-called bias b to this
sum. The bias is a value which can also be adjusted
during the learn phase.
• Finally, the actual output has to be determined. For this
purpose an activation or step function Φ is applied to
weighted sum of the input values.
10
the actual output has to be determined.
For this purpose an activation or step
function Φ is applied to weighted sum
of the input values.
11
The simplest form of an activation
function is a binary function
• The simplest form of
an activation function
is a binary function. If
the result of the
summation is greater
than some threshold
s, the result of Φ will
be 1, otherwise 0.
12
A Simple Neural Network
• The following image shows the general
building principle of a simple artificial
neural network:
•
13
• We will write a very
simple Neural
Network
implementing the
logical "And" and "Or"
functions.
• Let's start with the
"And" function. It is
defined for two inputs:
14
Line Separation
• You could imagine that you have two attributes
describing am eddible object like a fruit for
example: "sweetness" and "sourness"
• We could describe this by points in a two-
dimensional space. The x axis for the sweetness
and the y axis for the sourness. Imagine now that
we have two fruits as points in this space, i.e. an
orange at position (3.5, 1.8) and a lemon at (1.1,
3.9).
15
Line Separation
• We could define dividing lines to define the
points which are more lemon-like and which are
more orange-like. The following program
calculates and renders a bunch of lines. The red
ones are completely unusable for this purpose,
because they are not separating the classes.
Yet, it is obvious that even the green ones are
not all useful.
16
Line Separation
17
Line Separation
18
Line Separation
19
Neural Network Using Python
and Numpy
20
Neural Network Using Python
and Numpy
21
Neural Network Using Python
and Numpy
• We have introduced the basic ideas about
neuronal networks in the previous chapter of our
tutorial.
• We pointed out the similarity between neurons
and neural networks in biology. We also
introduced very small articial neural networks
and introduced decision boundaries and the
XOR problem.
• The focus in our previous chapter had not been
on efficiency.
22
Neural Network Using Python
and Numpy
• We will introduce a Neural Network class in Python in
this chapter, which will use the powerful and efficient
data structures of Numpy. This way, we get a more
efficient network than in our previous chapter. When we
say "more efficient", we do not mean that the artificial
neural networks encountered in this lesson are efficient
and ready for real life usage.
• They are still quite slow compared to implementations
from sklearn for example. The focus is to implement a
very basic neural network and by doing this explaining
the basic ideas.
23
Neural Network Using Python
and Numpy
• Ideas like how the signal flow inside of a
network works, how to implement weights.
how to initialize weight matrices or what
activation functions can be used.
• We will start with a simple neural networks
consisting of three layers, i.e. the input
layer, a hidden layer and an output layer.
24
A Simple Artificial Neural
Network Structure
• You can see a simple neural network structure in
the following diagram. We have an input layer
with three nodes i1,i2,i3i1,i2,i3 These nodes get
the corresponding input values x1,x2,x3x1,x2,x3.
The middle or hidden layer has four
nodes h1,h2,h3,h4h1,h2,h3,h4.
• The input of this layer stems from the input layer.
We will discuss the mechanism soon. Finally,
our output layer consists of the two nodes o1,o2
25
Neural Network Using Python
and Numpy
• We have to note that some would call this
a two layer network, because they don't
count the inputs as a layer.
26
Neural Network Using Python
and Numpy
• The input layer consists of the
nodes i1i1, i2i2 and i3i3. In principle the
input is a one-dimensional vector, like (2,
4, 11). A one-dimensional vector is
represented in numpy like this:
27
A Simple Artificial Neural
Network Structure
28
A Simple Artificial Neural
Network Structure
• In the algorithm, which we will write later,
we will have to transpose it into a column
vector, i.e. a two-dimensional array with
just one column:
29
A Simple Artificial Neural
Network Structure
30
Backpropagation in Neural
Networks
31
Backpropagation in Neural
Networks
32
Backpropagation in Neural
Networks
• We have already written Neural Networks
in Python in the previous chapters of our
tutorial. We could train these networks, but
we didn't explain the mechanism used for
training. We used backpropagation without
saying so. Backpropagation is a
commonly used method for training
artificial neural networks, especially deep
neural networks.
33
Backpropagation in Neural
Networks
• Backpropagation is needed to calculate
the gradient, which we need to adapt the
weights of the weight matrices. The weight
of the neuron (nodes) of our network are
adjusted by calculating the gradient of the
loss function. For this purpose a gradient
descent optimization algorithm is used. It
is also called backward propagation of
errors.
34
Backpropagation in Neural
Networks
• Explaining gradient descent starts in many
articles or tutorials with mountains.
Imagine you are put on a mountain, not
necessarily the top, by a helicopter at
night or heavy fog. Let's further imagine
that this mountain is on an island and you
want to reach sea level. You have to go
down, but you hardly see anything, maybe
just a few metres.
35
Backpropagation in Neural
Networks
• Your task is to find your way down, but you
cannot see the path. You can use the method of
gradient descent. This means that you are
examining the steepness at your current
position. You will proceed in the direction with
the steepest descent. You take only a few steps
and then you stop again to reorientate yourself.
This means you are applying again the
previously described procedure, i.e. you are
looking for the steepest descend.
36
Backpropagation in Neural
Networks
• This procedure is depicted in the following
diagram in a two-dimensional space.
37
Backpropagation in Neural
Networks
38
Backpropagation in Neural
Networks
• Going on like this you will arrive at a position, where
there is no further descend.
• Each directions goes upwards. You may have reached
the deepest level - the global minimum -, but you might
as well be stuck in a basin. If you start at the position on
the right side of our image, everything works out fine, but
from the leftside, you will be stuck in a local minimum. If
you imagine now, - not very realistic - you are dropped
many time at random places on this island, you will find
ways downwards to sea level. This is what we actually
do, when we train a neural network.
•
39
Machine Learning by
using Python
Neural Network
Lesson 2
By: Professor Lili Saghafi

More Related Content

What's hot

What's hot (20)

03 Single layer Perception Classifier
03 Single layer Perception Classifier03 Single layer Perception Classifier
03 Single layer Perception Classifier
 
TensorFlow Tutorial Part2
TensorFlow Tutorial Part2TensorFlow Tutorial Part2
TensorFlow Tutorial Part2
 
06 neurolab python
06 neurolab python06 neurolab python
06 neurolab python
 
Deep learning
Deep learningDeep learning
Deep learning
 
Introduction to Neural networks (under graduate course) Lecture 8 of 9
Introduction to Neural networks (under graduate course) Lecture 8 of 9Introduction to Neural networks (under graduate course) Lecture 8 of 9
Introduction to Neural networks (under graduate course) Lecture 8 of 9
 
Artificial Neural Networks Lect2: Neurobiology & Architectures of ANNS
Artificial Neural Networks Lect2: Neurobiology & Architectures of ANNSArtificial Neural Networks Lect2: Neurobiology & Architectures of ANNS
Artificial Neural Networks Lect2: Neurobiology & Architectures of ANNS
 
Sigmoid function machine learning made simple
Sigmoid function  machine learning made simpleSigmoid function  machine learning made simple
Sigmoid function machine learning made simple
 
Recurrent neural networks rnn
Recurrent neural networks   rnnRecurrent neural networks   rnn
Recurrent neural networks rnn
 
Deep learning
Deep learningDeep learning
Deep learning
 
Quantum artificial intelligence
Quantum artificial intelligenceQuantum artificial intelligence
Quantum artificial intelligence
 
Basics Of Neural Network Analysis
Basics Of Neural Network AnalysisBasics Of Neural Network Analysis
Basics Of Neural Network Analysis
 
Membrane computing
Membrane computingMembrane computing
Membrane computing
 
Introduction to Neural networks (under graduate course) Lecture 7 of 9
Introduction to Neural networks (under graduate course) Lecture 7 of 9Introduction to Neural networks (under graduate course) Lecture 7 of 9
Introduction to Neural networks (under graduate course) Lecture 7 of 9
 
Regularization in deep learning
Regularization in deep learningRegularization in deep learning
Regularization in deep learning
 
TensorFlow Tutorial Part1
TensorFlow Tutorial Part1TensorFlow Tutorial Part1
TensorFlow Tutorial Part1
 
Neural network
Neural networkNeural network
Neural network
 
Introduction to Neural networks (under graduate course) Lecture 2 of 9
Introduction to Neural networks (under graduate course) Lecture 2 of 9Introduction to Neural networks (under graduate course) Lecture 2 of 9
Introduction to Neural networks (under graduate course) Lecture 2 of 9
 
Lstm
LstmLstm
Lstm
 
Activation function
Activation functionActivation function
Activation function
 
Machine Learning, Deep Learning and Data Analysis Introduction
Machine Learning, Deep Learning and Data Analysis IntroductionMachine Learning, Deep Learning and Data Analysis Introduction
Machine Learning, Deep Learning and Data Analysis Introduction
 

Similar to Machine learning by using python lesson 2 Neural Networks By Professor Lili Saghafi

2011 0480.neural-networks
2011 0480.neural-networks2011 0480.neural-networks
2011 0480.neural-networks
Parneet Kaur
 
Deep Learning Interview Questions And Answers | AI & Deep Learning Interview ...
Deep Learning Interview Questions And Answers | AI & Deep Learning Interview ...Deep Learning Interview Questions And Answers | AI & Deep Learning Interview ...
Deep Learning Interview Questions And Answers | AI & Deep Learning Interview ...
Simplilearn
 
lecture07.ppt
lecture07.pptlecture07.ppt
lecture07.ppt
butest
 
The Introduction to Neural Networks.ppt
The Introduction to Neural Networks.pptThe Introduction to Neural Networks.ppt
The Introduction to Neural Networks.ppt
moh2020
 

Similar to Machine learning by using python lesson 2 Neural Networks By Professor Lili Saghafi (20)

Feed forward back propogation algorithm .pptx
Feed forward back propogation algorithm .pptxFeed forward back propogation algorithm .pptx
Feed forward back propogation algorithm .pptx
 
2011 0480.neural-networks
2011 0480.neural-networks2011 0480.neural-networks
2011 0480.neural-networks
 
UNIT 5-ANN.ppt
UNIT 5-ANN.pptUNIT 5-ANN.ppt
UNIT 5-ANN.ppt
 
UNIT-3 .PPTX
UNIT-3 .PPTXUNIT-3 .PPTX
UNIT-3 .PPTX
 
Unit 2 ml.pptx
Unit 2 ml.pptxUnit 2 ml.pptx
Unit 2 ml.pptx
 
Neural net and back propagation
Neural net and back propagationNeural net and back propagation
Neural net and back propagation
 
Understanding Deep Learning & Parameter Tuning with MXnet, H2o Package in R
Understanding Deep Learning & Parameter Tuning with MXnet, H2o Package in RUnderstanding Deep Learning & Parameter Tuning with MXnet, H2o Package in R
Understanding Deep Learning & Parameter Tuning with MXnet, H2o Package in R
 
Artificial neural network paper
Artificial neural network paperArtificial neural network paper
Artificial neural network paper
 
Introduction to Neural networks (under graduate course) Lecture 9 of 9
Introduction to Neural networks (under graduate course) Lecture 9 of 9Introduction to Neural networks (under graduate course) Lecture 9 of 9
Introduction to Neural networks (under graduate course) Lecture 9 of 9
 
Unit 6: Application of AI
Unit 6: Application of AIUnit 6: Application of AI
Unit 6: Application of AI
 
Deep Learning Interview Questions And Answers | AI & Deep Learning Interview ...
Deep Learning Interview Questions And Answers | AI & Deep Learning Interview ...Deep Learning Interview Questions And Answers | AI & Deep Learning Interview ...
Deep Learning Interview Questions And Answers | AI & Deep Learning Interview ...
 
Artificial neural networks
Artificial neural networksArtificial neural networks
Artificial neural networks
 
Lecture 11 neural network principles
Lecture 11 neural network principlesLecture 11 neural network principles
Lecture 11 neural network principles
 
lecture07.ppt
lecture07.pptlecture07.ppt
lecture07.ppt
 
Neural Networks and Deep Learning Basics
Neural Networks and Deep Learning BasicsNeural Networks and Deep Learning Basics
Neural Networks and Deep Learning Basics
 
Neural network
Neural networkNeural network
Neural network
 
Neural
NeuralNeural
Neural
 
The Introduction to Neural Networks.ppt
The Introduction to Neural Networks.pptThe Introduction to Neural Networks.ppt
The Introduction to Neural Networks.ppt
 
Artificial Neural Network
Artificial Neural NetworkArtificial Neural Network
Artificial Neural Network
 
lecture11_Artificial neural networks.ppt
lecture11_Artificial neural networks.pptlecture11_Artificial neural networks.ppt
lecture11_Artificial neural networks.ppt
 

More from Professor Lili Saghafi

More from Professor Lili Saghafi (20)

Artificial Intelligence and the importance of Data, By : Prof. Lili Saghafi
Artificial Intelligence and the importance of Data,  By : Prof. Lili SaghafiArtificial Intelligence and the importance of Data,  By : Prof. Lili Saghafi
Artificial Intelligence and the importance of Data, By : Prof. Lili Saghafi
 
Software Engineering
Software EngineeringSoftware Engineering
Software Engineering
 
Ai
AiAi
Ai
 
Introduction to Quantum Computing Lecture 1: Fundamentals
Introduction to Quantum Computing Lecture 1: FundamentalsIntroduction to Quantum Computing Lecture 1: Fundamentals
Introduction to Quantum Computing Lecture 1: Fundamentals
 
Software Engineering_Agile Software Development By: Professor Lili Saghafi
Software Engineering_Agile Software Development By: Professor Lili SaghafiSoftware Engineering_Agile Software Development By: Professor Lili Saghafi
Software Engineering_Agile Software Development By: Professor Lili Saghafi
 
Quantum Computing Quantum Internet 2020_unit 1 By: Prof. Lili Saghafi
Quantum Computing Quantum Internet 2020_unit 1 By: Prof. Lili SaghafiQuantum Computing Quantum Internet 2020_unit 1 By: Prof. Lili Saghafi
Quantum Computing Quantum Internet 2020_unit 1 By: Prof. Lili Saghafi
 
Programming Languages Categories / Programming Paradigm By: Prof. Lili Saghafi
Programming Languages Categories / Programming Paradigm By: Prof. Lili Saghafi Programming Languages Categories / Programming Paradigm By: Prof. Lili Saghafi
Programming Languages Categories / Programming Paradigm By: Prof. Lili Saghafi
 
Introduction to blockchain lesson 2 By: Professor Lili Saghafi
Introduction to blockchain lesson 2 By: Professor Lili SaghafiIntroduction to blockchain lesson 2 By: Professor Lili Saghafi
Introduction to blockchain lesson 2 By: Professor Lili Saghafi
 
Introduction to Blockchain Technology By Professor Lili Saghafi
Introduction to Blockchain Technology By Professor Lili SaghafiIntroduction to Blockchain Technology By Professor Lili Saghafi
Introduction to Blockchain Technology By Professor Lili Saghafi
 
Cyber Security and Post Quantum Cryptography By: Professor Lili Saghafi
Cyber Security and Post Quantum Cryptography By: Professor Lili SaghafiCyber Security and Post Quantum Cryptography By: Professor Lili Saghafi
Cyber Security and Post Quantum Cryptography By: Professor Lili Saghafi
 
Machine learning by using python lesson 3 Confusion Matrix By : Professor Lil...
Machine learning by using python lesson 3 Confusion Matrix By : Professor Lil...Machine learning by using python lesson 3 Confusion Matrix By : Professor Lil...
Machine learning by using python lesson 3 Confusion Matrix By : Professor Lil...
 
Machine learning by using python Lesson One Part 2 By Professor Lili Saghafi
Machine learning by using python Lesson One Part 2 By Professor Lili SaghafiMachine learning by using python Lesson One Part 2 By Professor Lili Saghafi
Machine learning by using python Lesson One Part 2 By Professor Lili Saghafi
 
Machine learning by using python By: Professor Lili Saghafi
Machine learning by using python By: Professor Lili SaghafiMachine learning by using python By: Professor Lili Saghafi
Machine learning by using python By: Professor Lili Saghafi
 
What is digital humanities ,By: Professor Lili Saghafi
What is digital humanities ,By: Professor Lili SaghafiWhat is digital humanities ,By: Professor Lili Saghafi
What is digital humanities ,By: Professor Lili Saghafi
 
Effective Algorithm for n Fibonacci Number By: Professor Lili Saghafi
Effective Algorithm for n Fibonacci Number By: Professor Lili SaghafiEffective Algorithm for n Fibonacci Number By: Professor Lili Saghafi
Effective Algorithm for n Fibonacci Number By: Professor Lili Saghafi
 
Computer Security Cyber Security DOS_DDOS Attacks By: Professor Lili Saghafi
Computer Security Cyber Security DOS_DDOS Attacks By: Professor Lili SaghafiComputer Security Cyber Security DOS_DDOS Attacks By: Professor Lili Saghafi
Computer Security Cyber Security DOS_DDOS Attacks By: Professor Lili Saghafi
 
Data Science unit 2 By: Professor Lili Saghafi
Data Science unit 2 By: Professor Lili SaghafiData Science unit 2 By: Professor Lili Saghafi
Data Science unit 2 By: Professor Lili Saghafi
 
Data science unit 1 By: Professor Lili Saghafi
Data science unit 1 By: Professor Lili Saghafi Data science unit 1 By: Professor Lili Saghafi
Data science unit 1 By: Professor Lili Saghafi
 
Data Scientist By: Professor Lili Saghafi
Data Scientist By: Professor Lili SaghafiData Scientist By: Professor Lili Saghafi
Data Scientist By: Professor Lili Saghafi
 
New Assessments in Higher Education with Computers by: Prof Lili Saghafi
New Assessments in Higher Education with Computers by: Prof Lili Saghafi New Assessments in Higher Education with Computers by: Prof Lili Saghafi
New Assessments in Higher Education with Computers by: Prof Lili Saghafi
 

Recently uploaded

Gardella_PRCampaignConclusion Pitch Letter
Gardella_PRCampaignConclusion Pitch LetterGardella_PRCampaignConclusion Pitch Letter
Gardella_PRCampaignConclusion Pitch Letter
MateoGardella
 
Gardella_Mateo_IntellectualProperty.pdf.
Gardella_Mateo_IntellectualProperty.pdf.Gardella_Mateo_IntellectualProperty.pdf.
Gardella_Mateo_IntellectualProperty.pdf.
MateoGardella
 
Making and Justifying Mathematical Decisions.pdf
Making and Justifying Mathematical Decisions.pdfMaking and Justifying Mathematical Decisions.pdf
Making and Justifying Mathematical Decisions.pdf
Chris Hunter
 
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in DelhiRussian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
kauryashika82
 

Recently uploaded (20)

Ecological Succession. ( ECOSYSTEM, B. Pharmacy, 1st Year, Sem-II, Environmen...
Ecological Succession. ( ECOSYSTEM, B. Pharmacy, 1st Year, Sem-II, Environmen...Ecological Succession. ( ECOSYSTEM, B. Pharmacy, 1st Year, Sem-II, Environmen...
Ecological Succession. ( ECOSYSTEM, B. Pharmacy, 1st Year, Sem-II, Environmen...
 
Basic Civil Engineering first year Notes- Chapter 4 Building.pptx
Basic Civil Engineering first year Notes- Chapter 4 Building.pptxBasic Civil Engineering first year Notes- Chapter 4 Building.pptx
Basic Civil Engineering first year Notes- Chapter 4 Building.pptx
 
Paris 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activityParis 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activity
 
PROCESS RECORDING FORMAT.docx
PROCESS      RECORDING        FORMAT.docxPROCESS      RECORDING        FORMAT.docx
PROCESS RECORDING FORMAT.docx
 
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptxSOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
 
Unit-V; Pricing (Pharma Marketing Management).pptx
Unit-V; Pricing (Pharma Marketing Management).pptxUnit-V; Pricing (Pharma Marketing Management).pptx
Unit-V; Pricing (Pharma Marketing Management).pptx
 
Accessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impactAccessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impact
 
How to Give a Domain for a Field in Odoo 17
How to Give a Domain for a Field in Odoo 17How to Give a Domain for a Field in Odoo 17
How to Give a Domain for a Field in Odoo 17
 
Código Creativo y Arte de Software | Unidad 1
Código Creativo y Arte de Software | Unidad 1Código Creativo y Arte de Software | Unidad 1
Código Creativo y Arte de Software | Unidad 1
 
Gardella_PRCampaignConclusion Pitch Letter
Gardella_PRCampaignConclusion Pitch LetterGardella_PRCampaignConclusion Pitch Letter
Gardella_PRCampaignConclusion Pitch Letter
 
Measures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and ModeMeasures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and Mode
 
Web & Social Media Analytics Previous Year Question Paper.pdf
Web & Social Media Analytics Previous Year Question Paper.pdfWeb & Social Media Analytics Previous Year Question Paper.pdf
Web & Social Media Analytics Previous Year Question Paper.pdf
 
Class 11th Physics NEET formula sheet pdf
Class 11th Physics NEET formula sheet pdfClass 11th Physics NEET formula sheet pdf
Class 11th Physics NEET formula sheet pdf
 
Mixin Classes in Odoo 17 How to Extend Models Using Mixin Classes
Mixin Classes in Odoo 17  How to Extend Models Using Mixin ClassesMixin Classes in Odoo 17  How to Extend Models Using Mixin Classes
Mixin Classes in Odoo 17 How to Extend Models Using Mixin Classes
 
Gardella_Mateo_IntellectualProperty.pdf.
Gardella_Mateo_IntellectualProperty.pdf.Gardella_Mateo_IntellectualProperty.pdf.
Gardella_Mateo_IntellectualProperty.pdf.
 
Advance Mobile Application Development class 07
Advance Mobile Application Development class 07Advance Mobile Application Development class 07
Advance Mobile Application Development class 07
 
Making and Justifying Mathematical Decisions.pdf
Making and Justifying Mathematical Decisions.pdfMaking and Justifying Mathematical Decisions.pdf
Making and Justifying Mathematical Decisions.pdf
 
Advanced Views - Calendar View in Odoo 17
Advanced Views - Calendar View in Odoo 17Advanced Views - Calendar View in Odoo 17
Advanced Views - Calendar View in Odoo 17
 
Unit-IV- Pharma. Marketing Channels.pptx
Unit-IV- Pharma. Marketing Channels.pptxUnit-IV- Pharma. Marketing Channels.pptx
Unit-IV- Pharma. Marketing Channels.pptx
 
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in DelhiRussian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
 

Machine learning by using python lesson 2 Neural Networks By Professor Lili Saghafi

  • 1. 1 Machine Learning by using Python Neural Network Lesson 2 By: Professor Lili Saghafi proflilisaghafi@gmail.com @Lili_PLS
  • 2. 2 Overview • Machine learning is the kind of programming which gives computers the capability to automatically learn from data without being explicitly programmed. • This means in other words that these programs change their behavior by learning from data. • In this course we will cover various aspects of machine learning • Everything will be related to Python. So it is Machine Learning by using Python. • What is the best programming language for machine learning? • Python is clearly one of the top players!
  • 3. 3 Neural Networks • When we say "Neural Networks", we mean artificial Neural Networks (ANN). The idea of ANN is based on biological neural networks like the brain. • The basic structure of a neural network is the neuron. A neuron in biology consists of three major parts: the soma (cell body), the dendrites, and the axon. • The dendrites branch of from the soma in a tree-like way and getting thinner with every branch. They receive signals (impulses) from other neurons at synapses. The axon - there is always only one - also leaves the soma and usually tend to extend for longer distances than the dentrites. The axon is used for sending the output of the neuron to other neurons or better to the synapsis of other neurons.
  • 5. 5 The following image by Quasar Jarosz, courtesy of Wikipedia, illustrates this:
  • 6. 6 already an abstraction for a biologist, we can further abstract it:
  • 7. 7 A perceptron of artificial neural networks is simulating a biological neuron.
  • 8. 8 Neural Networks • It is amazingly simple, what is going on inside the body of a perceptron or neuron. The input signals get multiplied by weight values, i.e. each input has its corresponding weight. This way the input can be adjusted individually for every xixi. We can see all the inputs as an input vector and the corresponding weights as the weights vector.
  • 9. 9 Neural Networks • When a signal comes in, it gets multiplied by a weight value that is assigned to this particular input. That is, if a neuron has three inputs, then it has three weights that can be adjusted individually. The weights usually get adjusted during the learn phase. After this the modified input signals are summed up. It is also possible to add additionally a so-called bias b to this sum. The bias is a value which can also be adjusted during the learn phase. • Finally, the actual output has to be determined. For this purpose an activation or step function Φ is applied to weighted sum of the input values.
  • 10. 10 the actual output has to be determined. For this purpose an activation or step function Φ is applied to weighted sum of the input values.
  • 11. 11 The simplest form of an activation function is a binary function • The simplest form of an activation function is a binary function. If the result of the summation is greater than some threshold s, the result of Φ will be 1, otherwise 0.
  • 12. 12 A Simple Neural Network • The following image shows the general building principle of a simple artificial neural network: •
  • 13. 13 • We will write a very simple Neural Network implementing the logical "And" and "Or" functions. • Let's start with the "And" function. It is defined for two inputs:
  • 14. 14 Line Separation • You could imagine that you have two attributes describing am eddible object like a fruit for example: "sweetness" and "sourness" • We could describe this by points in a two- dimensional space. The x axis for the sweetness and the y axis for the sourness. Imagine now that we have two fruits as points in this space, i.e. an orange at position (3.5, 1.8) and a lemon at (1.1, 3.9).
  • 15. 15 Line Separation • We could define dividing lines to define the points which are more lemon-like and which are more orange-like. The following program calculates and renders a bunch of lines. The red ones are completely unusable for this purpose, because they are not separating the classes. Yet, it is obvious that even the green ones are not all useful.
  • 19. 19 Neural Network Using Python and Numpy
  • 20. 20 Neural Network Using Python and Numpy
  • 21. 21 Neural Network Using Python and Numpy • We have introduced the basic ideas about neuronal networks in the previous chapter of our tutorial. • We pointed out the similarity between neurons and neural networks in biology. We also introduced very small articial neural networks and introduced decision boundaries and the XOR problem. • The focus in our previous chapter had not been on efficiency.
  • 22. 22 Neural Network Using Python and Numpy • We will introduce a Neural Network class in Python in this chapter, which will use the powerful and efficient data structures of Numpy. This way, we get a more efficient network than in our previous chapter. When we say "more efficient", we do not mean that the artificial neural networks encountered in this lesson are efficient and ready for real life usage. • They are still quite slow compared to implementations from sklearn for example. The focus is to implement a very basic neural network and by doing this explaining the basic ideas.
  • 23. 23 Neural Network Using Python and Numpy • Ideas like how the signal flow inside of a network works, how to implement weights. how to initialize weight matrices or what activation functions can be used. • We will start with a simple neural networks consisting of three layers, i.e. the input layer, a hidden layer and an output layer.
  • 24. 24 A Simple Artificial Neural Network Structure • You can see a simple neural network structure in the following diagram. We have an input layer with three nodes i1,i2,i3i1,i2,i3 These nodes get the corresponding input values x1,x2,x3x1,x2,x3. The middle or hidden layer has four nodes h1,h2,h3,h4h1,h2,h3,h4. • The input of this layer stems from the input layer. We will discuss the mechanism soon. Finally, our output layer consists of the two nodes o1,o2
  • 25. 25 Neural Network Using Python and Numpy • We have to note that some would call this a two layer network, because they don't count the inputs as a layer.
  • 26. 26 Neural Network Using Python and Numpy • The input layer consists of the nodes i1i1, i2i2 and i3i3. In principle the input is a one-dimensional vector, like (2, 4, 11). A one-dimensional vector is represented in numpy like this:
  • 27. 27 A Simple Artificial Neural Network Structure
  • 28. 28 A Simple Artificial Neural Network Structure • In the algorithm, which we will write later, we will have to transpose it into a column vector, i.e. a two-dimensional array with just one column:
  • 29. 29 A Simple Artificial Neural Network Structure
  • 32. 32 Backpropagation in Neural Networks • We have already written Neural Networks in Python in the previous chapters of our tutorial. We could train these networks, but we didn't explain the mechanism used for training. We used backpropagation without saying so. Backpropagation is a commonly used method for training artificial neural networks, especially deep neural networks.
  • 33. 33 Backpropagation in Neural Networks • Backpropagation is needed to calculate the gradient, which we need to adapt the weights of the weight matrices. The weight of the neuron (nodes) of our network are adjusted by calculating the gradient of the loss function. For this purpose a gradient descent optimization algorithm is used. It is also called backward propagation of errors.
  • 34. 34 Backpropagation in Neural Networks • Explaining gradient descent starts in many articles or tutorials with mountains. Imagine you are put on a mountain, not necessarily the top, by a helicopter at night or heavy fog. Let's further imagine that this mountain is on an island and you want to reach sea level. You have to go down, but you hardly see anything, maybe just a few metres.
  • 35. 35 Backpropagation in Neural Networks • Your task is to find your way down, but you cannot see the path. You can use the method of gradient descent. This means that you are examining the steepness at your current position. You will proceed in the direction with the steepest descent. You take only a few steps and then you stop again to reorientate yourself. This means you are applying again the previously described procedure, i.e. you are looking for the steepest descend.
  • 36. 36 Backpropagation in Neural Networks • This procedure is depicted in the following diagram in a two-dimensional space.
  • 38. 38 Backpropagation in Neural Networks • Going on like this you will arrive at a position, where there is no further descend. • Each directions goes upwards. You may have reached the deepest level - the global minimum -, but you might as well be stuck in a basin. If you start at the position on the right side of our image, everything works out fine, but from the leftside, you will be stuck in a local minimum. If you imagine now, - not very realistic - you are dropped many time at random places on this island, you will find ways downwards to sea level. This is what we actually do, when we train a neural network. •
  • 39. 39 Machine Learning by using Python Neural Network Lesson 2 By: Professor Lili Saghafi