SlideShare une entreprise Scribd logo
1  sur  23
Data Mining
Classification
Prithwis Mukerjee, Ph.D.
Prithwis
Mukerjee 2
Classification
Definition
 The separation or ordering of objects ( or things ) in
classes
A Priori Classification
 When the classification is done before you have looked at
the data
Post Priori Classification
 When the classification is done after you have looked at
the data
Prithwis
Mukerjee 3
General approach
You decide on the classes without looking at
the data
 For example : High risk, medium risk, low risk classes
You “train” system
 Take a small set of objects – the training set
 Each object has a set of attributes
 Classify the objects in this small (“training”) set into the
three classes, without looking at the attributes
 You will need human expertise here, to classify objects
 Now find a set of rules based on the attributes such that
the system classifies the objects just as you have done
without looking at the attributes
Use these rules to classify the full set of
attributes
Prithwis
Mukerjee 4
If we have this data ...
Name Eggs Pouch Flies Feathers Class
Cockatoo Yes No Yes Yes Bird
No No No No Mammal
Yes Yes No No Marsupial
Emu Yes No No Yes Bird
Kangaroo No Yes No No Marsupial
Koala No Yes No No Marsupial
Yes No Yes Yes Bird
Owl Yes No Yes Yes Bird
Penguin Yes No No Yes Bird
Platypus Yes No No No Mammal
Possum No Yes No No Marsupial
Wombat No Yes No No Marsupial
Dugong
Echidna
Kokkabura
Prithwis
Mukerjee 5
We need to build a decision tree like ....
Pouch ?Pouch ?
Feathers ?Feathers ?
Bird Mammal
Marsupial
YES
YES
NO
NO
Prithwis
Mukerjee 6
Question is ...
Why did we ignore
two attributes ?
 Flies ?
 Feathers ?
Why did we use the
attribute called
POUCH first ?
 And then we used the
attribute called
FEATHERS
A rigorous
classification process
should tell us
 If there are lots of
attributes to be looked at
then which are the
important ones ?
 In which order should we
look at the attributes
So that the
classification arrived
at is very similar to
the classification done
with the training set
Prithwis
Mukerjee 7
Decision Tree : Tree Induction Algorithm
Step 1 : Place all members into one node
 If all members belong to the same class
 Stop : there is nothing to be done
Step 2 : Else
 Choose one attribute and based on its value split the node
into two nodes
 For each of the two nodes
 If all members belong to the same class
 Stop
 Else : Recursively go to Step 1
Big question : How do you choose which
attribute to split a node on ?
 Information Theory
 GINI Index
Prithwis
Mukerjee 8
Information Theory : Recapitulate
Information Content I
 Of an event E
 That has n possible outcomes
 Where outcome i happens with probability pi
 Is defined as I = Σi
( - pi
log2
pi
)
Example :
 Event EA
has two possible outcomes
 P1
= 0, P2
= 0 : Outcome 1 is a certainty
 I = 0 because there is NO information in the outcome
 Event EB
has two possible outcomes
 P1
= 0.5, P2
= 0.5 : Both outcomes are equally likely
 I = -0.5 log2
(0.5) – 0.5 log2
(0.5) = 1
 Maximum possible information that is possible for an event
with two outcomes
Prithwis
Mukerjee 9
Information in the roll of a dice
Fair dice
 All numbers 1 – 6 equally probable ( pi
= 1/6)
 I = 6 x (- 1/6) log2
(1/6) = 2.585
Loaded Dice Case 1
 P6
= 0.5; P1
= P2
= P3
= P4
= P5
= 0.1
 I = 5 x (-0.1) log2
(0.1) – 0.5 x log2
(0.5) = 2.16
Loaded Dice Case 2
 P6
= 0.75; P1
= P2
= P3
= P4
= P5
= 0.05
 I = 5 x (-0.05) log2
(0.1) – 0.75 x log2
(0.75) = 1.39
Point to note ...
 We can change the information in the roll of the dice by
changing the probabilities of the various outcomes !
Prithwis
Mukerjee 10
How do we change the information ?
In a dice
 We make mechanical
modifications so that the
probabilities of each
outcome changes
 This is higly illegal
In a set of individuals
 We regroup the
individuals into the
classes so that the
probability of each class
changes
 This is highly permitted
in our algorithm
H
Prithwis
Mukerjee 11
Consider the following scenario ..
Probability of each outcome ( or class )
 P(A) = 3/10 , P(B) = 3/10 , P(C) = 4/10
Total Information Content of Set S
 -(3/10) log2
(3/10) – (3/10) log2
(3/10) – (4/10) log2
(4/10) = 1.57
ID Home Married Gender Employed Credit Class
1 Yes Yes Male Yes A B
2 No No Female Yes A A
3 Yes Yes Female Yes B C
4 Yes No Male No B B
5 No Yes Female Yes B C
6 No No Female Yes B A
7 No No Male No B B
8 Yes No Female Yes A A
9 No Yes Female Yes A C
10 Yes Yes Female Yes A C
Prithwis
Mukerjee 12
Suppose we split this set on HOME
I1
: Information in set S1
 -(2/5)log2
(2/5) – (1/5) log2
(1/5) – (2/5) log2
(2/5) = 1.52
I2
: Information in set S2
 -(1/5)log2
(1/5) – (2/5) log2
(2/5) – (2/5) log2
(2/5) = 1.52
Total Information in S1
and S2
 0.5 I1
+ 0.5I2
= 0.5 x 1.52 + 0.5 x 1.52 = 1.52
ID Home Married Gender Employed Credit Class
2 No No Female Yes A A
5 No Yes Female Yes B C
6 No No Female Yes B A
7 No No Male No B B
9 No Yes Female Yes A C
ID Home Married Gender Employed Credit Class
1 Yes Yes Male Yes A B
3 Yes Yes Female Yes B C
4 Yes No Male No B B
8 Yes No Female Yes A A
10 Yes Yes Female Yes A C
P1
(A) = 2/5
P1
(B) = 1/5
P1
(C) = 2/5
P2
(A) = 1/5
P2
(B) = 2/5
P2
(C) = 2/5
Prithwis
Mukerjee 13
Impact of HOME attribute
In sets S1
and S2
, the
attribute HOME was
the same
But in set S the
attribute HOME is not
the same and so is of
some significance
What is the
significance of the
HOME attribute ?
By adding the HOME
attribute we have
increased the
information content
 FROM : 1.52
 TO : 1.57
So HOME attribute
adds 0.05 to the
overall information
content
 Or HOME attribute
reduces uncertainty by
0.05
Prithwis
Mukerjee 14
Let us go back to the original set S ..
Probability of each outcome ( or class )
 P(A) = 3/10 , P(B) = 3/10 , P(C) = 4/10
Total Information Content of Set S
 -(3/10) log2
(3/10) – (3/10) log2
(3/10) – (4/10) log2
(4/10) = 1.57
ID Home Married Gender Employed Credit Class
1 Yes Yes Male Yes A B
2 No No Female Yes A A
3 Yes Yes Female Yes B C
4 Yes No Male No B B
5 No Yes Female Yes B C
6 No No Female Yes B A
7 No No Male No B B
8 Yes No Female Yes A A
9 No Yes Female Yes A C
10 Yes Yes Female Yes A C
Prithwis
Mukerjee 15
This time we split on GENDER
I1
: Information in set S1
 -(3/7)log2
(3/7) – (4/7) log2
(4/7) = 0.985
I2
: Information in set S2
 = 0
Total Information in S1
and S2
 (7/10) I1
+ (3/10)I2
= 7/10 x 0.985 + 3/10 x 0 = 0.69
ID Home Married Gender Employed Credit Class
2 No No Female Yes A A
3 Yes Yes Female Yes B C
5 No Yes Female Yes B C
6 No No Female Yes B A
8 Yes No Female Yes A A
9 No Yes Female Yes A C
10 Yes Yes Female Yes A C
ID Home Married Gender Employed Credit Class
1 Yes Yes Male Yes A B
4 Yes No Male No B B
7 No No Male No B B
P1
(A) = 3/7
P1
(B) = 0/7
P1
(C) = 4/7
P2
(A) = 0/3
P2
(B) = 3/3
P2
(C) = 0/3
Prithwis
Mukerjee 16
Impact of GENDER attribute
In sets S1
and S2
, the
attribute GENDER
was the same
But in set S the
attribute GENDER is
not the same and so
is of some
significance
What is the
significance of the
GENDER attribute ?
By adding the
GENDER attribute we
have increased the
information content
 FROM : 0.69
 TO : 1.57
So GENDER attribute
adds 0.88 to the
overall information
content
 Or GENDER attribute
reduces uncertainty by
0.88
Prithwis
Mukerjee 17
If we were to do this for all attributes ...
We would observe that GENDER is the best
candidate for the split
Attribute
Home 1.57 1.52 0.05
Married 1.57 0.85 0.72
Gender 1.57 0.69 0.88
Employed 1.57 1.12 0.45
Credit 1.57 1.52 0.05
Information
before Split
Information
after Split
Information
Gain
Prithwis
Mukerjee 18
And the first part of our tree would be ...
GenderGender
What Next ?What Next ? Class B
MaleFemale
Prithwis
Mukerjee 19
Remove GENDER and Class B and
continue
ID Home Married Employed Credit Class
2 No No Yes A A
3 Yes Yes Yes B C
5 No Yes Yes B C
6 No No Yes B A
8 Yes No Yes A A
9 No Yes Yes A C
10 Yes Yes Yes A C
Probability of each outcome ( or class )
 P(A) = 3/7 , P(C) = 4/7
Total Information Content of Set S
 -(3/7) log2
(3/7) – (4/7) log2
(4/7) = 1.33
Prithwis
Mukerjee 20
We split this set on HOME ...
I1
: Information in set S1
 -(2/4)log2
(2/4) – (2/4) log2
(2/4) = 1.00
I2
: Information in set S2
 -(1/3)log2
(1/3) – (2/3) log2
(2/3) = 0.92
Total Information in S1
and S2
 (4/7) I1
+ (3/7)I2
= 4/7 x 1.00 + 3/7 x 0.92 = 0.96
ID Home Married Employed Credit Class
2 No No Yes A A
5 No Yes Yes B C
6 No No Yes B A
9 No Yes Yes A C
ID Home Married Employed Credit Class
3 Yes Yes Yes B C
8 Yes No Yes A A
10 Yes Yes Yes A C
P1
(A) = 2/4
P1
(C) = 2/4
P1
(A) = 1/3
P1
(C) = 2/3
Gain
= 1.33 – 0.96
= 0.37
Prithwis
Mukerjee 21
But if we were to split on MARRIED
I1
: Information in set S1
 = 0.0
I2
: Information in set S2
 = 0.0
Total Information in S1
and S2
 = 0.0
ID Home Married Employed Credit Class
2 No No Yes A A
8 Yes No Yes A A
6 No No Yes B A
ID Home Married Employed Credit Class
3 Yes Yes Yes B C
9 No Yes Yes A C
10 Yes Yes Yes A C
5 No Yes Yes B C
P1
(A) = 4/4
P1
(C) = 0/4
P1
(A) = 0/3
P1
(C) = 3/3
Gain
= 1.33 - 0
= 1.33
Prithwis
Mukerjee 22
Two things have happened
With MARRIED
 We have hit the upper limit of information gain
 No other attribute can do any better than this
In The TWO sub sets
 All members belong to the same class
 Either A or C
Hence we STOP here and observe ...
Prithwis
Mukerjee 23
That our DECISION TREE looks like
GenderGender
MarriedMarried
Class C Class A
Class B
Male
YES
Female
NO

Contenu connexe

Tendances

Causal inference in data science
Causal inference in data scienceCausal inference in data science
Causal inference in data scienceAmit Sharma
 
Resolution method in AI.pptx
Resolution method in AI.pptxResolution method in AI.pptx
Resolution method in AI.pptxAbdullah251975
 
Machine Learning Algorithms | Machine Learning Tutorial | Data Science Algori...
Machine Learning Algorithms | Machine Learning Tutorial | Data Science Algori...Machine Learning Algorithms | Machine Learning Tutorial | Data Science Algori...
Machine Learning Algorithms | Machine Learning Tutorial | Data Science Algori...Simplilearn
 
Captcha Seminar report 2014
Captcha Seminar report 2014Captcha Seminar report 2014
Captcha Seminar report 2014Ganesh Dhage
 
Reinforcement learning
Reinforcement learningReinforcement learning
Reinforcement learningDing Li
 
Artificial Intelligence - Propositional Logic
Artificial Intelligence - Propositional LogicArtificial Intelligence - Propositional Logic
Artificial Intelligence - Propositional LogicSaravanan T.M
 
Introduction to Machine learning with Python
Introduction to Machine learning with PythonIntroduction to Machine learning with Python
Introduction to Machine learning with PythonChariza Pladin
 
Genetic algorithms in Data Mining
Genetic algorithms in Data MiningGenetic algorithms in Data Mining
Genetic algorithms in Data MiningAtul Khanna
 
Image Steganography using LSB
Image Steganography using LSBImage Steganography using LSB
Image Steganography using LSBSreelekshmi Sree
 
block ciphers
block ciphersblock ciphers
block ciphersAsad Ali
 
Data Mining: Concepts and Techniques_ Chapter 6: Mining Frequent Patterns, ...
Data Mining:  Concepts and Techniques_ Chapter 6: Mining Frequent Patterns, ...Data Mining:  Concepts and Techniques_ Chapter 6: Mining Frequent Patterns, ...
Data Mining: Concepts and Techniques_ Chapter 6: Mining Frequent Patterns, ...Salah Amean
 
Review chapter 1 2-3
Review chapter 1 2-3Review chapter 1 2-3
Review chapter 1 2-3ahmed22dg
 
Cryptography and applications
Cryptography and applicationsCryptography and applications
Cryptography and applicationsthai
 
Vulnerability Management
Vulnerability ManagementVulnerability Management
Vulnerability Managementasherad
 
2. public key cryptography and RSA
2. public key cryptography and RSA2. public key cryptography and RSA
2. public key cryptography and RSADr.Florence Dayana
 

Tendances (20)

Causal inference in data science
Causal inference in data scienceCausal inference in data science
Causal inference in data science
 
Resolution method in AI.pptx
Resolution method in AI.pptxResolution method in AI.pptx
Resolution method in AI.pptx
 
Machine Learning Algorithms | Machine Learning Tutorial | Data Science Algori...
Machine Learning Algorithms | Machine Learning Tutorial | Data Science Algori...Machine Learning Algorithms | Machine Learning Tutorial | Data Science Algori...
Machine Learning Algorithms | Machine Learning Tutorial | Data Science Algori...
 
Captcha Seminar report 2014
Captcha Seminar report 2014Captcha Seminar report 2014
Captcha Seminar report 2014
 
Reinforcement learning
Reinforcement learningReinforcement learning
Reinforcement learning
 
Artificial Intelligence - Propositional Logic
Artificial Intelligence - Propositional LogicArtificial Intelligence - Propositional Logic
Artificial Intelligence - Propositional Logic
 
Pgp
PgpPgp
Pgp
 
Introduction to Machine learning with Python
Introduction to Machine learning with PythonIntroduction to Machine learning with Python
Introduction to Machine learning with Python
 
Genetic algorithms in Data Mining
Genetic algorithms in Data MiningGenetic algorithms in Data Mining
Genetic algorithms in Data Mining
 
MD-5 : Algorithm
MD-5 : AlgorithmMD-5 : Algorithm
MD-5 : Algorithm
 
Email security
Email securityEmail security
Email security
 
Image Steganography using LSB
Image Steganography using LSBImage Steganography using LSB
Image Steganography using LSB
 
block ciphers
block ciphersblock ciphers
block ciphers
 
Data Mining: Concepts and Techniques_ Chapter 6: Mining Frequent Patterns, ...
Data Mining:  Concepts and Techniques_ Chapter 6: Mining Frequent Patterns, ...Data Mining:  Concepts and Techniques_ Chapter 6: Mining Frequent Patterns, ...
Data Mining: Concepts and Techniques_ Chapter 6: Mining Frequent Patterns, ...
 
Game playing in AI
Game playing in AIGame playing in AI
Game playing in AI
 
Review chapter 1 2-3
Review chapter 1 2-3Review chapter 1 2-3
Review chapter 1 2-3
 
Cryptography and applications
Cryptography and applicationsCryptography and applications
Cryptography and applications
 
Vulnerability Management
Vulnerability ManagementVulnerability Management
Vulnerability Management
 
Prim's algorithm
Prim's algorithmPrim's algorithm
Prim's algorithm
 
2. public key cryptography and RSA
2. public key cryptography and RSA2. public key cryptography and RSA
2. public key cryptography and RSA
 

En vedette

Bitcoin, Blockchain and Crypto Contracts - Part 3
Bitcoin, Blockchain and Crypto Contracts - Part 3Bitcoin, Blockchain and Crypto Contracts - Part 3
Bitcoin, Blockchain and Crypto Contracts - Part 3Prithwis Mukerjee
 
Business Intelligence Industry Perspective Session I
Business Intelligence   Industry Perspective Session IBusiness Intelligence   Industry Perspective Session I
Business Intelligence Industry Perspective Session IPrithwis Mukerjee
 
Game theoretic concepts in Support Vector Machines
Game theoretic concepts in Support Vector MachinesGame theoretic concepts in Support Vector Machines
Game theoretic concepts in Support Vector MachinesSubhayan Mukerjee
 
The incompleteness of reason
The incompleteness of reasonThe incompleteness of reason
The incompleteness of reasonSubhayan Mukerjee
 
Tintin and Contemporary Politics
Tintin and Contemporary PoliticsTintin and Contemporary Politics
Tintin and Contemporary PoliticsSubhayan Mukerjee
 
ইন্টার্নেট কি এবং কেন ?
ইন্টার্নেট কি এবং কেন ?ইন্টার্নেট কি এবং কেন ?
ইন্টার্নেট কি এবং কেন ?Prithwis Mukerjee
 
Bitcoin, Blockchain and the Crypto Contracts - Part 2
Bitcoin, Blockchain and the Crypto Contracts - Part 2Bitcoin, Blockchain and the Crypto Contracts - Part 2
Bitcoin, Blockchain and the Crypto Contracts - Part 2Prithwis Mukerjee
 

En vedette (10)

Data mining intro-2009-v2
Data mining intro-2009-v2Data mining intro-2009-v2
Data mining intro-2009-v2
 
Data mining arm-2009-v0
Data mining arm-2009-v0Data mining arm-2009-v0
Data mining arm-2009-v0
 
Bitcoin, Blockchain and Crypto Contracts - Part 3
Bitcoin, Blockchain and Crypto Contracts - Part 3Bitcoin, Blockchain and Crypto Contracts - Part 3
Bitcoin, Blockchain and Crypto Contracts - Part 3
 
Business Intelligence Industry Perspective Session I
Business Intelligence   Industry Perspective Session IBusiness Intelligence   Industry Perspective Session I
Business Intelligence Industry Perspective Session I
 
Game theoretic concepts in Support Vector Machines
Game theoretic concepts in Support Vector MachinesGame theoretic concepts in Support Vector Machines
Game theoretic concepts in Support Vector Machines
 
The incompleteness of reason
The incompleteness of reasonThe incompleteness of reason
The incompleteness of reason
 
Tintin and Contemporary Politics
Tintin and Contemporary PoliticsTintin and Contemporary Politics
Tintin and Contemporary Politics
 
Internet of Things
Internet of ThingsInternet of Things
Internet of Things
 
ইন্টার্নেট কি এবং কেন ?
ইন্টার্নেট কি এবং কেন ?ইন্টার্নেট কি এবং কেন ?
ইন্টার্নেট কি এবং কেন ?
 
Bitcoin, Blockchain and the Crypto Contracts - Part 2
Bitcoin, Blockchain and the Crypto Contracts - Part 2Bitcoin, Blockchain and the Crypto Contracts - Part 2
Bitcoin, Blockchain and the Crypto Contracts - Part 2
 

Similaire à Data mining classification-2009-v0

Classification decision tree
Classification  decision treeClassification  decision tree
Classification decision treeyazad dumasia
 
1_Introduction_printable.pdf
1_Introduction_printable.pdf1_Introduction_printable.pdf
1_Introduction_printable.pdfElio Laureano
 
Statistics assignment
Statistics assignmentStatistics assignment
Statistics assignmentBrian Miles
 
Dr. Oner CelepcikayITS 632ITS 632Week 4Classification
Dr. Oner CelepcikayITS 632ITS 632Week 4ClassificationDr. Oner CelepcikayITS 632ITS 632Week 4Classification
Dr. Oner CelepcikayITS 632ITS 632Week 4ClassificationDustiBuckner14
 
Data Mining:Concepts and Techniques, Chapter 8. Classification: Basic Concepts
Data Mining:Concepts and Techniques, Chapter 8. Classification: Basic ConceptsData Mining:Concepts and Techniques, Chapter 8. Classification: Basic Concepts
Data Mining:Concepts and Techniques, Chapter 8. Classification: Basic ConceptsSalah Amean
 
Bsc cs ii-dbms- u-iii-data modeling using e.r. model (entity relationship model)
Bsc cs ii-dbms- u-iii-data modeling using e.r. model (entity relationship model)Bsc cs ii-dbms- u-iii-data modeling using e.r. model (entity relationship model)
Bsc cs ii-dbms- u-iii-data modeling using e.r. model (entity relationship model)Rai University
 

Similaire à Data mining classification-2009-v0 (9)

Classification decision tree
Classification  decision treeClassification  decision tree
Classification decision tree
 
1_Introduction_printable.pdf
1_Introduction_printable.pdf1_Introduction_printable.pdf
1_Introduction_printable.pdf
 
QUARTILES.pptx
QUARTILES.pptxQUARTILES.pptx
QUARTILES.pptx
 
Decision tree
Decision treeDecision tree
Decision tree
 
Statistics assignment
Statistics assignmentStatistics assignment
Statistics assignment
 
Machine learning
Machine learningMachine learning
Machine learning
 
Dr. Oner CelepcikayITS 632ITS 632Week 4Classification
Dr. Oner CelepcikayITS 632ITS 632Week 4ClassificationDr. Oner CelepcikayITS 632ITS 632Week 4Classification
Dr. Oner CelepcikayITS 632ITS 632Week 4Classification
 
Data Mining:Concepts and Techniques, Chapter 8. Classification: Basic Concepts
Data Mining:Concepts and Techniques, Chapter 8. Classification: Basic ConceptsData Mining:Concepts and Techniques, Chapter 8. Classification: Basic Concepts
Data Mining:Concepts and Techniques, Chapter 8. Classification: Basic Concepts
 
Bsc cs ii-dbms- u-iii-data modeling using e.r. model (entity relationship model)
Bsc cs ii-dbms- u-iii-data modeling using e.r. model (entity relationship model)Bsc cs ii-dbms- u-iii-data modeling using e.r. model (entity relationship model)
Bsc cs ii-dbms- u-iii-data modeling using e.r. model (entity relationship model)
 

Plus de Prithwis Mukerjee

Plus de Prithwis Mukerjee (20)

Thought controlled devices
Thought controlled devicesThought controlled devices
Thought controlled devices
 
Cloudcasting
CloudcastingCloudcasting
Cloudcasting
 
Currency, Commodity and Bitcoins
Currency, Commodity and BitcoinsCurrency, Commodity and Bitcoins
Currency, Commodity and Bitcoins
 
Data Science
Data ScienceData Science
Data Science
 
05 OLAP v6 weekend
05 OLAP  v6 weekend05 OLAP  v6 weekend
05 OLAP v6 weekend
 
04 Dimensional Analysis - v6
04 Dimensional Analysis - v604 Dimensional Analysis - v6
04 Dimensional Analysis - v6
 
Thought control
Thought controlThought control
Thought control
 
World of data @ praxis 2013 v2
World of data   @ praxis 2013  v2World of data   @ praxis 2013  v2
World of data @ praxis 2013 v2
 
BIS 08a - Application Development - II Version 2
BIS 08a - Application Development - II Version 2BIS 08a - Application Development - II Version 2
BIS 08a - Application Development - II Version 2
 
Lecture02 - Data Mining & Analytics
Lecture02 - Data Mining & AnalyticsLecture02 - Data Mining & Analytics
Lecture02 - Data Mining & Analytics
 
Data mining clustering-2009-v0
Data mining clustering-2009-v0Data mining clustering-2009-v0
Data mining clustering-2009-v0
 
PPM Lite
PPM LitePPM Lite
PPM Lite
 
OLAP Cubes in Datawarehousing
OLAP Cubes in DatawarehousingOLAP Cubes in Datawarehousing
OLAP Cubes in Datawarehousing
 
Dimensional Modelling
Dimensional ModellingDimensional Modelling
Dimensional Modelling
 
Datawarehousing and Business Intelligence
Datawarehousing and Business IntelligenceDatawarehousing and Business Intelligence
Datawarehousing and Business Intelligence
 
Business Models for Web 2.0
Business Models for Web 2.0Business Models for Web 2.0
Business Models for Web 2.0
 
BIS01 Living On the Web
BIS01 Living On the WebBIS01 Living On the Web
BIS01 Living On the Web
 
BIS03 Data Modelling - I
BIS03 Data Modelling - IBIS03 Data Modelling - I
BIS03 Data Modelling - I
 
BIS04 Data Modelling - II
BIS04 Data Modelling  - IIBIS04 Data Modelling  - II
BIS04 Data Modelling - II
 
BIS06 Physical Database Models
BIS06 Physical Database ModelsBIS06 Physical Database Models
BIS06 Physical Database Models
 

Dernier

Unraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdfUnraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdfAlex Barbosa Coqueiro
 
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024BookNet Canada
 
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)Mark Simos
 
Scanning the Internet for External Cloud Exposures via SSL Certs
Scanning the Internet for External Cloud Exposures via SSL CertsScanning the Internet for External Cloud Exposures via SSL Certs
Scanning the Internet for External Cloud Exposures via SSL CertsRizwan Syed
 
DevEX - reference for building teams, processes, and platforms
DevEX - reference for building teams, processes, and platformsDevEX - reference for building teams, processes, and platforms
DevEX - reference for building teams, processes, and platformsSergiu Bodiu
 
Streamlining Python Development: A Guide to a Modern Project Setup
Streamlining Python Development: A Guide to a Modern Project SetupStreamlining Python Development: A Guide to a Modern Project Setup
Streamlining Python Development: A Guide to a Modern Project SetupFlorian Wilhelm
 
"ML in Production",Oleksandr Bagan
"ML in Production",Oleksandr Bagan"ML in Production",Oleksandr Bagan
"ML in Production",Oleksandr BaganFwdays
 
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptx
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptxMerck Moving Beyond Passwords: FIDO Paris Seminar.pptx
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptxLoriGlavin3
 
From Family Reminiscence to Scholarly Archive .
From Family Reminiscence to Scholarly Archive .From Family Reminiscence to Scholarly Archive .
From Family Reminiscence to Scholarly Archive .Alan Dix
 
Dev Dives: Streamline document processing with UiPath Studio Web
Dev Dives: Streamline document processing with UiPath Studio WebDev Dives: Streamline document processing with UiPath Studio Web
Dev Dives: Streamline document processing with UiPath Studio WebUiPathCommunity
 
TrustArc Webinar - How to Build Consumer Trust Through Data Privacy
TrustArc Webinar - How to Build Consumer Trust Through Data PrivacyTrustArc Webinar - How to Build Consumer Trust Through Data Privacy
TrustArc Webinar - How to Build Consumer Trust Through Data PrivacyTrustArc
 
Designing IA for AI - Information Architecture Conference 2024
Designing IA for AI - Information Architecture Conference 2024Designing IA for AI - Information Architecture Conference 2024
Designing IA for AI - Information Architecture Conference 2024Enterprise Knowledge
 
SIP trunking in Janus @ Kamailio World 2024
SIP trunking in Janus @ Kamailio World 2024SIP trunking in Janus @ Kamailio World 2024
SIP trunking in Janus @ Kamailio World 2024Lorenzo Miniero
 
TeamStation AI System Report LATAM IT Salaries 2024
TeamStation AI System Report LATAM IT Salaries 2024TeamStation AI System Report LATAM IT Salaries 2024
TeamStation AI System Report LATAM IT Salaries 2024Lonnie McRorey
 
Human Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsHuman Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsMark Billinghurst
 
Search Engine Optimization SEO PDF for 2024.pdf
Search Engine Optimization SEO PDF for 2024.pdfSearch Engine Optimization SEO PDF for 2024.pdf
Search Engine Optimization SEO PDF for 2024.pdfRankYa
 
WordPress Websites for Engineers: Elevate Your Brand
WordPress Websites for Engineers: Elevate Your BrandWordPress Websites for Engineers: Elevate Your Brand
WordPress Websites for Engineers: Elevate Your Brandgvaughan
 
Powerpoint exploring the locations used in television show Time Clash
Powerpoint exploring the locations used in television show Time ClashPowerpoint exploring the locations used in television show Time Clash
Powerpoint exploring the locations used in television show Time Clashcharlottematthew16
 
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek SchlawackFwdays
 
Take control of your SAP testing with UiPath Test Suite
Take control of your SAP testing with UiPath Test SuiteTake control of your SAP testing with UiPath Test Suite
Take control of your SAP testing with UiPath Test SuiteDianaGray10
 

Dernier (20)

Unraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdfUnraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdf
 
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
 
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
 
Scanning the Internet for External Cloud Exposures via SSL Certs
Scanning the Internet for External Cloud Exposures via SSL CertsScanning the Internet for External Cloud Exposures via SSL Certs
Scanning the Internet for External Cloud Exposures via SSL Certs
 
DevEX - reference for building teams, processes, and platforms
DevEX - reference for building teams, processes, and platformsDevEX - reference for building teams, processes, and platforms
DevEX - reference for building teams, processes, and platforms
 
Streamlining Python Development: A Guide to a Modern Project Setup
Streamlining Python Development: A Guide to a Modern Project SetupStreamlining Python Development: A Guide to a Modern Project Setup
Streamlining Python Development: A Guide to a Modern Project Setup
 
"ML in Production",Oleksandr Bagan
"ML in Production",Oleksandr Bagan"ML in Production",Oleksandr Bagan
"ML in Production",Oleksandr Bagan
 
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptx
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptxMerck Moving Beyond Passwords: FIDO Paris Seminar.pptx
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptx
 
From Family Reminiscence to Scholarly Archive .
From Family Reminiscence to Scholarly Archive .From Family Reminiscence to Scholarly Archive .
From Family Reminiscence to Scholarly Archive .
 
Dev Dives: Streamline document processing with UiPath Studio Web
Dev Dives: Streamline document processing with UiPath Studio WebDev Dives: Streamline document processing with UiPath Studio Web
Dev Dives: Streamline document processing with UiPath Studio Web
 
TrustArc Webinar - How to Build Consumer Trust Through Data Privacy
TrustArc Webinar - How to Build Consumer Trust Through Data PrivacyTrustArc Webinar - How to Build Consumer Trust Through Data Privacy
TrustArc Webinar - How to Build Consumer Trust Through Data Privacy
 
Designing IA for AI - Information Architecture Conference 2024
Designing IA for AI - Information Architecture Conference 2024Designing IA for AI - Information Architecture Conference 2024
Designing IA for AI - Information Architecture Conference 2024
 
SIP trunking in Janus @ Kamailio World 2024
SIP trunking in Janus @ Kamailio World 2024SIP trunking in Janus @ Kamailio World 2024
SIP trunking in Janus @ Kamailio World 2024
 
TeamStation AI System Report LATAM IT Salaries 2024
TeamStation AI System Report LATAM IT Salaries 2024TeamStation AI System Report LATAM IT Salaries 2024
TeamStation AI System Report LATAM IT Salaries 2024
 
Human Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsHuman Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR Systems
 
Search Engine Optimization SEO PDF for 2024.pdf
Search Engine Optimization SEO PDF for 2024.pdfSearch Engine Optimization SEO PDF for 2024.pdf
Search Engine Optimization SEO PDF for 2024.pdf
 
WordPress Websites for Engineers: Elevate Your Brand
WordPress Websites for Engineers: Elevate Your BrandWordPress Websites for Engineers: Elevate Your Brand
WordPress Websites for Engineers: Elevate Your Brand
 
Powerpoint exploring the locations used in television show Time Clash
Powerpoint exploring the locations used in television show Time ClashPowerpoint exploring the locations used in television show Time Clash
Powerpoint exploring the locations used in television show Time Clash
 
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack
 
Take control of your SAP testing with UiPath Test Suite
Take control of your SAP testing with UiPath Test SuiteTake control of your SAP testing with UiPath Test Suite
Take control of your SAP testing with UiPath Test Suite
 

Data mining classification-2009-v0

  • 2. Prithwis Mukerjee 2 Classification Definition  The separation or ordering of objects ( or things ) in classes A Priori Classification  When the classification is done before you have looked at the data Post Priori Classification  When the classification is done after you have looked at the data
  • 3. Prithwis Mukerjee 3 General approach You decide on the classes without looking at the data  For example : High risk, medium risk, low risk classes You “train” system  Take a small set of objects – the training set  Each object has a set of attributes  Classify the objects in this small (“training”) set into the three classes, without looking at the attributes  You will need human expertise here, to classify objects  Now find a set of rules based on the attributes such that the system classifies the objects just as you have done without looking at the attributes Use these rules to classify the full set of attributes
  • 4. Prithwis Mukerjee 4 If we have this data ... Name Eggs Pouch Flies Feathers Class Cockatoo Yes No Yes Yes Bird No No No No Mammal Yes Yes No No Marsupial Emu Yes No No Yes Bird Kangaroo No Yes No No Marsupial Koala No Yes No No Marsupial Yes No Yes Yes Bird Owl Yes No Yes Yes Bird Penguin Yes No No Yes Bird Platypus Yes No No No Mammal Possum No Yes No No Marsupial Wombat No Yes No No Marsupial Dugong Echidna Kokkabura
  • 5. Prithwis Mukerjee 5 We need to build a decision tree like .... Pouch ?Pouch ? Feathers ?Feathers ? Bird Mammal Marsupial YES YES NO NO
  • 6. Prithwis Mukerjee 6 Question is ... Why did we ignore two attributes ?  Flies ?  Feathers ? Why did we use the attribute called POUCH first ?  And then we used the attribute called FEATHERS A rigorous classification process should tell us  If there are lots of attributes to be looked at then which are the important ones ?  In which order should we look at the attributes So that the classification arrived at is very similar to the classification done with the training set
  • 7. Prithwis Mukerjee 7 Decision Tree : Tree Induction Algorithm Step 1 : Place all members into one node  If all members belong to the same class  Stop : there is nothing to be done Step 2 : Else  Choose one attribute and based on its value split the node into two nodes  For each of the two nodes  If all members belong to the same class  Stop  Else : Recursively go to Step 1 Big question : How do you choose which attribute to split a node on ?  Information Theory  GINI Index
  • 8. Prithwis Mukerjee 8 Information Theory : Recapitulate Information Content I  Of an event E  That has n possible outcomes  Where outcome i happens with probability pi  Is defined as I = Σi ( - pi log2 pi ) Example :  Event EA has two possible outcomes  P1 = 0, P2 = 0 : Outcome 1 is a certainty  I = 0 because there is NO information in the outcome  Event EB has two possible outcomes  P1 = 0.5, P2 = 0.5 : Both outcomes are equally likely  I = -0.5 log2 (0.5) – 0.5 log2 (0.5) = 1  Maximum possible information that is possible for an event with two outcomes
  • 9. Prithwis Mukerjee 9 Information in the roll of a dice Fair dice  All numbers 1 – 6 equally probable ( pi = 1/6)  I = 6 x (- 1/6) log2 (1/6) = 2.585 Loaded Dice Case 1  P6 = 0.5; P1 = P2 = P3 = P4 = P5 = 0.1  I = 5 x (-0.1) log2 (0.1) – 0.5 x log2 (0.5) = 2.16 Loaded Dice Case 2  P6 = 0.75; P1 = P2 = P3 = P4 = P5 = 0.05  I = 5 x (-0.05) log2 (0.1) – 0.75 x log2 (0.75) = 1.39 Point to note ...  We can change the information in the roll of the dice by changing the probabilities of the various outcomes !
  • 10. Prithwis Mukerjee 10 How do we change the information ? In a dice  We make mechanical modifications so that the probabilities of each outcome changes  This is higly illegal In a set of individuals  We regroup the individuals into the classes so that the probability of each class changes  This is highly permitted in our algorithm H
  • 11. Prithwis Mukerjee 11 Consider the following scenario .. Probability of each outcome ( or class )  P(A) = 3/10 , P(B) = 3/10 , P(C) = 4/10 Total Information Content of Set S  -(3/10) log2 (3/10) – (3/10) log2 (3/10) – (4/10) log2 (4/10) = 1.57 ID Home Married Gender Employed Credit Class 1 Yes Yes Male Yes A B 2 No No Female Yes A A 3 Yes Yes Female Yes B C 4 Yes No Male No B B 5 No Yes Female Yes B C 6 No No Female Yes B A 7 No No Male No B B 8 Yes No Female Yes A A 9 No Yes Female Yes A C 10 Yes Yes Female Yes A C
  • 12. Prithwis Mukerjee 12 Suppose we split this set on HOME I1 : Information in set S1  -(2/5)log2 (2/5) – (1/5) log2 (1/5) – (2/5) log2 (2/5) = 1.52 I2 : Information in set S2  -(1/5)log2 (1/5) – (2/5) log2 (2/5) – (2/5) log2 (2/5) = 1.52 Total Information in S1 and S2  0.5 I1 + 0.5I2 = 0.5 x 1.52 + 0.5 x 1.52 = 1.52 ID Home Married Gender Employed Credit Class 2 No No Female Yes A A 5 No Yes Female Yes B C 6 No No Female Yes B A 7 No No Male No B B 9 No Yes Female Yes A C ID Home Married Gender Employed Credit Class 1 Yes Yes Male Yes A B 3 Yes Yes Female Yes B C 4 Yes No Male No B B 8 Yes No Female Yes A A 10 Yes Yes Female Yes A C P1 (A) = 2/5 P1 (B) = 1/5 P1 (C) = 2/5 P2 (A) = 1/5 P2 (B) = 2/5 P2 (C) = 2/5
  • 13. Prithwis Mukerjee 13 Impact of HOME attribute In sets S1 and S2 , the attribute HOME was the same But in set S the attribute HOME is not the same and so is of some significance What is the significance of the HOME attribute ? By adding the HOME attribute we have increased the information content  FROM : 1.52  TO : 1.57 So HOME attribute adds 0.05 to the overall information content  Or HOME attribute reduces uncertainty by 0.05
  • 14. Prithwis Mukerjee 14 Let us go back to the original set S .. Probability of each outcome ( or class )  P(A) = 3/10 , P(B) = 3/10 , P(C) = 4/10 Total Information Content of Set S  -(3/10) log2 (3/10) – (3/10) log2 (3/10) – (4/10) log2 (4/10) = 1.57 ID Home Married Gender Employed Credit Class 1 Yes Yes Male Yes A B 2 No No Female Yes A A 3 Yes Yes Female Yes B C 4 Yes No Male No B B 5 No Yes Female Yes B C 6 No No Female Yes B A 7 No No Male No B B 8 Yes No Female Yes A A 9 No Yes Female Yes A C 10 Yes Yes Female Yes A C
  • 15. Prithwis Mukerjee 15 This time we split on GENDER I1 : Information in set S1  -(3/7)log2 (3/7) – (4/7) log2 (4/7) = 0.985 I2 : Information in set S2  = 0 Total Information in S1 and S2  (7/10) I1 + (3/10)I2 = 7/10 x 0.985 + 3/10 x 0 = 0.69 ID Home Married Gender Employed Credit Class 2 No No Female Yes A A 3 Yes Yes Female Yes B C 5 No Yes Female Yes B C 6 No No Female Yes B A 8 Yes No Female Yes A A 9 No Yes Female Yes A C 10 Yes Yes Female Yes A C ID Home Married Gender Employed Credit Class 1 Yes Yes Male Yes A B 4 Yes No Male No B B 7 No No Male No B B P1 (A) = 3/7 P1 (B) = 0/7 P1 (C) = 4/7 P2 (A) = 0/3 P2 (B) = 3/3 P2 (C) = 0/3
  • 16. Prithwis Mukerjee 16 Impact of GENDER attribute In sets S1 and S2 , the attribute GENDER was the same But in set S the attribute GENDER is not the same and so is of some significance What is the significance of the GENDER attribute ? By adding the GENDER attribute we have increased the information content  FROM : 0.69  TO : 1.57 So GENDER attribute adds 0.88 to the overall information content  Or GENDER attribute reduces uncertainty by 0.88
  • 17. Prithwis Mukerjee 17 If we were to do this for all attributes ... We would observe that GENDER is the best candidate for the split Attribute Home 1.57 1.52 0.05 Married 1.57 0.85 0.72 Gender 1.57 0.69 0.88 Employed 1.57 1.12 0.45 Credit 1.57 1.52 0.05 Information before Split Information after Split Information Gain
  • 18. Prithwis Mukerjee 18 And the first part of our tree would be ... GenderGender What Next ?What Next ? Class B MaleFemale
  • 19. Prithwis Mukerjee 19 Remove GENDER and Class B and continue ID Home Married Employed Credit Class 2 No No Yes A A 3 Yes Yes Yes B C 5 No Yes Yes B C 6 No No Yes B A 8 Yes No Yes A A 9 No Yes Yes A C 10 Yes Yes Yes A C Probability of each outcome ( or class )  P(A) = 3/7 , P(C) = 4/7 Total Information Content of Set S  -(3/7) log2 (3/7) – (4/7) log2 (4/7) = 1.33
  • 20. Prithwis Mukerjee 20 We split this set on HOME ... I1 : Information in set S1  -(2/4)log2 (2/4) – (2/4) log2 (2/4) = 1.00 I2 : Information in set S2  -(1/3)log2 (1/3) – (2/3) log2 (2/3) = 0.92 Total Information in S1 and S2  (4/7) I1 + (3/7)I2 = 4/7 x 1.00 + 3/7 x 0.92 = 0.96 ID Home Married Employed Credit Class 2 No No Yes A A 5 No Yes Yes B C 6 No No Yes B A 9 No Yes Yes A C ID Home Married Employed Credit Class 3 Yes Yes Yes B C 8 Yes No Yes A A 10 Yes Yes Yes A C P1 (A) = 2/4 P1 (C) = 2/4 P1 (A) = 1/3 P1 (C) = 2/3 Gain = 1.33 – 0.96 = 0.37
  • 21. Prithwis Mukerjee 21 But if we were to split on MARRIED I1 : Information in set S1  = 0.0 I2 : Information in set S2  = 0.0 Total Information in S1 and S2  = 0.0 ID Home Married Employed Credit Class 2 No No Yes A A 8 Yes No Yes A A 6 No No Yes B A ID Home Married Employed Credit Class 3 Yes Yes Yes B C 9 No Yes Yes A C 10 Yes Yes Yes A C 5 No Yes Yes B C P1 (A) = 4/4 P1 (C) = 0/4 P1 (A) = 0/3 P1 (C) = 3/3 Gain = 1.33 - 0 = 1.33
  • 22. Prithwis Mukerjee 22 Two things have happened With MARRIED  We have hit the upper limit of information gain  No other attribute can do any better than this In The TWO sub sets  All members belong to the same class  Either A or C Hence we STOP here and observe ...
  • 23. Prithwis Mukerjee 23 That our DECISION TREE looks like GenderGender MarriedMarried Class C Class A Class B Male YES Female NO