You are on page 1of 17

COMP24111 Machine Learning

Nave Bayes Classifier



Ke Chen


COMP24111 Machine Learning
2
Outline
Background
Probability Basics
Probabilistic Classification
Nave Bayes
Example: Play Tennis
Relevant Issues
Conclusions


COMP24111 Machine Learning
3
Background
There are three methods to establish a classifier
a) Model a classification rule directly
Examples: k-NN, decision trees, perceptron, SVM
b) Model the probability of class memberships given input data
Example: perceptron with the cross-entropy cost
c) Make a probabilistic model of data within each class
Examples: naive Bayes, model based classifiers
a) and b) are examples of discriminative classification
c) is an example of generative classification
b) and c) are both examples of probabilistic classification
COMP24111 Machine Learning
4
Probability Basics


Prior, conditional and joint probability for random variables
Prior probability:
Conditional probability:
Joint probability:
Relationship:
Independence:
Bayesian Rule
) | , ) (
1 2 1
X P(X X | X P
2
) (
) ( ) (
) (
X
X
X
P
C P C | P
| C P =
) (X P
) ) ( ), , (
2 2
,X P(X P X X
1 1
= = X X
) ( ) | ( ) ( ) | ( )
2 2 1 1 1 2 2
X P X X P X P X X P ,X P(X
1
= =
) ( ) ( ) ), ( ) | ( ), ( ) | (
2 1 2 1 2 1 2 1 2
X P X P ,X P(X X P X X P X P X X P
1
= = =
Evidence
Prior Likelihood
Posterior

=
COMP24111 Machine Learning
5
Probability Basics


Quiz: We have two six-sided dice. When they are tolled, it could end up
with the following occurance: (A) dice 1 lands on side 3, (B) dice 2 lands
on side 1, and (C) Two dice sum to eight. Answer the following questions:
? to equal ) , ( Is 8)
? ) , ( 7)
? ) , ( 6)
? ) | ( 5)
? ) | ( 4)
? 3)
? 2)
? ) ( ) 1
P(C) P(A) C A P
C A P
B A P
A C P
B A P
P(C)
P(B)
A P
-
=
=
=
=
=
=
=
COMP24111 Machine Learning
6
Probabilistic Classification


Establishing a probabilistic model for classification
Discriminative model

) , , , ) (
1 n 1 L
X (X c , , c C | C P = = X X
) , , , (
2 1 n
x x x = x

Discriminative
Probabilistic Classifier

1
x
2
x
n
x
) | (
1
x c P ) | (
2
x c P ) | ( x
L
c P
- - -
- - -
COMP24111 Machine Learning
7
Probabilistic Classification


Establishing a probabilistic model for classification (cont.)
Generative model
) , , , ) (
1 n 1 L
X (X c , , c C C | P = = X X
Generative
Probabilistic Model
for Class 1
) | (
1
c P x
1
x
2
x
n
x
- - -
Generative
Probabilistic Model
for Class 2
) | (
2
c P x
1
x
2
x
n
x
- - -
Generative
Probabilistic Model
for Class L
) | (
L
c P x
1
x
2
x
n
x
- - -
- - -
) , , , (
2 1 n
x x x = x
COMP24111 Machine Learning
8
Probabilistic Classification


MAP classification rule
MAP: Maximum A Posterior
Assign x to c* if

Generative classification with the MAP rule
Apply Bayesian rule to convert them into posterior probabilities




Then apply the MAP rule
L
c , , c c c c | c C P | c C P = = = = > = =
1
* *
, ) ( ) ( x X x X
L i
c C P c C | P
P
c C P c C | P
| c C P
i i
i i
i
, , 2 , 1 for
) ( ) (
) (
) ( ) (
) (
=
= = =
=
= = =
= = =
x X
x X
x X
x X
COMP24111 Machine Learning
9
Nave Bayes


Bayes classification

Difficulty: learning the joint probability
Nave Bayes classification
Assumption that all input attributes are conditionally independent!



MAP classification rule: for

) ( ) | , , ( ) ( ) ( ) (
1
C P C X X P C P C | P | C P
n
= X X
) | , , (
1
C X X P
n

) | ( ) | ( ) | (
) | , , ( ) | (
) | , , ( ) , , , | ( ) | , , , (
2 1
2 1
2 2 1 2 1
C X P C X P C X P
C X X P C X P
C X X P C X X X P C X X X P
n
n
n n n
=
=
=
L n n
c c c c c c P c x P c x P c P c x P c x P , , , ), ( )] | ( ) | ( [ ) ( )] | ( ) | ( [
1
*
1
* * *
1
= = >
) , , , (
2 1 n
x x x = x
COMP24111 Machine Learning
10
Nave Bayes


Nave Bayes Algorithm (for discrete input attributes)
Learning Phase: Given a training set S,




Output: conditional probability tables; for elements
Test Phase: Given an unknown instance ,
Look up tables to assign the label c* to X if

; in examples with ) | ( estimate ) | (


) , 1 ; , , 1 ( attribute each of value attribute every For
; in examples with ) ( estimate ) (


of value target each For
1
S
S
i jk j i jk j
j j jk
i i
L i i
c C x X P c C x X P
N , k n j X x
c C P c C P
) c , , c (c c
= = = =
= =
= =
=
L n n
c c c c c c P c a P c a P c P c a P c a P , , , ), (

)] | (

) | (

[ ) (

)] | (

) | (

[
1
*
1
* * *
1
= = ' ' > ' '
) , , (
1 n
a a
'

'
=
'
X
L N X
j j
,
COMP24111 Machine Learning
11
Example


Example: Play Tennis
COMP24111 Machine Learning
12
Example


Learning Phase
Outlook Play=Yes Play=No
Sunny
2/9 3/5
Overcast
4/9 0/5
Rain
3/9 2/5
Temperature Play=Yes Play=No
Hot
2/9 2/5
Mild
4/9 2/5
Cool
3/9 1/5
Humidity Play=Yes Play=No
High
3/9 4/5
Normal
6/9 1/5
Wind Play=Yes Play=No
Strong
3/9 3/5
Weak
6/9 2/5
P(Play=Yes) = 9/14 P(Play=No) = 5/14
COMP24111 Machine Learning
13
Example


Test Phase
Given a new instance,
x=(Outlook=Sunny, Temperature=Cool, Humidity=High, Wind=Strong)
Look up tables





MAP rule
P(Outlook=Sunny|Play=No) = 3/5
P(Temperature=Cool|Play==No) = 1/5
P(Huminity=High|Play=No) = 4/5
P(Wind=Strong|Play=No) = 3/5
P(Play=No) = 5/14
P(Outlook=Sunny|Play=Yes) = 2/9
P(Temperature=Cool|Play=Yes) = 3/9
P(Huminity=High|Play=Yes) = 3/9
P(Wind=Strong|Play=Yes) = 3/9
P(Play=Yes) = 9/14
P(Yes|x): [P(Sunny|Yes)P(Cool|Yes)P(High|Yes)P(Strong|Yes)]P(Play=Yes) = 0.0053
P(No|x): [P(Sunny|No) P(Cool|No)P(High|No)P(Strong|No)]P(Play=No) = 0.0206

Given the fact P(Yes|x) < P(No|x), we label x to be No.

COMP24111 Machine Learning
14
Example


Test Phase
Given a new instance,
x=(Outlook=Sunny, Temperature=Cool, Humidity=High, Wind=Strong)
Look up tables





MAP rule
P(Outlook=Sunny|Play=No) = 3/5
P(Temperature=Cool|Play==No) = 1/5
P(Huminity=High|Play=No) = 4/5
P(Wind=Strong|Play=No) = 3/5
P(Play=No) = 5/14
P(Outlook=Sunny|Play=Yes) = 2/9
P(Temperature=Cool|Play=Yes) = 3/9
P(Huminity=High|Play=Yes) = 3/9
P(Wind=Strong|Play=Yes) = 3/9
P(Play=Yes) = 9/14
P(Yes|x): [P(Sunny|Yes)P(Cool|Yes)P(High|Yes)P(Strong|Yes)]P(Play=Yes) = 0.0053
P(No|x): [P(Sunny|No) P(Cool|No)P(High|No)P(Strong|No)]P(Play=No) = 0.0206

Given the fact P(Yes|x) < P(No|x), we label x to be No.

COMP24111 Machine Learning
15
Relevant Issues


Violation of Independence Assumption
For many real world tasks,
Nevertheless, nave Bayes works surprisingly well anyway!
Zero conditional probability Problem
If no example contains the attribute value
In this circumstance, during test
For a remedy, conditional probabilities estimated with


) | ( ) | ( ) | , , (
1 1
C X P C X P C X X P
n n
=
0 ) | (

, = = = =
i jk j jk j
c C a X P a X
0 ) | (

) | (

) | (

1
=
i n i jk i
c x P c a P c x P
) 1 examples, virtual" " of (number prior to weight :
) of values possible for / 1 (usually, estimate prior :
which for examples training of number :
C and which for examples training of number :
) | (


>
=
=
= =
+
+
= = =
m m
X t t p p
c C n
c a X n
m n
mp n
c C a X P
j
i
i jk j c
c
i jk j
COMP24111 Machine Learning
16
Relevant Issues


Continuous-valued Input Attributes
Numberless values for an attribute
Conditional probability modeled with the normal distribution




Learning Phase:
Output: normal distributions and
Test Phase:
Calculate conditional probabilities with all the normal distributions
Apply the MAP rule to make a decision

i j ji
i j ji
ji
ji j
ji
i j
c C
c X
X
c C X P
=
=
|
|
.
|

\
|

= =
which for examples of X values attribute of deviation standard :
C which for examples of values attribute of (avearage) mean :
2
) (
exp
2
1
) | (


2
2
o

o t
L n
c c C X X , , ), , , ( for
1 1
= = X
L n
) , , ( for
1 n
X X
'

'
=
'
X
L i c C P
i
, , 1 ) ( = =
COMP24111 Machine Learning
17
Conclusions
Nave Bayes based on the independence assumption
Training is very easy and fast; just requiring considering each
attribute in each class separately
Test is straightforward; just looking up tables or calculating
conditional probabilities with normal distributions
A popular generative model
Performance competitive to most of state-of-the-art classifiers
even in presence of violating independence assumption
Many successful applications, e.g., spam mail filtering
A good candidate of a base learner in ensemble learning
Apart from classification, nave Bayes can do more

You might also like