Professional Documents
Culture Documents
Ibrahim Sabek
Computer and Systems Engineering Department, Faculty of Engineering, Alexandria University, Egypt
1 / 33
Agenda
1 Machine learning overview and applications 2 Supervised vs. Unsupervised learning 3 Generative vs. Discriminative models 4 Overview of Classication 5 The big picture 6 Bayesian inference 7 Summary 8 Feedback
2 / 33
3 / 33
ML applications
Spam detection Handwriting detection Speech recognition Netix recommendation system
4 / 33
ML applications
Spam detection Handwriting detection Speech recognition Netix recommendation system
Classes of ML models
Supervised vs. Unsupervised. Generative vs. Discriminative
5 / 33
6 / 33
7 / 33
8 / 33
9 / 33
10 / 33
11 / 33
12 / 33
13 / 33
14 / 33
15 / 33
16 / 33
17 / 33
18 / 33
19 / 33
20 / 33
Generative:
you want to estimate the joint distribution p (x , y )
21 / 33
Overview of Classication
Overview of Classication
23 / 33
Overview of Classication
24 / 33
Overview of Classication
25 / 33
Overview of Classication
26 / 33
Overview of Classication
Random forests
Given D = {(x 1 , y 1 ), (x 2 , y 2 ), ......, (x n , y n )} where x i R , y i R For i = 1, ..., B
Choose bootstrap sample Di from D Construct tree Ti using Di s.t. at each node choose random subset of features and only consider splitting on these features.
27 / 33
also we have p (x , y ) = p (x |y )p (y )
Point estimate of
Maximum Likelihood Estimation (MLE) Maximum A Prior (MAP) Est . = argmax p (|x , D )
Deterministic Approximation
Laplace Approx. Variational methods
Stochastic Approximation
Importance sampling Gibbs sampling
29 / 33
Bayesian inference
Bayesian inference
Put distributions on everything and then use rules of probability to infer values Aspects of Bayesian inference
Priors: Assuming a prior distribution p () Procedures: Minimizing expected loss (averaging over ) Pros.:
Directly answer questions. Avoid overtting
Cons.:
Must assume prior. Exact computation can be intractable
30 / 33
Bayesian inference
Factorization of the probabilistic model. Notational device Visualization for inference algorithms Example for thinking graphically p (a, b , c ):
p (a, b , c ) = p (c |a, b )p (a, b ) = p (c |a, b )p (b |a)p (a)
31 / 33
Summary
Summary
Machine learning is an essential eld for our life. Machine learning is a broad world, we just started it in this session :D :D.
32 / 33
Feedback
Feedback
Your feedback is welcomed on alex.acm.org/feedback/machine/
33 / 33