DOC PREVIEW
CMU CS 10701 - Recitation

This preview shows page 1-2-3-4-29-30-31-32-59-60-61-62 out of 62 pages.

Save
View full document
Premium Document
Do you want full access? Go Premium and unlock all 62 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

Review Logistic regression Gaussian na ve Bayes linear regression and their connections New Bias variance decomposition biasvariance tradeoff overfitting regularization and feature selection Yi Zhang 10 701 Machine Learning Spring 2011 February 3rd 2011 Parts of the slides are from previous 10 701 lectures 1 Outline Logistic regression Decision surface boundary of classifiers Generative vs discriminative classifiers Linear regression Bias variance decomposition and tradeoff Overfitting and regularization Feature selection 2 Outline Logistic regression Model assumptions P Y X Decision making Estimating the model parameters Multiclass logistic regression Decision surface boundary of classifiers Generative vs discriminative classifiers Linear regression Bias variance decomposition and tradeoff Overfitting and regularization Feature selection 3 Logistic regression assumptions Binary classification f X X1 X2 Xn Y 0 1 Logistic regression assumptions on P Y X And thus 4 Logistic regression assumptions Model assumptions the form of P Y X Logistic regression P Y X is the logistic function applied to a linear function of X 5 Decision making Given a logistic regression w and an X Decision making on Y Linear decision boundary Aarti 10 701 6 Estimating the parameters w Given where How to estimate w w0 w1 wn Aarti 10 701 7 Estimating the parameters w Given Assumptions P Y X w Maximum conditional likelihood on data Logistic regression only models P Y X So we only maximize P Y X ignoring P X 8 Estimating the parameters w Given Assumptions Maximum conditional likelihood on data Let s maximize conditional log likelihood 9 Estimating the parameters w Max conditional log likelihood on data A concave function beyond the scope of class No local optimum gradient ascent descent 10 Estimating the parameters w Max conditional log likelihood on data A concave function beyond the scope of class No local optimum gradient ascent descent 11 Multiclass logistic regression Binary classification K class classification For each class k K For class K 12 Outline Logistic regression Decision surface boundary of classifiers Logistic regression Gaussian na ve Bayes Decision trees Generative vs discriminative classifiers Linear regression Bias variance decomposition and tradeoff Overfitting and regularization Feature selection 13 Logistic regression Model assumptions on P Y X Deciding Y given X Linear decision boundary Aarti 10 701 14 Gaussian na ve Bayes Model assumptions P X Y P Y P X Y Bernoulli on Y Conditional independence of X Gaussian for Xi given Y Deciding Y given X 15 P X Y 0 P X Y 1 16 Gaussian na ve Bayes nonlinear case Again assume P Y 1 P Y 0 0 5 P X Y 0 P X Y 1 17 Decision trees Decision making on Y follow the tree structure to a leaf 18 Outline Logistic regression Decision surface boundary of classifiers Generative vs discriminative classifiers Definitions How to compare them GNB 1 vs logistic regression GNB 2 vs logistic regression Linear regression Bias variance decomposition and tradeoff Overfitting and regularization Feature selection 19 Generative and discriminative classifiers Generative classifiers Modeling the joint distribution P X Y Usually via P X Y P Y P X Y Examples Gaussian na ve Bayes Discriminative classifiers Modeling P Y X or simply f X Y Do not care about P X Examples logistic regression support vector machines later in this course 20 Generative vs discriminative How can we compare say Gaussian na ve Bayes and a logistic regression P X Y P Y P X Y vs P Y X Hint decision making is based on P Y X Compare the P Y X they can represent 21 Two versions GNB 1 and GNB 2 Model assumptions on P X Y P Y P X Y Bernoulli on Y Conditional independence of X GNB 1 Gaussian on Xi Y GNB 2 Additionally class independent variance 22 Two versions GNB 1 and GNB 2 Model assumptions on P X Y P Y P X Y Bernoulli on Y Conditional independence of X GNB 1 Gaussian on Xi Y GNB 2 Additionally class independent variance P X Y 0 Impossible for GNB 2 P X Y 1 23 GNB 2 vs logistic regression GNB 2 P X Y P Y P X Y Bernoulli on Y Conditional independence of X and Gaussian on Xi Additionally class independent variance It turns out P Y X of GNB 2 has the form 24 GNB 2 vs logistic regression It turns out P Y X of GNB 2 has the form See Mitchell Na ve Bayes and Logistic Regression section 3 1 page 8 10 Recall P Y X of logistic regression 25 GNB 2 vs logistic regression P Y X of GNB 2 is subset of P Y X of LR Given infinite training data We claim LR GNB 2 26 GNB 1 vs logistic regression GNB 1 P X Y P Y P X Y Bernoulli on Y Conditional independence of X and Gaussian on Xi Logistic regression P Y X 27 GNB 1 vs logistic regression None of them encompasses the other First find a P Y X from GNB 1 that cannot be represented by LR P X Y 0 P X Y 1 LR only represents linear decision surfaces 28 GNB 1 vs logistic regression None of them encompasses the other Second find a P Y X represented by LR that cannot be derived from GNB 1assumptions P X Y 0 P X Y 1 GNB 1 cannot represent any correlated Gaussian But can still possibly be represented by LR HW2 29 Outline Logistic regression Decision surface boundary of classifiers Generative vs discriminative classifiers Linear regression Regression problems Model assumptions P Y X Estimate the model parameters Bias variance decomposition and tradeoff Overfitting and regularization Feature selection 30 Regression problems Regression problems Predict Y given X Y is continuous General assumption Aarti 10 701 31 Linear regression assumptions Linear regression assumptions Y is generated from f X plus Gaussian noise f X is a linear function 32 Linear regression assumptions Linear regression assumptions Y is generated from f X plus Gaussian noise f X is a linear function Therefore assumptions on P Y X w 33 Linear regression assumptions Linear regression assumptions Y is generated from f X plus Gaussian noise f X is a linear function Therefore assumptions on P Y X w 34 Estimating the parameters w Given Assumptions Maximum conditional likelihood on data 35 Estimating the parameters w Given Assumptions Maximum conditional likelihood on data Let s maximize conditional log likelihood 36 Estimating the parameters w Given Assumptions Maximum conditional likelihood on data Let s maximize conditional log likelihood 37 Estimating the parameters w Max the conditional log likelihood over data OR minimize the sum of squared errors Gradient ascent descent is easy Actually a closed form solution exists 38 Estimating


View Full Document

CMU CS 10701 - Recitation

Documents in this Course
lecture

lecture

12 pages

lecture

lecture

17 pages

HMMs

HMMs

40 pages

lecture

lecture

15 pages

lecture

lecture

20 pages

Notes

Notes

10 pages

Notes

Notes

15 pages

Lecture

Lecture

22 pages

Lecture

Lecture

13 pages

Lecture

Lecture

24 pages

Lecture9

Lecture9

38 pages

lecture

lecture

26 pages

lecture

lecture

13 pages

Lecture

Lecture

5 pages

lecture

lecture

18 pages

lecture

lecture

22 pages

Boosting

Boosting

11 pages

lecture

lecture

16 pages

lecture

lecture

20 pages

Lecture

Lecture

20 pages

Lecture

Lecture

39 pages

Lecture

Lecture

14 pages

Lecture

Lecture

18 pages

Lecture

Lecture

13 pages

Exam

Exam

10 pages

Lecture

Lecture

27 pages

Lecture

Lecture

15 pages

Lecture

Lecture

24 pages

Lecture

Lecture

16 pages

Lecture

Lecture

23 pages

Lecture6

Lecture6

28 pages

Notes

Notes

34 pages

lecture

lecture

15 pages

Midterm

Midterm

11 pages

lecture

lecture

11 pages

lecture

lecture

23 pages

Boosting

Boosting

35 pages

Lecture

Lecture

49 pages

Lecture

Lecture

22 pages

Lecture

Lecture

16 pages

Lecture

Lecture

18 pages

Lecture

Lecture

35 pages

lecture

lecture

22 pages

lecture

lecture

24 pages

Midterm

Midterm

17 pages

exam

exam

15 pages

Lecture12

Lecture12

32 pages

lecture

lecture

19 pages

Lecture

Lecture

32 pages

boosting

boosting

11 pages

pca-mdps

pca-mdps

56 pages

bns

bns

45 pages

mdps

mdps

42 pages

svms

svms

10 pages

Notes

Notes

12 pages

lecture

lecture

42 pages

lecture

lecture

29 pages

lecture

lecture

15 pages

Lecture

Lecture

12 pages

Lecture

Lecture

24 pages

Lecture

Lecture

22 pages

Midterm

Midterm

5 pages

mdps-rl

mdps-rl

26 pages

Load more
Download Recitation
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Recitation and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Recitation and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?