# UB CSE 574 - Linear Classification: Probabilistic Generative Models (18 pages)

Previewing pages*1, 2, 3, 4, 5, 6*of 18 page document

**View the full content.**## Linear Classification: Probabilistic Generative Models

Previewing pages *1, 2, 3, 4, 5, 6*
of
actual document.

**View the full content.**View Full Document

## Linear Classification: Probabilistic Generative Models

0 0 70 views

Lecture Notes

- Pages:
- 18
- School:
- University at Buffalo, The State University of New York
- Course:
- Cse 574 - Introduction to Machine Learning

**Unformatted text preview: **

Machine Learning Srihari Linear Classification Probabilistic Generative Models Sargur N Srihari University at Buffalo State University of New York USA 1 Machine Learning Srihari Linear Classification using Probabilistic Generative Models Topics 1 Overview Generative vs Discriminative 2 Bayes Classifier using Logistic Sigmoid and Softmax 3 Continuous inputs Gaussian Distributed Class conditionals Parameter Estimation 4 Discrete Features 5 Exponential Family 2 Machine Learning Srihari Overview of Methods for Classification 1 Generative Models Two step 1 Infer class conditional densities p x Ck and priors p Ck 2 Use Bayes theorem to determine posterior probabilities 2 Discriminative Models One step Directly infer posterior probabilities p Ck x Decision Theory In both cases use decision theory to assign each new x to a class 3 Machine Learning Srihari Generative Model Model class conditionals p x Ck and priors p Ck Compute posteriors p Ck x from Bayes theorem Two class Case Posterior for class C1 is Since p x p x Ci p x Ci p Ci i i LLR with Bayes odds 4 Machine Learning Srihari Logistic Sigmoid Function 1 a 1 exp a a Property a 1 a Inverse a ln 1 If a P C1 x then Inverse represents a ln p C1 x p C2 x Sigmoid S shaped or squashing function Log ratio of maps real a to finite 0 1 interval probabilities called logit or log odds Note Dotted line is scaled probit function cdf of a zero mean unit variance Gaussian 5 Machine Learning Srihari Generalizations and Special Cases More than 2 classes Gaussian Distribution of x Discrete Features Exponential Family 6 Machine Learning Srihari Softmax Generalization of logistic sigmoid For K 2 we have logistic sigmoid For K 2 we have p Ck x p x Ck p Ck p x C j p C j j exp ak exp a j If K 2 this reduces to earlier form p C1 x exp a1 exp a1 exp a2 1 1 exp a2 a1 1 1 exp lnp x C2 p C2 ln x C1 p C1 1 1 p x C2 p C2 p x C1 p C1 1 1 exp a where p x C1 p C1 j a ln p x C2 p C2 Quantities ak are defined by ak ln p x Ck p Ck Known as the soft max

View Full Document