DOC PREVIEW
UCSD CSE 190 - Linear Discriminant Functions

This preview shows page 1 out of 4 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 4 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 4 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

1CSE190a Fall 06Announcements• HW1 due today• Most of last lecture was on the blackboard. • Gather hand data set today.Linear Linear DiscriminantDiscriminantFunctionsFunctions(Sections 5.1(Sections 5.1--5.2)5.2)••Pattern Classification, Ch4 (Part 1)3Linear Linear discriminantdiscriminantfunctions and functions and decisions surfacesdecisions surfaces• DefinitionIt is a function that is a linear combination of the components of xg(x) = wtx+ w0(1)where w is the weight vector and w0the bias• A two-category classifier with a discriminant function of the form (1) uses the following rule:Decide ω1if g(x) > 0and ω2if g(x) < 0⇔ Decide ω1if wtx> -w0and ω2otherwiseIf g(x) = 0 ⇒ xis assigned to either classPattern Classification, Ch4 (Part 1)4Pattern Classification, Ch4 (Part 1)5••The equation The equation g(x) = 0g(x) = 0defines the defines the decision decision surfacesurfacethat separates points assigned to the that separates points assigned to the category category ωω11from points assigned to the from points assigned to the category category ωω22••When When g(x)g(x)is linear, the decision surface is a is linear, the decision surface is a hyperplanehyperplane••Algebraic measure of the distance from x to Algebraic measure of the distance from x to the the hyperplanehyperplane(interesting result!)(interesting result!)Pattern Classification, Ch4 (Part 1)62Pattern Classification, Ch4 (Part 1)7••In conclusion, a linear In conclusion, a linear discriminantdiscriminantfunction divides the function divides the feature space by a feature space by a hyperplanehyperplanedecision surfacedecision surface••The orientation of the surface is determined by the The orientation of the surface is determined by the normal vector w and the location of the surface is normal vector w and the location of the surface is determined by the biasdetermined by the biaswwH)d(0, particular inw)x(gr thereforeww. wand0 g(x) cesin)1ww and x x - withcolinear is(since w ww.rxx02tpp=====+=Pattern Classification, Ch4 (Part 1)8••The multiThe multi--category casecategory case••We define c linear We define c linear discriminantdiscriminantfunctionsfunctionsand assign and assign xxto to ωωiiif if ggii(x(x) > ) > ggjj(x(x) ) ∀∀j j ≠≠ii; in case of ties, the ; in case of ties, the classification is undefinedclassification is undefined••In this case, the classifier is a In this case, the classifier is a ““linear machinelinear machine””••A linear machine divides the feature space into c decision regioA linear machine divides the feature space into c decision regions, ns, with with ggii(x(x) being the largest ) being the largest discriminantdiscriminantif x is in the region if x is in the region RRii••For a two contiguous regions For a two contiguous regions RRiiand and RRjj; the boundary that ; the boundary that separates them is a portion of separates them is a portion of hyperplanehyperplaneHHijijdefined by:defined by:ggii(x(x) = ) = ggjj(x(x))⇔⇔((wwii––wwjj))ttxx+ (w+ (wi0i0––wwj0j0) = 0) = 0••wwii––wwjjis normal tois normal toHHijijandandc1,...,i wxw)x(g0itii=+=jijiijwwgg)H,x(d−−=Pattern Classification, Ch4 (Part 1)9Pattern Classification, Ch4 (Part 1)10••It is easy to show that the decision regions for It is easy to show that the decision regions for a linear machine are convex, this restriction a linear machine are convex, this restriction limits the flexibility and accuracy of the limits the flexibility and accuracy of the classifierclassifierCS348, Fall 2001© David Kriegman, 200111PerceptronPerceptronXi : inputsWi:weightsθ : thresholdLinear, threshold unitsCS348, Fall 2001© David Kriegman, 200112PerceptronPerceptronThe threshold can be easily forced to 0 by introducing an additional weight input W0= θ3CS348, Fall 2001© David Kriegman, 200113How powerful is a How powerful is a perceptronperceptron??Threshold = 0Threshold = 0CS348, Fall 2001© David Kriegman, 200114Concept Space & Linear Concept Space & Linear SeparabilitySeparabilityCS348, Fall 2001© David Kriegman, 200115Training Training PerceptronPerceptronCS348, Fall 2001© David Kriegman, 200116CS348, Fall 2001© David Kriegman, 200117CS348, Fall 2001© David Kriegman, 2001184CS348, Fall 2001© David Kriegman, 200119CS348, Fall 2001© David Kriegman, 200120CS348, Fall 2001© David Kriegman, 200121CS348, Fall 2001© David Kriegman, 200122Increasing Expressiveness:Increasing Expressiveness:MultiMulti--Layer Neural NetworksLayer Neural Networks2-layer Neural


View Full Document

UCSD CSE 190 - Linear Discriminant Functions

Documents in this Course
Tripwire

Tripwire

18 pages

Lecture

Lecture

36 pages

Load more
Download Linear Discriminant Functions
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Linear Discriminant Functions and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Linear Discriminant Functions 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?