8 8 Constrained LS Why Constrain Because sometimes we know or believe certain values are not allowed for For example In emitter location you may know that the emitter s range can t exceed the radio horizon You may also know that the emitter is on the left side of the aircraft because you got a strong signal from the left side antennas and a weak one from the right side antennas Thus when finding LS you want to constrain it to satisfy these conditions 1 Constrained LS Problem Statement Say that Sc is the set of allowable values due to constraints Then we seek CLS S c such that x H CLS 2 min x H Sc Types of Constraints H A R D E R 2 1 Linear Equality A b 2 Nonlinear Equality f b 3 Linear Inequality A b A b Constrained to a line plane or hyperplane Constrained to lie above below a hyperplane 4 Nonlinear Inequality f b f b We ll Cover 1 See Books on Optimization for Other Cases 2 LS Cost with a Linear Equality Constraint Using Lagrange Multipliers we need to minimize J c x H T x H T A b w r t and x2 Linear Equality Constraint contours of x H T x H Unconstrained Minimum 2 D Linear Equality Constraint Constrained Minimum x1 3 Constrained Optimization Lagrange Multiplier x2 f x1 x2 contours Constraint g x1 x2 C g x1 x2 C h x1 x2 0 Ex ax1 bx2 c 0 x2 a b x1 c b A Linear Constraint h x1 x2 x a 1 h x1 x2 h x1 x2 b x2 x1 Constrained Max occurs when f x1 x 2 h x1 x 2 f x1 x 2 h x1 x 2 0 Ex The grad vector has slope of b a orthogonal to constraint line f x1 x2 g x1 x 2 C 0 4 LS Solution with a Linear Equality Constraint Follow the usual steps for Lagrange Multiplier Solution 1 Set J c 0 CLS as a function of CLS 1 T 1 T T 2H T x 2H T H A T 0 c H H H x 2 H H 1 AT uc Unconstrained Estimate 2 Solve for to make CLS satisfy the constraint A c b solve for c 1 A uc H T H 2 1 A b c 2 A HT H T 1 A T 1 A uc b 3 Plug in to get the constrained solution c c c c uc 1 1 1 H T H A T A H T H A T A uc b Correction Term Amount of Constraint Deviation 5 Geometry of Constrained Linear LS The above result can be interpreted geometrically x s s c s uc Constraint Line Constrained Estimate of the Signal is the Projection of the Unconstrained Estimate onto the Linear Constraint Subspace 6 8 9 Nonlinear LS Everything we ve done up to now has assumed a linear observation model but we ve already seen that many applications have nonlinear observation models s H Recall For linear case closed form solution Not so for nonlinear case Must use numerical iterative methods to minimize the LS cost given by J x s T x s But first Two Tricks 7 Two Tricks for Nonlinear LS Sometimes it is possible to 1 Transform into a Linear Problem 2 Separate out any Linear Parameters Trick 1 Seek an invertible function g 1 g Sometimes Possible to Do Both Tricks Together such that s H which can be easily solved for LS and then find LS Trick 2 g 1 LS See if some of the parameters are linear Try to decompose to get s H Nonlinear in Linear in 8 Example of Linearization Trick Consider estimation of a sinusoid s amplitude and phase with a known frequency A s n A cos 2 f o n But we can re write this model as s n A cos cos 2 f o n A sin sin 2 f o n 1 2 which is linear in 1 2 T so HT H 1 HT x Then map this estimate back using Note that for this example this is merely exploiting polar torectangular ideas 2 2 1 2 1 g 1 2 tan 1 9 Example of Separation Trick Consider a signal model of three exponentials s n A1r n A2 r 2 n A3r 3n 0 r 1 A1 A2 A3 r T T Then we can write 1 r H r r N 1 1 r2 r 2 N 1 3 r 3 N 1 r 1 s H r r HT r H r 1 H T r x Depends on only one variable so might conceivably just compute on a grid and find minimum Then we need to minimize x H r H r H r T J r x H r r x H r r T 1 HT r x x H r H T T r H r 1 HT r x 10 Iterative Methods for Solving Nonlinear LS Goal Find value that minimizes J x s T x s without computing it over a p dimensional grid Two most common approaches 1 Newton Raphson a Analytically find J b Apply Newton Raphson to find a zero of J i e linearize J about the current estimate c Iteratively Repeat 2 Gauss Newton a Linearize signal model s about the current estimate b Solve resulting linear problem c Iteratively Repeat Both involve Linearization but they each linearize something different Solve linear problem Iteratively improve result 11 Newton Raphson Solution to Nonlinear LS To find minimum of J set J 0 g Need to find J 1 J J p for Taking these partials gives J N 1 2 x i s i i 0 J 2 j can ignore Why N 1 s i i s i x j i 0 r i hij 12 N 1 Now set to zero ri hij 0 for j 1 p 0 i Matrix Vector s 0 1 s 1 1 H s N 1 1 Define the ith g HT r 0 Depend nonlinearly on s 0 2 s 1 2 s N 1 2 row of H s 0 p s 1 p s N 1 p hTi s i 1 x 0 s 0 r x N 1 s N 1 s i 2 Then the equation to solve is g HT r s i P N 1 r n hi 0 n 0 13 For Newton Raphson we linearize g around our current estimate and iterate Need this T 1 g 1 H r T k H r k 1 k g k k N 1 N 1 HT r r n h n N 1 h n r n N 1 r n r n h h n n n 0 n 0 n 0 n 0 Gn Derivative of Product Rule G n ij 2 s n i j 1 2 p i j HT r N 1 G n x n s n n 0 HT H …
View Full Document