Unformatted text preview:

MIT OpenCourseWarehttp://ocw.mit.edu 6.854J / 18.415J Advanced Algorithms Fall 2008��For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms.\ 18.415/6.854 Advanced Algorithms December 1, 2008 Lecture 22 Lecturer: Michel X. Goemans In this lecture, we introduce Seidel’s algorithm [3] to solve linear programs with n constraints in dimension d, when the dimension is small. The expected running time o f Seidel’s algorithm is O(d!n), i.e. it is strongly polynomial for fixed dimension d (strongly, since it does not depend on the size o f the input coefficients). Then, we use Seidel’s algorithm to develop a randomized convex-hull algorithm in an arbitrary dimension d which is the best pos sible when d ≥ 4. 1 Linear Programming in Fixed Dimension In this section, we fix the dimension d. We wish to find a stro ngly-polynomial time algorithm to solve linear programming. 1.1 Seidel’s Algorithm Let H be a set of n inequalities. Each inequality corresponds to a half-space h determined by a hyperplane. Let LP (H) be the linear program that minimizes cT x subject to the constraints: x ∈ h, x ∈ Rd . h∈H To make the description of the algorithm simpler, we make the following two assumptions: 1. Bounded: the feasible region is bounded, i.e. there exists M such that, for any feasible x, −M ≤ xi ≤ M for all i = 1, 2, . . . , d. This assumption can be enforced by ficticiously imposing a large bounding box, and whenever one of the inequalities of this bounding box is tight at optimum, we know that the linear program is unbounded. 2. Non-degenerate: the intersection of any d + 1 hyperplanes is empty. In 2-D, non-degeneracy means that there do not exist three lines meeting at the same point. If H does not meet this assumption, we can use some standard tricks like perturbation to make it non-degenerate. This can be handled by doing so-called lexicographic perturbation. These two assumptions imply that for any H′ ⊆ H, LP (H′) has a unique solution x(H′). Seidel’s algorithm will actually apply to a more general class of problems than linear programming, but here we’ll focus on linear pr ogramming. What is actually needed in the generalization is that the unique solution x(H′) is defined by a basis: Definition 1 A subset B ⊆ H′ is called a basis of the linear program LP (H′) if x(B) = x(H′) and B is minimal. Seidel’s algorithm solves the linear program H incrementally as follows. Chooes uniformly h ∈ H. Solve the linear program with h removed, and get a solution x. If the solution x satisfies h, then return x. If the solution x does not satisfy h, we impose the condition that h is s atisfied at equality, and eliminate one variable. Then s olve the linear program with d−1 variables and n−1 inequalities. The correctness of this algorithm was proved in the last lecture. In Seidel’s algorithm, we can stop the recurs ion when we have either n constraints in d = 1 variable (which takes O(n) time to solve), or 1 constraint in d va riables (which takes O(d) time to optimize over our ficticious bounding box). 22-11.2 Analysis of Running Time Let T (d, n) be the expected running time of Seidel’s algorithm on an instance with n inequalities and d variables. To find a recursive relation for T (d, n), note tha t we first recursively solve an LP with n − 1 inequalities and d variables, which takes time T (d, n − 1). If the solution x s atisfies the removed constra int h (which takes O(d) time to check), we are done and simply return the d coordinates of x. If x does not satisfy h, we first reduce the LP to only d − 1 variables in O(dn) time (it takes O(d) time to eliminate one var iable in each constraint) using the constraint h, and then solve the LP with n −1 inequalities and d −1 variables in T (d −1, n −1) time. The probability that x does not satisfy h is d/n, since the optimal solution is determined by exactly d inequalities and we have selected an inequality uniformly at random. This is the important step in the analysis and is known as backward analysis. By the analysis above, we have d T (d, n) = T (d, n −1) + O(d) + (O(dn) + T (d − 1, n −1)) n = T (d, n −1) + dT (d − 1, n −1) + O(d2). n The base cases are T (1, n) = O(n) and T (d, 1) = O(d). Using this recursive relation, we can prove by induction on d + n that Claim 1    X i2 T (d, n) = O   d!n = O(d!n). i! 1≤i≤d Proof: The base case is satisfied. We need to check the induction step. Suppos e that    X i2 T (d, n −1) = O   d!(n − 1)  ,i! 1≤i≤d    X i2 T (d − 1, n −1) = O   (d − 1)!(n −1)  . i! 1≤i≤d−1 Since   X i2 i! · d!(n −1) + d n X i2 i! · (d −1)!(n −1) + d2 ≤ X i2 i!  d!n, 1≤i≤d 1≤i≤d−1 1≤i≤d the c laim also holds for T (d, n). The second equality in the claim follows from the fact that P∞i2 is finite. �i=1 i! Thus, we have shown a strongly polynomial time algorithm to solve linear programs in a fixed small dimension d. 1.3 Improvement (Matousek, Sharir, Welzl [2]) Although the expected running time of Seidel’s algorithm is strong ly -polynomial in n, it increases exp onentially when d increases (more precisely, the dependence on d is 2O(d log d)). In this subsection, we briefly introduce an improvement to Seidel’s algorithm which gives a subexponential bound in d. We consider the linear program as follows. The LP algorithm LP (H, C) takes as input a candidate set C (that plays the role of a basis), and returns x as well as a basis B. Initially, we call LP (H, C) with C = ∅. 22-26  The algo rithm proceeds as follows. If H = C, then return C. If H = C, choose h randomly among H −C. We recursively call LP (H −{h}, C) and get a basis B. If h is satisfied by the so lution defined by B,


View Full Document

MIT 6 854J - Lecture Notes

Download Lecture Notes
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Lecture Notes and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Lecture Notes 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?