Unformatted text preview:

Greedy AlgorithmsGreedy Technique What is it? How does it work?Quick and dirty approach to solving optimization problemsNot necessarily ends up with an optimal solution Problems exploredCoin changing problemMinimum spanning tree algorithmsDijkstra’s algorithm for single source shortest pathsKnapsack problemOptimization problemsAn optimization problem: For a problem to solve, there are an objectivefunctionand a set of constraintsFind a feasible solution for the given instance for which the objective function has an optimal value (either maximum or minimum depending on the problem being solved) A feasible solution satisfies the problem’s constraints The constraints specify the limitations on the required solutionsAn example in the next slideCoin changing problemProblem: Return correct change using a minimum number of US coins. Greedy choice: Pick the coin with the highest valueA greedy solution: next slideThe amount owed = 37 cents.The change is: 1 quarter, 1 dime, 2 cents.Solution is optimal when US coins are used. Why is it optimal?A greedy solution:Input: Set of coins, amount-owedchange= {}while (more coin-sizes && valueof(change) < amount-owed) { // SelectionChoose the largest remaining coin// feasibility checkIf (adding the coin makes the valueof(change) exceed the amount-owed) then reject the coinelse add coin to change // check if solvedifififif (valueof(change) == amount-owed) then then then then returnchange}returnreturnreturnreturn “failed to compute change”Elements of the Greedy Strategy Cast problem as one in which you make a greedy choice and are left with one sub-problem to solve. Cost-benefit analysis for a greedy choice, e.g., the number of the coins used vs. the remaining amount of the change you oweA greedy solution is not always optimal !Reconsider the Coin Changing problem Suppose you live in Alice’s Wonderland where you have 12 cent coins in addition to US coins Suppose you owe 16 centsThe greedy solution chooses a 12 cent coin and four 1 cent coins  5 coinsAn optimal solution is one dime, one nickel, and one cent 3 coinsGreedy algorithms rarely find an optimal solutionA proof is needed to show that the algorithm finds an optimal solution. A counter example shows that the greedy algorithm does not provide an optimal solution.Greedy vs. Dynamic ProgrammingGreedy algorithms make good local choices in the hope that they result in an optimal solution.  Just make a choice that seems best at the moment and solve the remaining sub-problem in the next step  Iteratively make another greedy choice after one Result in feasible solutions but not necessarily end up with an optimal solution A greedy algorithm never reconsiders its choices  Main difference from dynamic programming Dynamic programming is exhaustive, and makes decisions based on all the previous decisions, potentially reconsidering previous choices In an earlier lecture on dynamic programming, we saw both greedy and dynamic programming approaches for finding an Optimal BST (Binary Search Tree)A greedy approach locating the highest probability node at the root or trying to minimize the tree depth does not necessarily give you an optimal solutionGreedy Minimum Spanning Tree Algorithms• Prim’s Algorithm• Kruskal’s AlgorithmWhat is A Spanning Tree?uvbacdef• A spanning tree for an undirected graph G=(V,E) is a subgraphof G that is a tree and contains all the verticesof G • Can a graph have more than one spanning tree?• Can an unconnected graph have a spanning tree?Minimum Spanning Tree4432915810143uvbacdefΣMst T: w( T )=(u,v) ∈Tw(u,v ) is minimized• The weight of a subgraph is the sum of the weights of its edges.• A minimum spanning tree for a weighted graph is a spanning tree with the minimum weight• Can a graph have more than one minimum spanning tree?Example of a Problem that Translates into a MSTThe Problem• Several pins of an electronic circuit must be connected using the least amount of wire.Modeling the Problem • The graph is a complete, undirected graph G = ( V, E ,W ), where V is the set of pins, E is the set of all possible interconnections between the pairs of pins and w(e) is the length of the wire needed to connect the pair of vertices.• Find a minimum spanning tree.Greedy ChoiceWe will show two ways to build a minimum spanning tree.• Prim's algorithm – A MST can be grown from the current spanning tree by adding the nearest vertex and the edge connecting the nearest vertex to theMST– Example: Figure 4.4 in page 146• Kruskal's algorithm– A MST can be grown from a forest of spanning trees by adding the smallest edge connecting two spanning trees– Example: Figure 4.7 in page 153Notation• Tree-vertices: in the tree constructed so far• Non-tree vertices: rest of verticesPrim’s Selection rule• Select the minimum weight edgebetween a tree-node and a non-tree node and add it to the treeKey idea of Prim’s algorithmSelect a vertex to be a tree-nodewhile (there are non-tree vertices) {if (there is no edge connecting a tree node with a non-tree node)return “no spanning tree”select an edge of minimum weight between a tree node and a non-tree nodeadd the selected edge and its new vertex to the tree}return tree645215810143uvbacdfPrim’s algorithmSource: A free online algorithms book available at http://www.cs.berkeley.edu/~vazirani/algorithms.htmlw[i][j] = 0 if i=j;edge weight if there is an edge (i,j); orinfinity if no edge (i, j) existsExampleSource: A free online algorithms book available at http://www.cs.berkeley.edu/~vazirani/algorithms.htmlcost/prevS: set oftree verticesImplementation• Queue can be implemented as an array or heap• Time complexity changes based on the implementation of the queue– More details nextPrim’s algorithmSource: A free online algorithms book available at http://www.cs.berkeley.edu/~vazirani/algorithms.htmlΘ(V)Θ(1)Θ(V)Note that decreasekey is executed maximum E times to find a MSTSo, the total time complexity is O(V*deletemin) + O(V * decreasekey)Prim’s algorithm• Time complexity– O(V*deletemin) + O(V * decreasekey)– Array: deletemin is O(V) and decreaskey is O(1)  O(V2)– Heap: deletemin is O(lg V) and decrease key is O(lg V)  (E lg V)• So, using heap is better when the graph is sparse (i.e., it has few edges) but worse when the graph has many edgesLemma 1Let G = ( V, E) be


View Full Document

BU CS 333 - Algorithms

Download Algorithms
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Algorithms and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Algorithms 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?