算法分析與設(shè)計(jì)教學(xué)課件:Chapter 16 Greedy Algorithm1_第1頁(yè)
算法分析與設(shè)計(jì)教學(xué)課件:Chapter 16 Greedy Algorithm1_第2頁(yè)
算法分析與設(shè)計(jì)教學(xué)課件:Chapter 16 Greedy Algorithm1_第3頁(yè)
算法分析與設(shè)計(jì)教學(xué)課件:Chapter 16 Greedy Algorithm1_第4頁(yè)
算法分析與設(shè)計(jì)教學(xué)課件:Chapter 16 Greedy Algorithm1_第5頁(yè)
已閱讀5頁(yè),還剩90頁(yè)未讀 繼續(xù)免費(fèi)閱讀

付費(fèi)下載

下載本文檔

版權(quán)說(shuō)明:本文檔由用戶提供并上傳,收益歸屬內(nèi)容提供方,若內(nèi)容存在侵權(quán),請(qǐng)進(jìn)行舉報(bào)或認(rèn)領(lǐng)

文檔簡(jiǎn)介

1、T&R Team of Algorithm DesignCollege of Computer Science and Engineering, CQUAlgorithm Analysis & Design Introduction to AlgorithmIntroduction to Algorithm GREEDY ALGORITHMLocally optimal choice16Overview Like dynamic programming, used to solve optimization problems. Dynamic programming can be overki

2、ll; greedy algorithms tend to be easier to code Problems exhibit optimal substructure (like DP). Problems also exhibit the greedy-choice property. When we have a choice to make, make the one that looks best right now. Make a locally optimal choice in hope of getting a globally optimal solution.Overv

3、iewI make the shortest path to the target at each step. Sometime I win, sometime I lose.Greedy Strategy The choice that seems best at the moment is the one we go with. Prove that when there is a choice to make, one of the optimal choices is the greedy choice. Therefore, its always safe to make the g

4、reedy choice. Show that all but one of the subproblems resulting from the greedy choice are empty.GreedyGreed is good. Greed is right. Greed works.Greed clarifies, cuts through, and captures theessence of the evolutionary spirit.- Gordon Gecko (Michael Douglas)Coin ChangingThe coin changing problemG

5、oal. Given currency denominations: 1, 5, 10, 25, 100, devise a methodto pay amount to customer using fewest number of coins.Ex: 34.The coin changing problem formal description Let D = d1,d2,dk be a finite set of distinct coin denominations. Example: d1 = 25, d2 = 10, d3 = 5, and d4 = 1. We can assum

6、e each di is and integer and d1 d2 dk. Each denomination is available in unlimited quantity. The coin-changing problem:- Make change for n cents, using a minimum total number of coins.- Assume that dk = 1 so that there is always a solution.A dynamic programming solution: Step 1Step 1: Characterize t

7、he structure of a coin-change solution. Define Cj to be the minimum number of coins we need to make change for j cents. If you knew that an optimal solution for the problem of make change for j cents used a coin of denomination di, we would have:Cj = 1 + Cj-di.A dynamic programming solution: Step 2S

8、tep 2: Recursively define the value of an optimal solution.An example: coin set 50, 25, 10, 1 An exampleA DP solution: step 3Step 3: Compute values in a bottom-up fashion.Avoid examining Cj for j = di before looking up Cj-di.Complexity: (nk).A DP solution: step 4Step 4: Construct and optimal solutio

9、n.We use an additional array denom1.n, where denomj is the denomination of a coin used in an optimal solution to the problem of making change for j cents.A DP solution: step 5 print optimal solution.Coin-changing: Greedy algorithmCashiers algorithm. At each iteration, add coin of the largest value t

10、hat does not take us past the amount to be paid.Q. Is cashiers algorithm optimal?Coin-changing: Analysis of Greedy algorithmTheorem. Greedy algorithm is optimal for U.S. coinage: 1, 5, 10, 25, 100.Pf. (by induction on x) Consider optimal way to change ck x ck+1 : greedy takes coin k. We claim that a

11、ny optimal solution must also take coin k. if not, it needs enough coins of type c1, , ck-1 to add up to x table below indicates no optimal solution can do this Problem reduces to coin-changing x - ck cents, which, by induction, isoptimally solved by greedy algorithm. Coin-changing: Analysis of Gree

12、dy algorithmObservation. Greedy algorithm is sub-optimal for US postaldenominations: 1, 10, 21, 34, 70, 100, 350, 1225, 1500.Counterexample. 140. Greedy: 100, 34, 1, 1, 1, 1, 1, 1. Optimal: 70, 70.Activity SelectionActivity-Selection Problem Problem: get your moneys worth out of a festival Buy a wri

13、stband that lets you onto any ride Lots of rides, each starting and ending at different times Your goal: ride as many rides as possible Another, alternative goal that we dont solve here: maximize time spent on rides Welcome to the activity selection problemActivity-selection Problem Input: Set S of

14、n activities, a1, a2, , an. si = start time of activity i. fi = finish time of activity i. Output: Subset A of maximum number of compatible activities. Two activities are compatible, if their intervals dont overlap.Example:Activities in each lineare compatible.1234657Optimal Substructure Assume acti

15、vities are sorted by finishing times. f1 f2 fn. Suppose an optimal solution includes activity ak. This generates two subproblems. Selecting from a1, , ak-1, activities compatible with one another, and that finish before ak starts (compatible with ak). Selecting from ak+1, , an, activities compatible

16、 with one another, and that start after ak finishes. The solutions to the two subproblems must be optimal. Prove using the cut-and-paste approach.Recursive Solution Let Sij = subset of activities in S that start after ai finishes and finish before aj starts. Subproblems: Selecting maximum number of

17、mutually compatible activities from Sij. Let ci, j = size of maximum-size subset of mutually compatible activities in Sij.ijjkiijSjkckicSjic if1,max if0,Recursive Solution:Activity Selection: Repeated Subproblems Consider a recursive algorithm that tries all possible compatible subsets to find a max

18、imal set, and notice repeated subproblems:S1 A?S2 A?S-12 A?S-1,2SS-2SyesnononoyesyesGreedy Choice Property Dynamic programming? Memoize? Yes, but Activity selection problem also exhibits the greedy choice property: Locally optimal choice globally optimal soln Them 16.1: if S is an activity selection

19、 problem sorted by finish time, then optimal solution A S such that 1 A Sketch of proof: if optimal solution B that does not contain 1, can always replace first activity in B with 1 (Why?). Same number of activities, thus optimal.Greedy-choice Property The problem also exhibits the greedy-choice pro

20、perty. There is an optimal solution to the subproblem Sij, that includes the activity with the smallest finish time in set Sij. Can be proved easily. Hence, there is an optimal solution to S that includes a1. Therefore, make this greedy choice without solving subproblems first and evaluating them. S

21、olve the subproblem that ensues as a result of making this greedy choice. Combine the greedy choice and the solution to the subproblem.Recursive AlgorithmRecursive-Activity-Selector (s, f, i, j)1.m i+12.while m j and sm fi3. do m m+14.if m w. Item k cant be part of the solution, since if it was, the

22、 total weight would be w, which is unacceptable Second case: wk =w. Then the item k can be in the solution, and we choose the case with greater valueelse , 1, 1max if , 1,kkkbwwkBwkBwwwkBwkBThe Knapsack Problem And Optimal Substructure Both variations exhibit optimal substructure To show this for th

23、e 0-1 problem, consider the most valuable load weighing at most W pounds If we remove item j from the load, what do we know about the remaining load? A: remainder must be the most valuable load weighing at most W - wj that thief could take from museum, excluding item j Solving The Knapsack Problem T

24、he optimal solution to the fractional knapsack problem can be found with a greedy algorithm How? The optimal solution to the 0-1 problem cannot be found with the same greedy strategy Greedy strategy: take in order of dollars/pound Example: 3 items weighing 10, 20, and 30 pounds, knapsack can hold 50

25、 pounds Suppose item 2 is worth $100. Assign values to the other items so that the greedy strategy will fail Copyright The McGraw-Hill Companies, Inc. Permission required for reproduction or display.The Knapsack Problem: Greedy Vs. Dynamic The fractional problem can be solved greedily The 0-1 proble

26、m cannot be solved with a greedy approach As you have seen, however, it can be solved with dynamic programming0-1 Knapsack Algorithmfor w = 0 to WB0,w = 0for i = 0 to nBi,0 = 0for w = 0 to Wif wi Bi-1,wBi,w = bi + Bi-1,w- wielseBi,w = Bi-1,welse Bi,w = Bi-1,w / wi w Running timefor w = 0 to WB0,w =

27、0for i = 0 to nBi,0 = 0for w = 0 to WWhat is the running time of this algorithm?O(W)O(W)Repeat n timesO(n*W)Remember that the brute-force algorithm takes O(2n)ExampleLets run our algorithm on the following data:n = 4 (# of elements)W = 5 (max weight)Elements (weight, benefit):(2,3), (3,4), (4,5), (5

28、,6)Example (2)for w = 0 to WB0,w = 0000000W012345i01234Example (3)for i = 0 to nBi,0 = 0000000W012345i012300004Example (4) if wi Bi-1,w Bi,w = bi + Bi-1,w- wi else Bi,w = Bi-1,w else Bi,w = Bi-1,w / wi w 000000W012345i01230000i=1bi=3wi=2w=1w-wi =-1Items:1: (2,3)2: (3,4)3: (4,5) 4: (5,6)40Example (5)

29、 if wi Bi-1,w Bi,w = bi + Bi-1,w- wi else Bi,w = Bi-1,w else Bi,w = Bi-1,w / wi w 000000W012345i01230000i=1bi=3wi=2w=2w-wi =0Items:1: (2,3)2: (3,4)3: (4,5) 4: (5,6)403Example (6) if wi Bi-1,w Bi,w = bi + Bi-1,w- wi else Bi,w = Bi-1,w else Bi,w = Bi-1,w / wi w 000000W012345i01230000i=1bi=3wi=2w=3w-wi

30、=1Items:1: (2,3)2: (3,4)3: (4,5) 4: (5,6)4033Example (7) if wi Bi-1,w Bi,w = bi + Bi-1,w- wi else Bi,w = Bi-1,w else Bi,w = Bi-1,w / wi w 000000W012345i01230000i=1bi=3wi=2w=4w-wi=2Items:1: (2,3)2: (3,4)3: (4,5) 4: (5,6)40333Example (8) if wi Bi-1,w Bi,w = bi + Bi-1,w- wi else Bi,w = Bi-1,w else Bi,w

31、 = Bi-1,w / wi w 000000W012345i01230000i=1bi=3wi=2w=5w-wi=2Items:1: (2,3)2: (3,4)3: (4,5) 4: (5,6)403333Example (9) if wi Bi-1,w Bi,w = bi + Bi-1,w- wi else Bi,w = Bi-1,w else Bi,w = Bi-1,w / wi w 000000W012345i01230000i=2bi=4wi=3w=1w-wi=-2Items:1: (2,3)2: (3,4)3: (4,5) 4: (5,6)4033330Example (10) i

32、f wi Bi-1,w Bi,w = bi + Bi-1,w- wi else Bi,w = Bi-1,w else Bi,w = Bi-1,w / wi w 000000W012345i01230000i=2bi=4wi=3w=2w-wi=-1Items:1: (2,3)2: (3,4)3: (4,5) 4: (5,6)40333303Example (11) if wi Bi-1,w Bi,w = bi + Bi-1,w- wi else Bi,w = Bi-1,w else Bi,w = Bi-1,w / wi w 000000W012345i01230000i=2bi=4wi=3w=3

33、w-wi=0Items:1: (2,3)2: (3,4)3: (4,5) 4: (5,6)403333034Example (12) if wi Bi-1,w Bi,w = bi + Bi-1,w- wi else Bi,w = Bi-1,w else Bi,w = Bi-1,w / wi w 000000W012345i01230000i=2bi=4wi=3w=4w-wi=1Items:1: (2,3)2: (3,4)3: (4,5) 4: (5,6)4033330344Example (13) if wi Bi-1,w Bi,w = bi + Bi-1,w- wi else Bi,w =

34、Bi-1,w else Bi,w = Bi-1,w / wi w 000000W012345i01230000i=2bi=4wi=3w=5w-wi=2Items:1: (2,3)2: (3,4)3: (4,5) 4: (5,6)40333303447Example (14) if wi Bi-1,w Bi,w = bi + Bi-1,w- wi else Bi,w = Bi-1,w else Bi,w = Bi-1,w / wi w 000000W012345i01230000i=3bi=5wi=4w=1.3Items:1: (2,3)2: (3,4)3: (4,5) 4: (5,6)4033

35、330 03447034Example (15) if wi Bi-1,w Bi,w = bi + Bi-1,w- wi else Bi,w = Bi-1,w else Bi,w = Bi-1,w / wi w 000000W012345i01230000i=3bi=5wi=4w=4w- wi=0Items:1: (2,3)2: (3,4)3: (4,5) 4: (5,6)400 0344703453333Example (15) if wi Bi-1,w Bi,w = bi + Bi-1,w- wi else Bi,w = Bi-1,w else Bi,w = Bi-1,w / wi w 0

36、00000W012345i01230000i=3bi=5wi=4w=5w- wi=1Items:1: (2,3)2: (3,4)3: (4,5) 4: (5,6)400 03447034573333Example (16) if wi Bi-1,w Bi,w = bi + Bi-1,w- wi else Bi,w = Bi-1,w else Bi,w = Bi-1,w / wi w 000000W012345i01230000i=3bi=5wi=4w=1.4Items:1: (2,3)2: (3,4)3: (4,5) 4: (5,6)400 034470345703453333Example

37、(17) if wi Bi-1,w Bi,w = bi + Bi-1,w- wi else Bi,w = Bi-1,w else Bi,w = Bi-1,w / wi w 000000W012345i01230000i=3bi=5wi=4w=5Items:1: (2,3)2: (3,4)3: (4,5) 4: (5,6)400 0344703457034573333Comments This algorithm only finds the max possible value that can be carried in the knapsack To know the items that

38、 make this maximum value, an addition to this algorithm is necessary Please see LCS algorithm from the previous lecture for the example how to extract this data from the table we builtHuffman CodesData CompressionQ. Given a text that uses 32 symbols (26 different letters, space, andsome punctuation

39、characters), how can we encode this text in bits?Q. Some symbols (e, t, a, o, i, n) are used far more often than others.How can we use this to reduce our encoding?Q. How do we know when the next symbol begins?Ex. c(a) = 01 What is 0101? c(b) = 010 c(e) = 1Data CompressionQ. Given a text that uses 32

40、 symbols (26 different letters, space, and some punctuation characters), how can we encode this text in bits?A. We can encode 25 different symbols using a fixed length of 5 bits per symbol. This is called fixed length encoding.Q. Some symbols (e, t, a, o, i, n) are used far more often than others.Ho

41、w can we use this to reduce our encoding?A. Encode these characters with fewer bits, and the others with more bits.Q. How do we know when the next symbol begins?A. Use a separation symbol (like the pause in Morse), or make sure thatthere is no ambiguity by ensuring that no code is a prefix of anothe

42、r one.Ex. c(a) = 01 What is 0101? c(b) = 010 c(e) = 1Prefix CodesDefinition. A prefix code for a set S is a function c that maps each xS to 1s and 0s in such a way that for x,yS, xy, c(x) is not a prefix of c(y).Ex. c(a) = 11c(e) = 01c(k) = 001c(l) = 10c(u) = 000Q. What is the meaning of 1001000001

43、?A. “l(fā)euk”Suppose frequencies are known in a text of 1G:fa=0.4, fe=0.2, fk=0.2, fl=0.1, fu=0.1Q. What is the size of the encoded text?A. 2*fa + 2*fe + 3*fk + 2*fl + 4*fu = 2.4GOptimal Prefix CodesDefinition. The average bits per letter of a prefix code c is the sumover all symbols of its frequency t

44、imes the number of bits of itsencoding:We would like to find a prefix code that is has the lowest possibleaverage bits per letter.Suppose we model a code in a binary treeRepresenting Prefix Codes using Binary TreesQ. How does the tree of a prefix code look?A. Only the leaves have a label.Pf. An enco

45、ding of x is a prefix of an encoding of y if and only if thepath of x is a prefix of the path of y.Representing Prefix Codes using Binary TreesQ. What is the meaning of111010001111101000 ?A. “simpel”Q. How can this prefix code be made more efficient?A. Change encoding of p and s to a shorter one.Thi

46、s tree is now full.Representing Prefix Codes using Binary TreesDefinition. A tree is full if every node that is not a leaf has twochildren.Claim. The binary tree corresponding to the optimal prefix code is full.Pf. (by contradiction)Suppose T is binary tree of optimal prefix code and is not full.Thi

47、s means there is a node u with only one child v.Case 1: u is the root; delete u and use v as the rootCase 2: u is not the root let w be the parent of u delete u and make v be a child of w in place of uIn both cases the number of bits needed to encode any leaf in thesubtree of v is decreased. The res

48、t of the tree is not affected.Clearly this new tree T has a smaller ABL than T. Contradiction.Optimal Prefix Codes: False StartQ. Where in the tree of an optimal prefix code should letters be placedwith a high frequency?A. Near the top.Greedy template. Create tree top-down, split S into two sets S1

49、and S2with (almost) equal frequencies. Recursively build tree for S1 and S2.Shannon-Fano, 1949 fa=0.32, fe=0.25, fk=0.20, fl=0.18, fu=0.05Optimal Prefix Codes: Huffman EncodingObservation. Lowest frequency items should be at the lowest level intree of optimal prefix code.Observation. For n 1, the lo

50、west level always contains at least twoleaves.Observation. The order in which items appear in a level does notmatter.Claim. There is an optimal prefix code with tree T* where the twolowest-frequency letters are assigned to leaves that are siblings in T*.Greedy template. Huffman, 1952 Create tree bottom-up.Make two leaves for two lowest-frequency letters y and z.Recursi

溫馨提示

  • 1. 本站所有資源如無(wú)特殊說(shuō)明,都需要本地電腦安裝OFFICE2007和PDF閱讀器。圖紙軟件為CAD,CAXA,PROE,UG,SolidWorks等.壓縮文件請(qǐng)下載最新的WinRAR軟件解壓。
  • 2. 本站的文檔不包含任何第三方提供的附件圖紙等,如果需要附件,請(qǐng)聯(lián)系上傳者。文件的所有權(quán)益歸上傳用戶所有。
  • 3. 本站RAR壓縮包中若帶圖紙,網(wǎng)頁(yè)內(nèi)容里面會(huì)有圖紙預(yù)覽,若沒(méi)有圖紙預(yù)覽就沒(méi)有圖紙。
  • 4. 未經(jīng)權(quán)益所有人同意不得將文件中的內(nèi)容挪作商業(yè)或盈利用途。
  • 5. 人人文庫(kù)網(wǎng)僅提供信息存儲(chǔ)空間,僅對(duì)用戶上傳內(nèi)容的表現(xiàn)方式做保護(hù)處理,對(duì)用戶上傳分享的文檔內(nèi)容本身不做任何修改或編輯,并不能對(duì)任何下載內(nèi)容負(fù)責(zé)。
  • 6. 下載文件中如有侵權(quán)或不適當(dāng)內(nèi)容,請(qǐng)與我們聯(lián)系,我們立即糾正。
  • 7. 本站不保證下載資源的準(zhǔn)確性、安全性和完整性, 同時(shí)也不承擔(dān)用戶因使用這些下載資源對(duì)自己和他人造成任何形式的傷害或損失。

最新文檔

評(píng)論

0/150

提交評(píng)論