The primary topics in this part of the specialization are: greedy algorithms (scheduling, minimum spanning trees, clustering, Huffman codes) and dynamic programming (knapsack, sequence alignment, optimal search trees).

Loading...

From the course by Stanford University

Greedy Algorithms, Minimum Spanning Trees, and Dynamic Programming

433 ratings

The primary topics in this part of the specialization are: greedy algorithms (scheduling, minimum spanning trees, clustering, Huffman codes) and dynamic programming (knapsack, sequence alignment, optimal search trees).

From the lesson

Week 3

Huffman codes; introduction to dynamic programming.

- Tim RoughgardenProfessor

Computer Science

So the time has arrived to begin our study of dynamic programming.

Â So this is a general algorithm design paradigm.

Â As I mentioned at the beginning of the course, it has a number of justly famous

Â applications. However, I'm not going to tell you just

Â yet what it is that makes an algorithm dynamic programming. Rather, our plan is,

Â over the next few videos, to develop from scratch an algorithm for a non trivial,

Â concrete computational problem. The problem is finding the maximum weight

Â independent set and a path graph. This concrete problem is going to force

Â us to develop a number of new ideas. And, once we've finished solving the

Â problem, at that point, we'll zoom out, and I'll point out what are the

Â characteris-, characteristics of our solution which make it a dynamic

Â programming algorithm. Then, armed with both a sort of formula

Â for developing the dynamic programming algorithms, as well as a concrete

Â instantiation, we'll move onto a number of further, and in general harder

Â applications of the paradigm. Indeed, even more than usual, the dynamic

Â programming paradigm takes practice to perfect.

Â In my experience, students find it counterintuitive at first, and they often

Â struggle to apply the paradigm to problems that they haven's seen before.

Â But here's the good news, dynamic programming is relatively formulaic,

Â certainly more so than our recent study of greedy algorithms, and it's something

Â that you can get a hang of. So with sufficient practice, and that's

Â exactly what I'll be giving you in the next couple of weeks, you should find

Â yourself with a powerful and quite widely applicable new tool in your programmer

Â tool box. So let me introduce you to the concave,

Â concrete problem we're going to study over the next few videos.

Â It's a graph problem but a very simple graph problem.

Â In fact, we're going to restrict our attention merely to path graphs.

Â That's graphs that consist solely of a path on some number n of vertices.

Â The only other part of the input is a single non-negative number per vertex.

Â We're going to call these weights. For example here's a path graph on four

Â vertices, and let's give the vertices the weights one, four, five, and four.

Â The responsibility of the algorithm is going to be to output an independent set.

Â What that means is a subset of the graph's vertices, so that no two vertices

Â are adjacent. So in the context of a simple path graph,

Â it just means you gotta return some of the vertices and always avoiding

Â consecutive pairs of vertices. So when you have a path of four vertices,

Â examples of independent sets include the empty set,

Â any single vertex, vertices one and three,

Â vertices two and four, and vertices one and four.

Â You could not, for example, return vertices two and three.

Â Because those were adjacent. That is forbidden.

Â Now, to make this interesting, we're going to want just, not any old

Â independent set, but the one whose sum of vertex weights

Â is as large as possible. That's the max weight independence set

Â problem. So what I'm going to do next is use this

Â concrete problem as an opportunity to review the various algorithm design

Â paradigms that we've seen so far. Along the way we'll see that none of them

Â actually work very well for this problem. And that's going to motivate us to devise

Â a new approach, and that approach is going to turn out to

Â be dynamic programming. .

Â So there's always our standard punching bag, brute force search.

Â This would entail iterating through all of the independent sets and remembering

Â the one with maximum total weights. Of course it's correct, no question about

Â that, but as usual this would require exponential time.

Â Even in just a path graph, the number of independent sets is exponential in the

Â number of vertices, n. .

Â So what other algorithm design paradigms do we know?

Â Well we just finished a big segment on greedy algorithms, we could certainly

Â think about that. You know, pretty much most problems it's

Â easy to propose greedy algorithms, and this one's no exception .

Â I think the most natural greedy algorithm you might try to use to compute a

Â maximally independent set would be well. What's the myopic decision?

Â Well, you want to get as much weight overall.

Â So, in each step, you want to just pick the vertex with the highest weight that

Â you haven't already chosen. Now, of course you have to worry about

Â feasibility. Remember, we're not allowed to output

Â adjacent or consecutive vertices. So, if any vertex is ruled out by

Â adjacency, we ignore it. And amongst those that preserve

Â feasibility, we include the highest weight one in our set so far.

Â Well, let me redraw the four node path graph we had in the last slide and let me

Â ask you. What would this greedy algorithm compute

Â on the four node path, and how does that compare to the optimal solution, the

Â independent set with the maximum total weight?

Â So the correct answer is the second one. Let's see why.

Â So let's start with the optimal solution, the maximum independence set.

Â Remember independence sets are forbidden from choosing adjacent or consecutive

Â vertices. so in this case the only sensible

Â solutions to consider are the first and third vertex, the second and fourth or

Â the first and fourth. Of these, the best is the second and

Â fourth for a total of eight. So what about the greedy algorithm?

Â Well, you know, we just had this period of time where we got really spoiled with

Â the success of greedy algorithms. especially the minimum span entry

Â problem. But let me remind you, you know, greedy

Â algorithms, you know, they're often good heuristics.

Â They're often not guaranteed to be correct.

Â And so I'm happy to have this opportunity to quickly remind you again, of that

Â drawback of greedy algorithms. They're quite frequently not correct.

Â So this is another such case, so what will the greedy algorithm do?

Â Well, it begins by picking the max weight vertex over all.

Â So that would be this vertex with weight five.

Â That unfortunately blocks the algorithm from picking either of the two vertices

Â that has weight four. The only remaining option that preserves

Â feasibility is to pick. The vertex of weight one.

Â So that gives us an indepenedent set of weight six.

Â So this greedy algorithm is not correct. You could of course try to devise other

Â types of algorithms, but I don't know of any greedy approach that will actually

Â solve this problem optimally. So that's a bummer, but we still haven't

Â exhausted our algorithm design paradigms. remember, you know, we learned this quite

Â powerful divide and conquer approach early on in part one of this class.

Â And you know, it seems like it could work here.

Â We had all these successful applications where the input was an array.

Â We broke it into two halves. We were cursed on both sides and combined

Â the results. And here, you know, path graphs don't

Â look so different than an array of numbers.

Â So the obvious approach for divide and conquer is to break the path into two

Â paths, each of half the length of the original,

Â recursively compute a maximum weighted independent set of each, and then somehow

Â combine the results. With the issues with the divide and

Â conquer approach are already apparent, just in our simple four vertex example.

Â So if we recurse on the left half, that is the first two vertices, and we compute

Â a max weight independence set, that's just going to be the second vertex by

Â itself. And if we independently recurse on the

Â right hand side, on the vertices three and four, the maximum weight independence

Â set on right half is going to be the vertex of weight five.

Â And now when we, the recursion's complete and we get our sub-solutions back, we

Â have the second vertex and we have third vertex,

Â but the problem is the union of those two solutions conflicts.

Â Right? We cannot simultaneously output the

Â second and third vertices. Those are consecutive,

Â those are adjacent, and that's not allowed.

Â Moreover, you know, in a four note graph. It's sort of easy to see how to repair

Â this conflict. But in a big graph with say thousands of

Â nodes, if you have a conflict right where the two sub-problems meet, it is not at

Â all obvious how you would quickly fix that and get a feasible and optimal

Â solution to the original problem. Now in some sense the divide and conquer

Â paradigm is more powerful or better suited for this problem than the greedy

Â approach, in that I do know of divide and conquer algorithms that could solve this

Â problem optimally that run in quadratic time.

Â But doing better than that in a divide and conquer matter seems quite

Â challenging. And the dynamic programming based

Â algorithm we'll develop will solve the problem in linear time.

Â That's coming up next.

Â Coursera provides universal access to the worldâ€™s best education,
partnering with top universities and organizations to offer courses online.