The primary topics in this part of the specialization are: greedy algorithms (scheduling, minimum spanning trees, clustering, Huffman codes) and dynamic programming (knapsack, sequence alignment, optimal search trees).

Loading...

From the course by Stanford University

Greedy Algorithms, Minimum Spanning Trees, and Dynamic Programming

441 ratings

The primary topics in this part of the specialization are: greedy algorithms (scheduling, minimum spanning trees, clustering, Huffman codes) and dynamic programming (knapsack, sequence alignment, optimal search trees).

From the lesson

Week 2

Kruskal's MST algorithm and applications to clustering; advanced union-find (optional).

- Tim RoughgardenProfessor

Computer Science

So let's start thinking about this alternative lazy union based approach to

Â the Union-Find data structure in some detail.

Â To get this to work as quickly as we want to we're going to need two optimizations.

Â This video will introduce the first of those optimizations, Union by Rank.

Â So let's have a quick slide making precise what we learned in the previous video.

Â So in our new implementation it's still the case

Â that each object has one extra field.

Â You can think of it logically as a pointer to some other object.

Â Really it's just recording the name of some other object but

Â the semantics have changed.

Â In our first implementation of union find with eager unions,

Â this field we called it a leader pointer and the invariate was that every

Â object's pointer points directly to the leader of its group.

Â In this Lazy Union Implementation, we are relaxing that variant.

Â So I'm going to change the name.

Â I'm not going to call it a leader pointer.

Â I'm just going to call it a parent pointer.

Â And the new relaxed invariant is that the collection of everybody's parent pointers

Â induces a collection of directed trees.

Â Each tree in this collection represents one of the groups

Â in the partition that we're maintaining.

Â Each of these trees has a root vertex,

Â that is a vertex which points back to itself.

Â It is these root vertices,

Â these root objects that we think of as the leaders of these components.

Â So as before, we're going to refer to a group by the name of its leader vertex and

Â that leader is, by definition, the root of the corresponding directed tree.

Â One way to think about things is that, when we had eager unions, we, again,

Â associated a directed tree with each of the groups, but we also insisted

Â that it was super shallow, bushy tree, that it had depth only one.

Â That is, all you had was a root, the leader,

Â and then everybody else who pointed directly to the leader.

Â In this lazy union implementation we're allowing the trees to grow deeper.

Â We don't insist that they have only one level below the root.

Â So as far as the initialization at birth in a union defined data structure just

Â consists of n degenerate trees each object initially points to itself and

Â is a root at the beginning.

Â In general how do you implement find from some object?

Â Well you just have to return the leader of its group and

Â we know you can get to the leader just by traversing parent pointers from x until

Â you get to a root until you get to some object that points back to itself.

Â That's what you return at the end of the find call.

Â How do you do a union?

Â Well given two objects x and y you have to find their respective roots so

Â you just invoke the find operation twice once for

Â x to get its root s1 once from y to get its root as two.

Â And then you install either s1 or s2 as a child of the other.

Â So you do one pointer update after the two finds.

Â So for now I'm going to leave Union under determined.

Â I'm not going to give you a rule to decide which of s1 and

Â s2 becomes the child of the other.

Â In the next quiz, I want you to think through what would be the worst case

Â running time of these operations if we make that choice arbitrarily,

Â if we just pick s1 s2 arbitrarily to become the child of the other.

Â All right so the correct answer unfortunately is the last one,

Â both find in union can actually blow up to linear time operations if you're

Â not careful about how you implement the union.

Â So why is this true?

Â Well first just let me point out that B is definitely not the correct answer.

Â Whatever finding union's worst case running time is,

Â it's definitely the same for both of them.

Â Remember, union reduces to calls to fine plus constant work.

Â So it's going to tell you that they're both going to have the same operation time

Â in the worst case.

Â Now, why is it linear?

Â The reason you're stuck with linear time is because the trees that you

Â build through a sequence of unions can in the worst case become very scraggly.

Â And to see what I mean imagine you just do a sequence of unions and

Â at each time you're unioning a group with just one object

Â into the group that you built up so far.

Â So if you union together a group of size two and group of size one,

Â we'll have you choose arbitrarily.

Â Maybe you make the group of size one the new root.

Â So now you just got a chain of the three objects.

Â Maybe you union in another singleton group,

Â and arbitrarily you choose the singleton group to be the new leader.

Â That gives you a chain of four objects and so on.

Â So if you do a linear number of unions, and you're not careful about who you make

Â a child of the other, you might well end up with a chain of linear length.

Â And then if you fines for the objects towards the bottom of that chain,

Â they're going to require linear work.

Â And again, union has a subroutine of fines so

Â its going to then take linear work as well.

Â So when you see this example, I hope your immediate reaction is well jeez,

Â we can certainly do better than that.

Â We can just be somewhat intelligent when we do a union which

Â of the old roots do we install as a child of the other one?

Â There's a rough analogy with how we made a natural optimization with our

Â eager union implementation of union find.

Â There we had to make some kind of choice between which of the two old leaders do

Â we retain?

Â We retain the one for the bigger group to minimize the number of leader updates.

Â So here, the natural thing to do is well look at what you did,

Â two trees is already deeper, and we want to keep the root of that deeper tree.

Â That's to avoid it from getting deeper still.

Â So we install the root of the shallower tree as a child of the deeper one.

Â Let's make that precise on the next slide with the notion of the rank of an object.

Â So the rank is just going to be a second field along with a parent

Â pointer that we maintain for each object.

Â It's just going to be some integer.

Â Let me tell you how I want you to think about the ranks at least for now.

Â Okay, so I'm going to give you some semantics, very natural semantics.

Â But I want to warn you, they're only going to hold valid for

Â the next couple of videos.

Â When we introduce something else called path compression these

Â semantics are going to break down.

Â But initially this is how I want you to think about ranks.

Â It's going to be the maximum number of hops required to get

Â from a leaf of x's tree up to x itself.

Â So for example if x is a root, this is going to be the worst case link,

Â the maximum link of any leaf to root path in the tree.

Â So just to make sure that this is clear let me draw a quick example.

Â So here in blue I've shown you one particular union fine data structure that

Â has two groups, now it's very easy to compute the rank of leafs, that's zero,

Â because the only path from a leaf to that node is the empty path.

Â There are two nodes that have rank 1.

Â Then the root of the bigger tree has rank 2.

Â And if you think about it, in general,

Â the rank of a node is going to be 1 plus the largest rank of any of its children.

Â For example, if you're in node X, you have a child Y and

Â their sum path from a leaf Z up to Y that requires five pointed traversals while

Â going from Z all the way up to U and X is going to require six pointed traversals.

Â And of course, when you initialize the data structure,

Â you want to set everybody's rank to zero, because everybody's just in a tree,

Â by themselves, at the beginning.

Â This notion of rank is going to be a super crucial concept throughout this entire

Â sequence of lectures on union fine so make sure you understand this definition.

Â I encourage you to draw out some of your own examples,

Â do some computations to get a better feel for the concept.

Â For the moment we're just going to use rank to avoid creating scraggly trees.

Â So let's go back to the bad example on the quiz.

Â Let's remember what the intuition was.

Â We wound up with this long chain and

Â therefore really bad worse case running time for our operations

Â because we were stupidly taking a tree that was already pretty deep and

Â installing it as a child under a single note under a super shallow tree.

Â Thereby producing a tree which is even deeper.

Â So the obvious way to avoid that is when faced with merging together a deep

Â tree and a shallow tree,

Â you install the shallow tree as a child under the deep one.

Â That's going to slow down this deepening process as much as possible.

Â So we can make that precise just to using this rank notion.

Â Notice that the rank is exactly quantifying the depth of a tree.

Â So the rank of the root of a tree by this invariant

Â is equal to the depth of the tree.

Â So if you want to make the shallower tree the child of the other one, that means you

Â want to make the smaller rank tree the child of the bigger rank tree.

Â This optimization is what is meant by Union by Rank.

Â This still remains the case where the two roots have equal rank, that is where

Â the two trees that were fusing have exactly the same maximum path length.

Â And in that case we do just arbitrarily choose one of them.

Â So in the pseudocode I've written here I've arbitrarily chosen y's root to be

Â the one that remains the root, x's root s1 is going to be installed as a child of s2.

Â But again, you can do it either way it's not going to matter.

Â Now we are not off the hook yet.

Â Remember this is a data structure where we're intending for

Â a invariant to hold these semantics for the ranks that's quantifying

Â worst case path link to the node and we made a modification to the data structure.

Â We rewired somebody's parent pointer.

Â So now it's our responsibility to check does the invariant still hold and

Â if doesn't still hold,

Â then we have to restore it hopefully using minimal extra work.

Â So in this quiz we're going to think through how ranks

Â change when we do a unions.

Â Remember the intonation, we're in the middle of a union of the objects x and y.

Â s1 and s2 refer the ranks that is the leaders of the groups of the trees that

Â contain x and y respectively.

Â The first thing I want you to think through and

Â convince yourself of is that for objects other than s1 and s2, other than the pair

Â of objects that are involved in the actual point of re-wiring, nobody's rank changes.

Â So make sure you think that through and convince yourself.

Â Ranks are invariant, except possibly for the objects S1 and S2.

Â In this quiz, we're going to drill down on how do the ranks of S1 and S2 change given

Â that we took one of them and rewired it's parent pointer to point to the other.

Â Okay so the correct answer to this quiz is the fourth one.

Â So remarkably in many cases there's no change at all to anybody's ranks including

Â S1 or S2.

Â If the ranks were different before the union both of the ranks stay the same.

Â The one exception is when you take the union of two ranks which are equal.

Â Then whichever one is the new root and

Â in the pseudo code that I showed you s2 is chosen to be the new root.

Â Its rank gets incremented, it goes up by one.

Â I think the easiest way to see this is is just with a couple pictures,

Â couple of examples.

Â So let me just draw two directed trees in the corresponding ranks.

Â And in this one s1 has strictly bigger rank.

Â It's rank is 2 so we install s2 as a child under s1.

Â So re-computing the ranks in the fused together tree, we see that again the rank

Â at the route at s1 is 2., same thing as it was before it didn't go up.

Â So, what's the general principle here?

Â Well, in general, suppose you have these two trees and

Â suppose the first ones route s1 has a bigger rank, say rank r.

Â What does that mean?

Â So, that means that there exists a path from a leaf to the root

Â s1 that requires a full r hops.

Â And in the second tree by virtue of its rank being less than r most r minus 1.

Â It says that every single path from a leaf to the root s2 of that

Â tree uses only r minus 1 hops.

Â So when you install s2 as a child under s1 the length of every leaf to root

Â path now the root is S1, it's only gone up by 1.

Â You just have to get to s2 and then 1 more hop you make it all the way to s1.

Â So if all those paths took almost r minus 1 before,

Â all the paths all the way to s minus 1 to s1 now, taken most are hops now.

Â So the worst case path link has not gone up.

Â That's exactly what happens in this example so it's true that when we install

Â s2 under s1 we have yet another path that requires two hops that goes through s2.

Â But we had one previously so the rank doesn't go up.

Â The worst case path length is exactly the same as before.

Â And this same kind of reasoning explains why the rank has to get bumped up by one

Â when you fuse together two trees that had the same rank.

Â So suppose S1 and S2 both have rank R,

Â well then both of them have a leaf to root path that requires a full R hops.

Â And now when you install S1 as a child under S2 that path where you needed R hops

Â to get from the leaf to S1, now to get all the way up to S2 you need one more hop.

Â So now suddenly there exists a path from a leaf to the new root S2 that

Â requires a full R plus 1 hops that's why the ranks gotta go up.

Â Coursera provides universal access to the worldâ€™s best education,
partnering with top universities and organizations to offer courses online.