0:13

Let's look at a demo of insertion sort.

Â For insertion sort, what we're going to do is we'll move an index i

Â from left to right as before, but now, in the i'th iteration,

Â we're going to move a[i] into position among the elements to its left.

Â Let's look at how that works on our example with cards.

Â 0:33

So, now we start by initializing i at the first card, and

Â we take the idea that everything from i to its left is going to be sorted, and

Â everything from the right we're not going to look at at all.

Â 0:51

So everything to the left of i is in ascending order, everything to the right,

Â we haven't seen it all yet.

Â So now when we increment i, well, in this case it's already in order,

Â we don't have anything else to do.

Â 1:03

In the third case now, when i is at the third entry in the array,

Â now we start a index j, and we move that starting at i to the left.

Â And what we need to do is just exchange the 5 with every element to its left

Â that's greater.

Â So first we exchange it with the 10, it's still not in place, so

Â we exchange it with the 7.

Â Now we get to the beginning of the array, and once we've done that or

Â we've hit a smaller element, then we have everybody to the left of i in order.

Â 1:55

Now, in this case, we have the 8, and we only have to exchange one, and

Â now it's got the 7 to its left and everything is in order.

Â So we've achieved putting it in order with less work in this case.

Â We don't always have to go all the way back to the beginning.

Â 2:10

4, exchange it with everybody to its left that's greater,

Â until we find a smaller element, then it's in ascending order.

Â 2 has to go all the way back to the beginning.

Â 2:42

Again, we can look at insertion sort in terms of invariants.

Â Our pointer still scans from left to right, but

Â now the elements to the left of the pointer, including it,

Â are in order, but the elements to the right have not yet been seen at all.

Â So we have to look at the code that's going to maintain that invariant as

Â the pointer increments.

Â Move the pointer to the right, it's incremented again.

Â Now the invariant's broken because the element on

Â the pointer is not in sorted order.

Â To put it in sorted order, we have to move from right to left, exchanging it with

Â every larger elements to its left, and that's what the code at the bottom does.

Â It starts j at i, and decrements j,

Â exchanging j with the elements to its left,

Â a of j with the element to its left, a of j-1,

Â as long as a of j is less than a of j-1 or j is bigger than 0.

Â And that immediately gives this code for insertion sort,

Â which is similar to our code for selection sort and just as simple.

Â It's got two nested for loops, selection sort had two nested for loops,

Â a test, a comparison, and an exchange inside the for loop.

Â And that's a fine implementation of an elementary sorting method.

Â 4:08

What about the analysis of insertion sort?

Â It's more complicated.

Â Our proposition says that insertion sort, to sort randomly ordered array

Â with distinct keys, it'll use about one quarter N squared compares,

Â and about the same number, one quarter N squared exchanges, on the average.

Â This is more complicated to prove.

Â It depends on the array being randomly ordered.

Â And again, you can get a feeling for

Â where the proposition comes from by looking at this N by N trace.

Â Again, the black elements are the ones that we compare, and

Â actually, they're also the exchanges.

Â On the red one is the one that's finally put into place.

Â 4:53

And you can see that for a large array that's randomly ordered, the element

Â that we put into place is going to go about halfway back on the average.

Â So that means about half the elements below the diagonal are going to be black

Â on the average.

Â There's N squared over 2 below the diagonal,

Â half of that is N squared over 4.

Â 5:20

This is a bigger trace that shows, again,

Â about half the elements below the diagonal are involved in the sort.

Â [COUGH] Let's look at an animation.

Â Since N squared over 4 versus N squared over 2,

Â insertion sort's going to be about twice as fast as selection sort.

Â So we can do about twice as many items in the trace in the same amount of time.

Â 6:44

So in the first case, it's much,

Â much faster than selection sort, linear instead of quadratic.

Â In the second case, it's slower than selection sort,

Â because it uses about the same number of compares, but it uses many more exchanges.

Â So let's see that in the animation.

Â 7:15

Same kind of dynamic characteristic as selection sort, except, for

Â every step, it's not just comparing,

Â it's also exchanging, which makes it even slower in practice.

Â 7:34

But there's also a good case that actually we take advantage of in plenty of

Â practical applications.

Â And that has to do with when the array is partially sorted.

Â To talk about this in a quantitative way, we define what's called an inversion.

Â An inversion is just a pair of keys that are out of order in the array.

Â 7:56

So this array has six inversions, T and

Â R are out of order, because R should go before T.

Â T and P are out of order, and so forth.

Â This array has six inversions.

Â 8:18

And partially sorted arrays appear often in practice.

Â For example, if you have a large array with just a few, that's sorted except for

Â just a few unsorted elements appended at the end,

Â it's going to be partially sorted.

Â Or in other cases, if you only have a few entries out of place,

Â the array's going to be partially sorted.

Â These types of things arise often in practical applications.

Â And what's interesting about insertion sort is that it runs in linear time for

Â partially sorted arrays.

Â And the proof is, the number of comparisons and the number of exchanges is

Â equal to the number of exchanges equal to the number of inversions, and

Â there's an extra compare for every element except the first.

Â So let's look at how that looks in the animation.

Â Here's a partially sorted array, and

Â you can see that insertion sort quickly gets the job done.

Â We're going to take advantage of this a little bit later in this lecture.

Â