0:03

Today we're going to talk about reductions. this is something that we've

Â done several times throughout the course, but we're going to want to take a little

Â bit more formal look at it, because it plays a critical role in some of the

Â advance, advanced topics that we're going to want to consider next. Just a quick

Â overview of the next few lectures. These are advanced topics related to algorithms

Â and you can take more advance courses that go into all these things in depth. But

Â it's important to talk about them in context of the algorithm so that we seem

Â to put everything into, into perspective So what we're going to talk about in this

Â lecture is called reduction and it's a technique that we use t take the good

Â algorithms that we know and use them effectively to build more complicated

Â algorithms. And as I said, we've done that before but it brings up some interesting

Â ideas. that are bigger than any particular algorithm. And one idea that we've talked

Â about is the idea of a problem-solving model. We saw that we could use shortest

Â paths and max flow to solve problems that didn't seem to be related at all. and

Â there's a particularly important problem-solving model called linear

Â programming that we'll talk about in the next lecture. but then there's a limit,

Â and that brings us to the topic of intractability that we'll talk about in

Â the last lecture. So we're shifting gears from individual problems to thinking about

Â problem-solving models that are more general than a particular individual

Â problem. And also, as part of this, we're moving into problems that are much more

Â difficult to solve. And not just linear and quadratic fast algorithms, but a just

Â a different scale. And we'll talk about that in the next couple of lectures. And

Â then we're not going to really be talking that much about code for awhile. It's more

Â about the conceptual framework and where the problems that you really want to work

Â on fit into the big picture. So the, our goals are we've looked at some terrific,

Â fantastic, very useful classical algorithms. But they fit into a larger

Â context. we're going to encounter new problems. you want to have all those

Â algorithms in the tool box, but you also under. Want to understand where they fit

Â in to the big picture. there's some very interesting, important and essential ideas

Â that have been developed at in great depth over the past several decades. and it's

Â important for every practitioner to, have a good feeling for, for those ideas and,

Â and what they imply. and really, by thinking about these big ideas, in

Â addition to particular algorithms for particular problems, most people find that

Â they're inspired to learn much more about algorithms. So by way of introduction

Â let's talk about what a reduction is. So this is just a big bird's eye view. The

Â idea is that what we would really like to do is if we, if we have a new problem that

Â we have solve, we'd like to be able to classify the problem according to the cost

Â that it's going to take to solve it. what's, what's an order of growth of a, of

Â an algorithm, a good algorithm or any algorithm to solve the problem. And we've

Â already touched on this a little bit when we talked about sorting algorithms. not

Â only did we have a, a good algorithm for solving sorting several good algorithms

Â but we also developed a lower bound that said that no algorithm can do better. And

Â so we can think of the sorting problem as classified as being order-of-growth,

Â linearithmic or N log N. And then we saw plenty of linear time algorithms that we

Â just examine all the data and we can get it solved. And we like to have more things

Â like this other algorithms that we need quadratic time to solve and so forth. So

Â we're going to think about how difficult are problems to solve, that's the first

Â idea. Now there's really frustrating news on this front that we'll be talking about

Â over the next couple of lectures. And the frustrating news is there are a large

Â number of practical problems that we'd like to solve that we don't really know

Â how hard they are to solve. And that's a profound idea that we'll get to soon. so,

Â but what we want to talk about in this lecture is one of the most important tools

Â that we use to try to classify problems. and it's actually been very successful, in

Â lots of instances. And so if we know that we could solve some problem efficiently we

Â want to use that knowledge to figure out what else we could or could not solve

Â efficiently. And that's what reduction is all about. It's just a single tool and

Â maybe kind of gets to this Archimedes quote, give me a lever long and a fulcrum

Â on which to place it and I will move the world. that's what reduction does for us.

Â So here's the basic idea. we say that a problem X reduces to a problem Y. If you

Â can use a problem that solves Y to help solve X. so, what does that mean? This

Â schematic diagram gives an idea of what we are talking about. so, in the middle, we,

Â we assume that we have a, in the middle box there, labeled in blue. And so, we

Â have an algorithm for solving problem y in, it's not really relevant what that,

Â algorithm is? It just knows that if you have any instance of problem y you can

Â solve it. And the idea behind reduction is that we take an instance of problem X and

Â we transform it into an instance of problem Y. Then use the algorithm for Y to

Â solve that instance of Y and then transform the solution back to the

Â solution to be the instance of problem X. So we take that transformation into an

Â instance of problem y. The transformation, the solution back to a solution 2x. Then

Â that whole thing enclosed in the box on the dotted line is an algorithm for

Â problem X. And what's the cost? The cost of solving problem X is the cost of

Â solving Y. Plus the cost of reduction. It's like pre-processing and

Â post-processing for, problem Y. Then we see a number of calls to Y, in a

Â reduction. And the pre-processing and post-processing might be, expensive.

Â Again, we've seen some specific examples of this, and we'll go over it, in a

Â minute. so here's a really simple example. finding the median, reduces to sorting. So

Â in this case problem Y is sorting and so we assume that we have a sorting

Â algorithm. If you need to find a median, then we take the items that's the instance

Â of X in this case there's no transformation just sort and then once

Â they're sorted, you take that sorted solution and just pick the middle, middle

Â item and return it so the transformation of solution is also easy. so the cost of.

Â Solving, finding the median is the cost of sorting which is N log N plus the cost of

Â the reduction which is just constant time. If you can sort you can find the median.

Â So finding the median reduces to sorting so here's a, a second, simple example.

Â suppose problem X is the element distinctness problem and so that one is,

Â you have a bunch of elements. And you want to know if they're all different. an easy

Â way to solve that, that we looked at back in the coding lecture, is, again, just

Â sort. We assume that we have a sorting algorithm. And then we take that instance

Â of, the element, the distinctness problem, and just all the elements, and sort them.

Â And then after you sort'em, then you can just go through, and, any items that are

Â equal are going to be adjacent to each other. So, simply make a path through, in

Â this post processing phrase, and check, adjacent pairs for equality, 'cause

Â anything equal's going to be right next to one other. And that gives a solution to

Â the element distinctness problem. So again, element distinctness reduces to

Â sorting, because you use sorting to solve it. And in this case the cost of solving

Â element distinctness is the cost of sorting and log N, plus cost of reduction,

Â this time you have to make a path through it. So this means that we can solve

Â element distinctness in time proportional to N log N. it could be that you could

Â find, a algorithm that doesn't use sorting to solve element distinctness. But that

Â might be a bit of a challenge. by reduction. maybe the hard work is done by

Â the sorting. and we get an algorithm for this other problem. that's the basic idea.

Â As long as these cluster reductions not too much, that's the basic idea, of being

Â able to use red uctions. To design new algorithms.

Â