0:09

So let's recall the first example that we looked at here.

Â We wanted to maximize 200M + 100W subject to this bunch of equations.

Â Now it turns out we had a very clever way of proving

Â that optimum was correct once we found it.

Â So the best you could do is 60000, and there was a great way of proving it.

Â We took one constraint, we multiplied by a hundred, we took another constraint and

Â we multiplied it by half.

Â We added those together and we got a new constraint that,

Â if we satisfied our original constraints, it had to be the case.

Â That 200M plus 100W, the thing we were trying to optimize, was at most 60,000.

Â And this is a very interesting and

Â general technique that if you want to bound your objective.

Â You can try and

Â do this by combining the constraints that you have together to prove a bound.

Â 1:10

Subject to a bunch of these linear inequality constraints A11x1 plus

Â a12x2 plus dot dot dot is at least b1 and thenetc etc.

Â So how can we try and do this?

Â Well if you give me any constant ci bigger than 0,

Â you can take the first constraint and multiply it by c1 and

Â the second constraint multiplied by c2 and so on and so forth, and add those all up.

Â And what you'll get is a new linear inequality,

Â w1x1 plus w2x2 plus dot dot dot is at least T.

Â Here the w i are some combination of the Cs, w i is the sum of c j a j i and

Â t is the sum of c j b j and this is a new inequality.

Â Now, if it's the case that w i is equal to Vi for

Â all i, we have is that V1X1 plus V2X2 plus dot,

Â dot, dot that thing we were trying to minimize is at least t.

Â And so, if we can arrange for Wi to the Vi for all i,

Â we've proven a lower bound on the thing that we're trying to minimize.

Â So, we'd like to find there's a bunch of Ci's that are all non-negative such that

Â vi is the sum of j = 1 to m of cj aji for all i.

Â And so that subject to this constraints t the sum of cjbj is as large as possible.

Â So we like the biggest lower bound we can.

Â Now the very interesting thing about this,

Â is that this system we just wrote down is actually just another linear program.

Â We want to find the c in Rm such that cjbj the sum of that is as

Â large as possible subject to a bunch of linear inequalities.

Â CI bigger than or equal to 0 and a few linear equalities.

Â vi is the sum of cj times aji.

Â 3:03

And so, to put this formally,

Â given any linear program we can call the primal very often..

Â Say minimize v.x subject to Ax at least b.

Â There's a dual linear problem,

Â which is the linear problem that we want to maximize.

Â Y.b subject to y transpose A equals v and

Â that's just another way of rewriting our inequality constraints.

Â And y at least 0.

Â And it should be noted that even if your linear program wasn't exactly in

Â this form, you can still write a dual program,

Â it's with a linear program of trying to find a combination of the constraints.

Â To bound the thing that you're trying to optimize.

Â And so it's not hard to show that a solution to the dual program

Â bounds the optimum for the primal.

Â 4:14

Now the surprising thing is that not just is this a way that you can get lower

Â bounds, that these two linear programs they actually have the same solution.

Â If you find the best solution for

Â the dual program, it actually always gives you a tight lower bound for the primal.

Â 4:38

And this is incredibly useful.

Â On the one hand it says if you have a linear program and

Â want to prove that your answer is optimal you could try and

Â solve the dual to provide a matching upper band or lower band.

Â 4:51

It also means that if all you care about is the numerical

Â answer you can try to solve with dual program rather than the primal.

Â And often, dual program is easier to solve, and so

Â this makes things more convenient.

Â And even if the dual program isn't easier,

Â often looking at the dual gives you some insight into the solution to the primal.

Â 5:13

Ok so that's the new programming duality.

Â Let's look at some examples.

Â For example, let's look at the max flow problem.

Â The size of your flow is the total flow going out of a source minus

Â total flow going into a source.

Â 5:26

Now, we have a bunch of these conservation of flow equations, and

Â we can add any multiples of those that we like.

Â And the objective stays the same.

Â So when we do that, the thing we're trying to maximize is the same

Â as the sum over all vertices V of some constant C sub V

Â times the total flow out of vertex V minus the total flow into vertex V.

Â 5:50

Here C sub s needs to be 1 if s is a source and

Â C sub t needs to be zero if t is a sign but for

Â any other vertex V we can take C sub v to be anything we like.

Â [COUGH] Okay, so

Â we have this expression, what do we get when we write this down?

Â 6:18

We can now try to bound this above using our capacity constraints.

Â And so the best we can do here,

Â it's not hard to show is the sum over edges v to w of the capacity

Â of the edge e times the maximum of either C sub v minus C sub w or zero.

Â 6:58

The bound that we prove then reduces to the sum over edges V w,

Â where V is in script C and w isnâ€™t of the capacity h.

Â But you'll note, C is just cut and

Â this bound that we proved is just the size of the cut.

Â And so, this dual program in some sense is just trying to find the minimum cut.

Â Hence, linear programming in duality,

Â in this special case, just gives us the max flow equals min cut.

Â 7:27

Okay, let's look at this other problem for example.

Â The diet problem.

Â Here we want to minimize the total cost of foods you need to buy,

Â subject to constraints.

Â It needs to meet your various daily requirements for various nutrients, and

Â you need to get a non-negative amount of each type of food.

Â 8:02

Okay, so when we combine all of those together we're suppose to get

Â A lower bound on the cost of our diet.

Â And so, if you compare coefficients well, the coefficient we need to end up with for

Â a food f is the cost of that food.

Â 8:18

And this should be equal to the sum of our nutrients N

Â of C sub of N times the amount of that nutrients in the food f.

Â Plus some positive amount that we got by adding whatever multiple we had

Â on the constraint that we got a non-negative amount of that food.

Â 8:43

But thereâ€™s a nice way now of interpreting this C sub N.

Â We can interpret it as a cost in order to buy a unit of nutrients N.

Â And so if there was a market where you could just buy calories at the cost of

Â Cn and you could buy protein at the cost of whatever.

Â What the above equations are saying.

Â Is that for each food you can't cheat the system by buying out food.

Â You can't get nutrients more cheaply than you could

Â by buying the nutrients individually.

Â 9:14

And the cheapest way to get nutrients is buying them individually

Â it's pretty clear the total cost of a balanced diet

Â is at least just the sum over nutrients at the cost of that

Â nutrient times the amount of nutrient that you're required to have in your diet.

Â 9:28

And so, what this linear program tries to do,

Â is it tries to find non-negative costs for

Â the various nutrients that satisfy this, no food allows you to cheat inequalities.

Â Such that the total cost of your diet is large as possible.

Â Now, there's one interesting observation about the solution,

Â supposed that we're actually trying to exactly achieved this lower bound.

Â 9:53

That would mean that you could never afford to buy overpriced food.

Â You can never afford to buy foods where the cost of that food was strictly bigger

Â than the total cost of all the nutrients that make up that food.

Â You could only buy foods where the cost of the foods is exactly the cost

Â of the nutrients in that food.

Â 10:11

And this gives us as an example of a general phenomena

Â called complementary slackness

Â where basically what this says Is that if you look at the solutions the dual,

Â you should look at which equations you needed to use in the dual program.

Â That tells you about which equations in the primal program need to be

Â 10:30

so in particular, complementary slackness is the following theorem.

Â If you give me a primal in your program, minimize v.

Â x subject to Ax at least b.

Â And it's dual.

Â Then if you take solutions to these two, if you use a positive multiple of

Â an equation in the dual, if yi is strictly bigger than 0, in the dual.

Â 11:04

So let's reveal what this means.

Â Let's suppose that we have a linear program to find by these five

Â linear inequalities labelled 1, 2, 3, 4, 5 and the diagram below.

Â We have allowed what regions this gray region and

Â the red point located is the optimal.

Â 11:20

Now, suppose that we're looking for solutions of the dual program.

Â Which of these five equations might actually be used as sort of positive

Â multiple of those equations, in the solutions to the dual program?

Â 11:34

Well, it turns out the only equations that you could actually use are two and

Â four because complementary slackness says that the only equations that get used in

Â the solution to the dual are ones with that equation is tight in the primal.

Â And in this case, two and

Â four are the only lines of this solution to the primal actually lies on.

Â And its those are the only equations that could actually be used in the solution.

Â 12:00

So in summary everything in your program has a dual in your program.

Â The solution to dual actually bounds to the solutions to the primal.

Â And surprisingly the LP na dit's dual has the same answer.

Â And this means that the solution do dual actually

Â is tight bound to the solution to the primal An.

Â In addition, we have this complementary slackness where knowing the solutions to

Â the dual tells you a lot about where the solutions to the primal lies.

Â In fact, it tells you which equations in the solutions the primal needs to be tied.

Â 12:32

So that's basically everything we have for this lecture.

Â Next lecture we're going to talk about proofs for these things so that

Â material is not necessarily required, but if you'd like to see it, it's informative.

Â