0:02

Cao Cao led his army towards the south in anticipation

Â of a war with the Sun-Liu alliance.

Â Cao Cao had a much larger and stronger army than the alliance, but being

Â residents of the north, Cao Cao's army was not familiar with sailing on rivers.

Â The soldiers became very nauseated during their sail to Eastern Wu.

Â Zhuge Liang and Zhou Yu planned to send Pang Tong,

Â a famous scholar at the time to infiltrate Cao Cao's camp.

Â They hoped Pang Tong would trick Cao Cao into believing that by committing his

Â ships together using iron strings,

Â it would stabilize them and prevent his army from becoming sick.

Â When really, this would provide an opportunity for

Â the alliance to launch a widespread and destructive fire attack on the ships.

Â Pang Tong went to Cao Cao's camp to plant this idea, and

Â knowing him to be a notable scholar, Cao Cao trusted his advice.

Â 0:55

He asked him to start by packing the square shaped house ships together.

Â [SOUND] Pang Tong came up with an arrangement, but Cao Cao thought it wasn't

Â stable enough, as there were too many gaps and the span of the packing was too large.

Â [SOUND] He asked Pang Tong to come up with a tighter and more stable arrangement.

Â Pang Tong was unable to do so and

Â traveled back to the allies to ask Zhuge Liang for help using the tablet.

Â [MUSIC]

Â [SOUND] >> [SOUND] So,

Â Cao Cao has a very large army, and with that army come auxiliary.

Â So, these are officials, families of the soldiers, other kinds of camp followers.

Â And in order to make them mobile, he's housing them on these ships,

Â these square housing ships.

Â 1:37

But Pang Tong says, well, look, a lot of the people in your camp,

Â followers, are from the north, and they find these ships very,

Â very unsettling, because they move around in the river, and it gives them nausea.

Â It means that they're not going to be performing to the best of their ability.

Â So, what you should be doing is tying them all together to make one big sort

Â of house raft, with lots and lots of these house ships joined together.

Â And that will make it much more stable, and

Â then these people will be much more comfortable on the ships.

Â And so, he suggests, here's a pattern,

Â where we can join all those ships together to make one big raft.

Â 2:15

And, so we can see that these are all squares of different sizes.

Â So, Pang Tong looks at this thing and says, look, hang on,

Â there's these gaps in here.

Â I mean, surely, we can make a better square.

Â Obviously, the tighter the square is, the more stable the whole raft would be,

Â because there'll be more ships next to each other.

Â So, he says, well, what I really want you to do is, yes,

Â we should tie the ships up but we should try to minimize this sort of size of

Â this rectangle to make it as stable as possible.

Â 2:43

And so that gives us the multiple square packing problem.

Â So, we're going to be given a number of squares of size 1x1, to nxn and

Â various different numbers of each of this kind of squares.

Â And then we're going to pack the squares into a rectangle of

Â the smallest possible area in order to solve Cao Cao's problem.

Â So, here's our model for the problem.

Â It's not too difficult.

Â So, here n is the number of square sizes.

Â So, square is just going through a number of square sizes and then for each square,

Â we have a number of copies.

Â And then we can do a better reasoning about, well, the maximum length

Â rectangle we'd ever need is if we just take all the lengths of these squares and

Â add them together.

Â And we'd certainly have a solution with that length.

Â Just by lining up all the squares one after the other.

Â And we are also going to take care of the minimum area which is,

Â just if I sum up all of the area of the squares, then the area of the whole thing,

Â the rectangle that I contained them in, obviously has to bigger than that area.

Â 3:39

So mina, principle decisions of these two, the height and

Â the width of this rectangle.

Â And we can note that the height has to be at least n, otherwise,

Â it kind of if you just square size n in it.

Â And it doesn't have to be any longer than maxl.

Â And similarly, the width is between those ranges.

Â It'll also be worthwhile, this is the area which we're actually trying to minimize,

Â which is going to be height times width.

Â We can also give it fairly tight bounds.

Â We know that the area has to be equal to this minimum area.

Â And of course, it needs to be no bigger than n times maxl,

Â which would be a rectangle big enough to just fit all the squares in a row.

Â 4:12

All right, and we can also calculate the total number of squares,

Â which is just adding up this number of copies of each type.

Â And that'll give us this array for all the squares that we actually have to pack.

Â And basically our main decisions are going to be then for

Â each of the squares, what's the x position and what's the y position?

Â And notice, as I've said many times, we're trying to, when we introduce

Â these decision variables, we're trying to add type bounds on the variables,

Â to reduce the amount of search we have to do to find a solution.

Â 4:44

All right, so here's an example of the answer to a square packing problem.

Â So, here we're packing a 1x1, 2x2 and 3x3, and 4x4 square together.

Â And it uses a height of 5 and a width of 7 and we have a total area of 35.

Â And it's hard to see, how to do better for this particular packing problem.

Â 5:04

So, it's going to be useful to have some auxiliary variables to make our model.

Â So, we're going to have an array which for

Â each square laid out gives us the size of that square.

Â So, we're trying to build this array here, right.

Â So, we have three of size 1, two of size 2, five of size 3, four of size 4,

Â sorry, three of size 4, no, four of size 4 and three of size 5.

Â So, this is just some calculation to do that.

Â So, we're actually going to use variables here.

Â And we're just going to use a global cardinality constraint to push them all

Â forward.

Â So, the global cardinality is going to make sure that we have the right number of

Â copies of each square in this array.

Â And then we're just going to order all the values in this size array, so

Â it's increasing.

Â And that'll force that there's only one solution, which is exactly this one.

Â 5:53

All right, now that we have the size of each array, right?

Â So, we were trying to get for this number of squares for each one the size it has.

Â Now we can easily add the rest of the constraints.

Â So, we need that each square fits inside the rectangle.

Â So, if we take its right-hand side, it has to be less than or equal to the width.

Â And we take the position of its top, which is its y-coordinate plus its size,

Â that has to be less than or equal to the height.

Â And then we need to make sure that for each pair of squares s1 and s2,

Â they don't overlap.

Â 6:31

This one says that s1 is below s2 and this one says that s1 is above s2.

Â And so one of those four things has to happen,

Â that's enough to make sure those squares don't overlap.

Â And then we're going to minimize the area.

Â So, that gives us our model.

Â 6:55

So, that looks like this, right?

Â So, you can see, we've actually packed very, very neatly,

Â every square into this thing of size 35.

Â But let's look at the real instance, that Cao Cao needs to have solved.

Â So, here we have seven different sizes of square, some of them, there's no copies.

Â There's no 3x3 ships that we need to tie up.

Â We can see, we get a solution quickly and if we keep going, we get better solutions.

Â And after seven minutes, we find that solution of 504 but

Â even after one more hour, we don't improve it.

Â Now, possibly this is the optimal solution but we don't know.

Â This is very common for hard combinatorial optimization problems that we can find

Â a good solution but maybe not be able to prove it optimum.

Â 7:40

So, how can we improve our model because we would like to get the best

Â possible solution?

Â So, we're going to add global constraints, obviously,

Â we can add redundant constraints and we can use symmetry breaking.

Â All of these things will help reduce the amount of search

Â it takes to answer this problem.

Â So, the first thing is the global constraint and

Â as you should be used to by now, if there's some combinatorial substructure,

Â which is common, then we'll have a global constraint, which captures that.

Â And so, two dimensional non overlap is a very common thing in any kind of 2d

Â packing problem, so there is a global constraint to do that, it's called diffn.

Â It should probably be called diff2, because the 2 is the 2 dimensions.

Â So, what does it take?

Â It takes the x coordinates of rectangles and the y coordinates of the rectangles

Â and the size in the x direction of each of those rectangles and

Â the size in the y direction of each of those rectangles and

Â make sure that none of those rectangles overlap.

Â And so if we go back to our model, we can remove this non-overlap constraint and

Â replace it with just a single global constraint, which is this one here, right?

Â So, the x-coordinates are the x, the y-coordinates are y, and the sizes in

Â the x-direction is size, and the sizes in the y-direction are also size.

Â So, that global constraint will do all of that non-overlap for us and

Â much more efficiently.

Â 8:57

Now, another thing we can do,

Â is we can note that cumulative is very close to a packing constraint.

Â So, if there is a packing, then a cumulative constraint, which just adds up

Â the amount of each size that we use in one dimension, will also hold.

Â Which means we can add redundant cumulative constraints to our packing

Â problem.

Â And that will improve the propagation and hence the solving.

Â Because the cumulative constraint, maybe,

Â able to do things before the diffn constraint can kick in.

Â 9:24

And so we can add a cumulative constraint.

Â So, basically saying that the amount of height we can use at any position

Â in our packing can only be up to this height that we're choosing.

Â So, how much height are we using?

Â We can think of these boxes as tasks.

Â They start at the x coordinate, because an x coordinate is going along this way,

Â they have a duration, which is their size, that's the size of the square.

Â And they have a height, the resource usage of the size.

Â So, that cumulative constraint must be true in any correct packing.

Â And we can simply reason about the width.

Â So, a width can only be up to width.

Â We look at the y-coordinate, the duration's now the size,

Â which is going upwards.

Â And the resource usage is the size that way.

Â And that has to be the total usage at any point.

Â And the width has to be less than or

Â equal to this width that we're trying to minimize.

Â All right, so in general, we have to be careful.

Â Cumulative constraints do not enforce packing.

Â So, here's an example, even when the x positions are fixed, which shows this.

Â So, if we have all of these rectangles here and

Â we're trying to pack them in a box.

Â Now if we look at it, it looks like it fits.

Â If we think about the cumulative constraint, we'll look here and say,

Â we're not using too much height at this point because if we add them up,

Â we get something less than this bound.

Â But if we try to actually pack them, right?

Â If we try to push that 6 down, then it would push this 3 at the bottom.

Â So, even though this also satisfies the cumulative,

Â we're not using too much resource here, it's not a problem.

Â It's because the cumulative really sees the problem like this, right?

Â And you can see, that this is a cumulative solution, but it's not a packing solution.

Â Because there's no way of leaving the rectangle four in the right shape.

Â 11:03

All right, but we have some very important things that we can take advantage of here,

Â which is symmetries.

Â So, when we have the squares of the same size, they're completely interchangeable.

Â So, if I have a square of size 2x2 here and another one here and

Â I swap them, then I'm going to get exactly the same solution.

Â And so I need to make sure that only one of our possible solutions is going to be

Â explored.

Â And the way to that is the order the placement of those queries.

Â So we're going to order them on their x, y coordinates.

Â So basically, we want to say, that the first square of size 2x2 is

Â sort of below and to the left of any other square.

Â 11:39

Well indeed, it's a lexicographical ordering or actually,

Â we'll make it above and to the right.

Â So, we're going to enforce a lexicographical ordering on the squares,

Â so that x1, y1 is let's say, in position to the first square of 2x2.

Â It's going to be lexicographically but

Â greater than the position of the second square of the same size.

Â And of course, this means that either x1 is greater than x2 Or

Â x1 equals x2 and y1 is greater than y2.

Â 12:06

And of course, we have a global constraint to do that kind of thing.

Â So, lex_greater will take arbitrary length arrays and enforce that,

Â this array here, is lexicographically greater than that one.

Â All right, there is its MiniZinc signature there,

Â and that will be a very useful global constraint for many things, but

Â typically, most commonly used in symmetry breaking, like the example we have here.

Â 12:29

So, now we're going to have to order our squares, and do to this,

Â we're going to have to do a little bit of work,

Â because we need to find all the squares of the same size.

Â So, first we are going to build this base array which just says,

Â it's the position just before the first square.

Â So, base of i is the position just before the first square of size

Â i in our n square layout of all the squares.

Â So, I can do that by just summing up the number of copies of all the squares

Â of size smaller than this one.

Â 12:59

And once I've got that base array, then I can build up my lexicographic constraints.

Â That make sure that squares are the same size,

Â lexicographical ordered in a reducing order.

Â So, here's what we're doing, we're iterating over all possible square sizes.

Â And then we're looking at all the copies of those square sizes except the last

Â ones, because we're going to compare this one plus the one after it.

Â And then we're just basically adding this lex constraint saying that the first

Â square of this size,

Â it's xy coordinate is lex greater than the next square of this size.

Â And we see, we're using base of i here plus j to give us the right

Â value in the x array for the squares of this size i.

Â 13:45

All right, now we solve the model, and indeed, well,

Â we find a better solution than we found after one hour.

Â And in fact, we've proved in just 32 seconds.

Â So, obviously, adding all of this extra

Â stuff to our model made a very significant difference on how powerful it could go.

Â 14:17

So, we can make another improvement.

Â We use this constraint solver here to build this size array.

Â Now really that size array was fixed, right?

Â Once we knew this number of copies of every ship then we could have built

Â this array without using the constraints alone.

Â So, that's likely to be much more efficient because it won't, well,

Â this won't take very long to solve.

Â But if we can just build it like this, then MiniZinc, when it's compiling

Â the model, knows these fixed sizes and then can make some more optimizations.

Â Which may not be possible with this version,

Â where the solver is the only one that works out the resources.

Â 14:56

So, how do we build this array?

Â Well, it's a little bit complicated.

Â We're going to iterate over all the i's in NSQ,

Â that's what we are trying to give sizes to.

Â And we're looking for all the j's, where i is greater than the base of j.

Â So, there are all the j's, where this i is greater than its base.

Â So, that's basically going to be 1, 2, 3, up to the one where i is in the base.

Â And then we take the maximum of those, and so that's actually going to give us, for

Â each position i, which square size should be in that position.

Â Takes a little bit of thought, but it'll work.

Â 15:31

All right, so what we've seen here is a packing problem, a 2D packing problem.

Â It's a very common use of CP in the real world to solve this kind of

Â packing problems.

Â And there's lots of varieties and

Â they vary complex discrete optimization problems and

Â they also typically have things like symmetries and things like that.

Â So, we've seen the use of diffn to encode 2D non-overlap.

Â And you can think about this as direct extension of the disjunctive constraint,

Â which actually encodes 1D non-overlap, if you think about it that way.

Â And we've also seen that we can use cumulative constraints for

Â helping our packing problems.

Â So, they're redundant for packing problems, but they're very useful for

Â improving our solving.

Â