1:14

So it's a pretty good aspect of Constraint Programming.

Most of the time you try to capture as tightly as possible this set of solutions

by expressing the constraints as best as you can, okay?

So I'm going to use a couple of simple examples, you know to illustrate this.

You know, remember the magic series, okay?

So we want you to find numbers from S0 to Sn such that Si represent the number of

occurrences of i inside the series, okay? So I asked you last time you, we could

actually find this series, this is a beautiful one, okay?

So we see you know, there are two instances of 0, took, took, there is one

instance of one itself, there is two instances of two you know, itself.

And the first one, okay? And there are reviews 0 instan, 0

occurrences of 3, 0 occurrences of 4, okay?

Now, so this is the model that we saw last time.

And what I was you know? One of the things that I'm asking you,

is, can you find the properties that the solutions have to satisfy.

Now when you look at the, you know, the array of decision variables series.

Can you find a constraints that is actually helping, you know, will help,

you know, in addition to these constraints that we have, that we have

stated. So we have these constraints which are

stated, it actually capture exactly what are the solution.

But can I, can I find another constraint which is always true for every solution,

but which will give the solver much more information?

And that basically means the solver will be ab, able to explore it to remove

values from the domain of the variables. Because that's the end of you know,

that's the rule of the game essentially, okay?

So, typically you know, the way you do this is that you exaggerate, right?

So look at this thing, okay? So this is my magic series and I'm asking

you, can I put 17 there, right? I'm exaggerating right, so nobody would

do that, okay? I put 2000 there, you know, it makes no

sense. Why?

Because what is 17 denoting there? 17 would denote the number of occurrences

of 4 in the series, no that series is you know 5, of length 5.

If I put 17 of [UNKNOWN] on one variable to put 17 occurrences of 4, okay?

So, in a sense, what do I know? Can I get you something which is

interesting on the sum of the numbers on this series?

Yes, they are all occurrences. How many occurrence can I have?

Well the length of the series, okay? So I can have this beautiful essentially

constraint which says that the sum of all the elements in the series have to be

equal to the length of the series which is n, okay?

So these constraints has to be true, it must be true, there is no other way.

Okay, so the number of occurrences n, okay?

And since these guys, these decision variables, are denoting the number of

occurrences okay, the sum of them can not be more than n.

So this is a constraint which is redundant from a semantic standpoint.

It will never, never remove any solutions.

Every solution will satisfy those constraints.

But it will be able to prune the search base better, okay?

So, can I do better than this, okay? Can I find actually something which is

stronger than that? Look at this thing, okay?

So what am I writing? Assume that series 2 is equal to 3.

What does that really mean? That basically means that the value here,

okay. I'm going to put the 3 over there, okay?

What does it mean? It means, essentially, that there are 3,

2 in this array, okay? So far so good, that's what we know.

But saying that there are three, two's in this array means what?

How many occurrences that, that call for. Well essentially, you know, these guys,

you know, there are three two's, okay? Two occurrences, and there are that three

times, so that means there are six occurrences, okay?

So essentially what we know is that there is another way for actually counting the

number of occurrences. Instead of summing every element in the

series, what you could do is look at the re-element in the series, okay?

And then multiply by i. So you look at every element of the

series, let's say element i, and you multiply it by i, because what you, what

I had before was what, series two was three.

So, I multiplied by two, so that makes six occurrences, okay?

So it's another way of computing the same information, okay?

So I have another redundant constraint, which is basically the summation of i,

you know, times series i, has to be equal to the number of occurrences, which is n,

okay? So that's another concern, which one of

the two that I've shown you before is stronger, okay?

I will tell you a little bit which one is stronger, okay?

But it doesn't matter, right? We could put the first one, we could put

the second one inside the server and let the server propagate these two, two

constraints at the same time. Okay, so let me show you the kind of

deduction that we can do. Okay, so this is essentially only taking

the second one here, okay? Which is stronger in almost all the

cases, there's one case where it is not, but almost all the cases it is stronger.

So you see essentially the constraint reduction before, but you see this

redundant constraint which is over there. Okay, what is interesting is what?

You see a five there and you see that this guy is multiplied by four, right?

What does that mean? Okay, that means that the maximum value

that series 4 can take is what? Well, it's one, right?

And the same thing for series 3, and for series 2 the maximum value is going to be

two, okay? Otherwise I will exceed these five, okay?

And for one it's going to be, you know, five, okay?

So immediately, I can deduce very strong information here, on the value of these

variables, okay? And I can use that for pruning the search

page [SOUND]. Very quickly, many of these ray file

constraints can disappear, okay? Because I know that they can't be true,

okay? So essentially, here, you know, I have

removed about a third of these constraints, okay?

Then, assume that I make a very simple choice.

I say that series 0 is equal to 2, okay? So that basically means that I put a 2

over there, right? So, you see the 2 over there, okay?

It also means that I have a 2, right? And therefore, the value that we see here

for series 2 here I have already one over there, okay?

And then, you look at the constraints, you know, the redundant constraints that

I have added, okay? So, what do you see there?

I know that series 2 is greater than 0 now, okay?

So, and therefore, when you look at these constraints, essentially the 5 has become

a 3, okay? And therefore, I can immediately deduce,

okay, that series 4 here has to be equal to 0.

Because it multiplied by a coefficient 4, and it can be at most 3, and that the

other one, the 3 times series, 3 can be at most, you know, 1, okay?

So once again, I can use that information and I can start propagating that

information, and my search base becomes much smaller, okay?

So, if you use the simple model that I've just shown you, you can solve magic

series up to really, really large numbers.

And if you do that, if you actually implement this beautiful model, you will

see a very nice pattern in the way these solutions are actually being so, so, so

basically constructed. And the similarities that they have with

each other. Okay, so the first role of Redundant

Constraints is to express properties of the solution.

And because it, it looks at the problem, the redundant constraint is looking at

the problem from a different angle, it's going to boost the propagation.

Okay, so in a sense what you are, you know, what you are adding is this new

constraint inside the overall model. You had all these constraints which were

already there. They were proving the search case

independently. You add the new one and that one will

also reduce these domains and therefore will have an impact on the other one.

That's the first rule of redundant constraint, okay?

So, I'm going to show you another role, this is the second role, which is to

provide a global view, a more global view okay?

So, and what I'm going to show you is that you can actually combine existing

constraints. And that combination will improve your

communication between the various variables okay?

So, this is, I'm going to illustrate with a market split problem, okay?

8:59

and essentially once again, so you can think of it as a multi knapsack except

that the constraints, instead of being small or equal are going to be equal,

okay? So what you have is a set of constraints,

a set of you know, you will have a set of constraints, you know, indexed by C, a

set of variables indexed by V, okay? For every value in C and V, you will have

have essentially a weight, okay? You can think of an abstract in the

multiple dimensions, dimensions, that's a WCV, okay?

And then you have a right in sight, you can think of it as a capacity, okay?

And then the decision variables are going to be this value X.

There is one variable for every element of V, okay?

And then the constraints are essentially kind of knapsack constraints except that

it's an equality and not a second equal. But what you do is you basically multiply

the weight and the variable, okay, for everyone of these concerns.

And you want that to be equal to the right hand side, okay?

Now these constraints are actually pretty, pretty weak from a, from a

propagation standpoint. They don't communicate each, to each

other, only through the domain. And in this particular case, you have

this linear concern that I'm basically doing their work independently.

And they communicate very later, okay? So one of the thing you can do, okay, is

use the concept of surrogate constraints. So you take all these constraints and you

add them, everyone with a different coefficient, okay?

So I'm basically using here, you know, alpha to the power of C, okay?

And by essentially multiplying the first constraint by alpha, the second

constraint by alpha squared, the total constraint by alpha cubed, and so on,

okay? I can combine all these constraints.

I also multiply the right, the right sorry, the right insight here.

And therefore, this is essentially a combination of all the, a linear

combination of all the existing constraints, and it's valid, obviously.

It's not going to remove any solution, and I can add it to the solver.

So what I did was essentially take in constraints that are completely

independent, merging them together. And now they are working on the same set

of variables and they are going to prove the search base more, okay?

So, Surrogate constraints, using Constraint Programming they are using

Mathematically Programming as well we might talk about that later on, okay.

But essentially what I'm doing is simple combination you know in this particular

case, linear combination of existing constraints, okay?

So in a sense, a Surrogate constraints is just a new constraint that I add, like

the properties of the solution that I shown you before.

Except that this time I'm not expressing a properties of the solution per say as

something that I discover. I simply taking existing constraints and

combining them and getting this new constraints that I am putting inside a

constraint ,solver. Once again by definition, if I add new

constraints the only thing that can happen is that I prove more in the worst

case I prove the same, okay? And so in some application, it can make a

big difference. Thank you.