0:03

Now we're going to talk about saddle-point asymptotics.

Â Which is the process of refining saddle-point bounds to get the more

Â accurate estimates that are available to us.

Â so again, in the previous segment, I talked about the basic idea of using a

Â cauchy coefficient formula. To just find the saddle point, use a

Â circle of radius equal of the saddle point.

Â And just bounding the interger everywhere on the circle with it's value at the

Â highest point. but now what we're going to do to get

Â better at symptotics is just to focus on the place near the saddle point.

Â So as as we saw early on the integrand is really low for most of the circle so what

Â we want to do is find a good bound on on the tail or the part that's really low.

Â in focus on the part of the path that's near the saddle point.

Â This is like Laplasas' method that we talked about very early on in that

Â sypathic analysis. Where you focus on the center and bound

Â the tails and bring the tails back. And with that kind of calculation we can

Â get very accurate asymoptotic estimates. so that is we're going to look at the

Â little part of the circle that is close to the saddle point and then rest of it

Â we're going to show to be negligable. with different estimates.

Â that's the saddle-point method. And just it's, it's easy to see how it's

Â going to, going to be effective. Because we saw that the bounds were off

Â by square root of 2 pi n, which is essentially the part of the circle, it's

Â low, and we were assuming it to be high. That's all it is.

Â It's a different factor. so we're going to do a little more

Â analysis to see that, we can we don't need to do that, and we can get an

Â accurate estimate with Laplasas' method. Now this, as with many of the transfer

Â theorems that we consider in analytic combinatorics.

Â There's going to be technical conditions, that need to be satisfied, for us to

Â apply this method. and just for completeness in this lecture

Â I'm going to call that susceptibility. That's just a name for the technical

Â conditions that's, that are going to be needed for saddle point approximations

Â to, to work. in, in you, you know, all, all it is, is

Â really some kind of formal statement that this process is, is going to work.

Â So that you can you know, find the saddle-point and that's the root of the

Â saddle-point equation. And the curve can be split into two parts

Â such that the tail part is negligible. and that the part that's not negligible

Â is the saddle point that is, there's a quadratic approximation just like we saw

Â from just an elementary anaylsis of what a saddle point is.

Â so it doesn't have to be precisely a saddle point but if it's near that saddle

Â point you have that quadratic approximation.

Â that's as general of the conditions, as, as the conditions as we can make them.

Â 3:34

and as with many of our transfer theorems that we considered earlier on for

Â singularity analysis and other approaches these technical conditions you know have,

Â have to be checked. so, if there is a multiple saddle-point,

Â then maybe there will be some other part of the condition to worry about, but I'm

Â not going to worry about that, intellectually.

Â and also tails have to be negligible in the first place, and then however this

Â estimate works. It has to be negligible, to bring the

Â pails back. And we'll look at a detailed, example

Â but, this kind of a technical detail is more appropriate to read about that, in

Â the book. So, what this is going to lead to is the

Â so-called saddle point transfer theorem. and that, all it says if you have a

Â contour integral. and, we write it as f of z equals e, to

Â the f of z. e to the little f of z just to simplify

Â the calculation, a little bit. it's a singularity, there's no problem

Â with doing that. So if that's susceptible to this sort of

Â approximation, then the, integral is approximated by f of z over that square

Â root of 2pi factor. It tells us exactly how to compute that

Â missing factor. this is a general technique for contour

Â integration, it's not just for asymptotics, that's about any contour

Â interval. [COUGH] it, gives a, a way to approximate

Â it. so, for, and the proof is very similar to

Â the proof that we did for the, saddle point bound.

Â but, so, that's the theorem, but what we want to use it for is a transfer method

Â for generating function. so it just translating that over to, our

Â use of, cauchy's theorem, where we're looking for a coefficient of z to the m

Â in a generating function. Then what we're going to do is just apply

Â this theorem with g of z with f of g equals g of z or z to the n plus 1.

Â So in that, and so that all that do is, is plugging into the general theorem, f

Â of z equals g of z over z to the n plus 1.

Â So our, our saddle point equation is just modified slightly and the main point is

Â that we evaluate our function at the saddle point

Â And then we have this corrective factor which is just square root of 2 pi times

Â the second derivative. so it's a plug and chug method of getting

Â coefficients of z to the n out of entire functions that have no singularities.

Â now there certainly is technical detail in proving suceptability and I'll try to

Â illustrate some of that in an example. But in this lecture what I really want to

Â focus in on is the idea that this kind of approximation is available.

Â I can't claim that this is as clean as many of the other transfer theorems that

Â we've looked at because checking those technical conditions can be more

Â challenging. so anyway, the summary is that for our

Â generating function g of z, and this is just doing the math backwards,

Â We're going to use the saddle point equation.

Â G prime is z over g is z equals m plus 1 over z.

Â So that's, theta is the solution to that. And then following that, that's the

Â approximation where little g is a log or big g minus m plus 1 log z.

Â that's a relative simple calculation for getting asymptotic estimates of

Â coefficients. So let's go back to our examples.

Â So estimating 1 over n factorial. it's a saddle point approximation.

Â Coefficient is z again and then e to the z.

Â So, g of z equals e to the z. And so, un then taking logs, our saddle

Â point equation is 1 minus n plus 1 over z equals 0.

Â so that's zeta equals n plus 1. and then the saddle point bound is just

Â plug in zeta second derivative us 1 over n plus 1.

Â so the approximation that we get is e to the n plus 1 over n plus 1 to the n plus

Â 1 squared of 2 pi over n plus 1. and again that one goes to 1 over e and

Â just simple asymptotics. gives us exactly stirling's

Â approximation, e, e to the n over n to the n over square root of 2 pi n.

Â So the, this extra factor in the saddle point approximation gives us the extra

Â factor that we were missing before. so the choice that we always have when

Â using saddle point is you can do all the detailed calculations to check that the

Â interval is suceptible and so forth. or you always have the option of just

Â using the bound and sacrificing that factor which certainly in many

Â applications, people will choose as an option.

Â Now if it becomes important to get the factor, we can can get it, but it takes

Â some work. and again, checking susceptability, you

Â have to check that the tails are negligible, that the central

Â approximation is quadratic. And then we can complete the tails back

Â in, in a reasonable way. And that's really doing the calculation

Â for a particular function that we have. Uh, [COUGH] so, at this level, we don't

Â necessarily have the, the general scheme of, but, this is an important starting

Â point for, limiting distributions that is something, advanced that we don't study

Â in this course. so, let's just look at,

Â And, and again, this is the last lecture in a long course.

Â So I'm not going to do these calculations in detail.

Â other to, other than to just, exhibit 'em.

Â And this is the kind of calculation that's needed to show that it's

Â acceptable, for this problem. so, what we're doing is, trying to do a

Â contour integral for either the z or the z over the n plus 1.

Â or just putting it all in 1 function, z minus n plus 1 log z.

Â around this this circle, z equals ne to the i theta.

Â so first thing is to, just switch to polar coordinates z equal to n e to theta

Â and just plug that in there and most of it comes up.

Â E to the n over n comes out. and all that's left is interval 0 2pi e

Â to the n e to the i theta minus 1 minus i theta that's that's just simple

Â subsitution. so now what these saddle point method

Â requires is that we're able to split it into 2 contours.

Â 1 for a little part close to the saddle point and then the rest for the tail

Â which is all the rest of the circle. and so those are the 2 contours and the

Â point is. For we have to be able to prove that for

Â the tales gets exponntially small. and that you can look at the text for

Â that proof, but when we're away from the saddle point, remember, in a 3D diagram

Â that's when the curve drops all the way down and so exponentially small.

Â And then in the part close to the so we're going to just negelect the tails

Â and work with the part close to the saddle point.

Â and another thing to notice is that if we can slightly shift the saddle point.

Â If we want, it simplifies the calculation quite a bit.

Â As long as we have that quadratic equation approximation near where we are.

Â 13:04

And there's a small calculation to check that.

Â and then we can change variable and do the integral and that's the intergal that

Â we have. And then we want to restrict theta the

Â other way to complete the tail so that we can get an integral that we can evaluate.

Â and then that integral is a regular, normal integral, squared of 2pie n.

Â 2pi over n is the final result. And so we had to bound alpha 1 way to

Â complete the tails, and bound it the other way to drop the O term.

Â and we're left with finding a value in between minus 1 half and minus 1third.

Â So we pick into the 2 5th and that's the complete derivation that, that shows that

Â it's susceptible to a saddle-point. And again these are integrate

Â calculations but it's kind of always the same and it's a matter of where that

Â bound is and that depends on the function.

Â the the place where the estimate near the saddle point is accurate and there's a

Â place where the tails are negligable and you just have to find that place.

Â and and then in the end we're getting Sterling's approximation back from this

Â problem. so that's a Illustration of the

Â difficulty of proving susceptibility. and unfortunately, that's still there for

Â many problems. Again this transfer theorem is not at all

Â as simple as others we've looked at. And you might well ask, well, you know,

Â you're talking about the difference between n to the 2 5th and n to the 1

Â half. That's about n to the 1 10th.

Â wasn't there something in early lectures about this not being relevant unless n is

Â so huge that we wouldn't care about it? Is that a problem here?

Â well actually it's not because in these cases we are talking about estimates that

Â are in the exponent so n doesn't need to be.

Â So if n is like a billion and these estimates are still something we are

Â going to care about and are going to matter.

Â 'Cuz it's in the exponent. the other thing is that I'm doing a quick

Â illustration here. we can get a full asymptotic series to

Â any desired precision using these methods.

Â and also, now when we get the answer out we can validate them numerically which we

Â usually do for problems of this sort. and we're still pushing towards the goal

Â of having a general, general schema that will cover a whole family of classes.

Â it's hard to that we're as far along for saddle points as some methods.

Â actually, when you look at the, extension to limiting distributions.

Â we actually, actually are but that's an advanced topic beyond the scope of this

Â course. so let's look again at the transfer for

Â the central binomial case. so now, we're estimating coefficient is z

Â to the n, and 1 plus z to the 2n. so that's our, g of z.

Â 16:28

so, if we take log of 2z minus m plus 1 log of z, and then define that to be our

Â function for the saddle point equation, first derivative is 2n over 1 plus z

Â minus n plus 1 over z. Second derivative then we get saddle

Â point equation is set that 1 to 0, so the saddle point is n plus 1 over n minus 1.

Â And that's what we saw before. and so the approximation says plug in n

Â plus 1 over n minus 1, into this formula. and we, and we get a pretty ugly

Â equation. Then we can go ahead and estimate but

Â remember, it, it's okay to shift the saddle point.

Â and you wanted to simplify the calculation and this 1 is really close to

Â 1. and so rather than go through an

Â estimate, of those, let's just use 1. so if we say the saddle point is 1, in,

Â plug it in, to this equation, then we immediately get out for the n over square

Â root of pi n, as expected. Making that choice early on and also

Â simplifies the check of susceptibility and so forth, which is uh, [COUGH],

Â something that you want to do. You're going to slightly shift the saddle

Â point as long as that doesn't disturb the idea that you have a quadratic

Â approximation and that you have a negligable pale, it's not going to matter

Â in. We can more easily complete a full

Â derivation after a problem like this. So again, another approach is to just use

Â the saddle point bound and sacrifice the pi n factor.

Â But then if it's an application where it's important and you can see what that

Â factor is then you need to prove that you can go in and do it using Standard

Â techniques. I think that, that's the the message.

Â Now, so that's Saddle-Point Asymptotics.

Â