1:16

I was being super sloppy when I wrote this out.

I wrote 1 over 1- x equals this power series, but

that's not true unless I include this statement.

I should have written this down.

If the absolute value of x is less than 1.

And if that's true, then these are equal.

The radius of convergence of this series for

1 over 1 + x squared is also 1, and we can see that in the graph.

Here I've graphed the functions 1 over 1 plus x squared and

take a look at its Taylor series expansion around the point 0.

Here's just the first time which is the constant term.

Here's through the quadratic term.

Here's through the x to the fourth term,

through the x through the sixth term and so on.

And you can see that it's doing an increasingly

good job of approximating the functions between minus one and one.

But over here, it's actually doing an increasingly bad job.

One over one plus x squared looks like a really nice function.

So if it's such a nice function, why isn't the Taylor series

centered around zero doing a better job of approximating the function far away?

Well here I've graphed y equals sign of x.

And honestly, I mean, qualitatively at least, this isn't so

different looking from the graph of 1 over 1 plus x squared.

The graph of 1 over 1 plus x squared had kind of a big bump in the middle, and

this thing's just got lots of bumps.

And yet the Taylor series is just totally different experience.

Let's start writing down the Taylor series around the origin.

So, here is the point around which I'm expanding.

Here is just the constant term in red.

Here's through the linear term.

Here's through the cubic term.

And I'm gonna keep on going and as I add more and more terms in my Taylor series

expansion, I'm getting increasingly good approximations to sine of x everywhere.

The sine of x is equal to its Taylor series centered around 0 for

all values of x, whereas 1 over 1 plus x squared, its Taylor series

around 0 is only equal to 1 over 1 plus x squared if x is less than 1.

We know other examples like that.

Like this, this is the graph y equals 1 over 1 minus x.

And I can start running out the Taylor series around this point.

Here's the constant term, the linear term.

I take more and more terms and yeah, I mean, the radius of convergence here is 1.

But maybe that isn't so surprising.

I mean, 1 over 1- x has a bad point at 1.

I don't want to plug in x equals 1 because then I'll be dividing by 0.

This function's not defined at 1.

So if I start a Taylor series around 0 and

I imagine how big do I really expect that radius of convergence to be.

Well, an interval of radius 1 centered around 0 bumps into the problem point.

So maybe I don't really expect the radius of convergence to be more than 1 at 0.

So maybe 1 over 1- x has a problem when x equals 1.

But 1 over 1 + x squared doesn't have any problem at 1,

doesn't have any problem anywhere.

This function's defined for all x.

So 1 over 1 + x squared looks just as nice as sine of x, why is the Taylor series for

this function centered around 0 so

much worse than the Taylor series sine of x centered around zero.

Taylor series for a sine of x converges to sine of x everywhere, and

the Taylor series for 1 over 1 + x squared is only good in an interval of radius 1.

Let's see what happens if we write down a Taylor series expansion for 1 over 1 +

x squared, but not centered around 0, but centered around some other point.

Let's write down the Taylor series expansion centered around x equals 1 and

see what happens.

So here we go.

There's the constant term.

There's the linear terms.

And we keep on going.

We add more and more terms to the Taylor series expansion, centered around 1.

And we're doing an increasingly good job, but

not over the whole real line again, right.

But at least the radiance of convergence is bigger now.

Well how much bigger?

The Taylor series for 1 over 1+ x squared centered at 0 has radius 1.

The Taylor series centered at 1 turns out to have radius the square root of 2.

And we had this idea that maybe places where the function's undefined,

these bad points where the function doesn't exist.

Maybe those are somehow to blame for the radius of convergence.

And that's what happened in the case of 1 over 1- x.

All right the radius of convergence around x equals 0 was 1,

because this function has this bad point at x equals 1.

The function is not defined if I plug in x equals 1.

Maybe I'm bumping into a bad point here too.

Well let's diagram the situation this way.

Here's the real line.

And around 0, I've got this.

Well it's an interval, but I've drawn it as a circle.

It's a circle of radius one.

So that's representing the radius of convergence of my Taylor series around

the point zero for the function 1 over 1 + x squared.

Now around the point 1, I've got a different radius and convergence,

it'ss bigger, it's the square root of 2.

So here is circle of radius

the square root of 2, and I'm placing it so that it's center is at one.

And the idea here is to try to get a sense of whether or

not maybe I'm bumping into a bad point.

Is there some point on the real line which is distance 1 from 0 and

distance the square root of 2 from 1.

But there isn't such a point on the real line.

Okay. But when I draw them as circles,

the circles are touching at these two points.

So, where is that point?

Well those points are in the complex plane.

That point there is i, and that point there is -i right.

Those are square roots of -1, those are imaginary numbers.

Well what happens if I evaluate 1 over 1 + x squared when x equals i, or x equals -i?

So i squared is -1, so what's f(i)?

Well f is 1 over 1 + its input squared, so

that would be 1 over 1 plus, what's i squared, it's -1.

7:24

One over one plus minus one.

That is not defined.

That's dividing by 0.

So, there is a bad point.

There is a point where the function is undefined.

It's just that the bad point for this function isn't a real point.

It's an imaginary input.

All right, if I evaluate this function at i or at minus i, it's undefined.

And yet, that bad point in the complex plane

is messing up my radius of convergence even along the real line.

We're beginning to get a glimpse of the important role that complex numbers play.

Even in the theory of just real value Taylor Series.

Additional evidence comes for

example by this equation, e to the ix equals cosine x plus i sine x.

And again i is a square root of minus one.

You can interpret this really as a statement about power series.

So here I've written down the Taylor series for e to the x, but

with x replaced with ix.

Here I've just written down cosine of x, and here I've written down sine of x,

but I've multiplied by i.

And you can expand this out using the fact that i squares to -1,

to get this equality.

And you can do kind of fun things like substitute in say pi for x,

and conclude that e to the i pi is cosine pi plus i sign pi.

So cosine pi is +1, sine pi is 0, so e to the i pi is -1.

Taylor series aren't just a jumping off point for calculus and for

numerical approximations.

Taylor series are also our first step into the theory of complex analysis and

that theory is more natural than it might seem at first.

If the complex numbers are affecting the radius of convergence of my real power

series, then they must be really there.

They're not as imaginary as people might think.

So Taylor series aren't just a jumping off point for Calculus and for

numerical approximations.

Taylor series are also our first step into the theory of complex analysis.

And that theory is more natural than it might seem at first.

I mean if the complex numbers are affecting the radius of convergence of my

real power series, then they must be really there.

I mean, they're not as imaginary as people might think.

[SOUND]