In this video, we'll look at orthogonal

projections of vectors onto one-dimensional subspaces.

Let's look at an illustration.

We are given vector x in two dimensions,

and x can be represented as a linear combination of the basis vectors of r2.

We also have one-dimensional subspace u with a basis vector b.

That means all vectors in u can be represented as lambda time b for some lambda.

Now we're interested in finding a vector in u that is closest to x.

Let's have a look at this.

When I compute the length of the difference of all vectors in u and the vector x,

I'm getting the graph on the right.

It turns out that we can't find the vector in u that is closest to x,

but orthogonal projection of x onto u.

That means the difference vector of x and its projection is orthogonal to u.

Overall, we are looking for the orthogonal projection of x onto u,

and we will denote this projection by pi u of x.

The projection has two important properties.

First, since pi u of x is in u,

it follows that there exists a lambda in r such that

pi u of x can be written as lambda times,

b the multiple of the basis vector that spans u.

The lambda is the coordinate of

the projection with respect to the basis b of the subspace u.

The second property is that

the difference vector of x and its projection onto u is orthogonal to u.

That means it's orthogonal to the basis vector that spans u.

So, the second property is that the inner product between b and

the difference between pi u of x and x is zero.

So that's the orthogonality condition.

These properties generally hold for any x in RD and one-dimensional subspaces u.

Now let's explore these two properties to find

pi u of x. I found the setting once more over here.

We have a two-dimensional vector x.

We have one-dimensional subspace u which is spanned by the vector b.

And we're interested in finding

the orthogonal projection of x onto u which we call pi u of x.

And we have two conditions for pi u of x.

The first thing is that since pi u of x is an element of u,

we can write it as a scaled version of the vector b.

There must be a lambda in r such that pi u of x is lambda times b.

And the second condition is

the orthogonality condition that the difference vector between

x and pi u of x is orthogonal to u,

which means it's orthogonal to the spanning vector b.

And now let's explore these two properties to find pi u of x.

So, first, we start writing.

We use the condition that b and pi u of x minus x inner product is zero,

which is equivalent to that the inner product of b and pi u of

x minus the inner product of b and x

is zero where we exploit it now the linearity of the inner product.

Now we are going to rewrite pi u of x as lambda times b.

So, this is equivalent to b times lambda b,

or inner product with lambda b minus inner product of b and x. That must be zero.

Now we can move the lambda out again because of the linearity of the inner product,

which is then lambda times the squared norm of b minus the inner product of b,

and x must be zero.

And that is equivalent to lambda is

the inner product of b with x divided by the squared norm of b.

So, now we found lambda which is

the coordinate of our projection with respect to the basis b.

And that means that

our projection using the first condition is lambda times b,

which is now the inner product of b with x divided by the squared norm of b times b.

If we choose the dot product as the inner product,

we can rewrite this in a slightly different way.

So, we would get b transpose times x times b divided by the squared norm of b.

So now, given that this one is a scaler,

we can just move it over here.

So, this is equivalent then to saying b times b transpose divided by

the squared norm of b times x is our projected point.

And if we look at this,

this is a matrix.

And this matrix is a projection matrix that projects

any point in two-dimensions onto the one-dimensional subspace.