0:00

In the following series of videos, we'll give a formal treatment of asymptotic

Â notation, in particular big-Oh notation, as well as work through a number of examples.

Â Big-Oh notation concerns functions defined on the positive integers, we'll call it T(n)

Â We'll pretty much always have the same semantics for T(n). We're gonna be

Â concerned about the worst-case running time of an algorithm, as a function of the

Â input size, n. So, the question I wanna answer for you in the rest of this video,

Â is, what does it mean when we say a function, T(n), is big-Oh of f(n). Or

Â hear f(n) is some basic function, like for example n log n. So I'll give you a

Â number of answers, a number of ways of, to think about what big-Oh notation really

Â means. For starters let's begin with an English definition. What does it mean for

Â a function to be big-Oh of f(n)? It means eventually, for all sufficiently large

Â values of n, it's bounded above by a constant multiple of f(n). Let's think

Â about it in a couple other ways. So next I'm gonna translate this English

Â definition into picture and then I'll translate it into formal mathematics. So

Â pictorially you can imagine that perhaps we have T(n) denoted by this blue

Â functions here. And perhaps f(n) is denoted by this green function here, which

Â lies below T(n). But when we double f(n), we get a function that eventually

Â crosses T(n) and forevermore is larger than it. So in this event, we would say

Â that T(n) indeed is a Big-Oh of f(n). The reason being that for all sufficiently

Â large n, and once we go far enough out right on this graph, indeed, a constant

Â multiple times of f(n), twice f(n), is an upper bound of T(n).

Â So finally, let me give you a actual mathematical definition that you could use

Â to do formal proofs. So how do we say, in mathematics, that eventually it should be

Â bounded above by a constant multiple of f(n)? We see that there exists two

Â constants, which I'll call c and n0. So that T(n) is no more than c times f(n)

Â for all n that exceed or equal n0. So, the role of these two constants is to

Â quantify what we mean by a constant multiple, and what we mean by sufficiently

Â large, in the English definition. c obviously quantifies the constant multiple

Â of f(n), and n0 is quantifying sufficiently large, that's the threshold

Â beyond which we insist that, c times f(n) is an upper-bound on T(n). So, going

Â back to the picture, what are c and n0? Well, c, of course, is just going to

Â be two. And n0 is the crossing point. So we get to where two f(n). And T(n)

Â cross, and then we drop the acentode. This would be the relative value of n0

Â in this picture, so that's the formal definition, the way to prove that

Â something's bigger of f(n) you exhibit these two constants c and n0

Â and it better be the case that for all n at least n0, c times f(n) upper-bounds T(n).

Â One way to think about it if you're trying to establish that something is big-Oh of

Â some function it's like you're playing a game against an opponent and you want to

Â prove that. This inequality here holds and your opponent must show that it doesn't

Â hold for sufficiently large n you have to go first your job is to pick a strategy in

Â the form of a constant c and a constant n0 and your opponent is then allowed to

Â pick any number n larger than n0 so the function is big-Oh of f(n) if and only if you

Â have a winning strategy in this game. If you can up front commit to constants c and

Â n0 so that no matter how big of an n your opponent picks, this inequality holds

Â if you have no winning strategy then it's not big-Oh of f(n) no matter what C and n0

Â you choose your opponent can always flip this in equality. By choosing a

Â suitable, suitable large value of n. I want to emphasis one last thing which is

Â that these constants, what do I mean by constants. I mean they are independent of

Â n. And so when you apply this definition, and you choose your constant c and

Â n0, it better be that n does not appear anywhere. So C should just be something

Â like a thousand or a million. Some constant independent of n. So those are a

Â bunch of way to think about big-Oh notation. In English, you wanna have it

Â bound above for sufficiently large numbers n. I'm showing you how to translate that

Â into mathematics that give you a pictorial representation. And also sort of a game

Â theoretic way to think about it. Now, let's move on to a video that explores a

Â number of examples.

Â