Now just to give you another feel of sort of how these sort of different types of

runtimes behave, let's look at some sort of common times that you might see.

There's log n, which is much smaller than root n,

which is much smaller than n, which is much smaller than n log n,

which is much smaller than n squared, which is much smaller than 2 to the n.

So, if we graph all of these,

you can see that these graphs sort of separate out from each other.

If you just look at them at small inputs, it's maybe a little bit hard to tell which

ones are bigger, there's a bit of jostling around between each other.

But if we extend the graph outwards a bit, it becomes much more clear.

2 to the n starts after about 4 really taking off.

Really just 2 to the n just shoots up thereafter and becomes 20 or 30,

it just leaves everyone else in the dust.

N squared keeps up a pretty sizable advantage though against everyone else.

N log n and n also are pretty well separated from the others.

In this graph, root n and log n seem to be roughly equal to each other, but

if you kept extending, if you let n get larger and larger,

they'd very quickly differentiate themselves.

Square root of 1 million is about 1,000.

Log of 1 million is about 20.

And so really as you keep going out, very quickly the further out you go

the further separated these things become from each other,

and that's really the key idea behind sort of asymptotics.

We don't care so much about the constants,

we care about what happens as your inputs get very large, how do they scale.