Lecture 4

Asymptotic Notation continued

Here are two more forms of asypmtotic notation:

Loose Upper Bounds: Little-o

Little-o is a "loose" upper bound. If f(n) = o(g(n)) then g grows strictly faster than f; you can multiply g by any positive constant c and g will still eventually exceed f. Here is the formal definition:
o(g(n)) = { the set of all f such that for any positive constant c there exists a positive constant n0 satisfying 0 <= f(n) <= cg(n) for all n >= n0 }.
So, for example, n2 is o(n3) but not o(n2).

Since g always grows strictly faster than f, we have

f(n) = o(g(n)) implies limn->oo (f(n)/g(n)) = 0.
If g is asyptotically non-negative, the implication will also run the other way, i.e., if the limit is 0, then f(n) = o(g(n)) is true.

Loose Lower Bounds: Little-omega

Little-omega (represented mercifully here by the letter 'w') is the "loose" analog to big-Omega. It is a loose lower bound in the same way little-o is a loose upper bound:
w(g(n)) = { the set of all f such that for any positive constant c there exists a positive constant n0 satisfying 0 <= f(n) <= cg(n) for all n >= n0 }.
So, for example, n3 is w(n2) but not w(n3).

As before, since f always grows strictly faster than g, we have

f(n) = o(g(n)) implies limn->oo (g(n)/f(n)) = 0.

Common Mathematical Facts for Analysis of Algorithms

Certain mathematical concepts come up frequently in the analysis of algorithms. We will see more of these as they come up in algorithms, but let's look at an overview for now (your book goes into more detail here; you should read section 2.2):