Escolar Documentos
Profissional Documentos
Cultura Documentos
Consider a function, T(n), that expresses the running time of an algorithm in terms of the problem
size, n. The units associated with T(n) could be a variety of things including seconds, nanoseconds,
or even a count of operations.
Definition: Big O
We say T(n) is O(f(n)) if there exist constants c>0 and n01 such that T(n) cf(n) for nn0. We also
say T(n) is order f(n).
To prove T(n) is O(f(n)), find c and n0.
Big O is a way of saying that a function is asymptotically less than another (as n gets big) to
within a constant factor.
Examples:
If T(n) = 6nlogn + n2 n + 7, prove T(n) is O(n2).
Proof: T(n) = 6nlogn + n2 n + 7
6n*n + n2 n + 7 (if n1, since n > logn when n>1)
= 6n2 + n2 n + 7
6n2 + n2 + 7 (since n 0)
2 2 2
6n + n + 7n (since n2 0)
= 14n2
Therefore, with c=14 and n0=1, we can say T(n) is O(n2).
Notice that as you move further to the right in this table, the growth rate of the functions increase
more dramatically as the problem size increases. In fact, with 2n, only small problem sizes can be
solved in a reasonable amount of time. This tells us that if we have an algorithm that has a running
time function with a 2n term in it, we probably wont be able to solve the problem for sizes much
greater than n=100.
n logn n n nlogn n2 n3 n5 2n
1 0 1 1 0 1 1 1 2
2 1 1 2 2 4 8 32 4
4 2 2 4 8 16 64 1024 16
10 3 3 10 33 100 1000 100000 1024
20 4 4 20 86 400 8000 3200000 1048576
40 5 6 40 213 1600 64000 1.02E+08 1.1E+12
100 7 10 100 664 10000 1000000 1E+10 1.3E+30
1000 10 32 1000 9966 1000000 1.0E+9 1E+15
Notes: The asymptotic growth rate (i.e. as n gets larger) of 2n will eventually be greater than na for
any constant value of a.
The asymptotic growth rate (i.e. as n gets larger) of two terms multiplied together (each
dependent on n) will be greater than the individual growth rates of the individual terms.
Only one of the terms, gi(n), will be important to consider as n becomes large, because the others will
eventually (as n becomes larger) contribute a negligible amount to the value of T(n). The definition of
Big O and proofs on the previous page show that T(n) will be O(gi(n)), if gi(n) is the dominating term
in T(n).
The table below shows how T(n) and each of its three terms grows as n grows. We see that when n
becomes large, the 2*n2 term dominates and we could show that T(n) is O(2*n2). By convention,
we would express O(2*n2) as O(n2) since the notion of a constant factor is built into the definition of
Big O.
2
n 200 500*n 2*n T(n) = 200 + 500*n + 2*n2
1 200 500 2 702
5 200 2500 50 2750
10 200 5000 200 5400
50 200 25000 5000 30200
100 200 50000 20000 70200
500 200 250000 500000 750200
1000 200 500000 2000000 2500200
5000 200 2500000 50000000 52500200
10000 200 5000000 200000000 205000200
As computer programmers, we can simply look at a running time function and determine its growth
rate, in terms of Big O, by identifying the largest additive term. Here are some examples: