Você está na página 1de 3

Asymptotic Notation (Big O)

Consider a function, T(n), that expresses the running time of an algorithm in terms of the problem
size, n. The units associated with T(n) could be a variety of things including seconds, nanoseconds,
or even a count of operations.

Definition: Big O
We say T(n) is O(f(n)) if there exist constants c>0 and n01 such that T(n) cf(n) for nn0. We also
say T(n) is order f(n).
To prove T(n) is O(f(n)), find c and n0.
Big O is a way of saying that a function is asymptotically less than another (as n gets big) to
within a constant factor.

Examples:
If T(n) = 6nlogn + n2 n + 7, prove T(n) is O(n2).
Proof: T(n) = 6nlogn + n2 n + 7
6n*n + n2 n + 7 (if n1, since n > logn when n>1)
= 6n2 + n2 n + 7
6n2 + n2 + 7 (since n 0)
2 2 2
6n + n + 7n (since n2 0)
= 14n2
Therefore, with c=14 and n0=1, we can say T(n) is O(n2).

If T(n) = 6n2logn + 3n, prove T(n) is O(n2logn).


Proof: T(n) = 6n2logn + 3n
6n2logn + 3n*nlogn (if n2, since log(1) = 0)
= 6n2logn + 3n2logn
= 9n2logn
Therefore, with c=9 and n0=2, we can say T(n) is O(n2logn).

If T(n) = 500n7 + 12*2n 10n3, prove T(n) is O(2n).


Proof: T(n) = 500n7 + 12*2n 10n3
500n7 + 12*2n (if n0, since 10n3 0)
500*2n + 12*2n (if 2n n7, which means n7logn => n37)
= 512*2n
Therefore, with c=512 and n0=37, we can say T(n) is O(2n).

Asymptotic Notation (Big O) Page 1 of 3


Common Function Growth (as n Grows)
The table below shows the values of a variety of common functions in terms of problem size, n. We
might have several of these as terms in a running time function, T(n). For example, a running time
n
function, T(n), might be determined to be T(n) = 2 + 32*n2 + n*logn , which would include 2n, n2,
and nlogn terms.

Notice that as you move further to the right in this table, the growth rate of the functions increase
more dramatically as the problem size increases. In fact, with 2n, only small problem sizes can be
solved in a reasonable amount of time. This tells us that if we have an algorithm that has a running
time function with a 2n term in it, we probably wont be able to solve the problem for sizes much
greater than n=100.

The Values for Common Functions


at Various n Values

n logn n n nlogn n2 n3 n5 2n
1 0 1 1 0 1 1 1 2
2 1 1 2 2 4 8 32 4
4 2 2 4 8 16 64 1024 16
10 3 3 10 33 100 1000 100000 1024
20 4 4 20 86 400 8000 3200000 1048576
40 5 6 40 213 1600 64000 1.02E+08 1.1E+12
100 7 10 100 664 10000 1000000 1E+10 1.3E+30
1000 10 32 1000 9966 1000000 1.0E+9 1E+15
Notes: The asymptotic growth rate (i.e. as n gets larger) of 2n will eventually be greater than na for
any constant value of a.
The asymptotic growth rate (i.e. as n gets larger) of two terms multiplied together (each
dependent on n) will be greater than the individual growth rates of the individual terms.

How does this information help us assess algorithm efficiency?

When a function has this form:

T(n) = g1(n) + g2(n) + gk(n)

Only one of the terms, gi(n), will be important to consider as n becomes large, because the others will
eventually (as n becomes larger) contribute a negligible amount to the value of T(n). The definition of
Big O and proofs on the previous page show that T(n) will be O(gi(n)), if gi(n) is the dominating term
in T(n).

Asymptotic Notation (Big O) Page 2 of 3


Example:
Consider T(n) = 200 + 500*n + 2*n2

The table below shows how T(n) and each of its three terms grows as n grows. We see that when n
becomes large, the 2*n2 term dominates and we could show that T(n) is O(2*n2). By convention,
we would express O(2*n2) as O(n2) since the notion of a constant factor is built into the definition of
Big O.

2
n 200 500*n 2*n T(n) = 200 + 500*n + 2*n2
1 200 500 2 702
5 200 2500 50 2750
10 200 5000 200 5400
50 200 25000 5000 30200
100 200 50000 20000 70200
500 200 250000 500000 750200
1000 200 500000 2000000 2500200
5000 200 2500000 50000000 52500200
10000 200 5000000 200000000 205000200

The Bottom Line

As computer programmers, we can simply look at a running time function and determine its growth
rate, in terms of Big O, by identifying the largest additive term. Here are some examples:

T(n) = 500 + 32logn + 13n is O(n)


T(n) = 50n + 32n2 + 0.002n3 is O(n3)
T(n) = 5n + 32n5 + 7*2n is O(2n)
T(n) = 15n + 3n2 + 7n*(logn)*n2 is O(n3logn) since 7n*(logn)*n2 is equal to 7n3logn
T(n) = 3n2(n-1) + 7 n2*(logn) is O(n3) since 3n2(n-1) is equal to 3n3 - 3n2 and 3n3 is
the largest term in terms of growth rate.

Asymptotic Notation (Big O) Page 3 of 3

Você também pode gostar