Você está na página 1de 30

Analysis of Algorithm

Asymptotic Analysis

What is the order of growth?


In the running time expression, when n becomes large a term will become significantly larger than the other ones: this is the so-called dominant term

T1(n)=an+b
Dominant term: a n

T2(n)=a log n+b


Dominant term: a log n

T3(n)=a n2+bn+c
T4(n)=an+b n +c (a>1)

Dominant term: a n2
Dominant term: an

What is the order of growth?


T1(kn)= a kn=k T1(n) Order of growth Linear Logarithmic

T2(kn)=a log(kn)=T2(n)+alog k
T3(kn)=a (kn)2=k2 T3(n) T4(n)=akn=(an)k

Quadratic
Exponential

How can be interpreted the order of growth?


Between two algorithms it is considered that the one having a smaller order of growth is more efficient However, this is true only for large enough input sizes Example. Let us consider T1(n)=10n+10 (linear order of growth) T2(n)=n2 (quadratic order of growth) If n<=10 then T1(n)>T2(n) Thus the order of growth is relevant only for n>10

Growth Rate
Growth Rate of Diferent Functions
300 250
Function Value
lg n n lg n n square n cube 2 raise to power n

200 150 100 50 0

32

12

51

20

81

Data Size

32

76 8

48

92

Growth Rates
n 0 1 2 4 8 16 32 64 128 256 512 1024 2048 lgn #NUM! 0 1 2 3 4 5 6 7 8 9 10 11 nlgn #NUM! 0 2 8 24 64 160 384 896 2048 4608 10240 22528 n2 0 1 4 16 64 256 1024 4096 16384 65536 262144 1048576 4194304 n3 0 1 8 64 512 4096 32768 262144 2097152 16777216 1.34E+08 1.07E+09 8.59E+09 2n 1 2 4 16 256 65536 4.29E+09 1.84E+19 3.4E+38 1.16E+77 1.3E+154

Constant Factors
The growth rate is not affected by

constant factors or lower-order terms 102n + 105 is a linear function 105n2 + 108n is a quadratic function

Examples

Order Notation
There may be a situation, e.g.
g(n)

f(n)
Tt 1
f(n) <= g(n)

n0
for all n >= n0 Or

f(n) <= cg(n)

for all n >= n0 and c = 1

g(n) is an asymptotic upper bound on f(n). f(n) = O(g(n)) iff there exist two positive constants c and n0 such that f(n) <= cg(n) for all n >= n0

Order Notation
Asymptotic Lower Bound: f(n) = (g(n)),
iff there exit positive constants c and n0 such that f(n) >= cg(n) for all n >= n0

f(n)

g(n) n0 n

Order Notation
Asymptotically Tight Bound: f(n) = (g(n)),
iff there exit positive constants c1 and c2 and n0 such that c1 g(n) <= f(n) <= c2g(n) for all n >= n0

c2g(n)
f(n) c1g(n)

n0

This means that the best and worst case requires the same amount of time to within a constant factor.

Big-Oh Notation
Given functions f(n) and g(n), we say that f(n) is O(g(n)) if there are positive constants c and n0 such that
f(n) cg(n) for n n0 Example: 2n + 10 is O(n)

10,000

3n
1,000

2n+10 n

100

10

2n + 10 cn (c 2) n 10 n 10/(c 2) Pick c = 3 and n0 = 10

1 1 10

100

1,000

Big-Oh Example
Example: the function n2 is not O(n)

1,000,000 100,000 10,000 1,000 100 10 1 1

n^2 100n 10n n

cn nc The above inequality cannot be satisfied since c must be a constant n2

10

100

1,000

More Big-Oh Examples


7n-2 7n-2 is O(n) need c > 0 and n0 1 such that 7n-2 cn for n n0 this is true for c = 7 and n0 = 1

3n3 + 20n2 + 5
3n3 + 20n2 + 5 is O(n3) need c > 0 and n0 1 such that 3n3 + 20n2 + 5 cn3 for n n0 this is true for c = 4 and n0 = 21

3 log n + log log n


3 log n + log log n is O(log n) need c > 0 and n0 1 such that 3 log n + log log n clog n for n n0 this is true for c = 4 and n0 = 2

Big-Oh and Growth Rate


The big-Oh notation gives an upper bound on the growth rate of a function The statement f(n) is O(g(n)) means that the growth rate of f(n) is no more than the growth rate of g(n) We can use the big-Oh notation to rank functions according to their growth rate
f(n) is O(g(n)) g(n) grows more Yes g(n) is O(f(n)) No

f(n) grows more Same growth

No Yes

Yes Yes

Relatives of Big-Oh
big-Omega big-Theta little-oh ? little-omega ?

Intuition for Asymptotic Notation


Big-Oh f(n) is O(g(n)) if f(n) is asymptotically less than or equal to g(n) big-Omega f(n) is (g(n)) if f(n) is asymptotically greater than or equal to g(n) big-Theta f(n) is (g(n)) if f(n) is asymptotically equal to g(n)

Big-Oh Rules
If is f(n) a polynomial of degree d, then f(n) is O(nd), i.e.,
1.

2.

Drop lower-order terms Drop constant factors Say 2n is O(n) instead of 2n is O(n2) Say 3n + 5 is O(n) instead of 3n + 5 is O(3n)

Use the smallest possible class of functions

Use the simplest expression of the class

Concept of Basic Operation


Time efficiency is analyzed by determining the number of repetitions of the basic operation as a function of input size Basic operation: the operation that contributes most towards the running time of the algorithm.

As a rule, the basic operation is located in its innermost loop

Basic Operation in sorting?

Input size and basic operation examples

Problem

Input size measure

Basic operation Key comparison Floating point multiplication Floating point multiplication Visiting a vertex or traversing an edge

Search for key in list of n items


Multiply two matrices of floating point numbers

Number of items in list n


Dimensions of matrices

Compute an
Graph problem

n
#vertices and/or edges

Examples(1)
Sum=0; For(i=1;i<=n;i++) For(j=1;j<=n;j++) Sum++;

Examples (2)
Sum=0; For(i=1;i<=n;i++) For(j=1;j<=i;j++) Sum++;

Examples(3)
Non Recursive
Matrix multiplication Selection sort Insertion sort

Recursive
Factorial Binary Search

Recursive Binary Search


int recSearch(int a[], int lb, int ub, int value) { //Recursive binary search routine int half; if(lb > ub) return -1; //value is not in the array half = (lb+ub) / 2; if(a[half] == value) //value is in the array return half; //return value's location else if(a[half] > value) return recSearch(a, lb, half-1, value); //search lower half of array else return recSearch(a, half+1, ub, value); //search upper half of array }

Recursive Binary Search Analysis


int a[100], value = 20; //assume array has been initialized recSearch(a, 0, 99, value); //initial call from main( )
INITIAL CALL FOR a[0 TO 99]

2nd CALL FOR a[0 TO 48]

or

2nd CALL FOR a[50 TO 99]

3rd CALL FOR a[0 TO 23]

or

3rd CALL FOR a[25 TO 48]

3rd CALL FOR a[50 TO 73]

or

3rd CALL FOR a[75 TO 99]

Binary Search
Algorithm check middle, then search lower or upper T(n) = T(n/2) + c
where c is some constant, the cost of checking the middle

Binary Search (cont)


Lets do some quick substitutions: T(n) = T(n/2) + c
but T(n/2) = T(n/4) + c, so

(1)

T(n) = T(n/4) + c + c T(n) = T(n/4) + 2c


T(n/4) = T(n/8) + c

(2)

T(n) = T(n/8) + c + 2c T(n) = T(n/8) + 3c

(3)

Binary Search (cont.)


Result at ith unwinding i

T(n) = T(n/2) + c
T(n) = T(n/4) + 2c T(n) = T(n/8) + 3c

1
2 3

T(n) = T(n/16) + 4c

Binary Search (cont)


Result at ith unwinding T(n) = T(n/2) + c =T(n/21) + 1c i 1

T(n) = T(n/4) + 2c
T(n) = T(n/8) + 3c T(n) = T(n/16) + 4c

=T(n/22) + 2c
=T(n/23) + 3c =T(n/24) + 4c

2
3 4

Binary Search (cont)


lets consider T(1) = c0 So, let: n/2k = 1 => n = 2k => k = log2n = lg n

Binary Search (cont.)


Substituting back in (getting rid of k): T(n) = T(1) + c lg(n) = c lg(n) + c0 = O( lg(n) )

Você também pode gostar