Você está na página 1de 27

Asymptotic function

Time Complexity
Algorithm arraymax (A,n) { max A[0] for i1 to n do If max<A[i] then Maxa[i] Return max }

Time Complexity
Algorithm sum(A,n) { S0; For i0 to n do SS+A[i]; Return S; }

Time Complexity
Algorithm fibo ( loopmax) { f11; f21; for loop1 to loopmax If loop<3 f1 Else { ff1+f2 f1f2 f2f } }

Time Complexity
Consider two functions x2/8 and 3*(x-2) Find the rate growth of these two functions X=22 on wards x2/8 grows faster than 3*(x-2) Smaller input values hide dramatic differences Cant find exactly how many operations are required to execute an algorithm May vary depends on the input size

Rate of growth
2n, n2, nlogn ,n , logn Faster growing functions increase at a significant rate, they quickly dominate the slower growing functions Rate of growth is determined by the largest value rather than smallest & constant value

Classification based on the largest value is asymptotic order or order of function or asymptotic growth or growth classification

Growth classification
Big-oh notation(O) Big- omega() Big-theta()

Big-oh
Let f(n) and g(n) be functions mapping nonnegative integers to real numbers Can say f(n) is O(g(n)) if there is a real constant c>0 and an integer constant n0>=1.f(n)<=cg(n) for every integer n>=n0.can say f(n) is order g(n) describes worst case analysis

Big-oh
Show that 7n-2 is O(n) Find the order of the function 3n+2 Find the order of the function 3n+3 Find the order of the function 10n2+4n+2

Big-oh provides an asymptotic way of saying that a function is less than or equal to another function

Big-omega
If g(n) is O(f(n)) that there is a real constant c>0 and an integer constant n0>=1 then f(n)>=c g(n) This says that asymptotically that one function is greater than or equal to another function up to a constant factor Describes best case running time of an algorithm

Big-omega
Consider this function f(n)=3n+3. find the order of the function

Big-theta
f(n) is (g(n) if f(n) is O(g(n)) and f(n) is (g(n)) There are real constants c>0 & c>0 & an integer constant n0>=1 Cg(n)<=f(n)<=c(g(n)) for n>=n0 Big-theta allows us to say that 2 functions are asymptotically equal to a constant factor

Big-theta
Consider this function f(n)=3n+2. find the order of the function describes average case analysis

Growth of Functions

Best case
It requires the algorithm to take the shortest time Best case for an algorithm will consider usually small & frequently constant value Not doing very frequently

Worst case
Worst case for an algorithm is the input requires that the algorithm will do most work

Average case
Steps to calculate average case: 1.determine the no of different input groups into which the input have been classified 2.determine the probability for each of the input from the input groups 3. determine how long the algorithm will run for each of these input groups A(n)=

Average case
n= is the size of input m= no of groups pi= probability the input will be from group ti= time the algorithm takes for the input group

Problem
Write an algorithm that finds the middle or median value of 3 distinct integer. Find the worst case, best case & average case complexity & also the no of input groups.

solution
Algorithm middleware ( a, b, c ) { if( a>b ) { if (a<c ) then middle=a; elseif( b>c ) then middle=b; else middle=c; }

else { If (a>c) then middle=a; else if(b<c) then middle=b; else middle=c } }

Input classes
Input class 1: (no of comparison=3) 1. 1 2 3 2. 1 3 2 3. 3 1 2 4. 3 2 1 Input class 2: ( no of comparison=2) 1. 2 1 3 2. 2 3 1

Average case analysis


Best case: 2 operations Worst case: 3 operations Average case:

=1/2*t1+1/2*t2 =1/2(3+2) =5/2 =2.5 operations

Some Asymptotic Rules


If d(n) is O(f(n)), then ad(n) is O(f(n)), for any a>0 If d(n) is O(f(n)), and e(n) is O(g(n)), then d(n)+e(n) is O(f(n)+g(n)) If d(n) is O(f(n)), and e(n) is O(g(n)), then d(n)e(n) is O(f(n)g(n)) If d(n) is O(f(n)) and f(n) is O(g(n)), then d(n) is O(g(n)) If d(n) is O(f(n) and e(n) is O(g(n)),then d(n)+e(n) is Max(O(f(n), O(g(n))) If f(n) is a polynomial of degree d, ie f(n) = a0+a1n+ ... + adnd, then f(n) is O(nd) nx is O(an) for any fixed a>1 and x>0 Lognx is O(logn) for fixed k>0 Logx n is O(ny) for fixed constants x>0, y>0

Importance of Asymptotics
Table of max-size of a problem that can be solved in one second, one minute and one hour for various running times measures in microseconds

Advantages of growth rate


Growth rate of an algorithm helps to compare the relative efficiency of algorithms

Você também pode gostar