Você está na página 1de 26

Analysis & Design Of

Algorithms (ADA)
Unit - 1
Introduction.

12/08/21 1
Introduction…

Algorithm – Design & Analysis.


• Algorithmic is a branch of computer science that consists of
designing and analyzing computer algorithms

– The “design” deals with


• The description of algorithm by means of a pseudo
language.
• Proof of correctness that is, the algorithm solves the
given problem in all cases.

– The “analysis” deals with performance evaluation.


• Two important ways to characterize the effectiveness
of an algorithm are its
space complexity
12/08/21 time complexity 2
Intro…

Algorithm Design…
Approaches :
1. Brute Force - It can be defined as a straightforward approach
to solving a problem, usually directly based on the problem's statement
and definitions of the concepts involved.
Eg: sum of n numbers, finding the largest element in a list, adding two
matrices, etc.
2. Incremental and
3. Divide & Conquer.

Techniques : are part of one of the above approaches.


- give guidance and direction on how to create a new
algorithm. Though there are literally thousands of
algorithms, there are very few design techniques,

The common techniques we will look at are as


follows (in the next slide) .
12/08/21 3
Intro…
Algorithm design techniques …

Decrease-and-conquer (under Divide & Conquer


approach).
– Solving a problem by reducing its instance to a
smaller one & then using incremental approach, ie.,
solving the smaller instance (recursively or
otherwise), and then extending the obtained solution
to get a solution to the original instance. Eg: Insertion
sort
– There is another special case of the decrease-and-
conquer technique covers the size reduction by a
constant factor. Eg: binary search
– Other special cases of the technique covers more
sophisticated situations of the variable-size reduction
Eg: Euclid's algorithm
12/08/21 4
Intro…

Algorithm design techniques …

Transform-and-conquer. Again, one can identify several


flavors of this tecnique (under Divide & Conquer
approach).
1.Simplification --- solves a problem by first transforming its
instance to another instance of the same problem (and of
the same size) with some special property which makes
the problem easier to solve.
2.Representation change -- is based on a transformation of
a problem's input to a different representation, which is
more conductive to an efficient algorithmic solution.
3.Preprocessing -- can be considered as yet another
variety of the transformation strategy. The idea is to
process a part of the input or the entire input to get some
auxiliary information which speeds up solving the problem.
E.g. Presorting
12/08/21 5
Analysis of algorithms
• Issues:
– Correctness – Is the algorithm correct?
– Time efficiency – How much time does the algorithm
use?
– Space efficiency – How much extra space does the
algorithm use?
– Optimality – Is the algorithm best compared to other
algorithm?
• Approaches:
– Theoretical analysis – Proof of correctness ,big oh and
other notation
– Empirical analysis – Testing and measurement over
instances
12/08/21 6
Theoretical analysis of time efficiency
• Time efficiency is analyzed by determining the
number of repetitions of the basic operation as a
function of input size

 Basic operation : The operation that contributes the


most towards the running time of the algorithm
input size

T (n) ≈ cop C(n)


running time execution time Number of times
for basic operation basic operation is
or cost executed

12/08/21 7
Input size and basic operation examples

Problem Input size measure Basic operation

Searching for key Number of list’s


Key comparison
in a list of n items items, i.e. n
Matrix dimensions or
Multiplication of Multiplication of
total number of
two matrices two numbers
elements
n’size = number of
Checking primality
digits (in binary Division
of a given integer n
representation)
Visiting a vertex
Typical graph #vertices and/or
or traversing an
problem edges
edge
12/08/21 8
Empirical analysis of time efficiency

• Select a specific (typical) sample of inputs

• Use physical unit of time (e.g., milliseconds)


or
Count actual number of basic operation’s executions

• Analyze the empirical data

12/08/21 9
Introduction …

Algorithm – Design & Analysis…


• Time Complexity: It is the number of
elementary instructions that a program
executes. This number is computed with
respect to the size n of input data.

• Space Complexity: It is the number of


elementary objects that a program needs to
store during its execution. This number is
computed with respect to the size n of input
data.
12/08/21 10
Introduction…

Algorithm – Performance cases :


In computer science,
best,
worst and
average cases of a given algorithm
express
what the resource usage is
at least,
at most and
12/08/21
on average , respectively. 11
Introduction…

Perrformance cases… :

• The term best-case performance is used in


computer science to describe the way an
algorithm behaves under optimal conditions.

• Worst case performance is used to analyze


an algorithm, so that it may be possible to
find the longest possible path through the
algorithm.
12/08/21 12
Introduction…

Performance cases… :
• Average case performance is used to do with
the mathematical average of all cases.
Average performance and worst-case
performance are the most used in
algorithm analysis. Less widely found is
best-case performance but it does have
uses,
for example knowing the best cases of
individual tasks can be used to improve
accuracy of an overall worst-case
analysis.
12/08/21 13
Order of growth
• It is not necessary to compute the efficiency of algorithm for smaller inputs.
• We will find the difference and importance in algorithm’s efficiency only when
the input is large.
• For larger values of n, count depends upon the function’s order of growth.
• Order of growth can be expressed as
1. Logarithmic.
2. Linear.
3. Quadratic.
4. Cubic.
5. Exponential .
6. Factorial.
The function growing slowest among these is Logarithmic function and
exponential , factorial functions grow very fast.

12/08/21 14
Basic asymptotic efficiency classes :

1 constant
log n logarithmic
n linear
n log n n-log-n
n2 quadratic
n3 cubic
2n exponential
n! factorial
12/08/21 15
Values of some important functions as n  

Specific values of count of order of growth depends on logarithm's


base. Formula used to switch from one base to another is
log a n = log a b*log b n

12/08/21 16
Asymptotic analysis of algorithms
• We usually embark on an asymptotic worst case analysis
of the running time of the algorithm.
• Asymptotic:
– Formal, exact, depends only on the algorithm.
– Ignores constants.
– Applicable mostly for large input sizes.
• Worst Case:
– Bounds on running time must hold for all inputs.
– Thus the analysis considers the worst-case input.
– Sometimes the “average” performance can be much
better.
– Real-life inputs are rarely “average” in any formal sense.

12/08/21 17
Asymptotic order of growth
To compare and rank the Order of growth of basic
operations count, 3 notations are used. (big oh) ,
(big omega) and (big theta).

• O(g(n)): class of functions t(n) that grow no faster


than g(n).

• Θ(g(n)): class of functions t(n) that grow at same


rate as g(n).

• Ω(g(n)): class of functions t(n) that grow at least as


fast as g(n).

12/08/21 18
Big Oh Notation

• Let t(n) and g(n) be a non decreasing and non


negative functions of n. We say t(n) = O(g(n)) if there
exists positive constant c and n0 such that
t(n)  c.g(n) for all n  n0
t(n) grows no faster than g(n)

• f(n) = O(g(n)) means c.g(n) is an upper bound on


f(n). Thus there exists some constant c such that f(n)
is always  c. g(n) , for large enough n.  
Eg. : f(n)=4n+2 and g(n)=n .Prove  f(n) is O(g(n)).

12/08/21 19
Big-oh Notation

12/08/21 20
-notation

• Let t(n) and g(n) be a non decreasing and


non negative functions of n. We say,
t(n) = (g(n)) iff there exists positive
constant c and n0 such that,
t(n)  c.g(n) for all n  n0.
t(n) grows faster than g(n).
– f(n)= (g(n)) means c.g(n) is a lower bound on
f(n). Thus there exists some constant c such
that f(n) is always  c. g(n) , for large enough n.
Eg.: f(n)=n3 and g(n)=n2 .Prove  f(n) is (g(n)).
12/08/21 21
Omega Notation

12/08/21 22
-notation
• Formal definition
– A function t(n) is said to be in (g(n)), denoted
t(n)  (g(n)), if t(n) is bounded both above and
below by some positive constant multiples of g(n) for
all large n, i.e., if there exist some positive constant c1
and c2 and some nonnegative integer n0 such that
c2 g(n)  f(n)  c1 g(n) for all n  n0.
– f(n)= (g(n)) means c1.g(n) is an upper bound on f(n)
and c2.g(n) is a lower bound on f(n), for large enough
n. Thus there exists constants c1 and c2 such that
fn  c1. g(n) and fn  c2. g(n) .
Eg.: f(n)=1/2n(n-1) and g(n)=n2 .Prove  f(n) is (g(n)).

12/08/21 23
Theta Notation

12/08/21 24
Using Limits for comparing orders of growth

0 order of growth of T(n) < order of growth of g(n)

lim T(n)/g(n) = c order of growth of T(n) = order of growth of g(n)

n→∞ ∞ order of growth of T(n) > order of growth of g(n)

12/08/21 25
Asymptotic performance
When n gets large enough, a Θ(n2) algorithm always beats a Θ(n3)
algorithm.
• We shouldn’t ignore
asymptotically slower
algorithms, however.
• Real-world design
situations often call for
a careful balancing of
engineering objectives.
• Asymptotic analysis is a
useful tool to help to
structure our thinking.

(END –ada1bca)
12/08/21 26

Você também pode gostar