Escolar Documentos
Profissional Documentos
Cultura Documentos
SUBMITTED TO SUBMITTED BY
Praveen Kumar Khushboo Bhagchandani
1647220
(IVMCA)
Que1. Consider one real-time application and explain how algorithms
are useful.
Link analysis is arguably one of the algorithms with the most myths
and confusion in the general public. The problem is that there are
different ways to make link analysis and there are also characteristics
that make each algorithm a little different (which allows to patent the
algorithms) but in their bases they are similar.
The idea behind link analysis is simple, you can represent a graph in a
Matrix form making it a eigenvalue problem. This eigenvalues can
give you a really good approach of the structure of the graph and the
relative importance of each node. The algorithm was developed in
1976 by Gabriel Pinski and Francis Narin.
Facebook when it shows you your news feed (this is the reason why
Facebook news feed is not an algorithm but the result of one),
Google+ and Facebook friend suggestion, LinkedIn suggestions for
jobs and contacts, Netflix and Hulu for movies, YouTube for videos,
etc. Each one has a different objective and different parameters, but
the math behind each remains the same.
The above definition means, if f(n) is theta of g(n), then the value f(n)
is always between c1*g(n) and c2*g(n) for large values of n (n >=
n0). The definition of theta also requires that f(n) must be non-
negative for values of n greater than n0.
Fig1:Graph of (g(n))
The Big O notation is useful when we only have upper bound on time
complexity of an algorithm. Many times we easily find an upper
bound by simply looking at the algorithm.
Let us consider the same Insertion sort example here. The time
complexity of Insertion Sort can be written as (n), but it is not a
very useful information about insertion sort, as we are generally
interested in worst case and sometimes in average case.
4) -notation:
The asymptotic upper bound provided by O-notation may or may not
be asymptotically tight. The bound 2n2 = O(n2) is asymptotically
tight, but the bound 2n = O(n2) is not. We use o-notation to denote an
upper bound that is not asymptotically tight. We formally define
o(g(n)) ("little-oh of g of n") as the set
5) w-Notation:
Definition : Let f(n) and g(n) be functions that map positive integers
to positive real numbers. We say that f(n) is (g(n)) (or f(n)
(g(n))) if for any real constant c > 0, there exists an integer constant
n0 1 such that f(n) > c * g(n) 0 for every integer n n0.
f(n) has a higher growth rate than g(n) so main difference between
Big Omega () and little omega () lies in their definitions. In the
case of Big Omega f(n)=(g(n)) and the bound is 0<=cg(n)0, but in
case of little omega, it is true for all constant c>0.
We use notation to denote a lower bound that is not asymptotically
tight.
f(n) (g(n)) if and only if g(n) ((f(n)).
In mathematical relation,
if f(n) (g(n)) then, lim f(n)/g(n) = ; n
Ans.
Algorithm Time Space
Complexity Complexity
Best Average Worst Worst
Quick sort (nlog(n)) (n log(n)) O(n^2) O(log(n))
Merge sort (nlog(n)) (n log(n)) O(n log(n)) O(n)
Heap sort (nlog(n)) (n log(n)) O(nlog(n)) O(1)
Bubble Sort (n) (n^2) O(n^2) O(1)
Insertion Sort (n) (n^2) O(n^2) O(1)
Selection Sort (n^2) (n^2) O(n^2) O(1)
Shell Sort (n log(n)) (n(log(n))^2) O(n(log(n))^ O(1)
2)
Radix Sort (nk) (nk) O(nk) O(n+k)
a) Insertion Sort:
Insertion sort is a simple sorting algorithm that builds the final sorted
array (or list) one item at a time. It is much less efficient on large lists
than more advanced algorithms such as quicksort, heapsort, or merge
sort. However, insertion sort provides several advantages:
c) Merge Sort:
d) Bubble Sort:
f) Heap Sort:
1. Remove the topmost item (the largest) and replace it with the
rightmost leaf. The topmost item is stored in an array.
3. Repeat steps 1 and 2 until there are no more items left in the heap.
g) Radix Sort:
h) Shell Sort:
Shell Sort is mainly a variation of Insertion Sort. In insertion sort,
we move elements only one position ahead. When an element has to
be moved far ahead, many movements are involved. The idea of
shellSort is to allow exchange of far items. In shellSort, we make the
array h-sorted for a large value of h. We keep reducing the value of
h until it becomes 1. An array is said to be h-sorted if all sublists of
every hth element is sorted.