Escolar Documentos
Profissional Documentos
Cultura Documentos
Definitions
Data Structure is a way of organizing all data
items that considers not only the elements
stored but also their relationship to each other.
Logical and mathematical model of a
particular organization of data items is called
Data Structure.
Integer
Float
Character
Pointer
Array
Structure/Union
List
Array
An Array can be defined as a set of finite number of
homogeneous elements or data items. It means an
array can contain one type of data only.
The array are stored at consecutive memory locations
and are static by nature. When the user declarers array,
it occupy that memory location, either the user make
use of all that memory or not.
Linked-List
It can be defined as a collection of variable
number of data items. An element of a list must
contain at least two fields, one for storing data or
information and other for storing address of next
element. Each such element is referred to as a
node, therefore a list can be defined as a collection
of nodes.
Stack
A Stack is an ordered collection of objects that are
inserted and removed according to the last-in first-out
(LIFO) principle.
Insertion of element into stack is called Push and
Deletion of element into stack is called Pop. Insertion
and deletion of elements can be done only from one
end, called TOS or top of stack.
Stack
2
1. Push(2)
10
2. Push(10)
3. Pop(10)
4. Pop(2)
Queue
A Queue is an ordered collection of elements which
works on FIFO principle.
Elements can be inserted into a queue from one end
called REAR and elements can be deleted from the
other end called FRONT.
Tree
A tree can be defined as finite set of data items. Tree
is non-linear types of data structures in which data
items are arranged or stored in hierarchical
relationship.
Graph
A graph G(V,E) is a set of vertices V and a set of
edges E. An edge connects a pair of vertices and
many have weight such as length, cost or another
measuring instrument for recording the graph.
Memory Allocation
There are two types of memory allocations: 1) Compile-time or static allocation
int x,y;
2) Run-time or Dynamic allocation
1) Malloc()- malloc(no. of elements*size of elements)
2) Calloc()- malloc(no. of elements, size of elements)
3) Ralloc()- realloc(ptr_var, new_size)
4) Free()- free(ptr_var)
Algorithm
An algorithm is a well defined computational procedure
of finite steps which takes some values as input and
produces some value as output.
In other words an algorithm is a finite sequence of
computational steps that transform the input into the
output.
Complexity of Algorithms
Algorithmic complexity is concerned about how fast or slow
particular algorithm performs. We define complexity as a
numerical function T(n) - time versus the input size n. We want
to define time taken by an algorithm without depending on the
implementation details. The way around is to estimate efficiency
of each algorithm asymptotically. We will measure time T(n) as
the number of elementary "steps" (defined in any way), provided
each such step takes constant time.
Asymptotic Notation
When we look at input sizes, large enough to
make only the order of growth of the running
time relevant, we are studying the asymptotic
efficiency of algorithms.
Three notations
Big-Oh (O) notation
Big-Omega() notation
Big-Theta() notation
Asymptotic notations
Categories of algorithm
Seven functions that often appear in
algorithm analysis:
Constant 1
Logarithmic log n
Linear n
Log Linear n log n
2
Quadratic n
3
Cubic n
n
Exponential 2
Polynomials
2
d
f(n) = a0 + a1n+ a2n + .+adn
d is the degree of the polynomial
a0,a1.... ad are called coefficients.
Example
Algorithm Mystery(n)
sum 0
for i 0 to n 1 do
for j 0 to n 1 do
sum sum + 1
# operations
1
n+1
n (n + 1)
n .n
# operations
1
n+1
n
1
currentMax A[0]
for i 1 to n 1 do
if A [i] currentMax then
currentMax A [i]
return currentMax
# operations
1
n
n -1
n -1
1
Space complexity
Space complexity is a function describing the amount of
memory (space) an algorithm takes in terms of the amount of
input to the algorithm. We often speak of "extra" memory
needed, not counting the memory needed to store the input
itself. Again, we use natural (but fixed-length) units to
measure this. We can use bytes, but it's easier to use, say,
number of integers used, number of fixed-sized structures,
etc. In the end, the function we come up with will be
independent of the actual number of bytes needed to
represent the unit. Space complexity is sometimes ignored
because the space used is minimal and/or obvious, but
sometimes it becomes as important an issue as time.
Space complexity
For example, we might say "this algorithm takes n2 time,"
where n is the number of items in the input. Or we might say
"this algorithm takes constant extra space," because the
amount of extra memory needed doesn't vary with the
number of items processed. For both time and space, we are
interested in the asymptotic complexity of the algorithm:
When n (the number of items of input) goes to infinity, what
happens to the performance of the algorithm?