Você está na página 1de 12

2011

Quick Sort Algorithm

Submitted to: Prof.


Muhammad Azam
Submitted by:

Asad Ali 9124


Aamir Shahzad 9118
Mudassadr Iqbal
Asad_Ali 9131
[Type the company name]
2011
2011 [QUICK SORT ALGORITHM]

-: Contents :-

Quicksort………………………………………………… 3

Description of quicksort: (explanation)………… 3

Algorithm………………………………………………… 3

Implementation…………………………………………. 4

Partitioning Analysis…………………………………. 7

QuickSort Analysis…………………………………….. 9

Competitive Analysis…………………………………... 10

References……………………………………………….. 12

2
2011 [QUICK SORT ALGORITHM]

Quicksort:
Quicksort is a sorting algorithm developed by C. A. R. Hoare that, on average, makes
O(nlogn) (big O notation) comparisons to sort n items. In the worst case, it makes O(n2)
comparisons, though if implemented correctly this behavior is rare. Typically, quicksort is
significantly faster in practice than other O(nlogn) algorithms, because its inner loop can be
efficiently implemented on most architectures, and in most real-world data it is possible to
make design choices that minimize the probability of requiring quadratic time.
Additionally, quicksort tends to make excellent usage of the memory hierarchy, taking
perfect advantage of virtual memory and available caches.

 Description of quicksort: (explanation)

Quicksort, like merge sort, is based on the divide-and-conquer paradigm.


Here is the three-step divide-and-conquer process for sorting a
typical subarray A[p . . r].

Divide: Partition (rearrange) the array A[p . . r] into two (possibly empty) subarrays
A[p . . q −1] and A[q +1 . . r] such that each element of A[p . . q −1] is
less than or equal to A[q], which is, in turn, less than or equal to each element
of A[q + 1 . . r]. Compute the index q as part of this partitioning procedure.

Conquer: Sort the two subarrays A[p . . q−1] and A[q +1 . . r] by recursive calls
to quicksort.

Combine: Since the subarrays are sorted in place, no work is needed to combine
them: the entire array A[p . . r] is now sorted.

 Algorithm:

The following procedure implements quicksort.

QUICKSORT(A, p, r)
1 if p < r
2 then q PARTITION(A, p, r)
3 QUICKSORT(A, p, q − 1)
4 QUICKSORT(A, q + 1, r)
To sort an entire array A, the initial call is QUICKSORT(A, 1, length[A]).

3
2011 [QUICK SORT ALGORITHM]

Partitioning the array


The key to the algorithm is the PARTITION procedure, which rearranges the subarray
A[p . . r] in place.

PARTITION(A, p, r)
1x A[r]
2i p− 1
3 for j p to r − 1
4 do if A[ j ] ≤ x
5 then I i+1
6 exchange A[i ] A[ j ]
7 exchange A[i + 1] A[r]
8 return i + 1

Quicksort sorts by employing a divide and conquer strategy to divide a list into two sub-
lists.The steps are:

1. Pick an element, called a pivot, from the list.

2. Reorder the list so that all elements with values less than the pivot come before the
pivot,while all elements with values greater than the pivot come after it (equal values can
go either way). After this partitioning, the pivot is in its final position. This is called the
partition operation.

3. Recursively sort the sub-list of lesser elements and the sub-list of greater elements.The
base case of the recursion are lists of size zero or one, which never need to be sorted.

 Implementation:

The Implementation of quicksot can be perform with the many methods that depends on
its partitions as following:

Partitioning - Choice 1:

 First n-1 elements into set A, last element set B

 Sort A using this partitioning scheme recursively

 B already sorted

 Combine A and B using method Insert() (= insertion into sorted array)

 Leads to recursive version of InsertionSort()

 Number of comparisons: O(n 2)

4
2011 [QUICK SORT ALGORITHM]

Partitioning - Choice 2:

 Put element with largest key in B, remaining elements in A

 Sort A recursively

 To combine sorted A and B, append B to sorted A

 Use Max() to find largest element recursive SelectionSort()

 Use bubbling process to find and move largest element to right-most position
recursive BubbleSort()

 All O(n2)

Partitioning - Choice 3:

 Let’s try to achieve balanced partitioning

 A gets n/2 elements, B gets rest half

 Sort A and B recursively

 Combine sorted A and B using a process called merge, which combines two sorted
lists into one

Example:

5
2011 [QUICK SORT ALGORITHM]

Example (Merging After sorting):

Explanation with the pivo t picking example:

We take an array an pick a pivot point and then we apply the method on it as like:

Sorting the numbers bigger than pivot in one side and less on the other side:

6
2011 [QUICK SORT ALGORITHM]

At the end we will got the final sorted array:

 Partitioning Analysis:

The running time of quicksort depends on whether the partitioning is balanced or


unbalanced, and this in turn depends on which elements are used for partitioning.
If the partitioning is balanced, the algorithm runs asymptotically as fast as merge
sort. If the partitioning is unbalanced, however, it can run asymptotically as slowly
as insertion sort. In this section, we shall informally investigate how quicksort
performs under the assumptions of balanced versus unbalanced partitioning.

Worst-case partitioning:
The worst-case behavior for quicksort occurs when the partitioning routine produces
one subproblem with n − 1 elements and one with 0 elements.
Let us assume that this unbalanced partitioning arises
in each recursive call. The partitioning costs _(n) time. On an array of size 0 just returns, T
(0) = _(1), and the recurrence for the runningtime is
T (n) = T (n − 1) + T (0) + _(n)
= T (n − 1) + _(n) .
Intuitively, if we sum the costs incurred at each level of the recursion, we get
an arithmetic series (equation (A.2)), which evaluates to _(n2). Indeed, it is
straightforward to use the substitution method to prove that the recurrence
T (n) =T (n − 1) + _(n)
has the solution
T (n) = _(n2).
Thus, if the partitioning is maximally unbalanced at every recursive level of the
algorithm, the running time is _(n2). Therefore the worst-case running time of
quicksort is no better than that of insertion sort. Moreover, the _(n2) running time
occurs when the input array is already completely sorted—a common situation in
which insertion sort runs in O(n) time.

7
2011 [QUICK SORT ALGORITHM]

Best-case partitioning:
In the most even possible split, PARTITION produces two subproblems, each of
size no more than n/2, since one is of size _n/2_ and one of size _n/2_−1. In this
case, quicksort runs much faster. The recurrence for the running time is then
T (n) ≤ 2T (n/2) + _(n) ,
which by case 2 of the master theorem has the solution
T (n) =O(n lg n).
Thus, the equal balancing of the two sides of the partition at every level
of the recursion produces an asymptotically faster algorithm.

Balanced partitioning:
The average-case running time of quicksort is much closer to the best case than to
the worst case. The key to understanding why is to understand how the balance of the
partitioning is reflected in the recurrence that describes the running time.
Suppose, for example, that the partitioning algorithm always produces a 9-to-1
proportional split, which at first blush seems quite unbalanced. We then obtain the
recurrence.
T (n) ≤ T (9n/10) + T (n/10) + cn
On the running time of quicksort, where we have explicitly included the constant c
hidden in the _(n) term. Figure 7.4 shows the recursion tree for this recurrence.
Notice that every level of the tree has cost cn, until a boundary condition is reached
at depth log10 n = _(lg n), and then the levels have cost at most cn. The recursion
terminates at depth log10/9 n = _(lg n). The total cost of quicksort is

8
2011 [QUICK SORT ALGORITHM]

therefore O(n lg n). Thus, with a 9-to-1 proportional split at every level of recursion,
which intuitively seems quite unbalanced, quicksort runs in O(n lg n)
time—asymptotically the same as if the split were right down the middle. In fact,
even a 99-to-1 split yields an O(n lg n) running time. The reason is that any split
of constant proportionality yields a recursion tree of depth _(lg n), where the cost
at each level is O(n). The running time is therefore O(n lg n) whenever the split
has constant proportionality.

 QuickSort Analysis:

The worst-case behavior of quicksort and for why we expect it to run quickly. In this
section, we analyze the behavior of quicksort more rigorously. We begin with a worst-case
analysis, which applies to either QUICKSORT, and conclude with an average-case.

Worst-case analysis:
Worst-case split at every level of recursion in quicksort produces a _(n2) running time,
which, intuitively, is the worst-case running time of the algorithm. We now prove this
assertion. Using the substitution method (see Section 4.1), we can show that the running
time of quicksort is O(n2). Let T (n) be the worst-case time for the procedure
QUICKSORT on an input of size n. We have the recurrence
T (n) = max
0≤q≤n−1
(T (q) + T (n − q − 1)) + _(n) , (7.1)
where the parameter q ranges from 0 to n − 1 because the procedure PARTITION
produces two subproblems with total size n − 1. We guess that T (n) ≤ cn2 for
some constant c. Substituting this guess into recurrence, we obtain
T (n) ≤ max
0≤q≤n−1
(cq2 + c(n − q − 1)2) + _(n)
= c · max
0≤q≤n−1
(q2 + (n − q − 1)2) + _(n) .
The expression q2+(n−q−1)2 achieves a maximum over the parameter’s range
0 ≤ q ≤ n − 1 at either endpoint, as can be seen since the second derivative of
the expression with respect to q is positive (see Exercise 7.4-3). This observation
gives us the bound max0≤q≤n−1(q2 + (n − q − 1)2) ≤ (n − 1)2 = n2 − 2n + 1.
Continuing with our bounding of T (n), we obtain
T (n) ≤ cn2 − c(2n − 1) + _(n)
≤ cn2 ,
since we can pick the constant c large enough so that the c(2n − 1) term dominates
the _(n) term. Thus, T (n) = O(n2.

9
2011 [QUICK SORT ALGORITHM]

Expected running time:


We have already given an intuitive argument why the average-case running time
of is O(n lg n): if, in each level of recursion, the split induced by RANDOMIZED-PARTITION
puts any constant fraction of the elements on one side of the partition, then the recursion
tree has depth _(lg n), and O(n) work is performed at each level. Even if we add new levels
with the most unbalanced split possible between these levels, the total time remains O(n lg
n). We can analyze the expected running time of RANDOMIZED-QUICKSORT precisely
by first understanding how the partitioning procedure operates and then using this
understanding to derive an O(n lg n) bound on the expected running time. This
upper bound on the expected running time, combined with the _(n lg n) best-case
bound.

 Competitive Analysis:

Competitive analysis is a method invented for analyzing online algorithms, in which the
performance of an online algorithm (which must satisfy an unpredictable sequence of
requests, completing each request without being able to see the future) is compared to the
performance of an optimal offline algorithm that can view the sequence of requests in
advance. An algorithm is competitive if its competitive ratio—the ratio between its
performance and the offline algorithm's performance—is bounded. Unlike traditional
worst-case analysis, where the performance of an algorithm is measured only for "hard"
inputs, competitive analysis requires that an algorithm perform well both on hard and easy
inputs, where "hard" and "easy" are defined by the performance of the optimal offline
algorithm.

For many algorithms, performance is dependent on not only the size of the inputs, but also
their values. One such example is the quicksort algorithm, which sorts an array of elements.
Such data-dependent algorithms are analysed for average-case and worst-case data.
Competitive analysis is a way of doing worst case analysis for on-line and randomized
algorithms, which are typically data depende.

Comparison with other sorting algorithms:


Quicksort is a space-optimized version of the binary tree sort. Instead of inserting items
sequentially into an explicit tree, quicksort organizes them concurrently into a tre e that is
implied by the recursive calls. The algorithms make exactly the same comparisons, but in a
different order.

The most direct competitor of quicksort is heapsort. Heapsort's worst-case running time is
always \mathcal{O}(n \log n). But, heapsort is assumed to be on average somewhat slower
than quicksort. This is still debated and in research, with some publications indicating the

10
2011 [QUICK SORT ALGORITHM]

opposite. In Quicksort remains the chance of worst case performance except in the
introsort variant, which switches to heapsort when a bad case is detected. If it is known in
advance that heapsort is going to be necessary, using it directly will be faster than waiting
for introsort to switch to it.

Quicksort also competes with mergesort, another recursive sort algor ithm but with the
benefit of worst-case \mathcal{O}(n \log n) running time. Mergesort is a stable sort, unlike
quicksort and heapsort, and can be easily adapted to operate on linked lists and very large
lists stored on slow-to-access media such as disk storage or network attached storage.
Although quicksort can be written to operate on linked lists, it will often suffer from poor
pivot choices without random access. The main disadvantage of mergesort is that, when
operating on arrays, it requires \mathcal{O}(n) auxiliary space in the best case, whereas
the variant of quicksort with in-place partitioning and tail recursion uses only
\mathcal{O}(\log n) space. (Note that when operating on linked lists, mergesort only
requires a small, constant amount of auxiliary storage.)

Bucket sort with two buckets is very similar to quicksort; the pivot in this case is effectively
the value in the middle of the value range, which does well on average for uniformly
distributed inputs.

Other Types of sorting Algorithms:

11
2011 [QUICK SORT ALGORITHM]

References:

 http://en.wikipedia.org/wiki/Pivot_element
 http://en.wikipedia.org/wiki/Quicksort#External_links
 http://www.wepapers.com/Papers/79380/Quicksort_Algorithm

12

Você também pode gostar