Você está na página 1de 5

Data Structures and Algorithms (COMP232)

Project #4: (Sorting Algorithms) Dr. Iyad Jaber .

Name : Majed Manasra . ID : 1100677 .

1- Insertion Sort
In pass p, we move the element in position p. left until its correct place is found among the first p + 1 elements. The element in position p is saved in temp, and all larger elements (prior to position p) are moved one spot to the right. Then temp is placed in the correct spot. Best Case: O (n) .when the data is already sorted, so the inner loop will fail immediately. Worst Case: O (n2) .happens when the data are in reverse order, so the inner loop will be entered always. Average Case: O (n2). ----------------------------------------------------------------------------------

2- Shell Sort
it works by comparing elements that are distant; the distance between comparisons decreases as the algorithm runs until the last phase . and we can pick an increment sequences that give a significant improvement in the algorithms running time , depends on the given data . The running time of Shellsort depends on the choice of increment sequence . ------------------------------------------------------------------------------------

3- Heap Sort
The binary heap data structures is an array that can be viewed as a complete binary tree. Each node of the binary tree corresponds to an element of the array. The array is completely filled on all levels except possibly lowest. since : PARENT (i) = (i/2) , LEFT (i) = 2*i , RIGHT (i) = (2*i) + 1 . for every node i other than the root, the value of a node is greater than or equal (at most) to the value of its parent , so the maximum value is the root of the binary tree . The HEAP SORT procedure takes time O(n log n), since Building the heap takes time O(n) and each of the n -1 calls to Heapify takes time O(lg n).

-----------------------------------------------------------------------------------

4- Merge Sort
The fundamental operation in this algorithm is merging two sorted lists. Because the lists are sorted, this can be done in one pass through the input, if the output is put in a third list. The basic merging algorithm takes two input arrays A and B, an output array C, and three counters, Actr, Bctr, and Cctr, which are initially set to the beginning of their respective arrays. The smaller of A[Actr] and B[Bctr] is copied to the next entry in C, and the appropriate counters are advanced. When either input list is exhausted, the remainder of the other list is copied to C . heap sort runtime O(n log n) . ------------------------------------------------------------------------------------

5-Quick Sort
choose any item, and then form three groups: those smaller than the chosen item, those equal to the chosen item, and those larger than the chosen item. Recursively sort the first and third groups, and then concatenate the three groups. The result is guaranteed by the basic principles of recursion to be a sorted arrangement of the original list , and its performance, is generally speaking, quite respectable on most inputs. In fact, if the list contains large numbers of duplicates with relatively few distinct items, as is sometimes the case, then the performance is extremely good. Best and Average case : O(n log n) . Worst Case : O(n2) . ------------------------------------------------------------------------------------

6- Radix Sort .
This is assuming that the least significant character is in the first position. The strings are sorted first by the least significant character, then by the next significant character, and so forth, until we reach the most significant character. It is important that we use a stable sort so that the order of the first sort is preserved when there are duplicate characters in the second sort and so forth. If we can count on our strings to be of small constant size, this is just O(n). The asymptotic analysis is valid, but for strings of say, length 80, with 256 possible characters in each position of each string, the value for n0 for which the actual running time of an O(n ln n) sort exceeds that of radix sort may be very large. In practice, radix sort is often much faster than Quicksort etc. for specialized data like short strings. -------------------------------------------------------------------------------------

7- Bubble Sort .

It works like this : we compare everey element with the element next to it , and if its smaller than the next element we break and move to next loop , but if its bigger we swap between them two and continue . the number of comparisons bubble sort makes is not affected by the values in the data, so the analysis for worst-case, best-case and average case are all the same and depend only on the for loops in the pseudocode. Clearly, the outer loop runs n times. The only complexity in this analysis in the inner loop. If we think about a single time the inner loop runs, we can get a simple bound by noting that it can never loop more than n times. Since the outer loop will make the inner loop complete n times, the comparison can't happen more than O(n2) times. --------------------------------------------------------------------------------------

8- Counting Sort.
The algorithm proceeds by defining an ordering relation between the items from which the set to be sorted is derived (for a set of integers, this relation is trivial).Let the set to be sorted be called A. Then, an auxiliary array with size equal to the number of items in the superset is defined, say B. For each element in A, say e, the algorithm stores the number of items in A smaller than or equal to e in B(e). If the sorted set is to be stored in an array C, then for each e in A, taken in reverse order, C[B[e]] = e. After each such step, the value of B(e) is decremented. The algorithm makes two passes over A and one pass over B. If size of the range k is smaller than size of input n, then time complexity=O(n). --------------------------------------------------------------------------------------

9- Bucket Sort .
At first algorithm divides the input array into buckets. Each bucket contains some range of input elements (the elements should be uniformly distributed to ensure optimal division among buckets).In the second phase the bucket sort orderes each bucket using , some other sorting algorithm, or by recursively calling itself with bucket count equal to the range of values, bucket sort degenerates to counting sort. Finally the algorithm merges all the ordered buckets. Because every bucket contains different range , of element values, bucket sort simply copies the elements of each bucket into the output array (concatenates the buckets). The asymptotic complexity of bucket sort is O(m C(n/m)), where n is size of the input array, m is the number of buckets and C(x) is the complexity of the inner sorting algorithm. --------------------------------------------------------------------------------------

10- Shaker Sort


Shaker sort is a bidirectional version of bubble sort . Hence every iteration of the algorithm consists of two phases. In the first one the lightest bubble ascends to the end of the array, in the second phase the heaviest bubble descends to the beginning of the array. Time complixity for shaker sort is O(n2) Finaly, we cannot consider one sort as the perfect sort and must be fast and effective all the time , we must pick our sorting algorithm depends on the data that we are about to sort . but we can consider the quick sort is a suitable algorithm for the most cases , because if we choose a good pivot we can get a short time O(n log(n)) , but in some conditions such as sorting a short strings , the radix sort is a lot faster than the quick .

Refrence: Data Structures and Algorithm Analysis in Java By Mark Allen wises . Websites : http://cs.utsa.edu/~dj/cs3343/lecture10.html http://www.sorting-algorithms.com/bubble-sort http://www.cse.iitk.ac.in/users/dsrkg/cs210/applets/sortingII/countingSort/count.html http://en.algoritmy.net/article/41160/Bucket-sort

Você também pode gostar