Ce diaporama a bien été signalé.
Le téléchargement de votre SlideShare est en cours. ×

Quick Sort By Prof Lili Saghafi

Publicité
Publicité
Publicité
Publicité
Publicité
Publicité
Publicité
Publicité
Publicité
Publicité
Publicité
Publicité
Prochain SlideShare
Quick sort
Quick sort
Chargement dans…3
×

Consultez-les par la suite

1 sur 32 Publicité

Quick Sort By Prof Lili Saghafi

Sorting is a classic subject in computer science.
There are three reasons for studying sorting algorithms.
First, sorting algorithms illustrate many creative approaches to problem solving
Second, sorting algorithms are good for practicing fundamental programming techniques using selection statements, loops, methods, and arrays.
Third, sorting algorithms are excellent examples to demonstrate algorithm performance.

Sorting is a classic subject in computer science.
There are three reasons for studying sorting algorithms.
First, sorting algorithms illustrate many creative approaches to problem solving
Second, sorting algorithms are good for practicing fundamental programming techniques using selection statements, loops, methods, and arrays.
Third, sorting algorithms are excellent examples to demonstrate algorithm performance.

Publicité
Publicité

Plus De Contenu Connexe

Diaporamas pour vous (20)

Similaire à Quick Sort By Prof Lili Saghafi (20)

Publicité

Plus par Professor Lili Saghafi (20)

Plus récents (20)

Publicité

Quick Sort By Prof Lili Saghafi

  1. 1. Prof. Lili Saghafi proflilisaghafi@gmail.com 1 Sorting Algorithms Algorithms and Data Structures Quick Sort By : Prof. Lili Saghafi proflilisaghafi@gmail.com @Lili_PLS
  2. 2. Prof. Lili Saghafi proflilisaghafi@gmail.com 2 2 Objectives –To study and analyze time complexity of various sorting algorithms –To design, implement, and analyze selection sort, insertion sort –To design, implement, and analyze bubble sort To design, implement, and analyze merge sort –– To design, implement, and analyzeTo design, implement, and analyze quick sortquick sort
  3. 3. Prof. Lili Saghafi proflilisaghafi@gmail.com 3 3 Why study sorting? Sorting is a classic subject in computer science. There are three reasons for studying sorting algorithmssorting algorithms. – First, sorting algorithms illustrate many creative approaches to problem solving – Second, sorting algorithms are good for practicing fundamental programming techniques using selection statements, loops, methods, and arrays. – Third, sorting algorithms are excellent examples to demonstrate algorithm performance.
  4. 4. Prof. Lili Saghafi proflilisaghafi@gmail.com 4 4 What data to sort? The data to be sorted might be integers, doubles, characters, or objects. For “Sorting Arrays,” for simplicity lets assume: • data to be sorted are integers, • data are sorted in ascending order, and • data are stored in an array. • The programs can be easily modified to sort other types of data, to sort in descending order, or to sort data in an ArrayList or a LinkedList.
  5. 5. Prof. Lili Saghafi proflilisaghafi@gmail.com 5 5 Sorting Arrays • Sorting, like searching, is also a common task in computer programming. • Sorting would be used, for instance, if you wanted to display the grades in alphabetical order. • Many different algorithms have been developed for sorting. • Selection Sort | Insertion Sort | Bubble Sort | Radix Sort | Merge Sort | Merge two sorted lists | QuickQuick SortSort | Partition in quick sort • Quicksort is a little more complex than selection and bubble • Best choice for general purpose and in memory sorting • Used to be the standard algorithm for sorting of arrays of primitive types in Java
  6. 6. Prof. Lili Saghafi proflilisaghafi@gmail.com 6 6 Quick Sort •Quick sort, developed by C. A. R. Hoare (1962), works as follows: – The algorithm selects an element, called the pivot, in the array. – Divide the array into two parts such that all the elements in the first part are less than or equal to the pivot and all the elements in the second part are greater than the pivot. – Recursively apply the quick sort algorithm to the first part and then the second part. Sir Charles Antony Richard Hoare 1980 Turing Award
  7. 7. Prof. Lili Saghafi proflilisaghafi@gmail.com 7 Quick Sort • Quicksort honoured as one of top 10 algorithms of 20th century in science and engineering. Basic plan. • Shuffle the array. • Partition so that, for some j – entry a[j] is in place – no larger entry to the left of j – no smaller entry to the right of j – Sort each piece recursively.
  8. 8. Prof. Lili Saghafi proflilisaghafi@gmail.com 8 Important factor in Quick Sort is Pivot PIVOT needs to have following condition after we sort it
  9. 9. Prof. Lili Saghafi proflilisaghafi@gmail.com 9 What we need to choose in next step We need to swap these two Now we need to repeat , choose item from left and item from right and if needed swap the two. Quicksort is Recursive sort
  10. 10. Prof. Lili Saghafi proflilisaghafi@gmail.com 10 • We repeat until we see that item from left has a greater index than item from right, so we know we're done .( LOW to HIGH , If HIGH is less than LOW and pivot ) • we STOP and swap item from left ( HIGH) with our pivot 3 , our pivot is now in its correct spot We swap item from left with are PIVOT
  11. 11. Prof. Lili Saghafi proflilisaghafi@gmail.com 11 Characteristic of PIVOT Quicksort is Recursive , we choose larger partition We choose 7 as are pivot and move it to the end So we swap 8 & 6 then we swap 8 and pivot Now we have our PIVOT 7 in the right place We leave Recursion handle the rest We move pivot to the end STOP When item from left has a greater index than item from right
  12. 12. Prof. Lili Saghafi proflilisaghafi@gmail.com 12 how do we choose the pivot Choose the median of the array •One popular method is called median of three •In this method we look at the first middle and last elements of the array we sort them properly and choose the middle item as our pivot •We're making the guess that the middle of these three items could be close to the median of the entire array and as you can see it's not too far off
  13. 13. Prof. Lili Saghafi proflilisaghafi@gmail.com 13 Why does it work? • On the partition step, algorithm divides thealgorithm divides the array into two partsarray into two parts and every element a from the left part is less or equal than every element b from the right part. • Also a and b satisfy a ≤ pivot ≤ b inequality. • After completion of the recursion calls both of the parts become sorted and, taking into account arguments stated above, the whole array is sorted.
  14. 14. Prof. Lili Saghafi proflilisaghafi@gmail.com 14 Quick Sort Partition How it works (Quick Sort Partition ) • In the beginning , First element as PIVOT • Element after PIVOT as LOW the last element as HIGH • Is LOW less than PIVOT ? Yes , Then move LOW to next element – If NOT then we move HIGH one to the right since list[high] > pivot, high was moved backward. – Is LOW greater than PIVOT ? Yes , – and IF HIGH is lower than PIVOT then we swap the low and high – If NOT then we move HIGH one to the right since list[high] > pivot, high was moved backward. – REPEAT until LOW and HIGH are equal . – If it is MORE than PIVOT then we move high to the right. – If it is LESS than Pivot then we swap high and pivot – Now the partition is complete and PIVOTE makes the array into two sub arrays for sorting . • The only thing we care about for now is that all these elements to the right of the wall are bigger than the pivot. • No actual order is assumed yet.
  15. 15. Prof. Lili Saghafi proflilisaghafi@gmail.com 15 Quick Sort How it works Quick Sort • Pivot is determined • All elements less than PIVOTE in one subarray • All elements more than PIVOTE in one subarray • Recursively apply Quiksort to first subarray until all elements sort • Recursively apply Quiksort to the second subarray until all elements sort • Now the entire array/list is sorted
  16. 16. Prof. Lili Saghafi proflilisaghafi@gmail.com 16 Analysis • Each time we repeat the process on the left and right sides of the array. we choose a pivot and do the comparisons and create another level of left and right subarrays. • This recursive call will continue until we reach the end when we've divided up the overall array into just subarrays of length 1. • From there, we know the array is sorted because every element has, at some point, been a pivot. • In other words, for every element, all the numbers to the left are lesser values than PIVOT and all the numbers to the right have greater values than PIVOT.
  17. 17. Prof. Lili Saghafi proflilisaghafi@gmail.com 17 Analysis continue… • This method works very well, if the value of the pivot chosen is approximately in the middle range of the list values. • This would mean that, after we move elements around, there are about as many elements to the left of the pivot as there are to the right. • And the dividethe divide--andand--conquerconquer nature of the Quicksort algorithm is then taken full advantage of it . RUNTIME COMPLEXITY • Best case: split in the middle — Θ(n log n) • Worst case: sorted array! — Θ(n2) • Average case: random arrays — Θ(n log n) Considered the method of choice for internal sorting of large files (n ≥ 10000)
  18. 18. Prof. Lili Saghafi proflilisaghafi@gmail.com 18 Runtime • This creates a runtime of O (n log n,) the n because we have to do n minus 1 comparisons on each generation, and log n because we have to divide the list log n times. • However, in the worst cases, this algorithm can actually be O (n squared.) • Suppose on each generation, the pivot just so happens to be the smallest or the largest of the numbers we're sorting. • This would mean breaking down the list, n times and making n minus 1 comparisons every single time. Thus, Big O of n squared is runtime.
  19. 19. Prof. Lili Saghafi proflilisaghafi@gmail.com 19 The importance of good Algorithms
  20. 20. Prof. Lili Saghafi proflilisaghafi@gmail.com 20 Hoare’s Partitioning Algorithm
  21. 21. Prof. Lili Saghafi proflilisaghafi@gmail.com 21 Pseudocode for Quicksort
  22. 22. Prof. Lili Saghafi proflilisaghafi@gmail.com 22 22 Quick Sort RunQuickSort
  23. 23. Prof. Lili Saghafi proflilisaghafi@gmail.com 23 References • Algorithms and Data Structures with implementations in Java and C++ • Introduction to Programming with C++ 3/E Y. Daniel Liang, Armstrong State University • C++ Without Fear: A Beginner's Guide That Makes You Feel Smart, 3/E • Problem Solving and Programming Concepts, 9/E , Maureen Sprankle, Jim Hubbard • C++ Language http://www.cplusplus.com/doc/tutorial/ • Cormen, Leiserson, Rivest. Introduction to algorithms. (Theory) • Aho, Ullman, Hopcroft. Data Structures and Algorithms. (Theory) • Robert Lafore. Data Structures and Algorithms in Java. (Practice) • Mark Allen Weiss. Data Structures and Problem Solving Using C++. (Practice)
  24. 24. Prof. Lili Saghafi proflilisaghafi@gmail.com 24 Sorting algorithms ANY QUESTION ? Algorithms and Data Structures Quick Sort By : Prof. Lili Saghafi proflilisaghafi@gmail.com @Lili_PLS
  25. 25. Prof. Lili Saghafi proflilisaghafi@gmail.com 25 Quicksort • Quicksort has worst case time complexity, a Big O of N squared • If a pivot is chosen properly it can be shown to have an average case of ,Big O and n log N Improvements: 1. better pivot selection: median of three partitioning 2. switch to insertion sort on small subfiles 3. elimination of recursion These combine to 20-25% improvement
  26. 26. Prof. Lili Saghafi proflilisaghafi@gmail.com 26 How can we improve the method? • One good way to improve the method is to cut down on the probability that the runtime is ever actually O of n squared. • Remember this worst worst case scenario can only happen when the pivot chosen is always the highest or lowest value in the array. • To ensure this is less likely to happen, we can find the pivot by choosing multiple elements and taking the median value as we described earlier.
  27. 27. Prof. Lili Saghafi proflilisaghafi@gmail.com 27 Quicksort • Quicksort is a fast sorting algorithm, which is used not only for educational purposes, but widely applied in practice. • On the average, it has O(n log n) complexity, making quicksort suitable for sorting big data volumes.
  28. 28. Prof. Lili Saghafi proflilisaghafi@gmail.com 28 Algorithm • The idea of the algorithm is quite simple • The divide-and-conquer strategy is used in quicksort. Below the recursion step is described: • Choose a pivot value. We take the value of the middle element as pivot value, but it can be any value, which is in range of sorted values, even if it doesn't present in the array. • Partition. Rearrange elements in such a way, that all elements which are lesser than the pivot go to the left part of the array and all elements greater than the pivot, go to the right part of the array. Values equal to the pivot can stay in any part of the array. Notice, that array may be divided in non-equal parts. • Sort both parts. Apply quicksort algorithm recursively to the left and the right parts.
  29. 29. Prof. Lili Saghafi proflilisaghafi@gmail.com 29 Complexity analysis • On the average quicksort has O(n log n) complexity • In worst case, quicksort runs O(n2) time • On the most "practical" data it works just fine and outperforms other O(n log n) sorting algorithms.
  30. 30. Prof. Lili Saghafi proflilisaghafi@gmail.com 30 30 Worst-Case Time 1. To partition an array of n elements, it takes n comparisons and n moves in the worst case. 2. In the worst case, each time the pivot divides the array into one big subarray with the other empty. 3. The size of the big subarray is one less than the one before divided. 4. The algorithm requires time: )(12...)2()1( 2 nOnn =+++−+− Suppose on each generation, the pivot just so happens to be the smallest or the largest of the numbers we're sorting.This would mean breaking down the list, n times and making n minus 1 comparisons every single time. Thus, Big O of n squared is runtime.
  31. 31. Prof. Lili Saghafi proflilisaghafi@gmail.com 31 31 Best-Case Time 1. In the best case, each time the pivot divides the array into two parts of about the same size. 2. Let T(n) denote the time required for sorting an array of elements using quick sort. 3. So, n n T n TnT ++= ) 2 () 2 ()(
  32. 32. Prof. Lili Saghafi proflilisaghafi@gmail.com 32 32 Average-Case Time 1. On the average, each time the pivot will not divide the array into two parts of the same size nor one empty part. 2. Statistically, the sizes of the two parts are very close. 3. So the average time is O(nlog n). 4. Be reminded that the exact average-case analysis is beyond the scope of this course. we have to do n minus 1 comparisons on each generation, and log n because we have to divide the list log n times.

×