Quick Sort

Quick sort is to partition the array around one element and then sort each part recursively.

  • Pick up one element as the pivot
  • Move all elements less than the pivot to the left, and all elements greater than the pivot to the right
  • Apply the above steps on both parts

The following method implements quick sort. It defines a recursive method to sort subarrays, and also defines a method to partition an array into two parts.

public static void quickSort(int[] data, int first, int last)
{
    if (first >= last)
        return;

    int pivot = partition(data, first, last);
    quickSort(data, first, pivot - 1); // sort the left part
    quickSort(data, pivot + 1, last); // sort the right part
}

The partitioning process involves picking up the pivot and move elements around the pivot. A simple procedure is as below:

  • Allocate a new temporary array holding the partitioned result
  • Pick up the first element as the pivot
  • Scan the array from the second element, compare each element with the pivot, and put it to the left end of the temporary array if it is less than or equal to the pivot, otherwise put it to the right end.
  • Finally copy back the result from the temporary array to the original array
public static int partition(int[] data, int first, int last)
{
    int[] temp = new int[last - first + 1];
    int pivot = data[first];
    int i = 0, j = last - first, k;

    for (k = first + 1; k <= last; k++)
    {
        if (data[k] <= pivot)
            temp[i++] = data[k];
        else
            temp[j--] = data[k];
    }
    temp[i] = data[first];

    // Copy data back into original array
    for (k = first; k <= last; k++)
        data[k] = temp[k - first];
    return first + i;
}

The method above requires extra storage (linear space) holding the intermediate result. The following is an in-place version of the partitioning which does not require addtional storage:

  • Pick up the first element in the array as the pivot
  • Scan the array from both ends toward the middle
  • Whenever finding two elements on the wrong side, swap them
  • When the scans from both ends meet in the middle, swap the pivot with this middle element
public static int partition(int[] data, int first, int last)
{
    int pivot = data[first];
    int left = first, right = last;

    while (left < right)
    {
        // Find an element bigger than the pivot from the left
        while (data[left] <= pivot && left < right)
            left++;
        // Find an element smaller than the pivot from the right
        while (data[right] > pivot)
            right--;
        // Swap the two elements found
        if (left < right)
            swap(data, left, right);
    }

    // move the pivot element to the middle
    swap (data, first, right);
    return right;
}

The pivot in the partition() method does not guarantee an even partitioning. This is different from merge sort, where we divide the array exactly in the middle.

  • If the pivot chosen at each time is the median element, then the partitioning is even, and the level of partitioning is O(log n)
  • If the pivot chosen at each time is either the smallest or the largest element (bad luck every time), then one part has no element and the other part holds all elements except the pivot itself. This will generate n levels of partitioning, which is essentially similar to selection sort.
  • If the pivot chosen randomly each time, then on average, the partitioning will be even, and the level of partitioning is close to O(log n).

At each level of the partitioning, moving elements around the pivots takes linear time in total. Thus

  • In the average case, quick sort takes \(O(n \log n)\) time.
  • In the worst case, quick sort takes \(O(n^2)\) time.

Summary of the running time of different sorting algorithms:

  • Selection, insertion and bubble sort runs in \(O(n^2)\) time, both worse case and average case.
  • Merge sort runs in \(O(n \log n)\) time, both worst and average case.
  • Quick sort runs in \(O(n \log n)\) time for the average case, and \(O(n^2)\) for the worst case.

In-Place Sorting

A sorting algorithm is called in-place if it uses at most O(1) extra storage. Insertion, selection, bubble are in-place; they requires only constant extra storage. Merge sort requires additional storage (linear space) for storing the temporary result; this storage can be reused each time.

The storage used by quick sort is more subtle. For the in-place version of partitioning, it only requires constant space. However, since quick sort uses recursion, and on average, there are \(O(\log n)\) levels of recursive calls, and this requires logarithmic space, although this is not explicitly allocated.

Comments

comments