Quick Sort in C

Quick Sort in C


Spread the Knowledge

Similar to merge sort in C, quick sort in C follows the principle of decrease and conquer, or as it is often called, divide and conquer. The quicksort algorithm is a sorting algorithm that works by selecting a pivot point, and thereafter partitioning the number set, or array, around the pivot point.

Also known as partition-exchange sort, quicksort was developed by Tony Hoare, a British computer scientist, in 1959. Since its publishing in 1961, quicksort has become one of the top choices in sorting algorithms.

The Quicksort Process

Once the partitioning is done, the group of elements smaller than the pivot element is placed before it, and the remaining, greater ones, after it. There are several variations of the quick sort algorithm, depending on what kind of element (or number) is selected as the pivot:

  • The first element as the pivot
  • The last element as the pivot
  • A random element as the pivot
  • Median as the pivot

The main process in a quicksort algorithm is partitioning. If x is the pivot in an array, then the main intent of the sorting process is to put x at the right position in a sorted array, such that smaller elements precede x and greater elements follow it.

Quick Sort in C Program

The following code demonstrates the quick sort in C process. It asks the user to input a number of elements (up to 25) that requires sorting and then presents those elements in the sorted order:

void quicksort(int number[25],int first,int last){
int i, j, pivot, temp;
int main(){
int i, count, number[25];
printf("Enter some elements (Max. - 25): ");
printf("Enter %d elements: ", count);
printf("The Sorted Order is: ");
printf(" %d",number[i]);
return 0;

Sample Output:

Enter some elements (Max. – 25): 5

Enter 5 elements: 5 22 -19 63 1

The Sorted Order is: -19 1 5 22 63

Another Quick Sort in C Program

Here is another C program that demonstrates quick sorting. It takes the last element of the number set as the pivot and doesn’t ask for input:

void swap(int* a, int* b)
int t = *a;
*a = *b;
*b = t;
int partition (int arr[], int low, int high)
int pivot = arr[high];
int i = (low - 1);
for (int j = low; j <= high- 1; j++)
if (arr[j] <= pivot)
swap(&arr[i], &arr[j]);
swap(&arr[i + 1], &arr[high]);
return (i + 1);
void quickSort(int arr[], int low, int high)
if (low < high)
int pi = partition(arr, low, high);
quickSort(arr, low, pi - 1);
quickSort(arr, pi + 1, high);
void printArray(int arr[], int size)
int i;
for (i=0; i < size; i++)
printf("%d ", arr[i]);
int main()
int arr[] = {22, 17, -8, 9, 11, 5};
int n = sizeof(arr)/sizeof(arr[0]);
quickSort(arr, 0, n-1);
printf("The sorted array is: n");
printArray(arr, n);
return 0;


The sorted array is:

-8 5 9 11 17 22

Analysis of Quicksort Algorithm (Time Complexity)

The time required by the quicksort algorithm for sorting a total of n numbers is represented by the following equation:

T(n) = T(k) + T(n-k-1) + (n) → (i)

T(k) and T(n-k-1) represents the two recursive calls in the quicksort algorithm. The last term (n) represents the partition process while k is representative of the total count of the numbers present in the set that is smaller than the pivot.

Note that the total time taken by a quicksort algorithm to complete is dependent on the input array as well as the partition strategy deployed.

We’ll calculate the efficiency of the quicksort algorithm for 3 different cases:

  • Worst Case – When the partition process always picks either the smallest or the greatest element as the pivot, it is considered to be the worst case of a quicksort algorithm.
    In the aforementioned quick sort in C program, for instance, where the last element is selected as the pivot point, the worst case occurs if the array is already sorted.
    The equation (i) gets transformed for worst case of quick sort as follows:T(n) = T(0) + T(n-1) + (n)It can be written as:
    T(n) = T(n-1) + (n)Hence,T(n)worst case = O(n2)
  • Average Case – Any case of quicksort that doesn’t belong to either the worst case or the best case is an average case.
    In order to carry out average case analysis of quicksort, we need to consider all possible permutation of the given array (set of elements) and then calculate time taken by each of the permutations. Obviously, it is a very complex process.
    A way around this problem is to consider the average case when the partitioning process puts (n/9) elements in one set and (9n/10) elements in the other one.
    Therefore, equation (i) gets transformed into,T(n) = T(n/9) + T(9n/10) + (n)
    Solution to abovementioned recurrence relation will be,T(n) = (n log n)Hence,T(n)average-case = O(n log n)
  • Best Case – The best case of quicksort occurs when the partitioning process always picks the middle element as the pivot. Hence, equation (i) becomes,T(n) = 2T(n/2) + (n)Using case 2 of Master Theorem,T(n) = (n log n)Hence,T(n)best-case = O(n log n)

Quicksort vs. The World (A Comparison with other sorting algorithms)

The worst case time complexity pertaining to quicksort is O(n2). Clearly, this is worse than that offered by other popular sorting algorithms, mainly merge sort and heap sort. Amazingly, quicksort is faster in practice than its competitors.

When carefully implemented, quicksort can be as much as two to three times faster than mergesort and heapsort. This is so because the inner loop of quicksort algorithm allows being effectually implemented on most system architectures for most forms of real-world data.

Moreover, quicksort can be implemented in different ways, simply by changing the choice of the pivot. And so, the worst case occurs very rarely for any given data. Nonetheless, merge sort is the better option when the data is humongous and stored on some external storage source.

Although bubble sort is not a direct competitor to quicksort, it can be considered in some scenarios where simplicity is the main priority. This is due to the simplistic nature of the bubble sort as opposed to quicksort, which is a complex algorithm.

When compared to selection sort, quicksort is pretty much identical. The main difference, however, lies in the fact that inefficient implementations, quicksort is not a stable sort. Put it simply, unlike selection sort, quicksort doesn’t always go for worst-case partition.

Program Completion!

So, that was all about quicksort in C. The concept of sorting is one of the fundamental aspects of programming, hence, very important to have a good knowledge about.

The programs that are detailed in the article for the implementation of a quicksort algorithm are two of the several possibilities. Leaning it is best when experimented on your own.

So, go and try it for yourself! You can consult other resources and some of the best C (and C++) books to understand the concept even better and further polish your programming skills as well.

If you have some interesting way of implementing the quicksort algorithm, please brag about it in the comments. The community would love to hear it from you.

Related Posts

Your email address will not be published. Required fields are marked *


20 Comments, RSS

  1. Avatar

    Marlin Mccallum February 15, 2019 @ 11:52 am

    How does quick sort work?

    • Avatar

      Jamar Hyland February 20, 2019 @ 2:46 pm

      Quick sort works on divide and conquer technique. It picks an element as pivot from the given list or array and partition the given array around the picked pivot element. Then data is compared with this pivot element, if it is smaller then put before pivot and if it is greater than pivot, it will be placed after pivot.

  2. Avatar

    Lara Mccreary February 15, 2019 @ 12:07 pm

    Is quick sort an in-place algorithm?

    • Avatar

      Willian Crider February 20, 2019 @ 2:47 pm

      In-place idea is about space efficiency. Yes, Quick sort is an in-place algorithm. There is no need to allocate additional memory in the heap memory.

  3. Avatar

    Sharda Halcomb February 15, 2019 @ 12:08 pm

    What is the program of quick sort which has it’s median as it’s pivot?

    • Avatar

      Alphonso Longoria February 20, 2019 @ 2:48 pm

      This will be the best case of quick sort in which partition process always picks the middle element as pivot. Recurrence for best case is O(n log n).

  4. Avatar

    Shantell Donaldson February 15, 2019 @ 12:08 pm

    How does randomized quick sort perform better than the standard quick sort algorithm? Why should anybody care about a sorting method that is not optimal?

  5. Avatar

    Matthew Phelan February 15, 2019 @ 12:09 pm

    What will be the complexity of quick sort if array is already sorted?

    • Avatar

      Hong Loveless February 20, 2019 @ 2:49 pm

      The reason behind this is randomized version has O(n logn) expected total number of comparisons. The deterministic version doesn’t enjoy this property, as is widely known. the distribution on input order does not matter anymore so this can be a great idea to use. When the data is too large then it’s an amazing algorithm to be used. Quick sort is optimal as it has more than linear growth at least in average case.

    • Avatar

      Gail Quintero February 20, 2019 @ 2:50 pm

      This will be the worst case complexity O(n^2) of quicksort. This happens when input array is already sorted or reverse sorted and either first or last element is picked as the pivot.

  6. Avatar

    Donita Gamble February 15, 2019 @ 12:09 pm

    How can I find the median of two sorted arrays?

    • Avatar

      Dillon Bolen February 20, 2019 @ 2:51 pm

      Median is at index (n-1)/2, if number of elements in array is N.
      Let’s understand this by an example:
      Two sorted arrays are here-
      ar1[] = {1,12,15,26,38}
      ar2[] = {2,13,17,30,45}
      After merging two arrays we get:
      Middle elements are – 15 and 17
      Median of two elements is (15+17)/2
      Which is equal to 16

  7. Avatar

    Kristle Rosado February 15, 2019 @ 12:11 pm

    Which is faster: quick sort or bubble sort, and why?

    • Avatar

      Carmine Garris February 20, 2019 @ 2:51 pm

      Quick sort is faster than bubble sort. Bubble sort has average case complexity of O(n^2) while Quick sort has O(n log n). Bubble sort is actually fast for small data sets like 10 elements or may be even for 100 elements.

  8. Avatar

    Lu Wentworth February 15, 2019 @ 12:11 pm

    What are the advantages and disadvantages of quicksort?

    • Avatar

      Dominique Somers February 20, 2019 @ 2:52 pm

      • High performance
      • In-place sorting algorithm
      • No extra memory required
      • Quick sort is better than other sorting algorithms
      • High efficiency
      • Faster than merge sort.

      • Unstable
      • Average efficiency in worst case scenario

  9. Avatar

    Jennefer Lombardi February 15, 2019 @ 12:12 pm

    What are the real time applications of quick sort?

    • Avatar

      Darrick Emanuel February 20, 2019 @ 2:53 pm

      • Defense
      • Medical monitoring
      • Life support in aircraft and space crafts
      • Monitoring and control in industrial and research plants
      • Handling dangerous materials
      • Control for aircraft etc

  10. Avatar

    Nieves Stackhouse February 15, 2019 @ 12:13 pm

    What is the best way to choose a pivot element in Quicksort?

    • Avatar

      Thurman Alicea February 20, 2019 @ 2:53 pm

      For the fastest quicksort, the best way to choose a pivot element is to choose 2 pivot elements (dual pivot-quicksort) or 3-pivot elements (3-pivot quicksort).