Analyzing the Quicksort Algorithm

I haven’t done a math post in a long while, and a probability post ever, so why not have a twofer?

First a little probability intro:

Sometimes we want to find the average value, usually called the “expected value”, of some random process. There’s a more intuitive way to ask about expected values: Which outcome is most likely? In other words, given the probabilities of multiple events happening, which event will tend to happen most? For instance, you might want to compute the average hand of poker.

In the single random variable case, the expected value looks like this:

\sum\limits_i {x_i  \cdot P(X_i)}

…where x_i is the numerical value of the random outcome X_i, and P(X_i ) is the probability that that outcome will occur.

Now let’s turn to the Quicksort algorithm. You can find more here, but I’ll explain the basic idea.

Quicksort, as its name implies, is an algorithm that will sort a list of things. In reality, it can sort a variety of different objects according to a variety of rules, but for simplicity we’ll do one that puts a list of numbers in increasing order.

Here’s how a random numerical Quicksort works. Take a list of distinct numbers at random (i.e. their positions are important). So let’s make this our list:

\{ 7,2,6,8,1,9,12,3\}

Take one of these values at random, and fix its position. This will separate our list into two sublists. Say we pick 8. Then, put all of the values less than 8 into the list to the left of 8 (by convention) and all of those greater than 8 in the list to the right. In other words, compare the chosen value with all of the unchosen values. That gives:

\{ 7,2,6,1,3\},8,\{9,12\}

Now, take the leftmost list, and if it contains more than one value, choose a value at random from it and repeat this process. Say we choose 2. That gives:


We can start to see that in a few more iterations, this process will numerically sort the list from smallest to largest.

Something that really bugs computer scientists about algorithms is how fast they work. Speed is an important practical factor (and there are a few others, computational power for instance) when deciding to use one algorithm over another. If it works too slowly, it impinges on productivity of the computer and its user. So we’re going to discuss how quick Quicksort actually is, using expected value. In other words, we’re going to derive how many comparisons Quicksort makes on an arbitrarily large list of numbers.

How do we know when two numbers i and j have been compared, and how can we track it? As is usual, we set up an indicator function. The indicator function will track how many comparisons are made. For convenience, we’ll number the numbers (so to speak) by their size. So “1” will stand for the smallest number, “2” for the next smallest, etcetera. Here’s the indicator function:

F(i,j) = \left\{ {\begin{array}{*{20}c}1  \\ 0  \\\end{array} } \right.\begin{array}{*{20}c}{{\text{    if i and j are compared}}}  \\{{\text{otherwise                }}}  \\ \end{array}

Since our list has n numbers in it, our random variable X will be:

X=\sum\limits_{i = 1}^{n - 1} {\sum\limits_{j = i + 1}^n {F(i,j)} }, where 1 \leq i < j \leq n

It’s easier to see why this summation works if you write it out. The numbers which actually compare under the indicator mapping will be counted, and the ones that don’t will be zeroed out. So in the end we’ll get the number of total comparisons.

So then it follows that the expected value E(X) is:

E(X)=E(\sum\limits_{i = 1}^{n - 1} {\sum\limits_{j = i + 1}^n {F(i,j)} })=\sum\limits_{i = 1}^{n - 1} {\sum\limits_{j = i + 1}^n {E(F(i,j))} }

(The mapping E can go inside the summation; this is easy to derive)

But, from our definition of expected value and the indicator function, this resolves to:

\sum\limits_{i = 1}^{n - 1} {\sum\limits_{j = i + 1}^n {P(\text{i and j are compared})} }, where again, P is the probability mapping.

Well, the probability that i and j are compared is just \frac{2}{{j - i + 1}}. Thus we get:

\sum\limits_{i = 1}^{n - 1} {\sum\limits_{j = i + 1}^n {\frac{2}{{j - i + 1}}} }

Let’s assume, now, that our list contains a lot of numbers. Otherwise, it wouldn’t really matter how fast it went (a list of five numbers is trivial to sort, for instance). Then we can approximate these summations with integrals, and we get:

\int\limits_1^{n - 1} {\int\limits_{i + 1}^n {\frac{2}{{j + i - 1}}} } dxdy \approx 2 \cdot \int\limits_1^{n - 1} {\ln (n - i + 1)dy}  =

2(n\ln n - n - 2\ln 2 + 2) \approx 2n\ln (n), which is usually written 2n\log(n).

So Quicksort requires on average 2n\log(n) comparisons in order to sort a list n numbers long. Factoring in how much time it takes to make each comparison (which isn’t much), you can easily estimate the average amount of time it takes to sort a list with Quicksort!

So, whipping out my calculator, a list of 5000 numbers would take, on average, about 85172 comparisons. Remember, choosing the comparison numbers is random, so the amount of comparisons will vary. According to Wikipedia, when Quicksort does its worse, it takes about O(n^2) comparisons (more on Big-O notation here), whereas the average is O(n\log{n}). (Remember, \log{n} increases MUCH more slowly than n. Which means O(n^2) is horrible in comparison! )


2 responses to this post.

  1. I think you the right person whom i can count on. Can you help me analyze my code or my algorithmin in sorting numbers or characters? For me based from my owm comparable with quicksort it is much faster than it is. Now i just want my algorithm to be verified by the specialist and maybe you can help me..



  2. Sorry, I have enough to do as it is without doing other peoples’ work.

    Good luck though.


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: