You are on page 1of 2

AVERAGE TIME COMPLEXITY OF QUICKSORT

CLAIM :- average time complexity of quicksort is O(nlog(n))

PROOF:-

The reccurence relation for the above code in general is given by:-

𝑇(𝑛) = 𝑂(𝑛) + 1/𝑛 × ∑𝑛−1


𝑖=0 (𝑇(𝑖) + 𝑇(𝑛 − 𝑖 − 1)) {n is the size of list}……………..(1)

For moving the two pivots and


making comparisons and swaping
for about O(n) times

#There is equal possibility of pivot i to be any index from 0 to n-1 because there is random
distribution of data in the list and hence the probability of being at any index for pivot i is (1/n)
once the index for i is decided pivot j can only have one possible place ie. i-1.Hence by fundamental
principle of counting total probability for this task is (1/n)

#Corresponding to this pivot arrangement we have two sublists generated ,of size i and n-i-1
respectively so sorting them would give a time complexity of T(i)and T(n-i-1).

#By probability theory ,we can write mean or average of random distribution as written above.

On simplifying (1) we get :

𝑇(𝑛) = 𝑂(𝑛) + 2/𝑛 × ∑𝑛−1


𝑖=0 𝑇(𝑖) ……………………………..(2)

Claim : T(n) = O(nlog(n))

Proof:
Induction on n:

Base case :
For n=1 we don’t need even pivots for comparison as the function directly
return the list. Hence time complexity will be of O(0) which is of O(1log(1))
Inductive hypothesis:
Let T(i)=O(ilog(i)) for 0 ≤i≤n

This implies T(i)≤ c.i.log(i) for 0 ≤i≤n

Inductive step:

From (2)
2
𝑇(𝑛) = 𝑂(𝑛) + 𝑛 ∗ ∑𝑛−1
𝑖=0 𝑇(𝑖)

2 𝑛
𝑇(𝑛) ≤ 𝑂(𝑛) + 𝑛 ∗ ∫1 𝑇(𝑖) (as continuous area is greater than discrete one)

2 𝑛
𝑇(𝑛) ≤ 𝑂(𝑛) + 𝑛 ∗ ∫1 𝑖. log(𝑖) …………. (by our induction hypothesis)

2 𝑛2 𝑛2
= 𝑂(𝑛) + 𝑛 ∗ {log(𝑛) ∗ − }
2 4

= O(n+n*log(n))

= O(n*log(n))

Hence proved

You might also like