this is the quick sort and I know that it has O(n^2) time complexity at worst case (when data is already sorted) put how I count the operation that makes this time complexity i.e which part of the code will mostly excute when elements are already sotrted so that when I count I end up with a function with O(n^2) time complexity
Help!

void quickSort( int A[], int left, int right )
{
  if( left >= right )
        return;
    int P_ind = ( left + right )/2;          
    int pivot = A[P_ind];                   
    swap( A[P_ind] , A );              
    int i = left;                             
    int j = right - 1;                       
    for(;;)                               
    {
	while( A[i] <= pivot && i < right ) 
                       i++;
	while( A[j] >  pivot && j > left  ) 
                       j--;
	if( i < j )
	    swap( A[i], A[j] );
	else
	    break;
    }
    swap ( A[i], A right ); 
    quickSort( A, left, i-1 );               
    quickSort( A, i+1, right );              
}

this is the quick sort and I know that it has O(n^2) time complexity at worst case (when data is already sorted) put how I count the operation that makes this time complexity i.e which part of the code will mostly excute when elements are already sotrted so that when I count I end up with a function with O(n^2) time complexity
Help!

void quickSort( int A[], int left, int right )
{
  if( left >= right )
        return;
    int P_ind = ( left + right )/2;          
    int pivot = A[P_ind];                   
    swap( A[P_ind] , A );              
    int i = left;                             
    int j = right - 1;                       
    for(;;)                               
    {
	while( A[i] <= pivot && i < right ) 
                       i++;
	while( A[j] >  pivot && j > left  ) 
                       j--;
	if( i < j )
	    swap( A[i], A[j] );
	else
	    break;
    }
    swap ( A[i], A right ); 
    quickSort( A, left, i-1 );               
    quickSort( A, i+1, right );              
}

Hey,

This is how you compute the upper bound (that's how you call the big O expression) for QuickSort worst case:

First off, like you said, you're supposed to have an already sorted array, with n elements. The array has to be sorted so that you always choose the minimum element as the pivot, and then go through the rest of the array looking for an smaller element that doesn't exist each time.

And second, like I said, you're also supposed to choose the minum (or maximum) element as the pivot for each subarray (that is, the first element of the subarrays).

So in the first pass, you'll take the first element of your array as the pivot, and then go through the rest of your array till its end, looking for a even smaller element that actually doesn't exist. So the number of comparisons you do in the first pass is: n-1

You have now 2 sub-arrays out of that first pass: the left one, with only 1 element (the minimum element), and the right one, with n-1 elements.

You now repeat the process with each part: the first one, with one element only, is trivial. You just leave it as it is.

On the other hand, in the second subarray, you choose, once again, its minimum element as the pivot (the first element, like always). Since it is the minimum element, you'll have to go through the whole subarrays doing comparisons till its end, looking for an even smaller element that doesn't exist. The result now is you do n-2 comparisons.

The process repeat for the third pass (n-3 comparisons), the forth pass (n-4 comparisons), and so on till you reach an subarray of size 1.

So, the total computation turns out to be:

(n-1) + (n-2) + (n-3) + ... + 3 + 2 + 1

Which is equal to:

Sumatory(from i=1 to n-1 of n - i) = n(n-1)/2 = (n^2)/2 - n/2 which belongs to O(n^2).

Ok, that's it. Hope it helped you a little bit!

This article has been dead for over six months. Start a new discussion instead.