Hi, I'm getting segmentation fault errors when I use quicksort with ~350 000+ numbers.

This is the quick sort algorithm I'm using:

void swap(int* v, int i, int j)
{
	int tmp = v[i];
	v[i] = v[j];
	v[j] = tmp;
}

int partition(int *v, int s, int e)
{
    int p = v[e], i = s - 1;

    for (int j = s; j < e; j++)
    {
        if (v[j] <= p)
        {
            i++;
            swap(v, i, j);
        }
    }    
    i++;
    swap(v, i, e);
    return i;
}

void quicksort(int* v, int s, int e)
{
    if (s < e)
    {
        int p = partition(v, s, e);
        quicksort(v, s, p - 1);
        quicksort(v, p + 1, e);
    }
}

void quicksort(int* v, int n)
{
    quicksort(v, 0, n-1);
}

and this is how I am testing it:

void generate_vector(int* &vec, int n)
{
    vec = new int[n];
    int i = 0;
    while (vec[i++] = --n);
}

int main()
{
    const int n = 350 * 1000;

    int *vec = 0;

    generate_vector(vec, n);
    quicksort(vec, n);
    
    int i = 0;
    while (i < n)  // check if is sorted
    {
        if (vec[i] != i)
        {
            std::cout << "error" << std::endl;
            break;
        }
        i++;
    }
    delete[] vec;
    return 0;
}

Why is this happening?
It works fine when I try less than 340 000, is the quicksort algorithm wrong or something else?

Sorry for the bad English.

Lol, if it works for ~340,000 values, then it's probably not the algorithm! Likely the problem is that the allocation is failing because you're somehow out of system resources, or (since the algorithm is recursive) your program stack and allocation heap are meeting in the middle.

Things to check:
+ did your array-allocation succeed (after line 3 in generate_vector(), check for vec == NULL)? how big a vec can you allocate (if you comment-out the call to quicksort())?
+ how far into the recursion can you get? At the beginning of the 3-parameter quicksort() function, add a debug line like:

cout << "sorting elements " << s << " through " << e << endl;

and see what you get....

You have a stack overflow, because your algorithm is hitting an O(n^2) case.

You can solve this by recursing into the smaller subproblem and then tail-recursing (i.e. looping, updating variables and jumping to the top of the function) for the larger subproblem.

You have a stack overflow, because your algorithm is hitting an O(n^2) case.

How so? There is no copy of data anywhere in the code, so it's O(n) in space.

You can solve this by recursing into the smaller subproblem and then tail-recursing (i.e. looping, updating variables and jumping to the top of the function) for the larger subproblem.

What? The algorithm is already divide-and-conquer recursive, so O(nlogn) in execution time. (And technically yes, O(logn) in execution stack-size to handle the recursion, but that's still fairly trivial.) Your "i.e." note isn't tail-recursion, it's an approach to removing the stack overhead of function-call recursion by maintaining local variables and a loop, but that's really not the issue here, and won't increase the solvable problem-size by enough to matter.

@raptr_dflo: I have no problem creating the vector, I tried creating a vector of 10 millions integers.
The last line printed by the debug line was sorting elements 174666 through 175333.

@Rashakil Fol: Could you give me a example of how can I do that of recursing into the smaller subproblem?


Thanks for your reply.

I just tried with 10 millions random numbers and it worked fine and quickly.

I think the problem is that the height of the tree is not log(n) but n.

The algorithm is already divide-and-conquer recursive, so O(nlogn) in execution time. (And technically yes, O(logn) in execution stack-size to handle the recursion, but that's still fairly trivial.)

Never mind, I took another look and realized what you're talking about. The OP picked a worst-case reverse-ordering, so it keeps partitioning all N-1 elements besides the partition-element into a single partition, and that does end up being O(n^2). Sorry!

This article has been dead for over six months. Start a new discussion instead.