Ok I am think I am doing this right, I am finding the big theta for algorithms

I have two:
#1

int selectKth( int a[], int k, int n )
{
    int minI;
    int tmp;
    for ( int i = 0; i < k; i++ ){
    minI = i;
    for ( int j = i + 1; j < n; j++ )
    {
            if ( a[j] < a[minI] )
           { 
                 minI = j;
           }
           tmp = a[i];
           a[i] = a[minI];
           a[minI] = tmp;
     }
    }
    return a[k - 1];
}

#2

for ( int i = 0; i < n - 1; i++ ){
                       for ( int j = i + 1; j < n; j++ )
                      {
                                tmp = a[i][j];
                                a[i][j] = a[j][i];
                                a[j][i] = tmp;
                      }
                 }

Do these both have the same big theta?
I see them as having N^2.
Am I doing this right?

moved to more appropriate board

This is not the more appropriate board; you should move it back to the CompSci forum.

Done and done.

Can I move it too? :twisted:

mrclean, this question is complicated by the fact that you have multiple parameters to your first function, but only one parameter to your second function. Your second function runs in theta(n*n) time, yes. But take a look at your first function. Suppose you hold the value k fixed? Suppose you hold k fixed at 3, and then consider the running time as you vary n from 0 to an arbitrarily large number? You'll find that you run through the interior for loop approximately 3n times. If you fix k at any value, you'll find this to be the case, that you run through the interior loop approximately k*n times, if n is really large.

The interesting thing is, for values of n that are less than k, the growth pattern seems to be a theta(n^2). But after you increase n past the value of k, you'll find that the interior for loop gets run "k" extra times each time you increase the value of n by 1.

This leads to the interesting topic of what big O and big Theta notation really means. When you have multiple variables, it is not necessarily as straightforward an idea as when you have one. You could say that your first function has a running time Theta(k*n). What does that mean? You could consider the two directions in which you can grow the parameters of the running time function f(k,n) = k*n: increasing k while holding n constant and increasing n while holding k constant. You could say the running time function grows linearly with respect to n. So it's Theta(n), in that respect. You could say that it grows linearly with respect to k. So it's Theta(k), in that respect.

But that's not complete information, since what if you grow the function in both k and n at the same time? Then running time quadruples, everytime you double your parameters! That's quadratic growth, not linear! But then consider a function like

foobar(n,k) {
 for i = 1 to k:
    print i
 end
 for i = 1 to n:
     print i;
 end
}

This function grows linearly with respect to n, and linearly with respect to k, just like the function you gave above. But if you increase n and k at the same time, it still grows linearly, not quadratically! Its growth would be described as Theta(n + k).

But then there's _another_ way of looking at big O notation.

Another way of looking at big O notation is to consider the worst case running time, given the value of one particular parameter. Take a look at your first function. What is the biggest number of operations that function can have, for any particular value of n? We know that the function will be called in such a way that k <= n, don't we? So the worst possible case is where k is equal to n. Then that means the worst possible running time, given any value of n, grows proportionally to n*n. So the function's worst case running time grows at a rate of Theta(n*n).

So remember that big theta and big o notation doesn't refer to a piece of computer source code itself; it refers to a particular piece of information about that source code. It can refer to the way the running time changes, given a large group of varying parameters (where "large" means "more than one"). It can also refer to the way the running time changes, when some parameters are held at constant values, while one other parameter moves between values. It can also refer to the value called "the worst possible running time, considering the value of the parameter Foo", as the parameter Foo changes value.

So, there is no such notion as "the big theta" for a particular algorithm. You need to ask yourself, precisely, what information you are trying to get.

commented: Good one - ~s.o.s~ +13
Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.