1.11M Members

calculate time complexity of the algorithm

 
0
 

Plz tell me how I would calculate time complexity of the program:
int i = N;
while (i > 0)
{
int Sum = 0;
int j;
for (j = 0; j < i; j++)
Sum++;
cout << Sum << endl;
i--;
}
thnx in advance

 
1
 

Count the total number of basic operations, those which take a constant amount of time. That's all there is to it.

For example, the code int Sum = 0; is 1 basic operation. Then j = 0; is another basic operation. Then j < i forms yet another basic operation. Count these, and you get your time complexity.

For example, take the loop,

for (j = 0; j < i; j++)
  Sum++;

In this loop, you end up evaluating j = 0 once, j < i a total of i + 1 times, j++ i times, and Sum++; i times. So the total number of basic operations is: 1 + (i + 1) + i + i = 3i + 2.

Then take the block of code,

int Sum = 0;
     int j;
     for (j = 0; j < i; j++)
    Sum++;
   cout << Sum << endl;
   i--;

Here, we have int Sum = 0; taking one operation, for (j = 0; j < i; j++) Sum++; taking 3i + 2 operations, cout << Sum << endl; taking 1 operation (or 2 operations, depending on how you look at it, but the whole thing takes a constant amount of time anyway). Then i--; takes one operation. So that's a total of 1 + (3i + 2) + 1 + 1 = 3i + 5.

Then take the block of code,

int i = N;
while (i > 0)
 {
[b]we calculated this to take 3i + 5 operations[/b]
}

Every time through the loop, 1 + (3i + 5) operations are performed (one is added for the comparison i > 0 ).

Of course, the value of i changes every time through the loop. So calculating the number of operations here takes a little bit of math. The first time through, we take 3N + 6 operations. The second time, we take 3(N - 1) + 6 operations. The third time, 3(N - 2) + 6 operations. And so on and so on, until we take 3(1) + 6 operations. So it's time for some math.

(3N + 6) + (3(N-1) + 6) + \cdots + (3(2) + 6) + (3(1) + 6) = \\ 3\left[N + (N - 1) + (N - 2) + \cdots + 2 + 1\right] + 6N = \\ \frac{3 N (N + 1)}2 + 6N = \\ \frac32 N^2 + \frac{15}2 N

This number of operations is equivalent to O(N^2). (It's good to know the formula N + (N - 1) + ... + 2 + 1 = N(N+1)/2.)

Now, what I just described is a very long and drawn out way of doing things. Nobody actually works problems like these out, but it's something you can fall back on. There are many shortcuts you can take. For example, it doesn't change anything if you mush all the constant values together (the way I treated the cout statement as just one operation). And whenever you see a while loop that runs N times, just multiply N by the amount of operations done on the inside.

Here's http://eternallyconfuzzled.com/articles/bigo.html which explains things a bit differently.

 
0
 

Very good work !!! Really well explained

 
1
 

Hi

Could you please explain how did you come to the result of 3/2N^2... as i cannot figure that out, and i really want to know how to caluclate the time complexity of a given algorithm. I quite understood it that you have to count the steps invovled in it, but how would i be able to get to the result, please explain more mathematically.

Thanks

 
1
 

Which step don't you get? Do you understand how I got to [tex](3N + 6) + (3(N-1) + 6) + \cdots + (3(2) + 6) + (3(1) + 6)[/tex]?

 
0
 

Well i understand 3N, and 3(N - 1) , 3(N-2)..... But i did not really understand, 3(2) + 6, 3(1) + 6 etc, as i think this would be result of replacing the N's value {I am not sure though}.
However, the real difficulty is calculating the actual result. 3/2N^2 + 15/2n, from this equation.
Thanks in advance

 
0
 

The first time through the loop, the value of i is N, so we know it takes 3N+6 operations in the loop. The second time through the loop, the value of i is N-1, so it takes 3(N-1)+6 operations. The third time through, the value of i is N-2, so it takes 3(N-2)+6 operations. And so on, until the last time through the loop, the value of i is 1, so it takes 3*1+6 operations. (The second-to-last time through the loop, the value of i is 2.)

That gives [tex](3N+6)+(3(N-1)+6)+\cdots + (3(2)+6) + (3(1) + 6)[/tex] operations.

Then I rearranged terms and pulled out the 3 to get the second line.

Then, using the formula N+(N-1)+(N-2)+...+3+2+1 = N(N+1)/2, I substituted.

 
0
 

How To Calculet Time Complexity Of An Algo

 
0
 

How To Calculet Time Complexity Of An Algo

By reading the responses posted in this and the other link you hijacked.

 
0
 

Hello frnds ....
itz really very well explained but it will be more clear if someone calculate for this one bcoz i was not able to compute for this .... please guide me ....
thanx in advance ....

int flag,temp;


do {


flag=0;


for(int j=0;j<n;j++) {


if((j+1)==n)
continue;


if(a[j]>a[j+1]){


flag=1;
temp=a[j];
a[j]=a[j+1];
a[j+1]=temp;
}
}
}while(flag==1);
 
-1
 

There you might want to consider the best and worst case complexities. Ask yourself what shape of array will cause really poor behavior and what shape of array will cause the fastest behavior.

 
0
 

Please some one clarify these calculation, because its realy hard to understand. I cant realy get it! please help!
(3n+6)+(3(n-1)+6)...

 
0
 

Can u help me please
I have a confusion in case of for and while loop in calculating the number of program steps for the purpose of finding time complexity. Is the control part of the loops only considered? I have seen somewhere that
the step count of for loop is n+1( in general. not considering the statements inside it) +1 for last execution of for loop. But for while it is n.Is it true.But how is it possible because condition is checked n+1 times.

 
0
 

A loop which loops N times is considered to be O(N) even though as you know it would take N+1 comparisons to run the loop.

 
0
 

Thanks for your answer. My confusion is regarding the difference between for loop and while loop ie their program steps. I read in a text written by Horowitz that for loop has n+1 and while has n program steps(in general). Becos in both case the control part is executed n+1 times. Here without considering asymptotic notations what will be the complexity. Please help me.

 
0
 
for (int i=0; i<n; i++)
int i=0;
(while i<n)

These two pieces of code are equivalent loops. But the while loop needs n steps to execute because the initialisation of i occurs outside the loop. The for loop requires n steps +1 to instantiate i inside the condition of the loop.

However in Big O notation, both are referred to as O(n) because constants have no effect on the trend as n approaches infinity. While Horowitz is technically correct, it is not in terms of algorithm complexity that they are talking about here - it is the number of programming steps (ie the number of operations the program executes). Number of operations is a bit of a misleading representation of program complexity, as I have shown in the above code.

 
0
 

thanks for your reply

 
0
 

when will the complexity of an algorithm be o(log n) and o(n log n)

 
0
 

Do you mean o(log n) or O(log n)? They mean different things.

An algorithm will be O(log n) if there's some value C and some value m such that given an input of size n, where n>m, the running time will be less than C*(log n). And similarly for O(n log n).

 
0
 

Thanks very much, it's clear now.

You
This article has been dead for over six months: Start a new discussion instead
Post:
Start New Discussion
Tags Related to this Article