Find the running time complexity of the following piece of code and show

``````your working step by step.
y=0;
x=0;
for (i=n; i>0;i=i-1)
{ y=y+1;}
for (i=1;i<=n;i=i*3)
{ for (j=1;j<=3n;++j)
{
for(k=0;k<n; k=k+5)
{
x=x+5;
}
}
}``````

Please guys help me out solving this question .....

Edited by Nick Evan: n/a

7
Contributors
14
Replies
18
Views
7 Years
Discussion Span
Last Post by \007

Let's analyze the code step by step. At the beginning you have 1 loop:

``````for (i=n; i>0;i=i-1)
{
y=y+1;
}``````

The loop is going from 1 to n, meaning O(n) complexity. Next you have a nested loop

``````for (i=1;i<=n;i=i*3)
{
for (j=1;j<=3n;++j)
{
for(k=0;k<n; k=k+5)
{
x=x+5; //this is O(1)
}
}
}``````

Three loops nested, the first one goes to n with jumps of i*3, which makes it's complexity O(log3(n))=O(log(n)), the second one goes up to 3n with increment of 1, which makes it O(3n)=O(n), and the third loop goes to n with jumps of 5, making it O((1/5)n)=O(n). So the complexity will be O(n + (n^2)log(n)).

If you don't understand any of the step that I have done, assuming that you don't just want me to do your homework - you are welcome to ask.

thank you apines for your guide line

my guess is O(x+x^3). cos if n tends to a huge number, the 3 nested for loops will likely take x^3 for the operation of x=x+5. and for the single for loop for y=y+1, thats adds the linear complexity..

EDIT- my bad, i didnt see the *3 for the first loop.. yeah i agree with apines.. its O(n+(n^2)log n)

Edited by Dhruv Gairola: n/a

O(f(x)+g(x))=O(g(x)) whenever f(x) grows more slowly than g(x) as x increases without bound. Therefore, O(x+x^3)=O(x^3), and O(n+(n^2)log n)=O((n^2)log n).

If you don't understand this, it means you need to learn about big-O notation in more detail.

O(f(x)+g(x))=O(g(x)) whenever f(x) grows more slowly than g(x) as x increases without bound. Therefore, O(x+x^3)=O(x^3), and O(n+(n^2)log n)=O((n^2)log n).

If you don't understand this, it means you need to learn about big-O notation in more detail.

Don't throw O(n+(n^2)log n)=O((n^2)log n) in the air, you need to prove it, which is the reason why I left it as it is.

Don't throw O(n+(n^2)log n)=O((n^2)log n) in the air, you need to prove it, which is the reason why I left it as it is.

Sorry; you're right. Log(n) grows more slowly than sqrt(n), so (n^2)log n grows more slowly than n. So I think I should have said that O(n+(n^2)log n)=O(n).

Sorry; you're right. Log(n) grows more slowly than sqrt(n), so (n^2)log n grows more slowly than n. So I think I should have said that O(n+(n^2)log n)=O(n).

Geez, today isn't my day.

(n^2)log n grows more quickly than n^2, of course. What was I thinking when I implied that n^2 times sqrt(n) is n?

Nonetheless, my original comment stands: If you really want to anaylze the asymptotic behavior of an algorithm, expressing it in big-O terms that can be further simplified suggests that you didn't finish the job.

So let me finish it. Log(n) increases monotonically as n increases, which means that (n^2)log n grows more quickly than n^2. So the "n+" term in O(n+(n^2)log n) is definitely unnecessary.

So is O((n^2)log n) equal to O(n^2)? No, because if you pick a positive constant k,
(n^2)log n > k*n^2 whenever n>exp(k).

So this informal proof shows that O(n+(n^2)log n)=O((n^2)log n), unless I've screwed it up again :-)

Ok, I think you are correct. Log(n) increases monotonically so eventually indeed (n^2)log(n) will surpasses n^2 - as you can see here (arkoenig look what you have done to my confidence in big-O notation :)). So in the end I believe that the final answer is indeed O((n^2)log n).

well, the fact of the matter is that both O((n^2)log n) and O(n+(n^2)log n) express the same thing..

Yes but when asked to show the big-O complexity notation you need to narrow it down. Writing O(log(n) + n + n^2 + n^3) is not a proper answer, O(n^3) is.

agreed.

Please can anyone solve this big O complexity problem by using summation method?

Please can anyone solve this big O complexity problem by using summation method?

Not familiar with summation method in regard to complexity analysis, at least not by that name.

Hi all, thank you all for explaining it in so much details. I too was asked this question and tried my best to solve this question, rather to understand Big-O.

I converted some of basic for loops into summation notations which are as follows.

But I am stuck at the following for loop and can't convert it in summation notation.

for(i=0;i<6; i=i+2) {}

and

for(i=m;i>-2; i=i-2) {}

Any help would be very much appreciated. How to convert above for loops in summation notations. I think I can understand Big-O. Thanks!

Hello, I am unsure what textbook you are using. A good textbook that I know of is, "Introduction to Algorithms" by Charles Leiserson and Erik Demaine. It can be challenging to understand textbooks alone-- and what makes that book nice is you can view their lectures online for free. MIT's OCW offers them-- here. You may find lectures 1~2 helpful for understanding how to write the notations and understand the complexity. I believe (if my memory serves me right) the example you shown was from the first or second lecture.

Edited by \007: n/a

This topic has been dead for over six months. Start a new discussion instead.
Have something to contribute to this discussion? Please be thoughtful, detailed and courteous, and be sure to adhere to our posting rules.