actually i dont have complete concept about bigO notation that usually used to find time complexity my question is in bigO notation why we ignore constant for example f=100n we say that our algorithm has order of n time comlexity or order

10 Years
Discussion Span
Last Post by Grunt


> why we ignore constant for example f=100n
Because it's meaningless when you're considering the "asymtotic" behaviour. Like 100*infinity is no different to 50*infinity or just infinity.

Such minor order terms only come into play for small n, and vary widely depending on such things as
- the processor
- the compiler
- the language
- the skill of the programmer
- the actual size of n
- the actual data being manipulated
- etc etc.



The Θ-notation asymptotically bounds a function from above and below. When we have only an asymptotic upper bound, we use O-notation. For a given function g(n), we denote by O(g(n)) (pronounced "big-oh of g of n" or sometimes just "oh of g of n") the set of functions

O(g(n)) = {f(n): there exist positive constants c and n0 such that
0 ≤ f(n) ≤ cg(n) for all n ≥ n0}.

We use O-notation to give an upper bound on a function, to within a constant factor. This is the formal defination and of course what Mr. Salem has said applies as well.

Hope it helped, bye.

This topic has been dead for over six months. Start a new discussion instead.
Have something to contribute to this discussion? Please be thoughtful, detailed and courteous, and be sure to adhere to our posting rules.