1.An algo can't have both constant time and O(n^2).
2.There is no such thing as O(n^2 + <something>)

The time the algorithm takes is a function of more than one variable, so O(n^2 + <something>) is perfectly reasonable notation.

3.Time complexity of an algorithm is independent of PL, OS or anything at all. Even if you have hardware quick sort on your machine it is still O(n log n) (Worst case O(n^2))

You're missing the point. I'm talking about development time; every serious language has the same time complexities. For any given task, some programming languages take less economic resources than others. You wouldn't use Delphi to code a supercomputer, would you? If you're writing a MacOSX application, Objective C would be the language of choice, no? If you're a scientist doing some numerical calculation, you'd use Matlab or something like it, no? They're the cheapest tools for their respective jobs, and you'd be wasting money/time otherwise.

4.I still have no idea about what is the algorithm we debate on its time complexity.

You can derive a closed form of the summation by simplifying the relation

sum_from_1_to_K(x^n) - sum_from_1_to_K((x-1)^n) = K^n

iteratively or recursively. The algorithm implements this and then evaluates the closed form of the expression.

To quickly answer the original query (though this is an interesting discussion)

There are many languages because well, people needed code systems that could do some process. They were abstracted up from machine language, and at the lowest levels, they do mostly the same things, but:

Some languages have better optimizations and implementations of a program.

I'd say you should pick up the fundamentals of programming first. Figure out what you want to do with a program, the abstracts of data structures and operation automation, then find a language and figure out how to do things in that language. Once you know how to think, picking up any language is pretty simple. But to learn the language, you've got to work at it and figure things out by doing increasingly complex operations.

There are still a fair number of old mainframes and systems out there that run code written in the 70's and 80's, but you should probably go with a more recent language. The level of abstraction will be higher, and you'll be able to find more help... from, for example, people on DaniWeb.

@Narue No one has a bigger ego than the computer nerd. You're a perfect example of that.

I'm using Visual Basic for an Introduction to Programming module at Technical college. I do have to say that just because it was widespread in industry 10 years ago doesn't mean that it's a great choice for learning coding. I can't explain it but it just seems sort of well, wierd.

I started learning Python, as well as a bit of Ruby and Lua - the assignments are more in-depth than what I did in those languages but I found them way less annoying than Visual Basic.

This topic has been dead for over six months. Start a new discussion instead.
Have something to contribute to this discussion? Please be thoughtful, detailed and courteous, and be sure to adhere to our posting rules.