Let a be any positive number. Show that a^n = o(n!).

What's the definition of `o` that you learned? Do you understand that definition? What are your thoughts on how that definition may apply to `a^n = o(n!)`?

commented: I just understood that big O is an asymptotic notation and used to classify algorithms according to their response time and memory they take. and I have no idea how to apply that def on this equatn. +0

I just understood that big O is an asymptotic notation and used to classify algorithms according to their response time and memory they take. and I have no idea how to apply that def on this equatn.

That's not a definition; it's a description and not a very descriptive one at that. That description doesn't even explain the difference between `O` and `o` nor does it make any specific statements about when a function is in the `o` of another function. It is plainly impossible to show (or even intuit) whether `a^n = o(n!)` based on that description alone.

So your first step should be to review your lecture notes to find out whether a real definition of `o` was given in class. If you can't find it in your notes (or you don't have access to any notes), you can try to use the one from Wikipedia and hope that that's the one your professor wants you to use.

PS: To reply to a post, please use the text box at the bottom of the thread with the "Submit your Reply" button. Writing replies in upvote comments makes them easy to miss and the thread hard to follow.

I don´t think that a^n = O(n!).

Supposing that O(x) means the order of magnitude of the number of computer operations you need to compute something, this means that O(2) need 2 computer operations to compute something and O(4) needs 4.

Now suppose that a^n = O(n!) is true, then this means that a^2 needs O(2!) = 2 operations.

That is 2^2 needs 2 operations and 4^2 needs 2 operations also.

Well, accepting that a^n= O(n!), 2^4 should need O(4!)=24 operations; but 2^4 = (2^2)^2.

2^2 need two operations to be computed, giving 4; then 4^2 needs another two operations to give 16.

That is, you computed 2^4 = (2^2)^2 in four operations, not 24.

This means that a^n = O(n!) is false.

If O(n) has another meaning (like order of precision, or so on), the same demonstration shows that a^n = O(n!) is false.

O (and o, which the OP asked about) describes the growth of a function. That function often describes the running time of an algorithm, but that's not important here.

A statement like "2^n = O(n!)" (or "2^n = o(n!)") does not mean that the running time of an algorithm to compute 2^n is in O(n!)¹, it means that the asymptotic growth of the function f(n)=2^n is less than or equal to that of g(n)=n! ignoring constants (and likewise "2^n = o(n!)" means that the asymptotic growth is strictly less than). For an exact definition you can look at the Wikipedia article on the subject.

Both the statements "2^n = O(n!)" and "2^n = o(n!)" are true because 2^n does indeed grow more slowly than n!.

¹ Note that that wouldn't even make sense as there are infinitely many algorithms to compute 2^n and they don't all have the same running time.