Hi can anyone give an answer?
Found this in cryptology book:
ii. Assuming that your computing platform can perform 1 billion divisions per second
(and assuming that the rest of the program takes negligible time), what is the largest
number of digits an inputted integer n could have so that the program could be
guaranteed to execute in less than one minute?
iii. Under the assumption of part (ii), how long could it take for the program to check
whether a 100-digit integer is prime?

"Introduction to Cryptography with Mathematical Foundations and Computer Implementations" by Alexander Stanoyevitch

Looks like a school/homework/test question.

I doubt you will find a viable response here without showing some sort of effort on your part...