Hi can anyone give an answer?
Found this in cryptology book:
ii. Assuming that your computing platform can perform 1 billion divisions per second
(and assuming that the rest of the program takes negligible time), what is the largest
number of digits an inputted integer n could have so that the program could be
guaranteed to execute in less than one minute?
iii. Under the assumption of part (ii), how long could it take for the program to check
whether a 100-digit integer is prime?