This is probably really more of a math problem.

I hope I want have to go into too much (for you, boring) detail about the code,

as the problem resides in this loops.

Please let me know if I provided too little detail :)

Now for the problem:

I would have thought the `&& decimals <10`

would mean 'only' 10 decimals where provided, as the integer named decimals will increase every time the incremental factor is divided by 10, which should mean adding a decimal.

Why am I wrong? The following code outputs 16 decimals when the integer n is set to eg. 3 or 2, but 15 decimals if n is set to 257.

Changing 10 to 100 will make the code take hours to execute, so I'm guessing quite a few decimals are in play :)

```
int n; //input number
double itn = 0; //incremental test number
double i = 1; //incremental factor
n = Integer.valueOf(args[0]);
int j = 0;
int decimals = 0;
while ((itn * itn) != n && decimals < 10) {
if (j > 0) {
itn = itn - i;
i = i / 10;
decimals++;
}
if (j == 0) {
j++;
}
while ((itn * itn) < n) {
itn = itn + i;
}
}
```