HI,

I have written a small program that uses a while loop to calculate how many years it takes for an investment to double given an interest rate:

However, I think there is something minor wrong with the calculation for years, since when testing I put in 100% it should say 1 year, yet says it takes 2 years.

```
def main():
# Introduction
print __doc__
presVal = 1 # Set present value to $1
intRate = input("Enter the Interest Rate >>") # Ask for the Interest Rate
years = 0 # Start the years at zero
futVal = presVal # Set the variable used to accumulate
# Set the while loop
while futVal <= presVal *2: # Continue 'while' futVal has not doubled
futVal = futVal + futVal *((float(intRate)/100)) # Calc the next loop
years = years + 1 # Add 1 to the years
# Calculate the amount left of the part year
fut = futVal%2
# Print the answer
print
print "It takes approximately %0.2f years to double the present value" % (years-fut)
```

Any help would be most welcomed.