Does anyone have a idea about this programe,

Assume that you are paid on the basis of one cent the first day, two cents the second day, four cents the third day, with the daily amount continuing to double in this way.  Design an algorithm and use it to write a Python program that uses loops to calculate the amount of money a person would earn over a period of time (maximum period one year) if paid in this way. The program should be implemented with a 'pre-tested loop' terminated by a sentinel like 0 or [Enter] to a  "How many days?" type prompt (i.e. do NOT control the loop by asking something like "Do you want to do another calculation?" or "Do you want to quit?" etc). The user's response should be validated to ensure that the value entered falls between 1 and 366 days inclusive. (If it doesn't, the user should be requested to enter the value again).  Output (for each pass through the main loop) should consist of a two column table with 'day' in the first column and 'today's salary' in the second column.  At the bottom of the table the total salary should be displayed. All monetary values should be shown as dollar amounts, not cents (for example $17.50 not 1750 cents).

Recommended Answers

All 4 Replies

Member Avatar for Enalicho

If you want help, attempt the problem, then post with any problems you encounter along with the code. It's the only way you'll learn. If we just give you the answer, what's the point?

I agree with Enalicho. However I can suggest that you write out in pseudo-code what you want the program to do. Pay close attention to using the words input and while. Also, think about the number(s) that need to increase and how they may do that. Post some code and we may help more.

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.