I'm working on a small software project and there is one tricky problem to solve.

Here is overview of the program:
It is a business software that has to keep track of expenses and earnings and to show the current balance (negative or possitive). Whenever there is someone that has to pay something to the given business the operator creates a receipt in the system and the money is added to the ballance. On the other hand whenever there is an expense the operator creates an expense receipt and the money is subtracted from the balance. The program should store all receipts.

The tricky part is how to compute the balance. The easiest way is to sum the expenses, sum the earnings and subtract the expenses from the earnings but there is a problem with this approach - if the firm is generatin alot of expenses or/and earnings at some point the sum of either the earnings or the expenses would go out of range of the data type that I might me using. So the question is how to solve this problem without the possibility of overflow?

One approach might be to group the receipt into periods - for example all receipts that were created on the same day would be in one period and every period would have its own balance - earnings minus expenses for that period. This way whenever we want to get the current balance the program will just add the balances of the all periods and now overflow is very unlike to occur. Is this a reasonable solution?

I would appreciate any comment, suggestion, ideas on this problem.
Thanks in advance and I wish a happy New Year to all of you!

So the question is how to solve this problem without the possibility of overflow?

Use a library like GMP for arbitrary precision math. That's really the only way to do it without heaping annoying restrictions on the end user.

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.