I seem to disappear from this forum for a while then re-appear...I've been so busy with my first born baby girl that I haven't had time to check my email much less work on any of my projects, but something has come up and I need some input.
I have an application that uses the built in random function to generate a random number. I'm wondering, if I need the result of the function to conform to a standard (namely being an integer, not a decimal) is it permissible to multiply the result by say 10,000,000 and then "round" the result (to lose the decimal) of the random function and have it still maintain its randomness?
I'm asking because something has happened with my application that should not happen in trillions of years (literally). I've noticed that when generating random numbers they repeat.
Imagine, for example, the following list of fruit:
Now imagine you were going to eat one of these each day and you used a random algorythm to assign each fruit a random number, and then you ate the fruit with the lowest number. Each day you repeat the process. That's essentially what I am doing, but on a scale with thousands of fruit. And in my case, the numbers repeated...like, not just a little, but EXACTLY the same numbers.
I'm wondering if it's my multiplying or rounding that could cause it?