Hey everyone,

I was wondering if someone could give me a push in the right direction. I'm trying to do Monte Carlo simulations on a certain statistical distribution.

Monte carlo simulations requires hundreds and thousands of iterations to be run. The problem that I have is I need to save all the information in arrays so that I can do further processing afterwards.

If I declare a two dimensional double array I get the following error:

"Exception of type 'System.OutOfMemoryException' was thrown."

So I assume I'm trying to create arrays that are too big.

Is there some other method or programming technique in C# that I can use to store huge amounts of data for later processing?

I'm a C# learner so I don't know too much.

Thanks!

One of the features of using a random number generator is that it's repeatable. All you need to do is store the starting seed that you provide the generator and you'll be able to repeat exactly the same sequence of numbers.

What about transitioning your data to a database? That way the data is stored on disk instead of in RAM (or the bulk of it at least) and you can just grab smaller chunks to process at any time. You could even add functionality such that if you call process(myObjectType[455]); , myObjectType will take the index specified (455) and use it as an item id for a select statement. That way in your actual processing you're using the same syntax for accessing an array.

Doing that one at a time for hundreds of thousands of items like you said I imagine would add unreasonable performance issues, so you could have myObjectType cache items, say 100 of them. Then when you request myObjectType[455] it checks to see if item 455 is in its current cache, and if not selects items 405 through 555 and caches them.

I haven't tried any of this myself, I'm just brainstorming.

This article has been dead for over six months. Start a new discussion instead.