1,105,456 Community Members

Checking memory bounds

Member Avatar
Labdabeta
Master Poster
769 posts since Feb 2011
Reputation Points: 161 [?]
Q&As Helped to Solve: 42 [?]
Skill Endorsements: 6 [?]
 
0
 

I have a program that dynamically allocates an array and uses it. The thing is that under certain situations this array will get infinitely long. As such I would like to know if/when the array has gotten too long and then start saving it to a file (where length will not be an issue) Here is an example:

unsigned char *myArray=new unsigned char[len];
//if len>max allowed memory:
FILE *myFile=fopen("myFile.dataFile","wb+");
for (int i=0; i<len; ++i)
{
    fprintf("%c",myOtherData[i],myFile);
}
//else
for (int i=0; i<len; ++i)
{
    myArray[i]=myOtherData[i];
}

I was thinking that maybe new[] returns NULL if there is not enough space or something?

Member Avatar
L7Sqr
Veteran Poster
1,006 posts since Feb 2011
Reputation Points: 179 [?]
Q&As Helped to Solve: 169 [?]
Skill Endorsements: 11 [?]
 
0
 

Are you sure your implementation of new doesn't already provide virtual memory backed by disk space? Have you run into the issue of new returning NULL (as opposed to throwing bad_alloc )?

Member Avatar
Labdabeta
Master Poster
769 posts since Feb 2011
Reputation Points: 161 [?]
Q&As Helped to Solve: 42 [?]
Skill Endorsements: 6 [?]
 
0
 

How can I check this, and how can I be sure that it will be platform-independant?

Member Avatar
L7Sqr
Veteran Poster
1,006 posts since Feb 2011
Reputation Points: 179 [?]
Q&As Helped to Solve: 169 [?]
Skill Endorsements: 11 [?]
 
0
 

I'm pretty sure new throwing an exception is platform-independent. Following that, you can install an exception handler with the effect of doing what you are suggesting (writing to file). Although, I think you would be hard pressed to find a system today that doesn't already transparently support this with virtual memory.

Member Avatar
Labdabeta
Master Poster
769 posts since Feb 2011
Reputation Points: 161 [?]
Q&As Helped to Solve: 42 [?]
Skill Endorsements: 6 [?]
 
0
 

Wow, I just tested it with this code:

#include <iostream>
using namespace std;

int main()
{
    long long int *test=new long long int[100000000000];
    test[                                  99999999999]=10;
    cout<<test[                            99999999999];
    return 0;
}

And it performed without a problem. Considering that I only have 4 gigs of RAM on my machine, can I assume that my compiler is transparently supporting the whole file-writing thing? (if this works then my mass1venum dll (I wrote it as an excercise) will be able to be a lot simpler)

Member Avatar
mike_2000_17
21st Century Viking
4,088 posts since Jul 2010
Reputation Points: 2,271 [?]
Q&As Helped to Solve: 800 [?]
Skill Endorsements: 73 [?]
Moderator
Featured
Sponsor
 
0
 

Indeed, pretty much all modern operating systems have good mechanisms to deal with memory and use the hard-drive if necessary. Of course, there will always be a chance that you run out of memory (whether the OS doesn't want to take up more HDD memory or because you run out of HDD space). So, there is really no need for you to write code that does this type of thing, and the OS will do this much better than you can ever hope to. For instance, the OS will swap memory between the RAM and HDD such that your program is always working on RAM memory (the chunks of memory not currently used are basically sleeping on the HDD). This is a kind of mechanism you would have a really hard time doing yourself.

As for the new operator, the C++ standard prescribes that it must throw a bad_alloc exception if you run out of memory, so that is entirely platform independent. If you want the "return NULL" behaviour, you must use the no-throw new operator, as in new(nothrow) int[1024];

If you are going to be working with large chunks of data, it might be a good idea to consider a container like std::deque which stores the complete array as a number of big chunks of data, as opposed to one contiguous array. This will generally be less demanding for the OS because the OS won't have to make space for one huge chunk of memory. But, of course, the memory won't be contiguous, so it might not be appropriate for your application.

Member Avatar
Labdabeta
Master Poster
769 posts since Feb 2011
Reputation Points: 161 [?]
Q&As Helped to Solve: 42 [?]
Skill Endorsements: 6 [?]
 
0
 

Thank you, that is exactly what I was hoping to hear!

Question Answered as of 2 Years Ago by L7Sqr and mike_2000_17
You
This question has already been solved: Start a new discussion instead
Post:
Start New Discussion
View similar articles that have also been tagged: