Hi,
I was wondering:
if the memory allocation in the following codes is static or dynamic,
why the upper limits on their sizes are different,
why some of them would cause segment fault?
what you would suggest to allocate array according to different need?

1.

double data[1048000];
   for(int i=0; i<1048000; i++){
      data[i] = i;
   }

this gives segfault error at the first iteration ie i=0. If I change size to be 1047000, it will be fine. Is the size too big? What is the maximum size for it?

2.

int size=1048000;
   double data[size];
   for(int i=0; i<size; i++){
      data[i] = i;
   }

Same error as the code before. Is the allocation for the array also static?

3.

int size=atoi(argv[1]);  
   double data [size];
   for(int i=0; i<size; i++){
      data[i] = i;
   }

Same error as the code before. Is this one static allocation for the array? The size is dynamically determine by command line argument. This one is actually a simplified case of what I met in real today where the double array is of size 3241728 and causing segfault somewhere down the running . I changed the code to the one below and all is fine now.

4.

int size=atoi(argv[1]);
   double *data = new double[size];
   for(int i=0; i<size; i++){
      data[i] = i;
   }
  delete [] data;

This one I know is dynamical allocation. It seems that there is virtually no limit on the size of the array.

Thanks in advance for looking at these somehow tedious things!

when you use new the argument it takes is a size_t
normally everyone is simply using an int since this is enough for most purposes.
the type of size_t is implementation specific, on my computer its a
unsigned long int
This means that on my system If I have a lot of memory, I should be able to allocate
2^63-1 length array.

I'm using 64 bit

Hi,
I was wondering:
if the memory allocation in the following codes is static or dynamic,
why the upper limits on their sizes are different,
why some of them would cause segment fault?
what you would suggest to allocate array according to different need?

1.

double data[1048000];
   for(int i=0; i<1048000; i++){
      data[i] = i;
   }

this gives segfault error at the first iteration ie i=0. If I change size to be 1047000, it will be fine. Is the size too big? What is the maximum size for it?

2.

int size=1048000;
   double data[size];
   for(int i=0; i<size; i++){
      data[i] = i;
   }

Same error as the code before. Is the allocation for the array also static?

3.

int size=atoi(argv[1]);  
   double data [size];
   for(int i=0; i<size; i++){
      data[i] = i;
   }

Same error as the code before. Is this one static allocation for the array? The size is dynamically determine by command line argument. This one is actually a simplified case of what I met in real today where the double array is of size 3241728 and causing segfault somewhere down the running . I changed the code to the one below and all is fine now.

4.

int size=atoi(argv[1]);
   double *data = new double[size];
   for(int i=0; i<size; i++){
      data[i] = i;
   }
  delete [] data;

This one I know is dynamical allocation. It seems that there is virtually no limit on the size of the array.

Thanks in advance for looking at these somehow tedious things!

1 and 2 are the same because the compiler is smart enough to figure out that while size is a variable, it will always be 1048000 when memory is allocated.

I'm not sure about 3. That may vary from compiler to compiler. I used to do it all the time and didn't get any errors (that's why I kept on doing it). People kept telling me it shouldn't compile, but it did. I don't understand that one, but you aren't supposed to do it, even if the compiler lets you get away with it.

4 is the heap versus stack. You can store a larger array on the heap. I won't try to explain why because I'll inevitably say something wrong. :)

http://www.codersource.net/c++_dynamic_memory_allocation.aspx

This article has been dead for over six months. Start a new discussion instead.