In the following program though most of the working part of the program is commented out, it gives seg fault. reducing the value of MAX to say 10000 removes the seg fault. Why is this happening? it should be only 1 MB only. Is 1 MB too big for an array?

#include <stdio.h>

#define MAX 1000010

int main()
{
	int m;
	int n;
	int carry;
	int sum;
	int i, j;
	int n1[MAX];
	int n2[MAX];
	char r[MAX];
	scanf("%d", &n);
	//for (i = 0;i < n;i++)
	//{
		//scanf("%d", &m);
		//for (j = 0;j < m;j++)
		//{
	//		scanf("%d %d\n", &n1[j], &n2[j]);			
		//}
		carry = 0;
		/*for (j = m-1;j >= 0;j--)
		{
			sum = n1[j]+n2[j]+carry;
			if (sum >= 10) {
				sum -= 10;
				carry = 1;
			}
			else
			{
				carry = 0;
			}
			r[j] = sum+'0';
		}
		r[m] = '\0';
		if (carry == 1)
		{
			printf("1%s\n\n", r);
		}
		else
		{
			printf("%s\n\n", r);
		}*/
	//}
	return 0;
}

Empirically, you have proven that it is indeed, too large. Remember, not only must that quantity of RAM be present, but it also must be contiguous. Also, you are asking for memory only from the stack.

Try re-booting your system (to enable the largest RAM possible), and if that should fail, you'll want to use malloc() or calloc() for your memory request. It's more complicated, but it gets memory from a larger resource than the stack can offer.

Edited 6 Years Ago by Adak: n/a

On the stack, 9 MB might be asking a bit much. Either try a variable with static duration (a global) or use dynamic allocation (prefer the latter).

Comments
"A bit much" pun intended? Found it humorous.

Everything on your machine is FINITE

$ ulimit -a
core file size          (blocks, -c) 0
data seg size           (kbytes, -d) unlimited
scheduling priority             (-e) 20
file size               (blocks, -f) unlimited
pending signals                 (-i) 16382
max locked memory       (kbytes, -l) 64
max memory size         (kbytes, -m) unlimited
open files                      (-n) 1024
pipe size            (512 bytes, -p) 8
POSIX message queues     (bytes, -q) 819200
real-time priority              (-r) 0
stack size              (kbytes, -s) 8192
cpu time               (seconds, -t) unlimited
max user processes              (-u) unlimited
virtual memory          (kbytes, -v) unlimited
file locks                      (-x) unlimited

Even "unlimited" just means that it isn't constrained to some given value, it just means "whatever is available".

I just happen to have 8MB stacks.
A popular other choice on 32 bit machines would be 1MB (like it seems for you).

If you were on a 16-bit x86, you'd be stuck with 64K MAX. The default for 16-bit compilers was something like 4K IIRC.

Embedded processors can have even narrower limits on the amount of stack space.

Saying static int n1[MAX]; would fix the immediate problem. Though you should implement a more dynamic approach based on the input parameter n.

But start with MAX = 10 while you get the rest of the code working, THEN change to dynamic allocation (it's very easy when everything else works).

This article has been dead for over six months. Start a new discussion instead.