Hi,

i am very new to c and was wondering if anybody can help me. i want to create an array of size 1 million bits. i heard you can use malloc or calloc. does anybody know how to do this,

thanks in advanced,

sc

Recommended Answers

All 11 Replies

Hi,

i am very new to c and was wondering if anybody can help me. i want to create an array of size 1 million bits. i heard you can use malloc or calloc. does anybody know how to do this,

thanks in advanced,

sc

Arrays and even calloc and malloc allocate contiguous memory. And the amount of contguous memory you can allocate at one time may be limited. If you need to work with large dataset then try linked lists. If you still want to stick to calloc, malloc etc. then try farmalloc or halloc i.e. you may have to change the memory model.

If you still want to stick to calloc, malloc etc. then try farmalloc or halloc i.e. you may have to change the memory model.

Those two functions are only available on ancient 16-bit compilers such as Turbo C. Modern 32-bit compilers have no such functions. OP didn't say what compiler or operating system he is using.

i want to create an array of size 1 million bits.

arrays are allocated in bytes, not bits. so 1 million bits on most computers will be 1,000,000 / 8 = 125,000 bytes. :)

Those two functions are only available on ancient 16-bit compilers such as Turbo C. Modern 32-bit compilers have no such functions. OP didn't say what compiler or operating system he is using.

Hmmm, I think you are right. I used faralloc only in Turbo C. I guess in 32-bit addressing systems there wont be any data which is "far", it should always be "near", hence there's no need for a different addressing system.
Here I have a question. As far as I knew in 16-bit systems(or atleast in DOS) you cannot declare an array that contiguously takes more than 64K memory. I was wondering what is the case with 32-bit systems! Is there any such limitations in 32-bit systems?

Is there any such limitations in 32-bit systems?

I think there is a 2Gig limit on a 32-bit os because of the limitations of unsigned long.

Each process is limited to 2GB or 3GB with the /3GB switch.
Moreover an array is contiguous, and it may not be able to use all the memory because of a heap fragmentation.

Why do you want to allocate such big memory?

i have to do a search for a key and store the result so that i can compare it with another search result.

Ok here is my input to the post. You can just type:

int bigthing[1000000];

and you should be fine. I just tried. Now if for whatever reason your program does not have that many continous memory, some funny things may happen. When dealing with big things like that the best thing to do is to use a linked list or any other data structure that is no continous in memory like arrays are. Take care and good luck.

-r

Ok here is my input to the post. You can just type:

int bigthing[1000000];

and you should be fine. I just tried. Now if for whatever reason your program does not have that many continous memory, some funny things may happen. When dealing with big things like that the best thing to do is to use a linked list or any other data structure that is no continous in memory like arrays are. Take care and good luck.
-r

In my pc it crashed on both VC6.0 and DEV C++ 4.9.9.2...and the reason is....such a large piece of memory exceeds the stack size (a Stack Overflow). You need to allocate the memory on the heap instead.

int *bigthing=new int[1000000];

don't forget to delete memory with delete[].

Well like I said. If for what ever reason the program can not get that amount of continous memory, some funny things may happen. Like a crash :). I just was trying to state the point that you can do it that way, but there was not way to be sure that it will work. A safer way would be what you wrote ;). Take care:

-r

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.