I want to write a C code that reads billions of integers from a text file. Each integer is of
length 10 digits (e.g. -2311872000). then I build a linked list to store these integers.how could I do that ?

Well, for starters, the example value exceeds a signed 32 bit range, so you have no choice but to bump up to a 64 bit entity. Let's assume that you're reading exactly 1 billion of these values. That means you need to store 1 billion 8 byte entities, which is 7GB and change.

I strongly recommend that you consider options that don't require all of the values in memory at one time, because that much memory usage is excessive and probably unnecessary.

What exactly are you going to do with this linked list?

to search for a number in the linked list,and find how much time it takes to traverse the linked list

Is that the ultimate goal? Because if it is you can take a much smaller sample of the numbers and then extrapolate an estimated search time as if the list contained billions of numbers.

I'd suggest that you ask your teacher if he really wants you to store in excess of 8GB in memory at one time. Not only is that generally a bad idea from a design standpoint, it's not safe to assume that a computer has that much RAM to begin with. The test results will be greatly skewed when paging parts of the list in and out of virtual memory.

If all you want to do is find out how long it takes to transverse a linked list of 1 billion nodes then you can do it with a sample of say 1 million nodes then, in a loop, transverse it 1,000 times. Start the timer before beginning the loop and end the timer after the loop finishes.

Consider the memory requirements (have a lot do you?).

1B nodes (1024^3) with a 64bit value field (8 bytes) and a 64bit link to the next node (another 8 bytes) = 16+GB... This is NOT a sustainable model! :-) IE, as others have indicated, this is not feasible with a 32bit system, and unless you have a 64bit one with tonnes of RAM (or swap space), then it won't work there either. Also, if you have less than 32GB of RAM (space for a 1B node list + system memory), then your application will be HORRIBLY time-constrained by memory swapping - the heat-death of the universe comes to mind...

So, rework your algorithm. To deal with this amount of data, is why we have on-disc database systems.

This article has been dead for over six months. Start a new discussion instead.