An exercise in Stroupstrup's book is to write a program involving memory allocation with an infinite loop and let it run until it crashes. He says the crash will be due to memory exhaustion. He also suggests researching the topic.

I did and am frightened by what I found. The vast majority of the pages I read were related to hacker attacks. Apparently, memory exhaustion/stack overflow can be used as a form of denial of service attack. Many of these said that Chrome is vulnerable to this attack. (from 2008/2009) Only a handful dealt with non-attack memory exhaustion and discussed a solution. ini_set('memory_limit',-1).

What is Stroupstrup trying to get at with this exercise? What will happen if I run a program with an infinite loop? I am very reluctant to run such a program because I am fearful of doing something I can't undo. How does a programmer estimate the amount of memory that a program might need in order to avoid this situation? For example, the number of int variables times 4 bytes per int?

Recommended Answers

All 7 Replies

Why? Memory exhaustion is a very real possibility in a system with finite resources.

What is Stroupstrup trying to get at with this exercise?

The point is that you have to be careful. If you keep allocating and never free anything, you will run out of memory.

What will happen if I run a program with an infinite loop? I am very reluctant to run such a program because I am fearful of doing something I can't undo.

That depends on the program. If you don't do anything in the loop, you won't run out of memory--it will just run forever. Normally you won't write infinite loops; there should always be something that will break out of the loop. The loop/allocate exercise is just a quick way to demonstrate what happens if you don't free memory you've allocated.

How does a programmer estimate the amount of memory that a program might need in order to avoid this situation? For example, the number of int variables times 4 bytes per int?

The exercise is looking specifically at dynamic memory allocation, probably using malloc . IMO, this is mostly taken care of by careful program design, because memory management in C++ is tricky.

>> What will happen if I run a program with an infinite loop? I am very reluctant to run such a program because I am fearful of doing something I can't undo.


I doubt there's anything to "undo". If you run a program with code like this:

while(1)
{
    char* memory_leak = (char*) malloc(1000);
}

Obviously something's not going to work fairly soon and something will have to be aborted. Depending on how your OS handles things, it'll either let your program keep gobbling as much memory as there is, so all of your other programs freeze. Or it won't let the program take the entire computer's memory and things will be a little better. Obviously this code can't recover, so it needs to be killed. When your system gets overwhelmed, it may take a while to kill it.

But there's nothing to undo. Turn the computer off, then on and it's no longer running. Yeah, some things may not have shut down right, so it may take an extra restart, but that's the end of the problem. If you set it up as a service that runs EVERY time you boot, well then you have problems. But if you just run it, since it only deals with volatile memory, once you turn the computer off, it can't hurt you anymore (until you run it again). Same thing with a fork bomb.

So I tried to exhaust the memory on my computer. I use a laptop running XP that has 1 G of RAM. I used CTRL+ALT+DEL / Performance to see how much available memory there was. Before running the program there was 467K of available memory. I eventually ran the following program.

int main ()
{
	for (int i = 0; i < 150000; ++i) {
		double* pi = new double[2000];
		*pi = i * 2.5;
		cout << i << "  " << *pi << endl;
	}

	keep_window_open();
    return 0;
}

It stopped at iteration 133,033. My computer was almost non-responsive. After a few minutes, VC++ gave the following error message:
"Unhandled exception. std::bad_alloc at memory location (hex address)."
Is that was is supposed to happen? Is this what usually happens?
Thanks for your help.

Yep, that's what's supposed to happen. When "new" can't allocate the memory, it throws a "bad_alloc" exception. If you don't catch it, the program aborts. A double should be 8 bytes usually, so 2000 of them would be 16,000 bytes. 1 gigabyte of RAM divided by 16,000 bytes is 62,5000, give or take, so at most you should have gotten 62,500 iterations at most, but you got 133,000, so I'm puzzled there. Seems like it should have ran out of memory way earlier. Are you sure you only have a gigabyte of RAM? Maybe it started allocating memory on the hard drive.

But anyway, bad_alloc is exactly what one would expect for that program. Once it crashed, the memory should be freed and you'll be back to normal soon, though it may take a while to free all the memory.

Yeah, definitely, this is what was expected to happen. The reason why you get so much iterations is because of virtual memory. You said you had 467K available... was that 467M, because if it is only 467K out of 1G, you should think of fixing something because I imagine your computer will be extremely slow at running anything (you essentially have no more RAM to execute anything but the OS... classic windows problem!). Anyhow, on winXP, I believe the default size of virtual memory is 2GB, which would exactly explain the amount of iterations and the slow response of the computer. The OS essentially just loaded the program on the hard-drive, expecting way more memory demand that it could provide on the RAM, leading to slow execution, freezing of the rest of the programs, and ultimately running out of the 2GB of virtual memory.

Now, there is nothing "dangerous" about running a program that runs out of memory. The OS loads programs in a sort-of controlled environment, it cannot affect the other programs by other means than just making the whole computer extremely slow while the program is running. Now, when the uncaught exception "bad_alloc" is thrown, the OS kills the process and frees all memory associated to it, and you are back to normal.

The reason why some ill-intentioned people can use memory exhaustion as an attack is basically because it is an easy way to make a computer / system / server unresponsive to anything else because it is spending all its time running this infinite loop and allocating all this memory. That is the basic idea in a "denial of service" attack, you just bloat the system or server so much that it can no longer provide its service (and denies it to all users).

The way to deal with this memory exhaustion problem is to actually catch this exception "bad_alloc" and deal with it in the program rather than letting the OS kill the process. Like 99% of the time, you don't need to care about memory exhaustion at all. But, some programs that crush a lot of data, or heavy computer games, might either check the available memory (via some OS-specific calls) before-hand and tell the user that "the computer does not meet minimum memory requirements", or every memory allocation can be checked for the "bad_alloc" exception and dealt with by either resorting to slower but less memory-hungry algorithms or exchanging large chunks of data between temporary files (when not used) and RAM (when used).

Correction! I misread the available memory on my computer. Mike 2000 is correct that it should have been 467 M. The display read 466,800 but I forgot that display was in K's so I should have added another 3 0's to the display number.

This has been a very informative exercise for me. Many thanks to those who contributed. I feel now I can almost call myself a geek! I am declaring the thread solved.

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.