Some time ago, I wrote a program that used the write() function of an fstream object to write a large file to a 100MB-zip-disk (it was an old computer). When I "End Now"ed the program I found that the zip disk was corrupted. The file didn't show up but a lot of the free space was gone, and I had to format the disk.

I'm guessing it was because the close() function was never called, and so the result would have been the same if I had run the program on my hard drive (which I'm glad I didn't). Is there any way of getting around this problem?

I know there's an atexit() function, but I tried it with cout and it crashed.

Thanks for your time

7 Years
Discussion Span
Last Post by cog_bn

Even if your program doesn't "close" the file, the run-time library will, and failing that the OS will do it when the process terminates.

On a hard disk, the worst that could ever happen would be lost sectors. NTFS is pretty robust against such things (extreme case is power loss, and disks routinely pass this test without severe damage).

ZIP-drives were not exactly the most reliable technology as I recall, so it could just as well have been buggy drivers as your code.

If you're just using the standard library I/O mechanism, there are a lot of steps between your code and the surface of the disk. Any of which could go wrong in a spectacular fashion.


I'm really sorry about this: I have no idea what's going on now. I've ran the same program and it's produced a different result.

Last time, the file that was created didn't even show up until the program exited (just after when the close() function was called), and that's why when lots of disk space was taken up but no file produced anywhere I blamed the close() function not being called. And I did search for that file, pressed F5, everything.

This time, the file was produced right at the start and it's size just kept increasing. Plus, when I tried to close the program it just closed without windows saying "this program is not responding. do you want to End Now". I've been trying the whole morning and I can't get it to crash like last time.

If anyone knows what's going on that'll be brilliant, but I think it was probably just the computer misbehaving. Or maybe windows updated itself at some point.


The code was just a normal

fstream file("rubbish.bin",ios::out|ios::binary|ios::app);
for(/*some random condition*/){
   file.write(/*some random data*/);

And I wondered whether it was the "ios::app" that did something funny, but it's not that either.


Well yes, if for(/*some random condition*/) never terminates, then your disk will fill up rapidly.

Post actual code, not a summary (or even an abridged or abbreviated version).


Fine, if you insist:

#include <fstream>
#include <iostream>
#include <iomanip>
using namespace std;

void main()
	char* data=new char[10<<20];
	unsigned int size;
	cout<<"Size (in MB, to the nearest 10):\n\t";
	cout<<"\n\nData written (MBs):\t    0...";	//4 spaces
	fstream file("RandomRubbish.bin",ios::out|ios::binary);

	for(int MB=1;MB<=size;MB++)
		for(int i=0;i<(10<<20)-8;i+=8){
			(*reinterpret_cast<unsigned long long int*>(data+i))++;
			cout<<"\a\a\a\a\a\n\nFile bad!!\n\n";
			for(int i=0;i<20;i++)cout<<'\a';	//To get my attention


As I say though, it's working fine now. And I've tried "End Process Tree"ing it, it was still working fine. It was probably just the computer misbehaving back then. Maybe there was some other program (I have changed the anti-virus from F-Secure to Sophos - don't know if that could have affected it) or something. Or maybe it was just an out-of-date zip drive driver.

This article has been dead for over six months. Start a new discussion instead.
Have something to contribute to this discussion? Please be thoughtful, detailed and courteous, and be sure to adhere to our posting rules.