In a function of my program, I open and close the file once every recursion...and the recursion is very large....I've made sure im not writing when its closed or not opening when its open or any of those trivial issues.Will this be a problem with the operating system?...because I'm getting a segmentation fault that varies in the position of the recursion each time i run it

Recursion in combination with file-access is asking for trouble... I'll bet that somewhere the fileis closed or opened without your knowledge :)

Could you post the code?

I have a humble suggestion, for I had also the same problem before. Could you read the whole file once into the memory? Of course, the data should be logically organized, like a table or something like that. And you run your program to process the whole thing in the memory. After everything is done, output it in a new file. This approach really accelerate the program and made me happy.

Best regard!!

In my opinion without some careful management it's asking for trouble. Why do you have to open it? Couldn't you do something like...

std::[i/o]fstream fstr;

// ...

fstr.open( "file_name" );
do_recurs_function( fstr, /*other params*? );

I had a similar situation where I was opening and closing streams (to different files though), in a large loop. I decided to forget opening and closing and just open once and close once and it sped the program up hugely. I honestly can't remember the speed performance improvement but I marked it as significant.

Edit - Note: in passing file streams into functions you have to pass by reference (or pointer I think). Can't pass by value.

#include <fstream>
void fstream_function( std::ifstream &in ) {}

int main() {
  std::ifstream in;
  return 0;

passed the file pointer..the seg faults stopped :)