Dear all
I made a code in C++, for data processing.
The code read, processing and write the output several times.
The process abort when the output file get 2Gb fo size.
I know that it is not a disk limit.
Is that a flag mistake on the makefile?
thanks

Recommended Answers

All 16 Replies

what compiler and operating system. It could be a limit of your os and/or compiler. Try your program on a 64-bit os, such as 64-bit XP or Vista and see if that fixes it.

I am using centus 4 and gnu compiler

what compiler and operating system. It could be a limit of your os and/or compiler. Try your program on a 64-bit os, such as 64-bit XP or Vista and see if that fixes it.

I don't use those, but I think I read someone say g++ and gcc have an option to produce 64-bit programs.

...
I know that it is not a disk limit.
...

Um - it may not be a "disk limit" as in a full disk, but your file system type (ext3 at a guess) probably has a maximum file size.

Hope this helps.

I had checked and that is not the case.

Um - it may not be a "disk limit" as in a full disk, but your file system type (ext3 at a guess) probably has a maximum file size.

Hope this helps.

You say you process it several times. Do you attempt to read in the full file, process it, add to it, then write a new, larger file? Perhaps showing us pertinent portions of the code would help.

No.
The data is a large 3d matrix.
Read one line, process this line and write the output line.
each line is like 1megabite of file.

You say you process it several times. Do you attempt to read in the full file, process it, add to it, then write a new, larger file? Perhaps showing us pertinent portions of the code would help.

Is it aborting because the OS can't handle the file, or is there a file size counter you're maintaining in the program?

Remember that a signed int has an upper limit of 2GB.

there is not a counter inside the code.
The morery is clened avery type the programa finish to process one line and write its output.

Is it aborting because the OS can't handle the file, or is there a file size counter you're maintaining in the program?

Remember that a signed int has an upper limit of 2GB.

I can't see where it's a limit of the OS file system on any current version - all allow files of at least 4GB.

Without some pertinent code, I have no more guesses as to what's going on.

The process abort when the output file get 2Gb fo size.

Idea :
what about looking at the exit code ?
see if it's equal to the return value that you are return inside your code ?

If not some adnormal thing was happened. Then you can use a debugger
to watch what's going on.

Is possible to have a mistake using a flag at the makefile, or a absence of a flag?

I can't see where it's a limit of the OS file system on any current version - all allow files of at least 4GB.

Without some pertinent code, I have no more guesses as to what's going on.

The output is ok.
the code does the process fine. I checked the output, and I also used the code with lower size datafile, that runs ok.

Idea :
what about looking at the exit code ?
see if it's equal to the return value that you are return inside your code ?

If not some adnormal thing was happened. Then you can use a debugger
to watch what's going on.

By "exit code" NicAx64 meant the code that the program returns to the OS.

Try writing a simple program that produces a file over 2GB and see if that works. Write OVER 2,147,483,648 (2GB) bytes. Compile it identically to your problem program. Does it work?

ok,
I will try ir and send-uou the fedaback soon,
Thank

By "exit code" NicAx64 meant the code that the program returns to the OS.

Try writing a simple program that produces a file over 2GB and see if that works. Write OVER 2,147,483,648 (2GB) bytes. Compile it identically to your problem program. Does it work?

Try something like this to see if it works:

#include <stdio.h>

char buf[1024];

int main() {
    unsigned i;
    FILE* f = fopen ("data.bin", "wb");

    /* Write exactly 2GB + 1MB */
    for (i = 0; i < 2 * 1024 * 1024 + 1024; ++i)
        fwrite (buf, 1024, 1, f);

    fclose (f);
}
Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.