I am trying to to read and write large files about 600M.
The thing is when i use the normal FileOutputStream methods an exception gets thrown saying
This is how i am trying to read the file
FileOutputStream out = new FileOutputStream("C:/my_life_story.zip");
InputStream in = //an input stream from a socket, etc
int len = 0;
byte buffer1 = new byte;
while ((len = in.read(buffer1)) > 0)
out.write(buffer1, 0, len);
My computer does have sufficient memory
I tried adding this in the while loop but it does not work and i got the same exception thrown by the JVM again
My question is basically how do i increase the memory programatically
or is there away in which i can read the file without having the above exception thrown by the JVM?
Try flushing the buffer after every write.
The only thing I can think of that's eating up memory here is the outputbuffer. It may not be flushing automatically, causing the JVM to attempt to map the entire output into RAM instead of writing it directly to disk (which you seem to be attempting to do).
Unfortunately i can't use the above code because the thing is the application is in a jar file on windows platform and users that use this application to run it usually double click on the jar file to run it. Is there a way to do what you suggested programatically maybe by the use of properties?
I think maybe jwenting maybe right that i have to flush and i will be trying it out.
On another question do any of you guys know how to increase the heap size programatically?