Hello everyone,

I currently have an application that reads in an entire excel file then iterates through the records from the file and queries our database for a match. If a match is found the record is ignored if a match is not found the record gets added to the database.

Now there are ~2500 records per excel file and each of those records has a few attributes linked to it. My problem is that when I try to read in certain files I get a GC overhead error or out of memory error. To resolve this I know I could probably read in a chunk of records instead of the entire file but I also get the error when I try to read in another file right after that.

Is there any way to get rid of the old data from the JVM's heap after I am through uploading it? Or is there some obvious way I may be missing to do this more efficiently?

Thanks for your help!!

If you ensure that you have no remaining references to the "old" data, then it will definitely be garbage collected before the JVM runs out of memory and throws an "out of memory". (In a complex program it's all too easy to leave an object in a List somewhere which prevents it being GCed.)
Yu may also increase the jvm's memory allocation; the default isn't very big.

Thanks for the quick response!

I went over my code and sure enough I had a list that wasn't getting cleared (a sizeable list at that). And I increased the heap size. No more errors!

This question has already been answered. Start a new discussion instead.