I have a swing application which works on CSV file. It reads full file line by line, computes some required statistics and shows output.
The Upper part of output screen shows each record from file in that order in JTable, whereas lower part shows statistics computed based on that data. The problem is that JVM take 4 times the memory than that of file size. (while processing 86MB of file Heap area uses 377 MB of space - memory utilization checked using jVisualVM).
1. I have used
LineNumberReader for reading file (beacause of specific requirement, I can change it if that helps in memory usage)
2. For reading every line
readLine() is used and then
.split(',') of that line which is String is called for individual fields of that record.
3. Each record in stored in Vector for displaynig in JTable, whereas other statisics are stored in
TreeMap and summary data in JavaBean class. Also one graph is plotted using
Please suggest to reduce Memory utilization as I need to process 2GB file.