Hi,

I have a problem while reading a jsp file. I have a jsp file called scan.jsp. It is a very big file having 9000 lines. It has java script codes, some imported files etc. Unable to read all lines of data when I try the following code. 



public static final void startReadProcess(String sourcFile){
                LineNumberReader lnr = null;
                try {
                    lnr = new LineNumberReader (new FileReader (sourcFile));
                    while ((lnr.readLine ()) != null) {
                            System.out.println(lnr.readLine ());
                        }
                } catch (Exception e) {
                    e.printStackTrace();
                    PrintErrorLog.printError("Error while readingSoureFile");
                }
                finally{
                    closeReaderObj(lnr);           
                }

            }



Can any one please let me know why Data loss happing while reading big files? 

Perhaps you should use BufferedReader.

 try (BufferedReader reader = Files.newBufferedReader(sourceFile, charset)) {
        String line = null;
        while ((line = reader.readLine()) != null) {
           System.out.println(line);
        }
    } catch (IOException x) {
        System.err.format("IOException: %s%n", x);
    }

Edited 3 Years Ago by nikolaos

Can any one please let me know why Data loss happing while reading big files?

Because your readLine call actually reads in a line and increments the file pointer. In your current code, you are basically discarding every other line. You need to assign the result of readLine to a variable (as shown above) and print it out for getting expected behaviour.

Comments
Excellent focused and correct answer.
This question has already been answered. Start a new discussion instead.