Runtime r = Runtime.getRuntime();
Process pr = r.exec(cmdString)

I can get the prompt result when I try "cmd /c type f1.properties", however, when the size of file is larger the command will hang and no output I could get and no exception occurred.

Is there any limitation for java to cache the prompt result?

What code are you using to process the output stream from your process?

you also may want to post a bit more code that you're running there. a lot of people think that if they do:

try{
  Runtime r = Runtime.getRuntim();
  Process pr = r.exec(cmdString);
}
catch(Exception e){
}

and no stack trace is being printed, that signifies that no exception was thrown, but who knows ...

I have realized that it was the InputSteamReader buffer size limitation, when it is a larger file the buffer default size is too small 8920. How to increase it so that a larger file could be read?
My code:

public static String executeCmdAndReturnPromptResult(String   cmdString)
                        throws Exception {

        LOGGER.entering(CLASSNAME,
                    "Entering executeCmdAndReturnPromptResult()", cmdString);

        String cmd = cmdString;
        LOGGER.info("The system command: " + cmd);
        Runtime r = Runtime.getRuntime();
        Process pr = r.exec(cmd);
        LOGGER.info("Error: " +getStringFromInputStream(pr.getErrorStream()));
        int exitVal = pr.waitFor();
        // System.out.println("ExitCode cmd return = " + exitVal);
        LOGGER.info("ExitCode cmd return = " + exitVal);

        String promptResultString = getStringFromInputStream(pr.getInputStream());
        LOGGER.exiting(CLASSNAME, "Exiting executeCmdAndReturnPromptResult()",
                promptResultString);

        return promptResultString;

    }

=========================================================================

private static String getStringFromInputStream(InputStream is) {

        BufferedReader br = null;
        StringBuilder sb = new StringBuilder();

        String line;
        try {
br = new BufferedReader(new InputStreamReader(is, "UTF-8"));
            line = br.readLine();
            while (line != null) {
                sb.append(line + "\n");
                line = br.readLine();
            }

        } catch (IOException e) {
            e.printStackTrace();
        } finally {
            if (br != null) {
                try {
                    br.close();
                } catch (IOException e) {
                    e.printStackTrace();
                }
            }
        }

        return sb.toString().trim();

    }

That code should be able to read the process output up to the limit of a single String. Maybe it's the logger that's the problem? Try just printing the string to System.out before returning it (at line 18)

The problem is not at logger, I am sure...I debug it inside code, but I could not debug further into r.exec(cmdString).
1. I tested with a short file it is working fine, but not a longer file as 3K.
2. The problem is the default size of BufferReader 8192 , but it would not work even I tried to increase it to 500000 .
3. Now I do not know how to read a larger file by using this method.
If the file is big, the method will hang without any response error or Exception.

Java Stream buffer size may affect the efficiency of your execution, but it won't change whether it works or not.
Maybe the problem is that you waitFor the process to terminate before you start to read your input stream. The Process API doc says

Because some native platforms only provide limited buffer size for standard input and output streams, failure to promptly write the input stream or read the output stream of the subprocess may cause the subprocess to block, or even deadlock.

... in which case you need to swap the code around so that you read all the input from your process, before you waitFor to get the return code, like this example

I also saw this one after doing a quick search on Google. Hope this help.

Thx, I have also checked the same resource on google but did not help so much...:)

Hi James, I have tried to swap the code but same problem exists...

not working for big files, only worked on a small file(1-12 lines).

As you said, a deadlock or block exists if the size of the file excess limitation, but why? Might be my windows 7 only provide limited buffer, so that the output has a bottle neck. :(

I am lucky as I only have a short output for the moment, but this is a problem in future if I need to get a large output from the command line "type "... Solution will be complicated if it is the problem of platform itself.

Yes, it certainly looks like a OS limit on the sysout buffer. Can you post your latest code because what you are trying to do is a perfectly normal thing that people do (successfully!) all the itme.

Just for reference...
I ran this code (with try/catch, imports etc), using your getStringFromInputStream method unchnaged, against a 210k text file with perfect results...

         Runtime r = Runtime.getRuntime();
         Process pr = r.exec("cmd /c type c:\\testdata\\long.txt");
         System.out.println(getStringFromInputStream(pr.getInputStream()));
         System.out.println("result " + pr.waitFor());  

... so the question is "what are you doing that's different?"

... and a small ps - a tidied-up version of your getStringFromInputStream

    private static String getStringFromInputStream(InputStream is) {
      StringBuilder sb = new StringBuilder();
      try (BufferedReader br = new BufferedReader(new InputStreamReader(is, "UTF-8"))) {
         // try-with-resources guarantees BufferedReader will be closed properly
         String line;
         // always declare variables in the smallest possible scope
         while ((line = br.readLine()) != null) {
            // saves repeating the read
            sb.append(line);
            sb.append("\n");
            // why do a string concatenation when you've got a StringBuffer?
         }
      } catch (IOException e) {
         e.printStackTrace();
      }
      return sb.toString().trim();
   }

I will test it, thanks!

Hi James,
My java version is 1.6.45 so the code you modified could not be run on my environment.( I must use 1.6.45 because of the requirement is to support it).

Please run your method against this file attached to see any difference.

==============================================================================
Re: "string concatenation when you've got a StringBuffer"
I must sb.append("\n"); if not the string I get will be only on one long line which looks ugly. Any suggestion not using concatenation?

OK, sorry to hear about the 1.6 requirement - I gues your client/boss knows that it's out of support and has security problems that will never be fixed?
I ran my code initially with your unchanged method, and it works 100%. The subsequent changes to the method were just because I had a spare moment and was a bit bored! I just ran it with your data file, no problem.

The important point is to call getStringFromInputStream before you call pr.waitFor

ps:
Because StringBuilder is optimised for appending, it's considered better practice to code

sb.append(line);
sb.append("\n");

rather than do string concatenation of line and \n, which requires allocation of a new String object. Having said that, I know (have seen) cases where the compiler replaces concatenated strings in a StringBuilder append by multiple calls with one string each in the generated code.

Still not working for long file as attached on only 2.4KB on my PC
Windows 7 SP1 (platform problem?) or Java1.6 problem.

Will try on Windows 8 PC with a higher java version 1.7.

The conclusion:
1. Use java FileReader to read the content of a file(better than "cmd /c type ...." way).
2. r.exec(cmdString) will have problem when InputStream is large.(platform depandent!)

Re: "The important point is to call getStringFromInputStream before you call pr.waitFor()"

Q:James: If you do not wait the command to finish, how to guarantee the output is a complete result? ErrorStream is another story...
Have you tested to put "getStringFromInputStream" after pr.waitFor and result?

Hi James,

The test result from me Java 1.7.25 is working for me now on the same platform windows 7 SP1.

"getStringFromInputStream" before and after pr.waitFor() both worked fine!

Thanks for your help!

  1. With the wait before getting the input stream the process hangs with a 210k file on my machine (win7 64, Java 8). No surprise there.

  2. No, there really isn't a problem with large output from exec provided you code it right (see 1)

  3. When you read the input stream you are reading a pipe from the process's sysout. The pipe remains open until the process terminates, when the pipe is closed. Your read loop keeps looping and waiting until the pipe is closed (end of file). Unless you want the return code the waitFor is redundant after reading the input to e.o.f.

  4. Error stream is a quandry because you don't know which read first. By far the easiest thing is to use ProcessBuilder (nowdays always preferred over Runtime.getRuntime), and call redirectErrorStream(true) to merge the error stream into the standard output.

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.