I am trying to put together a web interface for a process which downloads a number of XML files and then processes them. I am usimg JavaScript / jQuery / AJAX to initiate the process on the server, one file at a time, and to display progress information after each step.

My problem is that although everything seems to work, the progress messages only appear on the browser in one go after the entire process is completed, which is kinda pointless.

Here is my code - any suggestions will be welcome:

jQuery(document).ready(function($) {
    async: false,
    beforeSubmit: function(formData, jqForm, options) {
      document.getElementById('submit_button').disabled = true;
      document.getElementById('progress-messages').innerHTML = '';
    success: function(responseText, statusText, xhr, $form) {
      if (responseText == '1') {
        document.getElementById('progress-messages').innerHTML += '<h3>Retrieving XML Data from Amrod</h3>';
        var files = ['products', 'categories', 'images', 'branding'];
        var i = 0;
        while (files[i]) {
            type: "POST",
            url: ajaxurl,
            async: false, // Hopefully wait for a response from the server?
            data: {
              action: 'update_products',
              file: files[i]
            success: function(response) {
              document.getElementById('progress-messages').innerHTML += '<p>Retrieved ' + response + ' file</p>';
              setTimeout(function() {
                // Do nothing for a sec to see if the innerHTML will display - it doesn't.
              }, 1000);
              alert(); // This causes the innerHTML to display in the correct sequence, but is a pain - not a solution.
      document.getElementById('progress-messages').innerHTML += '<h3>Processing XML Data</h3>';
      // Do other stuff
      document.getElementById('submit_button').disabled = false;
      return false;


Just curious as to why you are using jQuery and then use raw JavaScript for the DOM?

that's most probably because you are using a "while" loop...

paulkd: Probably because I don't know any better! Imuse jQuery for the heavy lifting, e.g. AJAX, and plain JavaScript for the easy bits. I doubt it has any bearing on my problem, though.

Troy III: The problem persists even when there is no loop, and in any event if there is code in a loop that says "do stuff" the stuff should be done every time the loop itterates, right? Seems like some of the "stuff" is being saved up until the loop finishes iterating, and then gets done, i.e. changing the innerHTML of the "progress" div.

The setTimeout function does not wait for a second, if that's what you're after. So the changes in the loop (without the alert) happen so quickly, that the DOM gets no time to refresh, and therefor updates after the loop is done.

commented: You hit it right on the head! +12

Its common to try to create a sleep or delay, but in JavaScript, the setTimeout function is not used in the manner that you are attempting. With the setTimeout function, you can create a delay, but that does not stop the rest of the code from executing. The setTimeout() method will wait the specified number of milliseconds, and then execute the specified function.

In your case, you did not specifiy for anything to execute after 1000 milliseconds. So there was a delay to exectute nothing, but again...in the mean time, the rest of the code was being executed.

The following would execute an alert after 3 seconds...


If you had additional javascript code after that line, it would execute without waiting for the alert function to finish its execution.

Thanks for the "settimeout" tip. I did try placing the actions currently within the loop inside the "settimeout" function, and that made no difference either, even when I set the timeout value to like 10 seconds.

The activity in the loop actually downloads an XML file to the server, and the files are up to 35 mb in size, so there is plenty of time to frefresh the DOM.

I am tryimg to get something on the page to keep the user amused, so that he knows there is something going on. I tried a jQuery UI progress bar, but couldn't get it to work either.

After downloading the files, I will be reading each file and writin g each record into a database table - that's the "do othwer stuff" bit in the code. That will also take a while, and a progress bar would be very nice, because I will know how many records there are to process and how far the process is.

Life was so much simpler in COBOL - and users less demanding!

..., so there is plenty of time to frefresh the DOM.

You are using async: false so the DOM is waiting for the Javascript function to finish. Why are you doing this synchronously?

I am trying the synchronous method so that each file is downloaded by the server before the next one is started so that I can have all the files accounted for before starting to process them. I also rather hoped that the DOM would refesh while waiting for AJAX to come back.

I am trying the synchronous method so that each file is downloaded by the server before the next one is started so that I can have all the files accounted for before starting to process them.

You can do that with chaining, start the next download after the first one finishes. The success event can be used to trigger the next one.

Instead of "alert();" on line 27 in your original posting, try "abort();"

Some browsers require this "reset" between the retrieval of multiple XMLHTTPRequest data from the server.

Regards, Shawn

NetSite, I think you'll need to recheck your algo', and probably rethink your strategy completely

I'm not a fan of synchrounous request in JS. Everytime I see this in someone elses code I'd always think of browser's code execution limit, especially when it's inside a loop (which really is not a good practice).

Hopefully these references might help you:
Browser Execution Limit
Recursive SetTimeout