Hi all

Is it possible to continue running a script after encountering a maximum_execution_time error? If so, how would I go about doing this?

EDIT: My script is processing an xml feed that contains approximately 9000 images. I'm resizing the images and saving them to the server and the max_execution_time error always occurs on the following php function:

imagecopyresampled()

in the begnining of the script write this 2 lines.

  <?php
  set_time_limit(1200);
 ignore_user_abort(true);
 ?>

This script will run for 20 mintes (1200 seconds), default max time is 30 seconds. Due to second line, script will continue running even user close or stops browser.

Theres 2 sides, one is the php ini setting:

set_time_limit($seconds);

That will override that, the other side is your server script timeout such as cgi script timeout on IIS, you have to allow it more time but im not sure how to set that one and completely depends on what server you are running

Well we can rule out ignore_user_abort() because my script will be running on a cron. I've already changed the max_exection_time on my php ini to 90 seconds.

Won't setting the time limit to 20 minutes have a big effect on the server? I've edited my initial post to include extra details...

The server is an apache server with php 5.3.1

Personally, in such a case, I'd queue the information and have a cronjob running that gets rid of them in small intervals. Do you really have to wait for all of them to complete? If so, you can defer loading (2 at a time) if you use dummy image processing. Output an image link to a php script that handles one, and returns an image when done.

Well it seems increasing my mx_execution_time to 90 seconds has sorted the issue out.

@pritaeas
How would I go about queueing the information?

Store the information you need (from the XML) in a database for example. Your cronjob will get one record (or more) from the database and process it. When it succeeds, delete the record.