I cant use a cron for this task as the script should always be running.

I have a loop that does the following:
1) Gets a SQL row
2) Runs the values through some functions
3) stores new values
4) gets the next row and repeats

This script has to always be running 24/7 to update profiles on my website.

Now I've considered something similar to:

ignore_user_abort(1); // run script in background 
set_time_limit(0); // run script forever 
   // add the loop functions here
   // ... 

but I'm looking for any advice as to how I should have this loop running in the background. This is Ubuntu 10.04 LTS and I'm very new to linux.

The process will eventually be executing for 500 - 1000 rows and each execution takes between 0.001 and 10.00 seconds. (90% of the rows take less than 0.05 seconds to execute).

Perhaps there is a way to do 25 - 50 rows in one load and loop through all SQL records 20 - 50 at a time? Or should I stick to individual row execution?

Thanks in advance!

4 Years
Discussion Span
Last Post by mmcdonald


I think you can use Gearman, it allowes you to do parallel processing, you have to set a client to retrieve data and send it to a job server, this will call the workers registered for that particular task.

Here's a example in slow motion of the client script:


$a = range('a','z');

$gmclient = new GearmanClient();

    $i = 1;
    foreach($a as $letter)
        # serialize $letter if this is an array
        $gmclient->addTaskHighBackground("echo", $letter);

        if(2%$i) sleep(2);

    # kills this script after the loops ends if file exists


and this is the worker:


$worker = new GearmanWorker();
$worker->addFunction("echo", "task");

while ($worker->work())
    if ($worker->returnCode() != GEARMAN_SUCCESS)
        echo "return_code: " . $worker->returnCode() . "\n";

function task($job)
    $data = $job->workload(); # unserialize if you are receiving an array
    echo $data .' '. date('Y-m-d G:i:s')."\n";


to start, open few terminals, in the first start the client and in the others the workers:

php client.php
php worker.php

You will see in the workers terminals the output of the above example: a simple output of letters and timings. A worker can call other workers to permform a secondary tasks, and these can be executed in background with different priorities. Check PHP manual to get install instructions:


To start the gearman job server you can use this command:

gearmand -j 5 -v -l usage.log

with -d argument you can demonize it. Gearman has also persistent queues and if necessary you can user Supervisord to make sure client.php and worker.php are still running, or to restart them at fixed intervals.

Once you have this kind of setup you can run multiple instances of worker.php and decide the amount of data to send to each, this will give you the ability to speed up the execution, you can also distribute the jobs in remote servers. Just remember to serialize data when sending arrays. Hope it's useful, bye!

Votes + Comments
awesome, ill try this now thanks!
This topic has been dead for over six months. Start a new discussion instead.
Have something to contribute to this discussion? Please be thoughtful, detailed and courteous, and be sure to adhere to our posting rules.