0

I fetch some data from the remote server and feed it into my site database .It having 9 lakhs of records(app) but the insertion of records stopped 60,180 records only.we use mailto function and exception handler to find the bugs but no response.Anyone please advice me whether the cron get timeout or we have error in code our code is

<?php
set_time_limit(0);
ignore_user_abort();
try
{
$ch = curl_init();
// set URL and other appropriate options
curl_setopt($ch, CURLOPT_URL, "http://ks329xxx.com/cronRU/update");
curl_setopt($ch, CURLOPT_HEADER, 0);
// grab URL and pass it to the browser
curl_exec($ch);

// close cURL resource, and free up system resources
curl_close($ch);
}
catch (Exception $e) 
{
 echo 'Caught exception: ',  $e->getMessage(), "\n";
mail('varunxxxx@xxxxx.com', 'update', $message);    
}
?>
3
Contributors
4
Replies
21
Views
4 Years
Discussion Span
Last Post by pritaeas
0

<off topic>

I'm assuming 9 lakh = 900,000

Stick to international amounts, lakh is a bit obscure.

0

One thing I notice is $message is undefined (line 19). Am not so sure though, that curl raises exceptions on errors.

0

now i fix that $message = $e->getMessage(); There is no problem with the code . its working when am fetching minimum amount of data. when go for mass data its get terminated . I set infinite time for execution in web server but It seems to be a same issue. Kindly please advice and share your ideas

Edited by pritaeas: Added markdown.

0

I set infinite time for execution in web server

Are you sure your host actually supports this? For security/performance reasons this is usually disabled.

This topic has been dead for over six months. Start a new discussion instead.
Have something to contribute to this discussion? Please be thoughtful, detailed and courteous, and be sure to adhere to our posting rules.