I have an API parsing script, which inserts into the db more than 40,000 rows.and maybe due large numbers of queries getting fired continuously, after some specific time the script throws "MySQL server has gone away" error.
I know the reason behind this error can be closed connection or any other privileges on the mysql server.
How should I deal with this error , so that i can execute the script completely.
Thank you. I am not sure that I need to do it that way? I am really a beginner with this stuff and just playing around creating a blog for myself. I have got it all working in a really simple way for the time being.
thanks, but msqli_multi_query won't be suitable in this case.Because the code is checking in the db simultaneously to avoid inserting any duplicates, so i will need to insert one by one only.Can we do anything else than this
hey pritaeas, i solved the problem on my own, doing it other way.
now i am storing all the ids to some text file and the other script will fetch this ids from the line number its been told in the url,like script_name.php?limit=100 and so on.
Now i need to run the second script 10 times if there are 1000 ids in file created by first cron, since the bound or limit in second script is set as 100.
and doing the same logic in that script, i hope it will work for long term if set as a cron and after some duration.
But now new problem arises how to tackle this situation as a cron.
This is totally new topic from the subject of this thread, so i have created another thread dynamic cron , please have a look at it.
Thanks for your help:)