Hi all,

I'm am still learning mysql so please excuse anything I may not understand.

I need to update 12k records in a mysql table (about 5 fields per record) twice a day, and it needs to be done automatically (probably with curl).

My data source is a comma delimited text file. I would prefer to use php. I am working in a LAMP environment.

If anybody has some pointers it would be a great help.

Recommended Answers

All 4 Replies

Hey,

Personally, I think this question would be better in the PHP forum... but seeing as I'm here, I would approach it from the following direction.

Read file into an array using the file() method in PHP. This will append each line from the file into a new array index.

Then if you loop through your first array and explode each line using the explode() method and comma (,) as a delimiter, this will give you each value in another array with each field in a seperate index.

You can then run an SQL query passing the array indexes to update your DB.

Whack the script in a cron and run it twice a day, and the jobs a good-un.

R.

Thanks for the response robothy,

I should have stated my problem a little more clearly.

The problem is that the script timeouts before the whole set of data is finished. Also, if I try entering the data in one long query, mysql timeouts.

The most efficient way I have found is to do it in chunks but it's certainly not optimal.

I was curious if there is some feature of mysql (or php I guess) that I am overlooking to quickly update record set with over 10k rows.

The maximum length of time a script is allowed to run for is determined by the php.ini file, more specifically, the max_execution_time variable value.

In your script you could try the following:

// Start script...
ini_set(max_execution_time, 300);

// Run your update queries

// End script...
ini_set(max_execution_time, 30);  // Or whatever it was before.

If you're on a shared server, there may be a limit to how long a server script is allowed to run in your contract, and exceeding this may get you into trouble with the administrators, so check that first.

Other than this, I don't know of any pre-existing MySQL feature that will help you with updating 10k or more records.

R.

Thanks for the response robothy,

I should have stated my problem a little more clearly.

The problem is that the script timeouts before the whole set of data is finished. Also, if I try entering the data in one long query, mysql timeouts.

The most efficient way I have found is to do it in chunks but it's certainly not optimal.

I was curious if there is some feature of mysql (or php I guess) that I am overlooking to quickly update record set with over 10k rows.

Since the script times out, I assume you are attempting to do this via a web browser/server. If you don't have shell access to the server, you may be stuck doing it in chunks. However, there are almost always ways around limitations when Linux is involved. :)

One idea:

  1. Have your PHP program look for a flag file, say, UPDATING.
  2. If it's not updating and did receive the form data, it grabs the uploaded file, forks off a shell script that (1) creates the flag file, (2) executes a mysql() process to execute the SQL script, and (3) deletes the flag file. The PHP then falls through to (c).
  3. If it is updating, it sends HTML to your browser that (1) says "UPDATING", (2) shows the ps() status of the mysql() program doing the updating and (3) reloads the PHP program after 20 seconds
  4. If it's not updating and didn't receive the form data, it presents a form to you to upload a MySQL script containing all your updates (see mysqldump() for an example)
  5. You select your SQL script and click the button.

Mind you, this is an ugly, bare-metal solution that should work with browser/server without timing out. I'm sure others (even you) can pretty it up. And you might want to include a way to kill the script if it runs away.

I haven't tried this myself, but I probably will at some point, since I'm stuck working on a web site that has no shell access. Slow FTP access sure does make for inflated billing hours and infuriating waits for FTP to connect, transfer, disconnect. Give me secure-shell access!

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.