Hi everyone

I know there already is a thread on that one, but not enough for me to make it work.

So I know I can do :

$command = "mysqldump -u [username] -p [password] [databasename] > [backupfile.sql]";
then
system($command);

the thing is, I'm on a web server (hosting company) and "> backupfile.sql" does not mean nothing to me

I'm trying to send it on an other server so I tried to put this instead :

"> ftp://user:pass@www.domain.com/pathtofile/backupfile.sql"

But nothing happens, I mean, the file does not create on my other server, and if I try to print $command, blank page.

Does anyone can help me here ?

Thanks

Recommended Answers

All 5 Replies

You are trying to run system commands through PHP, most web hosts are going to prevent this sort of thing for obvious reasons of security. And even if you can create a file, you probably can't save directly through the FTP tunnel like that. You'd have to create a connection and send the file with other code.

If your database is small, create the mysql dump through phpmyadmin, your cpanel or web host control panel.

Thanks guyinpv

In fact I have two databases I want to backup, and I'd like them to be backup automatically each day. I don't want to have to get the each of the 2 phpMyAdmin and backup them mannually.

The 2 are pretty huge (in a text file one is 3Mb, the other is 6Mb and is growing), so even by PhpMyAdmin, I must "transfer" the backup, cannot show it on screen.

I tried a script which go through each table and "create" the dump in a variable then write to a file with fopen/fwrite, but then again, there is too much info for a single variable, so it doesn't work either.

I'm really stuck here, need a solution !!!

Does is have to by using PHP? Depending on the webhost you might be able to setup a cron job to do that backup...

Create a file called backup

#!/bin/sh

mysqldump   -uusername -ppassword  --opt tablename > /path/tofile/location/youwant/mysqldumpfile.sql

And then schedule cron to run and execute the script

0       6       *       *       *       /path/to/bacup/script/backup

This would run the script at 6:00am every day. You would still have to manually download the file, or you could use something like rsync to sync the file automatically.

sops21

thanks for this idea, I will explore it, there is probably something to do with this!!

Is it really 3MB and 6MB? That's actually pretty small.
I use Navicat to manage databases and it can be set up to do data sync/copy/dump and all that on a schedule. But I'm finding it's too slow on a larger 3GB database.

If you want to backup offsite, you'll have to dump the DB and then transfer it. If all you want is a local backup, then create another database on the server and you can setup a job to just copy the one database into the backup database on the same server.

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.