Hi friends,

I have one file in PHP this file is remote upload script. But when i run this file, it's not downloading specified file in URL. Below is a script is there anyone who can solve my problem then pleases help me.

define('BUFSIZ', 4095);
$url = $_GET["t1"]; // $a = $_GET["t1"];
$rfile = fopen($url, 'r');
$lfile = fopen(basename($url), 'w');
fwrite($lfile, fread($rfile, BUFSIZ), BUFSIZ);
Member Avatar


So you're copying the contents of a remote file to an identically named file on your own server? Check you have rights to open remote files in this way (check phpinfo). Also, what type of file is it? If the remote file is a php file, you'll only get the remote server output from it (e.g. html), not the underlying php code.

Am I on the right track?

No, i am not copying php file i just want to copy some important files so in absence of file i can get from my site.

When i run this file it gets url from 't1' from another php file,then it downloads same file but doesn't complete download. means if file size is 5MB then it only downloads file of ***bytes and one new file appears 'error_log'.

that's the problem. and where to check in phpinfo?

thanks for quick reply.

If you are experiencing a problem with php.ini you can try to contact your server host and see about getting it bumped. I would suggest requesting a bump only to the largest size you'd conceivably need. Some servers may give you access to the php.ini. Many do not.

This link shows two possible options to change override the settings in php.ini by .htaccess file and attempting to do it with your PHP code itself. http://www.sitepoint.com/how-to-override-php-configuration-settings/

But the server, especially shared hosting servers, may not allow this either.

The maxfile size is generally concerning uploads and not downloads. However, the max_execution_time may be of help to you. Many times while using the methods you are employing to force a download the PHP script will continue running while this process takes place and if it does not complete in time the server will kill the script for going over what they set as the maximum time allowed for a script to complete.

It is understandable that shared hosts especially put such things into place because the more resources one site uses..the more load the server has to bear while serving other sites simultaneously. Contact your host regarding this and see what they say...ask if it is in regards to time-out settings or something similar. If it is ask how it can be rectified. Unfortunately on many shared hosting servers they would require a higher level (therefore higher cost) plan to either make changes for you or allow you to make necessary changes yourself.

Or you could just upload/download the files via FTP. I've sometimes used this method of 'backing up' files in the event of catastrophic loss. At least I had them stored elsewhere.

Good luck on this one.