PHP script doesn't finish after large file download
I'm working on a PHP page for a website that will zip files together and allow the user to download it. The file size of the zip can range anywhere from a couple MB to 100MB. My PHP script creates the zip file in a temporary directory and then writes the file contents out to the browser. Once this is finished, the script updates a download counter in a MySQL database and deletes the zip file from the temp directory.
This all works fine until I come across a large zip download that takes longer than 30 seconds.max_execution_time
in the php.ini file is set to 30 so this makes sense, but if I try usingset_time_limit(0)
or changemax_execution_time
the same result occurs. The zip file successfully downloads from the browser with all the correct files inside, but the scripts seems to stop afterwards because the database isn't updated and the temporary zip file on the server isn't deleted.
It's a Linux environment with Apache and PHP 5.2.
This website is hosted on GoDaddy so I'm not sure if they have limitations on changing the time limit a script can execute for but basically I'd like to let this particular script run indefinitely until it completes.
Any thoughts on why I'm not able to set the time limit or any workarounds?
Here's my code:
<?php
// Don't stop the script if the user
// closes the browser
ignore_user_abort(true);
set_time_limit(0);
// Generate random name for ZIP file
$zip_file = "";
$characters = "0123456789abcdefghijklmnopqrstuvwxyz";
do
{
$zip_file = "tmp/";
for ($p = 0; $p < 10; $p++)
$zip_file .= $characters[mt_rand(0, strlen($characters))];
$zip_file .= ".zip";
} while (file_exists($zip_file));
// Prepare ZIP file
$zip = new ZipArchive();
/* Open and add files to ZIP (this part works fine)
.
.
.
*/
// Close and save ZIP
$zip->close();
// Check browser connection
if (connection_status() == 0)
{
// Send ZIP
header("Content-Type: application/zip");
header("Content-Length: " . filesize($zip_file));
header("Content-Disposition: attachment; filename=Download.zip");
header("Cache-Control: no-store, no-cache, must-revalidate");
header("Cache-Control: post-check=0, pre-check=0", false);
header("Pragma: no-cache");
header("Expires: Sat, 26 Jul 1997 05:00:00 GMT");
header("Last-Modified: " . gmdate("D, d M Y H:i:s") . " GMT");
// Save and then delete file
//readfile($zip_file);
if ($file = fopen($zip_file, "r"))
{
// Set buffer size
$buffer_size = 1024 * 8;
// While we're still transmitting
// send over bytes!
while(!feof($file) && (connection_status() == 0))
{
print(fread($file, $buffer_size));
flush();
usleep(10000); //<!-- Download speed cap
}
// Close file descriptor
fclose($file);
}
}
/* Update database download counter if connection_status() == 0
.
.
.
*/
// Delete the file
unlink($zip_file);
?>
UPDATE: I just tried doing another download from my local web server and bumped up theusleep
command to 10000 to slow the download time. Total download time took a little over 1 minute and the database is updated and the file is deleted out of /tmp. My local environment runs EasyPHP with Apache and PHP 5.3 on a Windows 7 box. Seems like this could have something to do with GoDaddy.
Also, on both the GoDaddy and local sites, I printed outmax_execution_time
before and afterset_time_limit
is called from my script and the results were 30 and 0 respectively so I'm not sure what's happening on the GoDaddy side of things.
Answer
Solution:
Answer
Solution:
Still not sure why the code I'm using above doesn't work on the hosting site, but I found a workaround. If I use the
register_shutdown_function
call in PHP on the live site, the database is updated correctly and the file is deleted.Here's the shutdown function code:
Answer
Solution:
You should be able to set your max_execution_time in your php5.ini and check this with a PHP info page. Can you provide the sections of your php5.ini and your output from a PHPinfo page?