PHP fopen and file_get_contents limited download speed, why?

Asked
Active3 hr before
Viewed126 times

7 Answers

90%

The thing is just to tell the remote web server to close the connection when the download is complete, as file_get_contents isn't intelligent enough to do it by itself using the response Content-Length HTTP header.,A normal browser would consider the page is fully loaded if the HTTP payload length reaches the length specified in the response Content-Length HTTP header. File_get_contents doesn't do this and that's a shame.,I'm trying to retrieve a remote file (6MB text file) with PHP and I noticed that with fopen the speed is limited to 100KB/s and with file_get_contents is 15KB/s.,file_get_contents doesn't send a "connection" HTTP header, so the remote web server considers by default that's it's a keep-alive connection and doesn't close the TCP stream until 15 seconds (It might not be a standard value - depends on the server conf).

SO, if you want to know the solution, here it is:

$context = stream_context_create(array('http' => array('header' => 'Connection: close\r\n')));
file_get_contents("http://www.something.com/somepage.html", false, $context);
88%

A URL can be used as a filename with this function if the fopen wrappers have been enabled. See fopen() for more details on how to specify the filename. See the Supported Protocols and Wrappers for links to information about what abilities the various wrappers have, notes on their usage, and information on any predefined variables they may provide., You can use the optional second parameter and set it to true, if you want to search for the file in the include_path, too. ,fopen() - Opens file or URL, The filename being read.

load more v
72%

It's slow because file_get_contents() reads the entire file into $page, PHP waits for the file to be received before outputting the content. So what you're doing is: downloading the entire file on the server side, then outputting it as a single huge string.,Some variation of the below is what i would use. YMMV depending on what you're doing. If you post your code we can address your specific implementation instead of just providing alternate solutions :-),If you can't use COPY consider using multi-valued INSERTs if practical. You seem to be doing this already. Don't try to list too many values in a single VALUES though; those values have to fit in memory a couple of times over, so keep it to a few hundred per statement.,to help determine if its PHP or your environment, you should try interacting with curl via the command line. At least that you'll be able to rule out PHP code being the problem if its still 5 seconds.

Here's my code:

$language = $_GET['soundtype'];
$word = $_GET['sound'];
$word = urlencode($word);
if ($language == 'english') {
    $url = "<the first url>";
} else if ($language == 'chinese') {
    $url = "<the second url>";
}
$opts = array(
  'http'=>array(
    'method'=>"GET",
    'header'=>"User-Agent: <my user agent>"
  )
);
$context = stream_context_create($opts);
$page = file_get_contents($url, false, $context);
header('Content-Type: audio/mpeg');
echo $page;
load more v
65%

fopen and file_get_contents have their own set of default options, but these are completely customizable. To define them, we need to create a new stream context:,Generators have other uses, but this one is demonstrably good for performant reading of large files. If we need to work on the data, generators are probably the best way.,For the second scenario, let’s imagine we want to compress the contents of a particularly large API response. We don’t care what it says, but we need to make sure it’s backed up in a compressed form.,In both scenarios, we need to read large files. In the first, we need to know what the data is. In the second, we don’t care what the data is. Let’s explore these options…

The methods we’ll use to see how much memory is used are:

// formatBytes is taken from the php.net documentation

memory_get_peak_usage();

function formatBytes($bytes, $precision = 2) {
   $units = array("b", "kb", "mb", "gb", "tb");

   $bytes = max($bytes, 0);
   $pow = floor(($bytes ? log($bytes) : 0) / log(1024));
   $pow = min($pow, count($units) - 1);

   $bytes /= (1 << (10 * $pow));

   return round($bytes, $precision).
   " ".$units[$pow];
}
load more v
75%

How to convert array to string in PHP ?,Why to check both isset() and !empty() function in PHP ?,Download file from URL using PHP,How to run JavaScript from PHP?

40%

$ctx = stream_context_create(array('http' =>
   array(
      'timeout' => 1200, //1200 Seconds is 20 Minutes
   )
));

echo file_get_contents('http://example.com/', false, $ctx);
22%

For more information about the file_get_contents function, please visit http://www.php.net/file_get_contents.,When the allow_url_fopen directive is enabled, you can write scripts that open remote files as if they are local files. For example, you can use the file_get_contents function to retrieve the contents of a web page.,This article describes how to enable and disable the allow_url_fopen directive in a custom php.ini file.,To enable this functionality, use a text editor to modify the allow_url_fopen directive in the php.ini file as follows:

To enable this functionality, use a text editor to modify the allow_url_fopen directive in the php.ini file as follows:

allow_url_fopen = on
load more v

Other "undefined-undefined" queries related to "PHP fopen and file_get_contents limited download speed, why?"