Home > Article > Backend Development > How to Avoid Memory Exhaustion When Processing Large Files in PHP Using File_get_contents?
File Processing Using File_get_contents Encounters Memory Exhaustion
When working with large files in PHP, using the file_get_contents function to fetch the entire file contents into a variable can cause memory exhaustion errors. This is because the variable containing the file contents resides in memory, and for large files, the allocated memory limit can be exceeded.
To overcome this issue, a more efficient approach is to use file pointers and process the file in chunks. This way, only the current portion of the file is held in memory at any given time.
Here's a custom function that implements this chunked file processing:
<code class="php">function file_get_contents_chunked($file, $chunk_size, $callback) { try { $handle = fopen($file, "r"); $i = 0; while (!feof($handle)) { call_user_func_array($callback, [fread($handle, $chunk_size), &$handle, $i]); $i++; } fclose($handle); return true; } catch (Exception $e) { trigger_error("file_get_contents_chunked::" . $e->getMessage(), E_USER_NOTICE); return false; } }</code>
To use this function, define a callback function to handle each chunk of data:
<code class="php">$success = file_get_contents_chunked("my/large/file", 4096, function($chunk, &$handle, $iteration) { // Perform file processing here });</code>
Additionally, consider refactoring your regex operations to use native string functions like strpos, substr, trim, and explode. This can significantly improve performance when working with large files.
The above is the detailed content of How to Avoid Memory Exhaustion When Processing Large Files in PHP Using File_get_contents?. For more information, please follow other related articles on the PHP Chinese website!