Collectives™ on Stack Overflow

Find centralized, trusted content and collaborate around the technologies you use most.

Learn more about Collectives

Teams

Q&A for work

Connect and share knowledge within a single location that is structured and easy to search.

Learn more about Teams

I want to read a file line by line, but without completely loading it in memory.

My file is too large to open in memory, and if try to do so I always get out of memory errors.

The file size is 1 GB.

You can use the fgets() function to read the file line by line:

$handle = fopen("inputfile.txt", "r");
if ($handle) {
    while (($line = fgets($handle)) !== false) {
        // process the line read.
    fclose($handle);
                You are not reading the entire file in memory. The max memory needed to run this depends on the longest line in the input.
– codaddict
                Nov 6, 2012 at 7:54
                @Brandin - Moot - In those situations, the asked question, which is to read a file LINE BY LINE, does not have a well-defined result.
– ToolmakerSteve
                Jun 30, 2016 at 18:57
                @ToolmakerSteve Then define what should happen. If you want you can just print the message "Line too long; giving up." and that is a well-defined result too.
– Brandin
                Jun 30, 2016 at 19:52
                Can a line contain a boolean false? If so then this method would stop without reaching the end of file. The Example #1 on this URL php.net/manual/en/function.fgets.php suggests that fgets sometimes can return boolean false even though end of file has yet not been reached. In the comment section on that page people report that fgets() doesn't always return correct values, so it's safer to use feof as the loop conditional.
– cjohansson
                Dec 7, 2016 at 11:21
                BTW: "If there is no more data to read in the file pointer, then FALSE is returned." php.net/manual/en/function.fgets.php ... Just in case
– everyman
                Jan 27, 2016 at 10:33

You can use an object oriented interface class for a file - SplFileObject http://php.net/manual/en/splfileobject.fgets.php (PHP 5 >= 5.1.0)

$file = new SplFileObject("file.txt"); // Loop until we reach the end of the file. while (!$file->eof()) { // Echo one line from the file. echo $file->fgets(); // Unset the file to call __destruct(), closing the file handle. $file = null; much cleaner solution. thanks ;) haven't used this class yet, there are more interesting functions here to explore: php.net/manual/en/class.splfileobject.php – Lukas Liesis May 10, 2015 at 19:29 Thanks. Yes, for example you can add this line before while $file->setFlags(SplFileObject::DROP_NEW_LINE); in order to drop newlines at the end of a line. – elshnkhll Nov 9, 2015 at 18:22 Thanks! Also, use rtrim($file->fgets()) to strip trailing newlines for each line string that is read if you don't want them. – racl101 Nov 22, 2017 at 23:32

If you want to use foreach instead of while when opening a big file, you probably want to encapsulate the while loop inside a Generator to avoid loading the whole file into memory:

* @return Generator $fileData = function() { $file = fopen(__DIR__ . '/file.txt', 'r'); if (!$file) { return; // die() is a bad practice, better to use return while (($line = fgets($file)) !== false) { yield $line; fclose($file);

Use it like this:

foreach ($fileData() as $line) {
    // $line contains current line

This way you can process individual file lines inside the foreach().

Note: Generators require >= PHP 5.5

@NinoŠkopac: Can you explain why this solution is more memory-efficient? For instance, in comparison to the SplFileObject approach. – k00ni Apr 24, 2020 at 11:12 Not sure what Tachi and The Onin's comments are comparing against, but I ran this against a 90MB text file, compared with codadict's method and found this to be 44% slower and use the same amount of memory. (ran on PHP 7.3) – artfulrobot Apr 14, 2021 at 16:50 @Tachi you are probably confusing something. This solution is neither "faster" or "slower" than the code in the accepted answer, let alone "hundred times". That's just a solution that allows one to use foreach instead of while, which looks nicer but inside it is using exactly the same while loop used in other answers – Your Common Sense May 10, 2022 at 13:07

There is a file() function that returns an array of the lines contained in the file.

foreach(file('myfile.txt') as $line) {
   echo $line. "\n";
                The one GB file would be all read into memory and converted to a more than one GB array... good luck.
– FrancescoMM
                Apr 15, 2015 at 9:07
                This was not the answer to the question asked, but it does answer the more common question many people have when looking here, so it was still useful, thanks.
– pilavdzice
                Apr 26, 2016 at 14:17
                file() is very convenient for working with small files. Especially when you want an array() as the end result.
– ellipse-of-uncertainty
                Jun 16, 2017 at 0:06

The obvious answer wasn't there in all the responses.
PHP has a neat streaming delimiter parser available made for exactly that purpose.

$fp = fopen("/path/to/the/file", "r");
while (($line = stream_get_line($fp, 1024 * 1024, "\n")) !== false) {
  echo $line;
fclose($fp);
                @ValterEkholm Yes, the newline from the end of each line becomes another normal character, as the delimiter is not the newline anymore.
– Félix Adriyel Gagnon-Grenier
                Sep 29, 2021 at 20:58
$filename = "test.txt";
$source_file = fopen( $filename, "r" ) or die("Couldn't open $filename");
while (!feof($source_file)) {
    $buffer = fread($source_file, 4096);  // use a buffer of 4KB
    $buffer = str_replace($old,$new,$buffer);
                this deserves more love, as it will work with huge files, even files that have no carriage returns or exceedingly long lines...
– Jimmery
                Jun 11, 2015 at 13:50
                I wouldn't be surprised if the OP didn't really care about actual lines and just wanted to e.g. serve a download. In that case, this answer is just fine (and what most PHP coders would do anyway).
– Álvaro González
                Jan 19, 2016 at 10:46
                Sir, How will you locate the file inside the fopen() by the way? Assuming we need to specify the url for opening!
– Deepak Keynes
                Feb 25, 2021 at 7:48

One of the popular solutions to this question will have issues with the new line character. It can be fixed pretty easy with a simple str_replace.

$handle = fopen("some_file.txt", "r");
if ($handle) {
    while (($line = fgets($handle)) !== false) {
        $line = str_replace("\n", "", $line);
    fclose($handle);

This how I manage with very big file (tested with up to 100G). And it's faster than fgets()

$block =1024*1024;//1MB or counld be any higher than HDD block_size*2
if ($fh = fopen("file.txt", "r")) { 
    $left='';
    while (!feof($fh)) {// read the file
       $temp = fread($fh, $block);  
       $fgetslines = explode("\n",$temp);
       $fgetslines[0]=$left.$fgetslines[0];
       if(!feof($fh) )$left = array_pop($lines);           
       foreach ($fgetslines as $k => $line) {
           //do smth with $line
fclose($fh);
                @user151496 I think the $left variable is there for that purpose. Buffering is usually faster so this is probably a better solution if you don't mind the added complexity.
– Alexis Wilke
                Jun 28, 2021 at 22:08

SplFileObject is useful when it comes to dealing with large files.

function parse_file($filename)
    try {
        $file = new SplFileObject($filename);
    } catch (LogicException $exception) {
        die('SplFileObject : '.$exception->getMessage());
    while ($file->valid()) {
        $line = $file->fgets();
        //do something with $line
    //don't forget to free the file handle.
    $file = null;

You can use the fgets() function in combination with a loop:

$filename = 'path/to/your/file.txt';
$handle = fopen($filename, 'r');
if ($handle) {
    while (($line = fgets($handle)) !== false) {
        // Process the current line
        echo $line;
        // You can perform any desired operations on the line here
        // For example, you can parse and extract data from each line
        // Proceed to the next line
    fclose($handle);
function read_file($filename = ''){
    $buffer = array();
    $source_file = fopen( $filename, "r" ) or die("Couldn't open $filename");
    while (!feof($source_file)) {
        $buffer[] = fread($source_file, 4096);  // use a buffer of 4KB
    return $buffer;
                This would create a single array of more than one GB in memory (good luck with it) divided not even in lines but in arbitrary 4096 character chunks. Why on earth would you want to do that?
– FrancescoMM
                Apr 15, 2015 at 9:09
        

Thanks for contributing an answer to Stack Overflow!

  • Please be sure to answer the question. Provide details and share your research!

But avoid

  • Asking for help, clarification, or responding to other answers.
  • Making statements based on opinion; back them up with references or personal experience.

To learn more, see our tips on writing great answers.