Collectives™ on Stack Overflow

Find centralized, trusted content and collaborate around the technologies you use most.

Learn more about Collectives

Teams

Q&A for work

Connect and share knowledge within a single location that is structured and easy to search.

Learn more about Teams

Allowed memory size of 33554432 bytes exhausted (tried to allocate 43148176 bytes) in php [duplicate]

Ask Question Fatal Error: Allowed Memory Size of 134217728 Bytes Exhausted (CodeIgniter + XML-RPC) (36 answers)
Closed last year .
Increase your maximum memory limit to 64MB in your php.ini file. Google search But could I ask why you are trying to allocate that much memory? What line of code does it fail at? user19302 Jan 6, 2009 at 8:21 PHP can be very inefficient with memory usage, I have often seen simple datagrids blow well into 80mb with a mere couple hundred records. This seems to especially happen when you go the OOP route. TravisO Jan 6, 2009 at 17:16 Conventionally, you read files that are of potentially large or arbitrary size one line at a time, over-writing the previous line memory with each line read. Or you may just want to tail or head the file to get the latest entries. Upping your memory allocation as the file grows is not the answer. ekerner May 21, 2011 at 1:12

If your script is expected to allocate that big amount of memory, then you can increase the memory limit by adding this line to your php file

ini_set('memory_limit', '44M');

where 44M is the amount you expect to be consumed.

However, most of time this error message means that the script is doing something wrong and increasing the memory limit will just result in the same error message with different numbers.

Therefore, instead of increasing the memory limit you must rewrite the code so it won't allocate that much memory. For example, processing large amounts of data in smaller chunks, unsetting variables that hold large values but not needed anymore, etc.

You still should check why the memory is exhausted. Maybe you don't need to read the whole file, maybe read it sequentially. – macbirdie Jan 22, 2009 at 10:12 - @panidarapu and @Don Jones: Depending on the amount of memory, and how this script is used, it could be dangerous to allow the change in memory usage in this way. Don, in your case, you can likely break the feed down into smaller chunks and parse what you need. Glad it works, but be careful. – anonymous coward Jun 11, 2010 at 20:51 This suggestion worked for me. Dynamically increasing memory limit in the script is done via function ini_set(): ini_set('memory_limit', '128M'); – mente Feb 12, 2011 at 4:16 Guys, please don't go for this quick fix. It can hurt you in long run. As a good programmer, you should figure-out the reason behind this memory consumption & increase as required instead of keeping it UNLIMITED. – Sumoanand Jun 11, 2013 at 21:17
  • If you have access to your PHP.ini file, change the line in PHP.ini If your line shows 32M try 64M: memory_limit = 64M ; Maximum amount of memory a script may consume (64MB)

  • If you don't have access to PHP.ini try adding this to an .htaccess file: php_value memory_limit 64M

  • Your script is using too much memory. This can often happen in PHP if you have a loop that has run out of control and you are creating objects or adding to arrays on each pass of the loop.

    Check for infinite loops.

    If that isn't the problem, try and help out PHP by destroying objects that you are finished with by setting them to null. eg. $OldVar = null;

    Check the code where the error actually happens as well. Would you expect that line to be allocating a massive amount of memory? If not, try and figure out what has gone wrong...

    I had this exact problem - turned out I had inadvertedly created a recursive function - and thus it ran out of memory at any random time during code execution. This had the upside of me now having the world's most memory efficient code, created in the hunt for a memory leak. – Kris Selbekk Oct 15, 2013 at 18:13 For the sake of others who will chase a rabbit down a hole. Doctrine in Symfony i think has an issue with monolog and when there is PDO exception it will create an infinite loop of exceptions as it will try an exception for the exception thus hiding the real issue (a corrupted db file in my case). – Desislav Kamenov Feb 7, 2019 at 19:50

    is never good. If you want to read a very large file, it is a best practise to copy it bit by bit. Try the following code for best practise.

    $path = 'path_to_file_.txt';
    $file = fopen($path, 'r');
    $len = 1024; // 1MB is reasonable for me. You can choose anything though, but do not make it too big
    $output = fread( $file, $len );
    while (!feof($file)) {
        $output .= fread( $file, $len );
    fclose($file);
    echo 'Output is: ' . $output;
                    Can't believe all these people recommending to set memory_limit to -1... Crazy thing to do on a production server. Thanks for a much cleaner solution.
    – JohnWolf
                    May 16, 2015 at 15:50
                    While at "best practice", it is good to close the file handler after the while loop: fclose($file)
    – kodeart
                    Sep 9, 2015 at 9:39
                    @assetCorp How does this help, provided the file has for example 100MiB and PHP memory limit is still set to 32 MiB. You read it by secure chunks of 1MiB, but then append it into a variable that is going to use all the available memory once the loop reaches 31. iteration. How is it any better? Only outputting the chunks similarly not to require storing them all in one variable would help to solve the problem.
    – helvete
                    Apr 4, 2018 at 9:37
                    This can work as a short-term solution, but unless you're expecting this kind of memory usage, it's likely a sign that something is wrong.
    – qozle
                    Oct 20, 2022 at 15:15
    

    It is unfortunately easy to program in PHP in a way that consumes memory faster than you realise. Copying strings, arrays and objects instead of using references will do it, though PHP 5 is supposed to do this more automatically than in PHP 4. But dealing with your data set in entirety over several steps is also wasteful compared to processing the smallest logical unit at a time. The classic example is working with large resultsets from a database: most programmers fetch the entire resultset into an array and then loop over it one or more times with foreach(). It is much more memory efficient to use a while() loop to fetch and process one row at a time. The same thing applies to processing a file.

    If you want to read large files, you should read them bit by bit instead of reading them at once.
    It’s simple math: If you read a 1 MB large file at once, than at least 1 MB of memory is needed at the same time to hold the data.

    So you should read them bit by bit using fopen & fread.

    Solved this by using: $fh = fopen($folder.'/'.$filename, "rb") or die(); $buffer = 1024*1024; while (!feof($fh)) { print(fread($fh, $buffer)); flush(); } fclose($fh); – Avatar Apr 28, 2014 at 11:09

    I was also having the same problem, looked for phpinfo.ini, php.ini or .htaccess files to no avail. Finally I have looked at some php files, opened them and checked the codes inside for memory. Finally this solution was what I come out with and it worked for me. I was using wordpress, so this solution might only work for wordpress memory size limit problem. My solution, open default-constants.php file in /public_html/wp-includes folder. Open that file with code editor, and find memory settings under wp_initial_constants scope, or just Ctrl+F it to find the word "memory". There you will come over WP_MEMORY_LIMIT and WP_MAX_MEMORY_LIMIT. Just increase it, it was 64 MB in my case, I increased it to 128 MB and then to 200 MB.

    // Define memory limits.
    if ( ! defined( 'WP_MEMORY_LIMIT' ) ) {
        if ( false === wp_is_ini_value_changeable( 'memory_limit' ) ) {
            define( 'WP_MEMORY_LIMIT', $current_limit );
        } elseif ( is_multisite() ) {
            define( 'WP_MEMORY_LIMIT', '200M' );
        } else {
            define( 'WP_MEMORY_LIMIT', '128M' );
    if ( ! defined( 'WP_MAX_MEMORY_LIMIT' ) ) {
        if ( false === wp_is_ini_value_changeable( 'memory_limit' ) ) {
            define( 'WP_MAX_MEMORY_LIMIT', $current_limit );
        } elseif ( -1 === $current_limit_int || $current_limit_int > 268435456 /* = 256M */ ) {
            define( 'WP_MAX_MEMORY_LIMIT', $current_limit );
        } else {
            define( 'WP_MAX_MEMORY_LIMIT', '256M' );
    

    Btw, please don't do the following code, because that's bad practice:

    ini_set('memory_limit', '-1');
                    Works for wordpress websites, where access is not available for .htaccess and php.ini files. +1
    – Mustafa sabir
                    May 4, 2017 at 7:23
                    I'd say that changing those limits on a 'core' WordPress file is not really a good idea; you can so very easily add those limits on wp-config.php instead, where they will not get overwritten by future WordPress updates. Also, a few security plugins (such as WordFence, for instance) will complain if 'core' WordPress files are changed...
    – Gwyneth Llewelyn
                    Jan 3, 2020 at 19:12
                    Oh... just edit wp-config.php and add the two lines there (i .e. define( 'WP_MEMORY_LIMIT', '200M' ); and define( 'WP_MAX_MEMORY_LIMIT', '256M' );. Unlike the files on the 'core' WP (namely, everything under wp-includes), which will be overwritten by WP upgrades, wp-config.php will not — it's there exactly for the exact purpose of overriding WP constants!
    – Gwyneth Llewelyn
                    Jan 7, 2020 at 20:38
                    "bad practice" is situational. -1 is fine for short-running processes. For example a php builder container that's used for running unit tests or composer installs/etc. Just don't run your production site with it set like that.
    – emmdee
                    Jan 28, 2020 at 20:32
    

    I notice many answers just try to increase the amount of memory given to a script which has its place but more often than not it means that something is being too liberal with memory due to an unforseen amount of volume or size. Obviously if your not the author of a script your at the mercy of the author unless your feeling ambitious :) The PHP docs even say memory issues are due to "poorly written scripts"

    It should be mentioned that ini_set('memory_limit', '-1'); (no limit) can cause server instability as 0 bytes free = bad things. Instead, find a reasonable balance by what your script is trying to do and the amount of available memory on a machine.

    A better approach: If you are the author of the script (or ambitious) you can debug such memory issues with xdebug. The latest version (2.6.0 - released 2018-01-29) brought back memory profiling that shows you what function calls are consuming large amounts of memory. It exposes issues in the script that are otherwise hard to find. Usually, the inefficiencies are in a loop that isn't expecting the volume it's receiving, but each case will be left as an exercise to the reader :)

    The xdebug documentation is helpful, but it boils down to 3 steps:

  • Install It - Available through apt-get and yum etc
  • Configure it - xdebug.ini: xdebug.profiler_enable = 1, xdebug.profiler_output_dir = /where/ever/
  • View the profiles in a tool like QCacheGrind, KCacheGrind
  • You can increase the memory allowed to php script by executing the following line above all the codes in the script:

    ini_set('memory_limit','-1'); // enabled the full memory available.
    

    And also de allocate the unwanted variables in the script.

    This isn't a great solution. This means (I believe) that you're letting the script run wild and use as much memory as it wants. Errors about using too much memory are likely a sign that something is wrong, so this gets rid of the symptom but kind of lets the problem get bigger (literally). – qozle Oct 20, 2022 at 15:17

    If you are trying to read a file, that will take up memory in PHP. For instance, if you are trying to open up and read an MP3 file ( like, say, $data = file("http://mydomain.com/path/sample.mp3" ) it is going to pull it all into memory.

    As Nelson suggests, you can work to increase your maximum memory limit if you actually need to be using this much memory.

    We had a similar situation and we tried out given at the top of the answers ini_set('memory_limit', '-1'); and everything worked fine, compressed images files greater than 1MB to KBs.

    I want to share my experience on this issue!

    Suppose you have a class A and class B.

    class A {
        protected $userB;
        public function __construct() {
            $this->userB = new B();
    class B {
        protected $userA;
        public function __construct() {
            $this->userA = new A();
    

    this will initiate a chain formation of objects which may be create this kind of issue!

    If you are using a shared hosting, you will not be able to enforce the increment in the php size limit.

    Just go to your cpanel and upgrade your php version to 7.1 and above then you are good to go.

    I had the same issue which running php in command line. Recently, I had changes the php.ini file and did a mistake while changing the php.ini

    This is for php7.0

    path to php.ini where I made mistake: /etc/php/7.0/cli/php.ini

    I had set memory_limit = 256 (which means 256 bytes)
    instead of memory_limit = 256M (which means 256 Mega bytes).

    ; Maximum amount of memory a script may consume (128MB)
    ; http://php.net/memory-limit
    memory_limit = 128M

    Once I corrected it, my process started running fine.

    If you are using Laravel, then use these ways:

    public function getClientsListApi(Request $request){
          print_r($request->all()); //for all request
          print_r($request->name); //for all name 
    

    instead of

    public function getClientsListApi(Request $request){
          print_r($request); // it show error as above mention