php - mysql INSERT fails but no error thrown: max_allowed_packet: settings show conflicting results


I successfully receive data into server client account via _POST but I can't insert data into row table that holds longblob types when row data is above ~25MB. I cannot see any errors thrown by the the php prepared statements/file, php file closes gracefully. I'm checking whether my max_allowed_packet is correctly configured.

I don't have server root privileges and access, I can't see my.cnf, and I'm talking to my host to ensure both client and server max_allowed_packet are set at 256M.

php file insert section:


if(!($stmt = $conn->prepare("INSERT INTO $itemTable (`itemName`, `itemAttributes`) VALUES (?, ?)")))
        echo "<br/>Failed to Prepare";
          echo "<br/>Prepare success.";  

          echo " Success executing";

  echo "<br />Connection Closed.......";

These are the checks I am doing, am I missing any to make sure the max_allowed_packet won't be overriden elsewhere and is setup in all places correctly?

Client setting check: All these three check give me 256MB max_allowed_packet:

mysql SHOW VARIABLES LIKE 'max_allowed_packet' 
mysql SHOW GLOBAL VARIABLES LIKE 'max_allowed_packet'
mysql SHOW SESSION VARIABLES LIKE 'max_allowed_packet'

However, I successfully login to mysql command prompt (mysql --user userName --password databaseName) to check max_allowed_packet but it shows NULL, what does that mean?

mysql> select @max_allowed_packet;

Server setting check: how can I check server 'max_allowed_packet' for GLOBAL and SESSION? I try the above replacing "mysql" for "mysqld" but I get no result and warnings I can't relate to the size of 'max_allowed_packet':


mysqldump check: Finally, I try to read max_allowed_packet for mysqldump with the three commands above (i.e. mysqldump SHOW VARIABLES LIKE 'max_allowed_packet' ) substituting "mysql" for "mysqldump" but I get access denied. As an alternative I then successfully login to mysqldump to read max_allowed_packet (mysqldump --user userName --password databaseName) and I get a lot of garbage scrolling through the screen so I can't get this value.



-+ | @max_allowed_packet | +


-+ | NULL | +


-+ 1 row in set (0.00 sec)|||[[email protected] ~]$ mysqld SHOW GLOBAL VARIABLES LIKE 'max_allowed_packet' 2020-02-12T07:47:33.832292Z 0 [Warning] Could not increase number of max_open_files to more than 1024 (request: 65536) 2020-02-12T07:47:33.832480Z 0 [Warning] Changed limits: max_connections: 214 (requested 256) 2020-02-12T07:47:33.832487Z 0 [Warning] Changed limits: table_open_cache: 400 (requested 2000) 2020-02-12T07:47:33.945335Z 0 [Warning] TIMESTAMP with implicit DEFAULT value is deprecated. Please use --explicit_defaults_for_timestamp server option (see documentation for more details). 2020-02-12T07:47:33.945476Z 0 [Warning] Can't create test file /var/lib/mysql/cloud.lower-test 2020-02-12T07:47:33.945569Z 0 [Note] mysqld (mysqld 5.7.29) starting as process 75 ... 2020-02-12T07:47:33.947615Z 0 [Warning] Can't create test file /var/lib/mysql/cloud.lower-test 2020-02-12T07:47:33.947631Z 0 [Warning] Can't create test file /var/lib/mysql/cloud.lower-test 2020-02-12T07:47:33.948025Z 0 [ERROR] Could not open file '/var/log/mysqld.log' for error logging: Permission denied 2020-02-12T07:47:33.948066Z 0 [ERROR] Aborting 2020-02-12T07:47:33.948138Z 0 [Note] Binlog end 2020-02-12T07:47:33.948287Z 0 [Note] mysqld: Shutdown complete



As it turns out, this has nothing to with MySQL directly.

Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 32000179 bytes) in /home/client/public_html/phpfile.php on line 175

This is a PHP error message, which tells you that your script has excseeded the 128M memory limit.

You can increase the limit either in your config file or in your script with


You can use-1 as value to disable the limit completely.

However - you should though avoid copying large amounts of data.

The following case is clear:


You copy 32M of data into a new variable. And your script is at least at 64M now.

Theexecute() method with parameters is more tricky, and I'm not sure if the following is exactly true: You pass a new array as parameter


This array first needs to be created in memory, before it is passed toexecute(). That again consumes at least 32M more. Then due to internal implementation (which I don't know) every array element is passed to something likebindValue(), which will again copy all the data. At this point your script is already at 128M limit (32*4).

So you should do something like the following:

Remove these lines:


Prepare the statement:

$stmt = $conn->prepare("INSERT INTO $itemTable (`itemName`, `itemAttributes`) VALUES (?, ?)");

Bind the parameters with bindParam:

$stmt->bindParam(1, $_POST['itemName'], PDO::PARAM_STR);
$stmt->bindParam(2, $_POST['itemAttributes'], PDO::PARAM_LOB);

bindParam() is using , which you can see in the description

public PDOStatement::bindParam ( mixed $parameter , mixed &$variable [, int $data_type = PDO::PARAM_STR [, int $length [, mixed $driver_options ]]] ) : bool

& in&$variable indicates, that no copy will be done, but a reference to the value is passed instead.

Note, that I am usually not a friend of call by reference, and avoid it when not dealing with resource critical code.execute([..]) is fine in most cases.

If you want to see, how much memory has been allocated, you can use memory_get_peak_usage() somewhere at the end of your script.

People are also looking for solutions to the problem: php - imap_open failed in my server but works in local


Didn't find the answer?

Our community is visited by hundreds of web development professionals every day. Ask your question and get a quick answer for free.

Ask a Question

Write quick answer

Do you know the answer to this question? Write a quick response to it. With your help, we will make our community stronger.

Similar questions

Find the answer in similar questions on our website.