PHP running out of memory in all my scripts - apache

EDIT:
My php.ini has 256MB memory set:
;;;;;;;;;;;;;;;;;;;
; Resource Limits ;
;;;;;;;;;;;;;;;;;;;
max_execution_time = 250 ; Maximum execution time of each script, in seconds
max_input_time = 120 ; Maximum amount of time each script may spend parsing request data
;max_input_nesting_level = 64 ; Maximum input variable nesting level
memory_limit = 256MB ; Maximum amount of memory a script may consume (256MB)
So I had a certain PHP script which was not very well written and when I executed it the PHP ran out of memory and my PC froze. Before running the script I have increased the memory limit in php.ini. I have changed it back to the default value after.
Now the problem is it seems to have done something to my PHP installation. Every PHP script I execute now is telling me it has not enough memmory. The scripts that have worked before without an issue.
It seems like the one bad script I mentioned earlier is still running in the background somehow.
I have restarted PHP, Apache, I have restarted my PC and even went to sleep for 8 hours. The next thing in the morning I find out all PHP scripts are still running out of memory. What the hell?
I am getting errors like this everywhere now (with the file in the error changing of course) - with every single even the simplest PHP script:
Fatal error: Allowed memory size of 262144 bytes exhausted (tried to allocate 6144 bytes) in D:\data\o\WebLib\src\Db\Db.php on line 241
Fatal error (shutdown): Allowed memory size of 262144 bytes exhausted (tried to allocate 6144 bytes) in D:\data\o\WebLib\src\Db\Db.php on line 241
Ok here is the script (I have commented out the bad parts):
<?php
error_reporting(E_ALL);
define('BASE_PATH', dirname(__FILE__));
require_once(BASE_PATH.'/../WebLib/config/paths.php');
require_once(PATH_TO_LIB3D_SRC.'/PHPExcel/Classes/PHPExcel.php');
require_once(PATH_TO_LIB3D_SRC.'/PHPExcel/Classes/PHPExcel/Reader/IReadFilter.php');
///** Define a Read Filter class implementing PHPExcel_Reader_IReadFilter */
//class chunkReadFilter implements PHPExcel_Reader_IReadFilter {
// private $_startRow = 0;
// private $_endRow = 0;
// /** Set the list of rows that we want to read */
// public function setRows($startRow, $chunkSize)
// {
// $this->_startRow = $startRow;
// $this->_endRow = $startRow + $chunkSize;
// }
// public function readCell($column, $row, $worksheetName = '')
// {
// // Only read the heading row, and the rows that are configured in $this->_startRow and $this->_endRow
// if (($row == 1) || ($row >= $this->_startRow && $row < $this->_endRow)) {
// return true;
// }
// return false;
// }
//}
//
//function ReadXlsxTableIntoArray($theFilePath)
//{
// $arrayData =
// $arrayOriginalColumnNames =
// $arrayColumnNames = array();
//
// $inputFileType = 'Excel2007';
// /** Create a new Reader of the type defined in $inputFileType **/
// $objReader = PHPExcel_IOFactory::createReader($inputFileType);
// /** Define how many rows we want to read for each "chunk" **/
// $chunkSize = 10;
// /** Create a new Instance of our Read Filter **/
// $chunkFilter = new chunkReadFilter();
// /** Tell the Reader that we want to use the Read Filter that we've Instantiated **/
// $objReader->setReadFilter($chunkFilter);
// $objReader->setReadDataOnly(true);
// /** Loop to read our worksheet in "chunk size" blocks **/
// /** $startRow is set to 2 initially because we always read the headings in row #1 **/
// for ($startRow = 1; $startRow <= 65536; $startRow += $chunkSize) {
// /** Tell the Read Filter, the limits on which rows we want to read this iteration **/
// $chunkFilter->setRows($startRow,$chunkSize);
// /** Load only the rows that match our filter from $inputFileName to a PHPExcel Object **/
// $objPHPExcel = $objReader->load($theFilePath);
// // Do some processing here
//
// $rowIterator = $objPHPExcel->getActiveSheet()->getRowIterator();
// foreach($rowIterator as $row){
//
// $cellIterator = $row->getCellIterator();
// //$cellIterator->setIterateOnlyExistingCells(false); // Loop all cells, even if it is not set
// if(1 == $row->getRowIndex ()) {
// foreach ($cellIterator as $cell) {
// $value = $cell->getCalculatedValue();
// $arrayOriginalColumnNames[] = $value;
// // let's remove the diacritique
// $value = iconv('UTF-8', 'ISO-8859-1//TRANSLIT', $value);
// // and white spaces
// $valueExploded = explode(' ', $value);
// $value = '';
// // capitalize the first letter of each word
// foreach ($valueExploded as $word) {
// $value .= ucfirst($word);
// }
// $arrayColumnNames[] = $value;
// }
// continue;
// } else {
// $rowIndex = $row->getRowIndex();
// reset($arrayColumnNames);
// foreach ($cellIterator as $cell) {
// $arrayData[$rowIndex][current($arrayColumnNames)] = $cell->getCalculatedValue();
// next($arrayColumnNames);
// }
// }
//
// unset($cellIterator);
// }
//
// unset($rowIterator);
// }
//
// // Free up some of the memory
// $objPHPExcel->disconnectWorksheets();
// unset($objPHPExcel);
//
// return array($arrayOriginalColumnNames, $arrayColumnNames, $arrayData);
//}
//
//if (isset($_POST['uploadFile'])) {
// //list($tableOriginalColumnNames, $tableColumnNames, $tableData) = ReadXlsxTableIntoArray($_FILES['uploadedFile']['tmp_name']);
// //CreateXMLSchema($tableOriginalColumnNames, 'schema.xml');
// //echo GetReplaceDatabaseTableSQL('posta_prehlad_hp', $tableColumnNames, $tableData);
//}

Change the php.ini :
memory_limit = 256M
Note that you're not supposed to use MB or KB, but M or K

PHP expects the unit Megabyte to be denoted by the single letter M. You've specified 256MB. Notice the extra B.
Since PHP doesn't understand the unit MB, it falls back to the lowest known "named" unit: kilobyte (K).
Simply remove the extra B from your setting and it should properly read the value as 256 megabyte (256M)
Please see the following FAQ entry on data size units:
PHP: Using PHP - Manual

You have a memory_limit of 256K. Thats much to less in nearly all cases. The default value is 16M (since 5.2).

Are you sure you set your memory size back correctly? The error shows that your max memory is 262144 bytes, and that is a quarter of an MB. That's really low!
As reaction to your php settings: shouldn't that syntax be
memory_limit = 256M
I don't know if it accepts both M and MB, but it might not?

Hey Richard. The changes couldn't have been executed since PHP clearly states that you only have 256K set as limit. Look through the php.ini and all the other places. It could be located in a .htaccess file on the vhost/host.

Have you restarted Apache after you edited php.ini and increased the memory_limit = 256MB?

Related

What is the best method to determine file size resource usage in SenseNet?

In order to charge appropriately for resource usage, i.e. database storage, we need to know the size of our client's files. Is there a simple way to calculate the resource usage for client's Workspace?
If you just want to know the size of the files in the workspace, you can use the funciton below, although total resource usage is likely much higher.
Calculate file size -- useful, but not close to total storage.
public static int DocumentFileSizeMB(string path)
{
var size = 0;
var results = ContentQuery.Query(SafeQueries.TypeInTree, null, "File", path);
if (results != null && results.Count > 0)
{
var longsize = results.Nodes.Sum(n => n.GetFullSize());
size = (int)(longsize / 1000000);
}
return size;
}
To get a better idea of storage space resources, call the SenseNet function GetTreeSize() on a node. However, this doesn't give the full resource usage due to other content that is related to the node size calcuation, but not stored beneath the node, such as Index tables, Log entries, etc.
A better method, but still not the full resource usage.
public static int NodeStorageSizeMB(string path)
{
var size = 0;
var node = Node.LoadNode(path);
if (node != null)
{
size = (int)(node.GetTreeSize() / 1000000); // Use 10**6 as Mega, not 1024*1024, which is "mebibyte".
}
return size;
}

rapidJson: crashed in release mode

I used rapidJson to read json data. I can build my application in both Debug and Release mode, but the application crashes in Release mode.
using namespace rapidjson;
...
char *buffer;
long fileSize;
size_t fileReadingResult;
//obtain file size
fseek(pFile, 0, SEEK_END);
fileSize = ftell(pFile);
if (fileSize <= 0) return false;
rewind(pFile);
//allocate memory to contain the whole file
buffer = (char *)malloc(sizeof(char)*fileSize);
if (buffer == NULL) return false;
//copy the file into the buffer
fileReadingResult = fread(buffer, 1, fileSize, pFile);
if (fileReadingResult != fileSize) return false;
buffer[fileSize] = 0;
Document document;
document.Parse(buffer);
When I run it in Release mode, I encounter an Unhanded exception; A heap has been corrupted.
The application breaks at "res = _heap_alloc(size) in malloc.c file
void * __cdecl _malloc_base (size_t size)
{
void *res = NULL;
// validate size
if (size <= _HEAP_MAXREQ) {
for (;;) {
// allocate memory block
res = _heap_alloc(size);
// if successful allocation, return pointer to memory
// if new handling turned off altogether, return NULL
if (res != NULL)
{
break;
}
if (_newmode == 0)
{
errno = ENOMEM;
break;
}
// call installed new handler
if (!_callnewh(size))
break;
// new handler was successful -- try to allocate again
}
It runs fine in Debug mode.
Maybe it could be a memory leak issue with your Malloc since it runs fine one time in Debug, but when you keep the application up longer it crashes.
Do you free your buffer after using it?
The reason is simple. You allocate a buffer of fileSize bytes but after reading the file, you write at the fileSize+1-th position with buffer[fileSize] = 0;
Fix: change allocation with one larger.
buffer = (char *)malloc(fileSize + 1);
Debug builds pad memory allocations with additional bytes so it does not crash.

redis bitset -- how to upload an existing bitset array

I have huge data of bitset, stored in db. I want to upload the same to redis bitset, so I can perform bit operations on it. Is there a way to upload this data from either redis-cli or javascript code? I am using bitset.js npm module to load the bitset in my program from db.
One obvious way is to iterate my bitset array within my javascript code and keep calling redis.setbit(...) multiple times. Is there a way to upload all of them at once? If so how?
A bitset in Redis is actually just a string, so you can assign to it directly all at once. The bits in the string are the bits of the bitfield, set in left-to-right order. I.e. setting bit number 0 to 1 yields the binary number 10000000, or a single byte with the value 128. This looks like "\x80" when Redis prints it, which you can see for yourself by running setbit foo 0 1 and then get foo in Redis.
So to construct the right string to send to Redis, we just need to read the bits out of your BitSet and construct a buffer, one byte at a time, with the appropriate bits set.
Below is code that uses bitset.js and the redis npm module to transfer a BitSet in JavaScript into a Redis key. Note that this code assumes that the bitfield fits comfortably in memory.
let redis = require('redis'),
BitSet = require('./bitset');
let client = redis.createClient();
// create some data
let bs = new BitSet;
bs.set(0, 1);
bs.set(31, 1);
// calculate how many bytes we'll need
var numBytes = Math.ceil(bs.msb()/8);
// construct a buffer with that much space
var buffer = new Buffer(numBytes);
// for each byte
for (var i = 0; i < numBytes; i++) {
var byte = 0;
// iterate over each bit
for (var j = 0; j < 8; j++) {
// slide previous bits to the left
byte <<= 1;
// and set the rightmost bit
byte |= bs.get(i*8+j);
}
// put this byte in the buffer
buffer[i] = byte;
}
// now we have a complete buffer to use as our value in Redis
client.set('bitset', buffer, function (err, result) {
client.getbit('bitset', 31, function (err, result) {
console.log('Bit 31 = ' + result);
client.del('bitset', function () {
client.quit();
});
});
});

Cancelling dispatch_io_read

I'm parsing a very large CSV file using GCD functions (please see code below).
If I encounter an error I'd like to cancel dispatch_io_read. Is there a way to do that?
dispatch_io_read(channel,
0,
Int.max,
dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_BACKGROUND, 0))
{ (done, data, error) in
guard error == 0 else {
print("Read Error: \(error)")
return
}
if done {
lineBuffer.dealloc(bufferSize)
}
dispatch_data_apply(data)
{ (region, offset, buffer, size) -> Bool in
print(size)
let bytes = UnsafePointer<UInt8>(buffer)
for var i = 0; i < size; i++ {
switch bytes[i] {
case self.cr: // ignore \r
break
case self.lf: // newline
lineBuffer[bufferLength] = 0x00 // Null terminated
line(line: String(UTF8String: lineBuffer)!)
bufferLength = 0
case _ where bufferLength < (bufferSize - 1): // Leave space for null termination
lineBuffer[bufferLength++] = CChar(bytes[i])
default:
return false // Overflow! I would like to stop reading the file here.
}
}
return true
}
}
Calling dispatch_io_close(DISPATCH_IO_STOP) will cause running dispatch_io_read operations to be interrupted and their handlers to be passed to ECANCELED error (along with partial results), see the dispatch_io_close(3) manpage.
Note that this does not interrupt the actual I/O system calls, it just prevents additional I/O system calls from being entered, so you may have to set an I/O channel high watermark to ensure the appropriate level of I/O granularity for your application.

Embedded: SDHC SPI write issue

I am currently working at a logger that uses a MSP430F2618 MCU and SanDisk 4GB SDHC Card.
Card initialization works as expected, I also can read MBR and FAT table.
The problem is that I can't write any data on it. I have checked if it is write protected by notch, but it's not. Windows 7 OS has no problem reading/writing to it.
Though, I have used a tool called "HxD" and I've tried to alter some sectors (under Windows). When I try to save the content to SD card, the tool pop up a windows telling me "Access denied!".
Then I came back to my code for writing to SD card:
uint8_t SdWriteBlock(uchar_t *blockData, const uint32_t address)
{
uint8_t result = OP_ERROR;
uint16_t count;
uchar_t dataResp;
uint8_t idx;
for (idx = RWTIMEOUT; idx > 0; idx--)
{
CS_LOW();
SdCommand(CMD24, address, 0xFF);
dataResp = SdResponse();
if (dataResp == 0x00)
{
break;
}
else
{
CS_HIGH();
SdWrite(0xFF);
}
}
if (0x00 == dataResp)
{
//send command success, now send data starting with DATA TOKEN = 0xFE
SdWrite(0xFE);
//send 512 bytes of data
for (count = 0; count < 512; count++)
{
SdWrite(*blockData++);
}
//now send tow CRC bytes ,through it is not used in the spi mode
//but it is still needed in transfer format
SdWrite(0xFF);
SdWrite(0xFF);
//now read in the DATA RESPONSE TOKEN
do
{
SdWrite(0xFF);
dataResp = SdRead();
}
while (dataResp == 0x00);
//following the DATA RESPONSE TOKEN are a number of BUSY bytes
//a zero byte indicates the SD/MMC is busy programing,
//a non_zero byte indicates SD/MMC is not busy
dataResp = dataResp & 0x0F;
if (0x05 == dataResp)
{
idx = RWTIMEOUT;
do
{
SdWrite(0xFF);
dataResp = SdRead();
if (0x0 == dataResp)
{
result = OP_OK;
break;
}
idx--;
}
while (idx != 0);
CS_HIGH();
SdWrite(0xFF);
}
else
{
CS_HIGH();
SdWrite(0xFF);
}
}
return result;
}
The problem seems to be when I am waiting for card status:
do
{
SdWrite(0xFF);
dataResp = SdRead();
}
while (dataResp == 0x00);
Here I am waiting for a response of type "X5"(hex value) where X is undefined.
But most of the cases the response is 0x00 (hex value) and I don't get out of the loop. Few cases are when the response is 0xFF (hex value).
I can't figure out what is the problem.
Can anyone help me? Thanks!
4GB SDHC
We need to see much more of your code. Many µC SPI codebases only support SD cards <= 2 GB, so using a smaller card might work.
You might check it yourself: SDHC needs a CMD 8 and an ACMD 41 after the CMD 0 (GO_IDLE_STATE) command, otherwise you cannot read or write data to it.
Thank you for your answers, but I solved my problem. It was a problem of timing. I had to put a delay at specific points.