SPIFFS upload size explanation - file-upload

I have an ESP8266, specifically the ESP-12-E.
I am using this aruino tool to upload data to the 4MB of SPIFFS storage available to me: https://github.com/esp8266/arduino-esp8266fs-plugin
I am uploading a couple of files, which by my calculation add up to
148,269 Bytes
~145KB
When I upload the files
[SPIFFS] data : /Users/andy/Documents/Personal/proj/p/relay_controller/data
[SPIFFS] size : 1004
[SPIFFS] page : 256
[SPIFFS] block : 8192
/css/bootstrap.min.css
/pretty.html
/wifi.htm
skipping .DS_Store
[SPIFFS] upload :
/var/folders/kq/jq1t3j257tn7n_dqy0m9_zl80000gn/T/arduino_build_804509/relay_controller.spiffs.bin
[SPIFFS] address: 0x300000
[SPIFFS] reset : nodemcu
[SPIFFS] port : /dev/cu.SLAB_USBtoUART
[SPIFFS] speed : 115200
Uploading 1028096 bytes from /var/folders/kq/jq1t3j257tn7n_dqy0m9_zl80000gn/T/arduino_build_804509/relay_controller.spiffs.bin
I have bolded the bits I have questions about.
It says size 1004....What is size 1004!?
At the end most importantly it says uploaded 1,028,096 bytes, Which is 1004 KB, which must explain the 1004. But why is it uploading 1004KB when my data folder is just ~145KB?

Related

df.to_parquet giving error - realloc of size 1073741824 failed

when I try to export df with 240,000,000 rows to parquet
i get an error
realloc of size 1073741824 failed
I dont want to split the file.
can I use another solution?

How to increase the flashing speed using SAM-BA?

I am trying to flash my yocto-generated image into a SAM9X60EK board but it's insanely slow, I think it's because of the write buffer size which is only 8KB (and the image is around 150MB) as shown below,
$ sam-ba -p serial -b sam9x60-ek -a nandflash -c write:microchip-headless-image-sam9x60ek.ubi:0x800000
Output:
Opening serial port 'ttyACM0' Connection opened.
Detected memory size is 536870912 bytes.
Page size is 4096 bytes.
Buffer is 8192 bytes (2 pages) at address 0x0030aca0.
NAND header value is 0xc1e04e07.
Supported erase block size: 256KB
or
Wrote 8192 bytes at address 0x00800000 (0.01%)
Wrote 8192 bytes at address 0x00802000 (0.01%)
…
…
Wrote 8192 bytes at address 0x0015c000 (99.83%)
Wrote 8192 bytes at address 0x0015e000 (100%)
Is there any possibility to make this process any faster?
You can try running this command before burning:
sam-ba -p serial -b sam9x60-ek -a lowlevel

Spectrum table Manifest file when S3 file size is in decimal

I am reading a S3 file by creating a Spectrum external table and pointing it to a manifest file which contains the information about the source S3 file.
The problem is when my S3 file size is in decimal for e.g. 37.5 MB or 100.2 KB.
As per the documentation, we need to provide the file size in bytes. Now when I am using a multiplier of 1000 to convert to bytes, I am loosing some records or some data at the end of the file in the external table.
But when I use a multiplier of 1024 to convert to bytes, my converted file size would be in decimal.
Consider I have a file size 100.2 KB, so in bytes it would be 102604.8 Bytes.
When I provide the file size as 102604.8 in manifest file, I get an
error "File entry does not have content length set"
When I provide a rounded value to the next integer, 102605, I get an error "Spectrum Error"
When I provide a rounded value to the previous integer, 102604, I again get same error "Spectrum Error"
My manifest looks like:
{
"entries": [
{"url":"s3://path/filename1.csv", "meta": { "content_length": 102605 } },
{"url":"s3://path/filename2.csv", "meta": { "content_length": 102605 } }
]
}
Does anybody here faced such scenario and can share their inputs.
And what is the actual size of the file?
Consider I have a file size 100.2 KB, so in bytes it would be 102604.8 Bytes.
The value of 100.2 KB is not the exact file size in bytes. Your file will always have the full number of bytes of size (as data is stored in bytes in memory).
You can check the size of the file by copying it to your local computer and calling
stat -f%z my_file.csv
Also you can directly check the metadata of the s3 object with eg. aws cli
aws s3api head-object --bucket my_bucket --key my_objects_key --query 'ContentLength'
In our system we are using the latter (but using boto3 python library) to assemble the manifest file and it works without any problems.
For the debugging you could also have a look into some internal Redshift tables like STL_ERROR or SVL_S3LOG.

A list of error in my site

I don't know why my site give me this error. This is the list of errors.
plz lid me ! what shall i do ?
Fatal error: Out of memory (allocated 6029312) (tried to allocate 8192 bytes) in /home/lifegat/domains/life-gate.ir/public_html/includes/functions.php on line 7216
Fatal error: Out of memory (allocated 7602176) (tried to allocate 1245184 bytes) in /home/lifegat/domains/life-gate.ir/public_html/misc.php(89) : eval()'d code on line 1534
Fatal error: Out of memory (allocated 786432) (tried to allocate 1245184 bytes) in /home/lifegat/domains/life-gate.ir/public_html/showthread.php on line 1789
Fatal error: Out of memory (allocated 7340032) (tried to allocate 30201 bytes) in /home/lifegat/domains/life-gate.ir/public_html/includes/class_core.php(4633) : eval()'d code on line 627
Fatal error: Out of memory (allocated 2097152) (tried to allocate 77824 bytes) in /home/lifegat/domains/life-gate.ir/public_html/includes/functions.php on line 2550
Warning: mysql_query() [function.mysql-query]: Unable to save result set in [path]/includes/class_core.php on line 417
Warning: Cannot modify header information - headers already sent by (output started at [path]/includes/class_core.php:5615) in [path]/includes/functions.php on line 4513
Database error
Fatal error: Out of memory (allocated 786432) (tried to allocate 311296 bytes) in /home/lifegat/domains/life-gate.ir/public_html/includes/init.php on line 552
Fatal error: Out of memory (allocated 3145728) (tried to allocate 19456 bytes) in /home/lifegat/domains/life-gate.ir/public_html/includes/functions.php on line 8989
Fatal error: Out of memory (allocated 262144) (tried to allocate 311296 bytes) in /home/lifegat/domains/life-gate.ir/public_html/forum.php on line 475
Warning: mysql_query() [function.mysql-query]: Unable to save result set in [path]/includes/class_core.php on line 417
Warning: Cannot modify header information - headers already sent by (output started at [path]/includes/class_core.php:5615) in [path]/includes/functions.php on line 4513
Fatal error: Out of memory means that the server is out of reserved memory. This usually happens when you are working with big objects, such as images.
The solution is to use the & operator. This makes a variable point towards another object. Example:
$object = new BigObject();
$copy = $object; // this copies the object thus more memory is required
$pointer = &$object; // the & makes the $pointer variable point to $object
Because the variable is pointed to another variable, if you change one, the other will change as well.
$object = new BigObject();
$pointer = &$object;
$object->x = 12345;
echo $object->x;
echo $pointer->x; // will have the same output as $object->x
Pointers are often used in functions, like this:
$object = new BigObject();
x( $object );
function x( &$object ) {
// do stuff with $object
}
The Warning: Cannot modify header information warning is usually given when you are trying to change the header data after sending output. You probably have a header(); call after you have echo'd something or have some whitespaces before you use the PHP open tag <?php.
Finally, the Warning: mysql_query() [function.mysql-query]: Unable to save result set error is usually a MySQL issue. But knowing you are out of memory, you might fix the other errors first.
Increase memory_limit in php.ini or slim your code.

Allowed memory size of 33554432 bytes exhausted (tried to allocate 11264 bytes)

I'm trying to upload file using simple form. Size of file is 1.37 MB.
After page reload, i get this error:
Allowed memory size of 33554432 bytes exhausted (tried to allocate 11264 bytes)
This is from phpinfo():
Directive Local Value Master Value
memory_limit 32M 32M
max upload file which i did some time ago is 25MB.
This is my php.ini:
[PHP]
; Maximum allowed size for uploaded files.
upload_max_filesize = 25M
; Maximum size of POST data that PHP will accept.
post_max_size = 25M
;;;;;;;;;;;;;;;;;;;
; Resource Limits ;
;;;;;;;;;;;;;;;;;;;
; Maximum execution time of each script, in seconds
max_execution_time = 600
; Maximum amount of time each script may spend parsing request data
max_input_time = 600
; Maximum amount of memory a script may consume (8MB)
memory_limit = 32M
Try to use other script to file resize and if it wont help rise memory limit and if you cant do it upload smaller images. No other option.