I'm having an issue using the UploadHandler.php. It does not resize images larger than 1.1M.
No errors are coming up. Everything else is working fine.
Server memory_limit = 120M
Post Max Size = 40M
Upload Max Filesize = 40M
Tried 'convert_params' => '-limit memory 120MiB -limit map 120MiB',
Thanks
Related
i uploaded same logs 200 million and one logSize is 474Byte
so i simply thought that the total disk usage is about 100G(200000000 x 474Byte)
but the result is 60G(solrUI - clode - Nodes - DiskUsage)
how is this possible .. ?
dose solr compress if those all logs same ??
I collect statistics and all the information I need is in the <head> (script tag) of site.
It have massive <body>(about 5-10 kb per page) so can I dont parse it for less server load?
I would be glad if you recommend alternative optimizations to reduce the server load
settings.py
CONCURRENT_REQUESTS = 32 DOWNLOAD_DELAY = 0.33 now speed 180/per min(sometimes 200)
Scrapy operates with entire response body only.
This behaviour coded in scrapy core.
CONCURRENCY_REQUEST = 32
Scrapy don't have CONCURRENCY_REQUEST setting. Did you mean CONCURRENT_REQUESTS ?
DOWNLOAD_DELAY = 0.33 now speed 180/per min(sometimes 200)
If you didn't specify RANDOMIZE_DOWNLOAD_DELAY as False (default valueTrue).
download delay will be a random number between 0.5x to 1.5x of DOWNLOAD_DELAY setting.
I am trying to upload large files (less than 5 GB, hence not multipart upload, normal upload) using java sdk. Smaller files gets uploaded in no time. but files which are above 1 MB, doesnt upload. My code gets stuck in the lince where actual upload happens. I tried using transfer manager (TransferManager.upload) function, when I check the number of bytes transferred, it keeps transferring more than 1 MB and keeps running until I force stop my java application. what could be the reason, where am I going wrong. same code works for smaller files. Issue is only with larger files.
DefaultAWSCredentialsProviderChain credentialProviderChain = new DefaultAWSCredentialsProviderChain();
TransferManager tx = new TransferManager(credentialProviderChain.getCredentials());
Upload myUpload = tx.upload(S3bucket,fileKey, file);
while(myUpload.isDone() == false) {
System.out.println("Transfer: " + myUpload.getDescription());
System.out.println(" - State: " + myUpload.getState());
System.out.println(" - Progress: "
+ myUpload.getProgress().getBytesTransferred());
}
s3Client.upload(new PutObjectRequest(S3bucket,fileKey, file));
Tried both transfer manager upload and putobject methods. Same issue with both.
TIA.
I created a long form with multiple fields that post the typed data to a Mariadb database.
The types are all set to TEXT as the field will contain long text.
The max_allowed_packet = 1073741824
The net_buffer_length = 1048576
The PHP post_max_size = 500M
The memory_limit = 550M
All this seems suficiant for very long text posting to database.
But apparently not, as it gives me a 500 error when I exceed 99100 characters :/ and works fine when I keep the posted text under that amount of characters.
What am I doing wrong ?
The error log shows : Code:500 Message:Code:500 Message: POST /login/resident/updateresident/7 HTTP/1.1
Thank you in advance!
Thanks for the reply !
Actually none of that worked....went through all php and apache and all the conf files that controls the upload ans post max .....
A brilliant guy sent me this link :
https://dev.mysql.com/doc/relnotes/mysql/5.6/en/news-5-6-20.html
It was a Mysql 5.6 bug !!!
Just updated it to 5.7 and pufff !! It worked !
GOD ! lost 3 days of my life because of this bug !
redux-persist has been working perfectly for me with smaller size state trees, but trying to use it on bigger ones I'm running into these errors when relaunching the app:
redux-persist/getStoredState: Error restoring data for key: pos Error: Couldn't read row 0, col 0 from CursorWindow. Make sure the Cursor is initialized correctly before
Couldn't read row 0, col 0 from CursorWindow. Make sure the Cursor is initialized correctly before accessing data from it.
Error: Couldn't read row 0, col 0 from CursorWindow. Make sure the Cursor is initialized correctly before accessing data from it.
I've tried things like this in MainApplication.java - onCreate method:
*long size = 50L * 1024L * 1024L; // 50 MB
com.facebook.react.modules.storage.ReactDatabaseSupplier.getInstance(getApplicationContext()).setMaximumSize(size)*
But it seems not work.
Thanks in advance
Looks like you have hit the maximum size limit for storing data in Android AsyncStorage which is currently 6MB.
I can see that you increased the AsyncStorage size to 50MB but however, it is still possible that your data's size is greater than 50MB. You can use the following plugin which makes use of fileSystem instead of AsyncStorage to persist your Redux state:
https://www.npmjs.com/package/redux-persist-filesystem-storage
This way you wouldn't have to worry about AsyncStorage size limit.