CKAN file upload 413 Request Entity Too Large error in Ngnix - file-upload

I need to be able to upload files ranging in size in 500MBs in CKan. I have installed CKAN using packager in Ubuntu 16x version. It works nice with me being able to set up organizations and creating new datasets. However, I am not able to upload files more than 100mb in size. I get error
413 Request Entity Too Large error' nginx/1.4.6 (Ubuntu)
Based on various forums and suggestions, I have changed
client_max_body_size to 1g in file /etc/nginx/nginx.conf. I have tried various ways such as setting this parameter to 1000M/1g/1G values one at a time and nothing seems to work. My all uploads beyond 100MB keep failing.
I also learnt that changing production.ini or development.ini(ckan.max_resource_size) file would help and I tried that too but it doesn't work. Please suggest what could be done. nginx is a proxy server and apache is web server that comes with default cKan packager.

In the end of /etc/nginx/nginx.conf, you have this include directive :
include /etc/nginx/sites-enabled/*;
that will include /etc/nginx/sites-enabled/ckan. This file contains the directive :
client_max_body_size 100M;
Change it, don't forget to change the ckan.max_resource_size /etc/ckan/default/production.ini, restart nginx and apache and it will work normally.

Related

Serving directory on samba share with Apache 2.4.41 truncates header

I have a strange problem when downloading files from an Apache 2.4.41 webserver serving files from a samba share on Ubuntu 20.04.
A python requests call ends in BadStatusLine('pache/2.4.41 (Ubuntu)\r\n')) and Chrome sometimes shows partial http headers in the begining of the file.
Usually the response would start with "HTTP/1.1" while with this server it directly starts with "pache 2.4.41" which looks like a shortened "Apache 2.4.41".
After digging a lot I found one helpful post: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=900821
If I set "EnableSendfile On" in Apache for this directory it now seems to work. But this is scary as hell. Will it always work? Also for file uploads, etc.? Why is this still an issue in 2021 when the original error was reported in 2018!
And here the server response:

Apache2 will only serve small files

I just know this is one of those questions where the solution should have been obvious and I'm going to feel like an idiot but I'm stuck.
I've set up a basic apache2 web sever under openSUSE Leap 15.1 on my LAN with a single virtual host (for the moment).
If I create a simple html file of 255 bytes or less a browser on another workstation picks it up without problem. If the file gets to anything larger than 255 bytes apache doesn't serve it. The GET request shows up on the server but nothing shows in Firefox, Konqueror at least gives me a 1b Object not found error.
I should say that running a browser on the server itself shows all these files perfectly well should I use 127.0.0.1, localhost or the server name in the URL.
Is this an Apache2 directive I've missed or something else entirely?
After setting the LogLevel to trace1 and then hunting around on the web I came across what appears to be the solution. Set the Apache directive ...
EnableSendfile Off
Apparently Apache is compiled with this set to On and the kernel sendfile doesn't do the job. Not a detailed explanation I know but I haven't followed this all the way through - I just needed to get Apache working!

Gzip files are corrupted when downloaded off an Apache 2.4.28 server. Fine via FTP/SZ

I am not sure when this started happening, but I have noticed that when I download .gz (gzip) files through httpd (apache 2.4.28), and try to open them on the client they are corrupted.
If I download them via sz, ftp, or another method they open fine on the client. If I transfer them via scp to another server and download them they work fine.
At first I thought it could be mod_deflate compressing it more than it should and corrupting it, but I disabled mod_deflate and the behavior still occurs.
I then downloaded nginx-1.12.2.tar.gz from nginx.org using wget on the server. When I downloaded that through Apache, it opened fine on the client.
As another test, I created a gz file on another server and transfered it over to the problematic one. Tried downloading that & it was corrupted.
So not really sure whats going on here. Can't seem to get a rhyme or reason to this bug.
Any thoughts?
Same problem here with apache 2.2 and 2.4, but removing
SetOutputFilter DEFLATE
from the vhosts configuration file (and doing an apache reload) fixed it for me on either machine.
Make sure to restart or at least reload apache after changing the vhosts config.
Turns out it was an incorrect mime type in the httpd.conf file.

How to configure server to allow large file downloads?

I can download 1+ Gb files with no problem from my server. Today uploaded 2.3 GB file and discovered that I can't download it, downloading gets interrupted after 1GB and it happens for every user. I tried Firefox 15, latest Chrome, and IE9.
Server:
Apache 2.2.22
fcgi
eAccelerator
CPnginx
How can I resolve this?
I think I just faced the same problem, hopefully this might help others.
My server is set up to proxy requests to Apache through Nginx using the ngx_http_proxy_module. The proxy_max_temp_file_size directive of this module defaults to 1024m and is what was causing my problem.
Try adding the line proxy_max_temp_file_size 5120m; within a http, server or location directive in your nginx.conf file and reload nginx's config file with /etc/init.d/nginx reload.
Looks like the default max download size apache sets is around 2 Gb and you can override it by tweaking LimitRequestBody in your httpd.conf. Not sure why it would stop at 1 Gb though, that would make me think the problem is something else.. Hope this helps.

Apache proxy server file upload limi is 128k?

I am running an Apache 2.2.3 proxy server to hide my backend machines from users. I added a file upload service to my webservices; however, files larger than 128 kb are returning http Status Code of 413. I know this means Request entity too large, and I have scoured the internet looking for a solution.
I have changed my php.ini file to have max_execution_time = 3000, max_input_time = 6000, memory_limit = 128M, post_max_size = 20M, upload_max_filesize = 20M, default_socket_timeout = 6000. This didn't help, as I suspected it wouldn't. I am doing a Rest call from Java for the webservice it is not PHP.
I have changed the maxHttpHeaderSize in server.xml to 20000000 on the proxy connector to try to allow for more information to flow through. Again this did nothing and my limit is still at 128 kb.
I have also added the LimitRequestBody 20000000 Directive to the Location block for the webservice files will be uploaded from. This again didn't work.
Currently all 3 are in place without any improvement. I am still only able to send max 128 kb files through the proxy.
When I try to send a file directly to the backend machine without using the proxy it works perfectly fine without taking into account the size.
Any suggestions on how to fix this will be very much appreciated.
Thank you.
I have figured out what the problem was, and where the 128k limit occurs.
In mod_ssl it uses the default ssl negotiation size as 128k, when doing an upload we automatically renegotiate for security purposes.
I had to add and modify the SSLRenegBufferSize directive in the Locations and Directories that needed a larger than 128k buffer on renegotiation. This has worked like a charm for me.
Hope it helps anyone else that experiences this limit, or had this question.