How to configure server to allow large file downloads? - apache

I can download 1+ Gb files with no problem from my server. Today uploaded 2.3 GB file and discovered that I can't download it, downloading gets interrupted after 1GB and it happens for every user. I tried Firefox 15, latest Chrome, and IE9.
Server:
Apache 2.2.22
fcgi
eAccelerator
CPnginx
How can I resolve this?

I think I just faced the same problem, hopefully this might help others.
My server is set up to proxy requests to Apache through Nginx using the ngx_http_proxy_module. The proxy_max_temp_file_size directive of this module defaults to 1024m and is what was causing my problem.
Try adding the line proxy_max_temp_file_size 5120m; within a http, server or location directive in your nginx.conf file and reload nginx's config file with /etc/init.d/nginx reload.

Looks like the default max download size apache sets is around 2 Gb and you can override it by tweaking LimitRequestBody in your httpd.conf. Not sure why it would stop at 1 Gb though, that would make me think the problem is something else.. Hope this helps.

Related

Serving directory on samba share with Apache 2.4.41 truncates header

I have a strange problem when downloading files from an Apache 2.4.41 webserver serving files from a samba share on Ubuntu 20.04.
A python requests call ends in BadStatusLine('pache/2.4.41 (Ubuntu)\r\n')) and Chrome sometimes shows partial http headers in the begining of the file.
Usually the response would start with "HTTP/1.1" while with this server it directly starts with "pache 2.4.41" which looks like a shortened "Apache 2.4.41".
After digging a lot I found one helpful post: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=900821
If I set "EnableSendfile On" in Apache for this directory it now seems to work. But this is scary as hell. Will it always work? Also for file uploads, etc.? Why is this still an issue in 2021 when the original error was reported in 2018!
And here the server response:

Apache2 will only serve small files

I just know this is one of those questions where the solution should have been obvious and I'm going to feel like an idiot but I'm stuck.
I've set up a basic apache2 web sever under openSUSE Leap 15.1 on my LAN with a single virtual host (for the moment).
If I create a simple html file of 255 bytes or less a browser on another workstation picks it up without problem. If the file gets to anything larger than 255 bytes apache doesn't serve it. The GET request shows up on the server but nothing shows in Firefox, Konqueror at least gives me a 1b Object not found error.
I should say that running a browser on the server itself shows all these files perfectly well should I use 127.0.0.1, localhost or the server name in the URL.
Is this an Apache2 directive I've missed or something else entirely?
After setting the LogLevel to trace1 and then hunting around on the web I came across what appears to be the solution. Set the Apache directive ...
EnableSendfile Off
Apparently Apache is compiled with this set to On and the kernel sendfile doesn't do the job. Not a detailed explanation I know but I haven't followed this all the way through - I just needed to get Apache working!

CKAN file upload 413 Request Entity Too Large error in Ngnix

I need to be able to upload files ranging in size in 500MBs in CKan. I have installed CKAN using packager in Ubuntu 16x version. It works nice with me being able to set up organizations and creating new datasets. However, I am not able to upload files more than 100mb in size. I get error
413 Request Entity Too Large error' nginx/1.4.6 (Ubuntu)
Based on various forums and suggestions, I have changed
client_max_body_size to 1g in file /etc/nginx/nginx.conf. I have tried various ways such as setting this parameter to 1000M/1g/1G values one at a time and nothing seems to work. My all uploads beyond 100MB keep failing.
I also learnt that changing production.ini or development.ini(ckan.max_resource_size) file would help and I tried that too but it doesn't work. Please suggest what could be done. nginx is a proxy server and apache is web server that comes with default cKan packager.
In the end of /etc/nginx/nginx.conf, you have this include directive :
include /etc/nginx/sites-enabled/*;
that will include /etc/nginx/sites-enabled/ckan. This file contains the directive :
client_max_body_size 100M;
Change it, don't forget to change the ckan.max_resource_size /etc/ckan/default/production.ini, restart nginx and apache and it will work normally.

Gzip files are corrupted when downloaded off an Apache 2.4.28 server. Fine via FTP/SZ

I am not sure when this started happening, but I have noticed that when I download .gz (gzip) files through httpd (apache 2.4.28), and try to open them on the client they are corrupted.
If I download them via sz, ftp, or another method they open fine on the client. If I transfer them via scp to another server and download them they work fine.
At first I thought it could be mod_deflate compressing it more than it should and corrupting it, but I disabled mod_deflate and the behavior still occurs.
I then downloaded nginx-1.12.2.tar.gz from nginx.org using wget on the server. When I downloaded that through Apache, it opened fine on the client.
As another test, I created a gz file on another server and transfered it over to the problematic one. Tried downloading that & it was corrupted.
So not really sure whats going on here. Can't seem to get a rhyme or reason to this bug.
Any thoughts?
Same problem here with apache 2.2 and 2.4, but removing
SetOutputFilter DEFLATE
from the vhosts configuration file (and doing an apache reload) fixed it for me on either machine.
Make sure to restart or at least reload apache after changing the vhosts config.
Turns out it was an incorrect mime type in the httpd.conf file.

How do I get Apache mod_cache to cache?

I've gotten Apache mod_cache to work as intended on a Windows server running Apache 2.2, but I'm having trouble getting it running on a Linux cpanel server to which I have root access.
Here's what I know:
1) mod_cache and mod_disk_cache are both compiled into Apache (confirmed with "httpd -l")
2) My httpd.conf is configured like this
CacheRoot /home/accountname/apache-cacheroot
CacheEnable disk /
3) I've restarted Apache after all configuration changes
4) I know that section of the httpd.conf is being processed (I put some unrelated commands in there to debug.)
5) The file I request displays the current time via php, and it does not change on subsequent requests.
...it does not change on subsequent requests
It sounds like your caching is working. If it did change on every request, then the request is being served by PHP instead of Apache's cache.
Did you try enabling the modules with a2en cache and a2en disk_cache ?
Do not forget to restart the server after doing this.
Depending on what your have your CacheRoot set to, you may need to change the permissions to make the permissions 777.