In my cpanel, threre is 30GB Project file, i am trying to compress it and downloading
but unfortunatly, file compression is taken too much time.
can you suggest any efficient way to compress it and can download easily
Related
I'm using the latest version of rclone to copy lots of files to backblaze b2 storage. What I've noticed recently is that there is a lot more download bandwidth being used than upload bandwidth. Eg running iftop on the network interface being used these are typical figures:-
Upload 77.6KB/s
Download 1.32MB/s
Why is there such so much being downloaded if all I am doing is backing up files to the server?
What can I do lessen the download bandwidth?
Many thanks
Have you tried the -bwlimit option to reduce download bandwidth?
I'm hosting an HLS stream with XAMPP / Apache, which basically means I have a folder in my document root that contains a couple of incrementally numbered 10-second video files.
Every 10 seconds, a new video file is saved into the folder and the oldest video file in the folder is deleted.
Apart from these video files, the document root also contains some other files, such as PHP scripts and playlist files.
My server has plenty of RAM and a pretty fast CPU, but is using a comparatively slow hard disk.
Given the fact that the constant downloading of these video files is likely what's going to make or break the server performance, it seems like a good idea to cache these files in memory.
If Apache were to keep all video files (with a .ts extension) that're downloaded by a user's video player, in it's memory for about 60 seconds, the next user would then be able to download the file much faster. Apache could rely on the files not changing after the first open and on the fact that the files won't be requested anymore after those 60 seconds.
All other files do not (necessarily) have to be cached, since they're rather small and are regularly modified.
Is anyone able to give me directions on how to get started?
Modern operating systems already cache accessed files in memory. The whole process is managed by the kernel automatically.
Apache in-memory caching won't help you since it needs all the files at start-up.
If you want some level of control over the caching you could use vmtouch. Check the manual.
We have upgraded to ColdFusion 10 and I am testing large upload capability.
Using both a HTML form and the flash multi-file upload CFFILEUPLOAD I can upload files of up to 2GB.
With files over 2gb the upload does not even start. 0% both with the flash upload and what chrome browser reports with HTML form.
Technical services suggest it does not even get as far as Apache, that is not restricting the upload. ColdFusion is also setup to allow 4000MB post data even with throttle.
The upload is occurring across the network, so even with test a 1.7gb file it doesn't take long - but 2.5gb does not even begin.
Any suggestions to help diagnose the cause?
Thanks
I want to upload big files to server. What is the best way to this:
1) using node.js library, such as formidable
2) using nginx upload module
or may be other more faster and better solution ?
If you just want to upload big files nginx would be the better solution.
However if you want to stream files, download as you upload, then node.js would be the right tool.
When using CSS3 and custom fonts, the client needs to download .oft or .ttf files. These files can be >50K. Can these files be compressed? How? Assuming Apache web server.
I am looking for a compression technique or an Apache configuration. Any ideas will help because downloading +50K files should be prevented.
.ttf files compress quite well (analysis with hex editor shows a lot of 0x00's and simple zipping reduces their size to 50% or even less of original size - at least fonts I checked), so I think any compression script would do the trick and I think same rule applies oft files as well.