I'm hosting an HLS stream with XAMPP / Apache, which basically means I have a folder in my document root that contains a couple of incrementally numbered 10-second video files.
Every 10 seconds, a new video file is saved into the folder and the oldest video file in the folder is deleted.
Apart from these video files, the document root also contains some other files, such as PHP scripts and playlist files.
My server has plenty of RAM and a pretty fast CPU, but is using a comparatively slow hard disk.
Given the fact that the constant downloading of these video files is likely what's going to make or break the server performance, it seems like a good idea to cache these files in memory.
If Apache were to keep all video files (with a .ts extension) that're downloaded by a user's video player, in it's memory for about 60 seconds, the next user would then be able to download the file much faster. Apache could rely on the files not changing after the first open and on the fact that the files won't be requested anymore after those 60 seconds.
All other files do not (necessarily) have to be cached, since they're rather small and are regularly modified.
Is anyone able to give me directions on how to get started?
Modern operating systems already cache accessed files in memory. The whole process is managed by the kernel automatically.
Apache in-memory caching won't help you since it needs all the files at start-up.
If you want some level of control over the caching you could use vmtouch. Check the manual.
Related
When we browse in the browser for the same website that we have browsed previously then the observe loads it immediately from the cache memory. Similarly when we access some files from our local system's drive then we see that our system shows the recent files(see the image attached) so I am thinking that these files are available in our system cache so it is showing. Now clear me that actually the browser cache and the system cache are different caches or only one cache reside in our whole system?
The browser has its own logical "cache". Every file is stored to the disk and read from there.
The system handles all "file caches" and therefore some data may (still) be in memory when the browser tries to access the file.
These are some different type of "caching" that you're trying to understand.
I have configured my drupal site so that all images/files/media etc is handled my s3 by using S3 file system module.
Now everything works fine, the image/file/ field uploader works fine but there is a huge performance issue when using IMCE file browser from the WYSIWYG editor. It takes at least a minute for the browser to display its content and there are only 290 images with 78 MB used in that initial folder which should not cause such huge delays. This is having a huge impact for our editors and several minutes lost just to upload a couple of images.
I tried various pagination patch and there is no difference at all in the performance.
What are my options now
As drilled through many forms and discussions, turns out that IMCE was not meant for S3 file system and I found this patch in pdf form(warning downloads rather than opens )
I followed the steps in that patch which significantly improved my performance.
I have a log server, where users upload archives and view their content online when needed. Currently the server unzips files, right after receiving them. Unfortunately, my peers consumed all the drive space I had. I can free up a lot of space, if there's a way of storing ZIP archives, but feeding them to users as HTML page (same as default Apache's file browser).
I know there are solutions relying on JS, like:
http://gildas-lormeau.github.io/zip.js/demos/demo2.html
https://stuk.github.io/jszip/
or I can unzip them on demand at server side and provide link to a temporary folder. However, some time ago I've heard a browser can view an archive content if proper headers are sent from Apache/nginx. Apache's mod-deflate doesn't help much here and I can't find other docs - perhaps it's not possible after all?
Cheers.
We have upgraded to ColdFusion 10 and I am testing large upload capability.
Using both a HTML form and the flash multi-file upload CFFILEUPLOAD I can upload files of up to 2GB.
With files over 2gb the upload does not even start. 0% both with the flash upload and what chrome browser reports with HTML form.
Technical services suggest it does not even get as far as Apache, that is not restricting the upload. ColdFusion is also setup to allow 4000MB post data even with throttle.
The upload is occurring across the network, so even with test a 1.7gb file it doesn't take long - but 2.5gb does not even begin.
Any suggestions to help diagnose the cause?
Thanks
Someone told me that some servers configure it that allow streaming of a file.
OR
Is it a file-encoding problem, not a server configuration problem?
Given a link of the video file, how do I check if that person allows streaming (or play only once downloaded) ? Headers?
In order for a file to be streamable, all of the information necessary to initialize the decode and playback engines must be at the beginning of the file. Not all file formats are designed in that way. (for instance, with AVI files usually the index is at the end).
But the server must also be configured to stream. Transferring a file over HTTP or FTP is a different protocol than streaming the file.
So it's both, for streaming to work everything has to be setup correctly, the server and the file must support it. If either one is not set up correctly, then transferring the file usually works. Transferring the file is the conservative or fallback solution.
As long as the encoding format is such that the information in the file is chronological with respect to the video frames, there is no theoretical possible way for a server to allow downloading but not playing. Think about it. If you have the data, and it's playable after downloading, that portion is playable before downloading is complete.