I am working on file upload and really wandering how actually chunk file upload works.
While i understand client sends data in small chunks to server instead of complete file at once. But i have few questions on this:-
For browser to divide and send whole file into chunks, Will it read complete file to its memory? If yes, then again there will me chances of memory leak and browser crash for big files(say > 10GB)
How cloud application like google drive droopbox handles such big files upload?
If multiple files are selected to upload and all have size grater than 5-10 GB, Does browser keep all files into memory then send it chunk by chunk?
Not sure if you're still looking for answer, I been in your position recently, and here's what I've come up with, hope it helps: Deal chunk uploaded files in php
During uploading, If you can print out the request from the backend, you shall see three parameters: _chunkNumber, _totalSize and _chunkSize, with these parameters it's easy to decide whether this chunk is the last piece, if it is, assemble all of the pieces as a whole shouldn't be hard.
As for javascript side, ng-file-upload has a setting named "resumeChunkSize" where you can enable chunk mode and setup the chunk size.
Related
I'm trying to use GUN to create a File sharing platform. I read the tutorial and API but I couldn't find a general way to upload/download a file.
I hear that there is a limitation of 5Mb of localStorage in GUN, if I want to upload a large file, I have to slice it then storage it into GUN. But right now I can't find a way to storage file into GUN.
I read the question from Retric and I know how to store the image into GUN, but can I store the other type of Files such as .zip or .doc File? Is there a general API for file storage?
I wrote a quick little app in 35 lines of HTML to demonstrates file sharing for images, videos, sound, etc.
https://github.com/amark/gun/blob/master/examples/basic/upload.html
I've sent 20MB files thru it, tho yeah, I'm sure there is a better way of splitting it up into 2MB chunks - that is currently not automatic, you'd have to code it.
We'll have a feature in the future that will automatically split up video files. Do you want to help with this?
I think on the download side, all you have to do is make sure you have the whole file (stitch it back together if you do write a splitter upper), and add it to some <a href=" target. Actually, I'm not sure exactly how, but I know browsers support download file attributes for a few years now, where you can create a download link even of a in-memory file... but you'll have to search online for how. Then please write a tutorial and share it with the community!!
I would recommend using IPFS for file storage and GUN to store the links to those files. GUN isn't meant for file storage I believe, primarily user/graph data. Thus the 5 MB limitation.
I have a program that generates information from the contents of files, however, I believe it would be more efficient if I were able to do this as the files are being written; rather than having to then read the contents back after some delay, since I can simply generate the data as the file is writing to disk.
What method(s) are available for an application to hook into the file-write process, i.e- to process the data stream as it's being written to disk? Also, which of these (if any) are allowable for app store apps?
I've been considering using a Spotlight Importer, however this still involves reading the contents of a file after they've been written, in which case I'm relying on the file still being in the RAM cache to reduce disk access.
I have a large video to upload like 3GB 4GB in size. And I need to use NSURLSessionUploadTask to upload it in background. If I try uploading this single file then it uploads fine but pause/resume in this case not works properly. I pause somewhere and it resumes from somewhere else or even from start.
So to achieve pause/resume, I move to chunks uploading. Now I create like 3 chunks at start, write their bytes in separate files and start uploading them. It works fine. Issue comes when app goes to background and existing chunks are uploaded completely. Now I need to add new chunks for uploading.
It gives me enough time to write files for other 3 chunks and start them, but those chunks never continue uploading unless user opens the app. Once app comes to foreground then those chunks start uploading. But same repeats when app goes to background and I need to add more chunks to it.
So chunks added to NSURLSession while app is in background never start uploading. Please provide help about it.
In my portal-ext properties file, I found these parameters. I don't remember why I put them into the config-file, I think I simply copied them from some other web page where someone said it'll help.
There are comments explaining what the parameters do, but I still don't understand the underlying issues.
How can uploaded data be serialized extraneously?
Why are files > 10 MB be considered excessively large, and why do they have to be cached?
#Set the threshold size to prevent extraneous serialization of uploaded data.
com.liferay.portal.upload.LiferayFileItem.threshold.size=262144
#Set the threshold size to prevent out of memory exceptions caused by caching excessively
#large uploaded data. Default is 1024 * 1024 * 10.
com.liferay.portal.upload.LiferayInputStream.threshold.size=10485760
These properties will be invoked when you have an external file upload functionality in your portal.
When you upload a larger file, it needs to be written to a temporary file on the disk.
Since the part of the file upload process is to hold the file in memory before writing it to the disk/database, Larger files must be avoided and it will prevent out of memory exceptions.
If you want to know more details on this,
Please go through this link.
Liferay's Document Library uses other properties to restrict the file size. Such as
dl.file.max.size=3072000
These properties are connected with maximum file size for upload (e.g. for document library). However these seem to be the default values.
XMLHTTPRequest fails when I send a large file (>700MB) over .send(). Even worse, BlobBuilder fails for large files with append() as well. Is there a way to send a file in multiple chunks using XMLHTTPRequest? How do I tell the server to "append" the following stream of data?
If you have control of the server as well as the client, I'd suggest the following workaround:
break the file up into chunks (.slice())
upload the multiple file chunks
reassemble the file chunks on the server
I don't know that this problem can be solved strictly within the browser.