NSURLSession uploading chunks in background - objective-c

I have a large video to upload like 3GB 4GB in size. And I need to use NSURLSessionUploadTask to upload it in background. If I try uploading this single file then it uploads fine but pause/resume in this case not works properly. I pause somewhere and it resumes from somewhere else or even from start.
So to achieve pause/resume, I move to chunks uploading. Now I create like 3 chunks at start, write their bytes in separate files and start uploading them. It works fine. Issue comes when app goes to background and existing chunks are uploaded completely. Now I need to add new chunks for uploading.
It gives me enough time to write files for other 3 chunks and start them, but those chunks never continue uploading unless user opens the app. Once app comes to foreground then those chunks start uploading. But same repeats when app goes to background and I need to add more chunks to it.
So chunks added to NSURLSession while app is in background never start uploading. Please provide help about it.

Related

Download large file from FTP server in chunks

I need to download a large file from an FTP server. A new file is uploaded once a week, and I need to be the first to download the file. I've made a check which checks if the file is uploaded, and if the file is there, it will start the download. Problem is this is a big file (3 gb). I can download about 10% of the file within the first few minutes, but as more and more people discover the file is uploaded, the avg download speed drops and drops, to the point where it takes about 3-4 hours to download the remaining 80-90%.
The time isn't a huge problem, but sure would be nice if i could get the download done quicker. The problem is my download never finishes, and i think its because the connection gets timed out.
Solution would be to extend the download timeout, but ideally i have another suggestion. My suggestion is to download the file in chunks: Right now I'm downloading from the beginning to the end in 1 go. It starts of with a good downloadspeed, but as more and more people begin their download, it slows all of us down. I would like to split up the download in smaller chunks and then have all the separate downloads start at the same time. I've made an illustration:
Here i have 8 starting points, which means i'll end up with 8-parts of the zip file, which i then need to recombine to one file once the download has ended. Is this even possible and how would i approach this solution? If i could do this, i would be able complete with the entire download in about 10-15 minutes and I wouldn't have to wait the extra 3-4 hours for the download to fail and then having to restart the download.
Currently i use a web client to download the ftp file, since all other approaches couldn't finish the download, because the file is larger than 2,4 gb.
Private wc As New WebClient()
wc.DownloadFileAsync(New Uri("ftp://user:password#ip/FOLDER/" & FILENAME), downloadPath & FILENAME)

Realm objective c - really huge db file size - 64GB

We have recently planned to switch from SQLite to Realm in macOS and iOS app due to db file corruption issue with SQLite so we first started with macOS app. All coding changes were smooth and app started working fine.
Background about app and DB usage - app really uses DB very heavily and performs too many read and writes to DB in each minute and saves big xml's to it. In each minute it writes / updates around 10-12 records (at max) with xml and reads 25-30records too. After each read it deletes data along with xml from database and my expectation is once data is deleted it should free up space and reduce file size but looks like it is growing continuously.
To test the new DB changes we kept app running app 3-4 days and DB file size went to 64.42GB and app started being slow. Please refer the attached screen shot.
To further debug, I started app with new DB file and size was 4KB but within 5min it goes to 295KB and never reduced in size even records were continuously added and deleted.
To further clarify, app uses NSThreads to perform various operations and those threads writes and reads data to DB but with proper begin\commit transactions. I also read at 'Large File Size' at https://realm.io/docs/java/latest/#faq and tried to find compactRealm but can't find it in objective c.
Can anybody please advise.
Update - I Give up on Realm
After 15days of efforts, Finally I have stopped usage of Realm and starting to fix/workaround db file corruption issue with SQLite. Realm Huge DB file issue was fixed by making changes to code for threads but then I started getting Too many open files error after running the app for 7-8 hours.
Debugged for whole week and made all possible changes and at some point looks like all was good as xcode were not showing any open files. But again I started getting Too many open files crash and then debugged with instrument and found there were so many open files to realm database, lock, commit and cv files.
I am sure there are no leaks in app and xcode also does not shows those open files in Disk usage as well. I decided to invoke lsof command in code before and after Realm calls and most of the it doesn't increase open file count but sometime n between it increases. In My app it started from 120 files to 550 in around 6 hours. Xcode looks all fine via Disk usage but instrument shows open files.
No good support from Realm team, sent email to them, just got one response. Made many changes to code following their suggestions and doesn't work at all so gave up on it. I think it's good for small apps only.

Intercepting File Writes on OS X

I have a program that generates information from the contents of files, however, I believe it would be more efficient if I were able to do this as the files are being written; rather than having to then read the contents back after some delay, since I can simply generate the data as the file is writing to disk.
What method(s) are available for an application to hook into the file-write process, i.e- to process the data stream as it's being written to disk? Also, which of these (if any) are allowable for app store apps?
I've been considering using a Spotlight Importer, however this still involves reading the contents of a file after they've been written, in which case I'm relying on the file still being in the RAM cache to reduce disk access.

How chunk file upload works

I am working on file upload and really wandering how actually chunk file upload works.
While i understand client sends data in small chunks to server instead of complete file at once. But i have few questions on this:-
For browser to divide and send whole file into chunks, Will it read complete file to its memory? If yes, then again there will me chances of memory leak and browser crash for big files(say > 10GB)
How cloud application like google drive droopbox handles such big files upload?
If multiple files are selected to upload and all have size grater than 5-10 GB, Does browser keep all files into memory then send it chunk by chunk?
Not sure if you're still looking for answer, I been in your position recently, and here's what I've come up with, hope it helps: Deal chunk uploaded files in php
During uploading, If you can print out the request from the backend, you shall see three parameters: _chunkNumber, _totalSize and _chunkSize, with these parameters it's easy to decide whether this chunk is the last piece, if it is, assemble all of the pieces as a whole shouldn't be hard.
As for javascript side, ng-file-upload has a setting named "resumeChunkSize" where you can enable chunk mode and setup the chunk size.

Multiple background downloads in ios7 using NSURLSessionConfigurations

I want to know about how to download a bunch of files one after the other. It is like if I have 5 files to download, I should automatically download all the 5 files in a sequence 1 complete then 2 , 2 then 3 this way all five should be completed automatically.
Its should all done in the background of my app . Thanks in advance
You should use NSURLSession. You can create a background session which will even continue to work after your app goes to the background and/or is terminated.
You ask the associated NSURLSessionTasks to download all the files, and the framework will take care of downloading as many of the files concurrently as makes sense (given the bandwidth etc.).
You'll be able to get download status if you need, and will get notified to completion of the downloads, even if your app wasn't running anymore. There's a lot to love about NSURLSession. You should consider it for all long running download/upload tasks.
I think you'll have to chain then manually. In other words keep your own queue of tasks. When one finishes remove it from the queue and start the next one.