Send File in Chunks with Metadata - objective-c

I am writing a web app and an OS X app that will upload files to a server.
The web app uses FlowJS to handle sending the files to the server. Along with every request, it sends the chunk number, chunk size, total file size, file name, etc. This is great because behind the scenes I'm uploading this data to an S3 and I use this information to determine when the file is finished uploading. Additionally, since the data is coming in chunks, I don't have to store the entire file in memory on my server.
For OS X with objective c, I'm planning on using AFNetworking. I tried using a multipart upload:
-(void)uploadFileWithUrl:(NSURL*)filePath {
AFHTTPRequestOperationManager*manager = [AFHTTPRequestOperationManager manager];
[manager POST:#"http://www.example.com/files/upload" parameters:nil constructingBodyWithBlock:^(id<AFMultipartFormData> formData) {
[formData appendPartWithFileURL:filePath name:#"blob" error:nil];
} success:^(AFHTTPRequestOperation *operation, id responseObject) {
NSLog(#"Success: %#", responseObject);
} failure:^(AFHTTPRequestOperation *operation, NSError *error) {
NSLog(#"Error: %#", error);
}];
}
but this simply attempts to send the entire file in one request which is not acceptable as the server will run out of memory on large requests.
What I would like to know is if there is anything like FlowJS for objective C, or if there is some way to include that information with AFNetworking for each call.

I do not think that you can do this automatically very easily. (And if one knows how to do this, I would like to know the answer.)
Multipart is no help, because it structures the content, but not the transmission. Think of it as paragraphs in a text document: They give the text a structure, but it is still one file.
That leads to the real problem: You want to have chunks of a fixed size and send a chunk after one is processed (uploaded to S3), then the next chunk and so on. How does the client know that the first chunk is processed? HTTP does not work the way that you can send new data, when the first one received by the server. You send the next chunk, when the first one is sent (on the way). I'm not sure, whether AFNetworking guarantees to request for the next part, after sending the first part … Maybe it wants to collect small parts.
Why don't you do it, S3 does:
Initiate a upload request with an ID as response. Then send a chunk using that upload ID. If it is done, the server responds and the next part is send. Doing so you have a handshake.

Related

Issue with Response time from Parse

Currently I am working with an iOS app which has some features to work with File datatype. I need to upload the images on Parse and also download the images from parse. Here I am facing issues with the response time that parse gives me.
When I upload a single image chosen from gallery in iOS device, it takes too much time to respond back. Likely if I am uploading a single image it takes around more than 1 minute to upload and respond back. The same thing is happening with downloading the images from parse server.
Can anyone please suggest me what should I do in order to improve these response time? Like do I need to pay something to get faster response? Or this is normal response time provided by parse for all accounts like paid and free.
I am using below code for uploading image,
//Upload a new picture
NSData *pictureData = UIImagePNGRepresentation(self.selectedImage);
PFFile *file = [PFFile fileWithName:#"img" data:pictureData];
[registrationData setObject:file forKey:#"image"];
[registrationData saveInBackgroundWithBlock:^(BOOL succeeded, NSError *error) {
if (succeeded) {
} else {
}
}];
Can anyone please suggest me the way to get faster response?
Thanks in advance...!!!
Its due to Parse call multiple nested call to upload image(s) so its better to
use png image
scale image before uploading
type of account may of one of issue (free / paid)
hope It helps

AFNetworking won't allow large file upload

With AFNetworking I'm trying to upload an image (1280x990) with size: 33695. The code below works perfectly with smaller images (ie:390x390) but the larger image throws an error:
[client POST:#"/upload_image" parameters:nil constructingBodyWithBlock:^(id <AFMultipartFormData>formData) {
[formData appendPartWithFileData:imageData name:#"image" fileName:#"image.jpg" mimeType:#"image/jpeg"];
} success:^(NSURLSessionDataTask * task, id responderData) {
} failure:^(NSURLSessionDataTask * task, NSError * error) {
}];
ERROR thrown:
NSDebugDescription = "JSON text did not start with array or object and option to allow fragments not set.";
I've searched many other posts and there doesn't seem to be anything referring to issues with a larger image size. Any suggestions?
Based on the author of AFNetworking, plz use appendPartWithFileURL instead. Because the data will be stream from disk.
I was having similar issues when I tried using AFNetworking. I have since switched to using RestKit and SDWebImage to handle the asynchronous loading and caching of images... and it works like a charm. You may want to take a look at this recent Quora post to better compare the differences between them.... Mainly the one con to Restkit was async and caching, but with little effort SDWebImage takes care of that with one line of code.
http://www.quora.com/iOS-Development/RestKit-vs-AFNetworking-What-are-the-pros-and-cons

Right way to implement HTTP post / Get in iOS

I am quite new to iOS, trying best way to implement HTTP post / Get communication.
Problem:
I want to make a multiple api calls and each calls will have its separate response. I am trying to write common network utils, Ideally it will take api url, make call and return data to caller. What is the right way to achive it?? I found moderate level of debate and fans for each approach.
Option 1:
dispatch_async(aQueue,^{
...[ make a sync network request get data back]
--- perform operation on data
--- then pass proceed data UI or set it in model.
dispatch_async(dispatch_get_main_queue()
}
Option 2:
-(NSString *) postData:(NSDictionary *)data serverUrl:(NSString *)targetUrl
-- call post data method with seperate delegate for each caller
-- start async request
-- on DidFinishedLaunching or OnError check delegate & then
return response back to callback
Thanks for your inputs.
I think your first option is not good. It is going to block the pooled thread for a long period of time which is not advisable. When implementing Multithreading in any environment pooled threads provided by the system should not be used for long running processes. Second synchronus network call are not really advised and it has its own pitfalls.
Your second option is more viable. An improvement that you may be able to do is to perform the work that happens in the did finish launching to a GCD thread and after the processing send the data on the main thread
There is a wonderful library available, called AFNetworking, which is very easy to implement.
It uses blocks, which greatly simply communication of data between classes (does away with delegates), and is asynchronous.
Example usage is below:
AFHTTPClient *client = [[AFHTTPClient alloc] initWithBaseURL:[NSURL URLWithString:"www.yourwebsite.com/api"]];
NSDictionary *params = #{
#"position": [NSString stringWithFormat:#"%g", position]
};
[client postPath:#"/api" parameters:params success:^(AFHTTPRequestOperation *operation, id responseObject) {
} failure:^(AFHTTPRequestOperation *operation, NSError *error) {
}];
As simple as that! Result is available directly within the class that calls the HTTP Post or Get method.
It even includes image and JSON requests, JSON deserialization, file download with progress callback, and so much more.

What is the correct way of sending large file through HTTP POST, without loading the whole file into ram?

I'm currently working on an application for uploading large video files from the iPhone to a webservice through simple http post. As of right now, I build an NSURLRequest and preload all of the video file data before loading the request. This naturally eats a ton of ram if the file is considerably big, in some cases it's not even possible.
So basically my question is: Is there a correct way of streaming the data or loading it in chunks without applying any modifications to the webserver?
Thanks.
EDIT for clarification: I am searching for a way to stream large multipart/form data FROM the iPhone TO a webserver. Not the other way arround.
EDIT after accepting answer: I just found out that apple has some nifty source code written for this exact purpose and it shows appending additional data to the post not just the big file itself. Incase anyone ever needs it: SimpleURLConnections - PostController.m
Yet another EDIT: While using that piece of source code from apple I encountered a very stupid and ugly problem that even wireshark couldn't help me debug. Some webservers don't understand the boundary string when it's declared in between quotes (like in apples example). I had problems with it on Apache Tomcat and removing the quotes worked just wonderful.
You can use NSInputStream on NSMutableURLRequest. For example:
NSMutableURLRequest *request = [[NSMutableURLRequest alloc] initWithURL:uploadURL];
NSInputStream *stream = [[NSInputStream alloc] initWithFileAtPath:filePath];
[request setHTTPBodyStream:stream];
[request setHTTPMethod:#"POST"];
[NSURLConnection sendAsynchronousRequest:request queue:[NSOperationQueue mainQueue]
completionHandler:^(NSURLResponse *response, NSData *data, NSError *error) {
NSLog(#"Finished with status code: %i", [(NSHTTPURLResponse *)response statusCode]);
}];
You can use a NSInputStream to provide the data to post via -[NSMutableURLRequest setHTTPBodyStream:]. This could be an input stream that reads from a file. You might need to implement the connection:needNewBodyStream: method in your URL connection delegate to provide a new, unopened stream in case the system needs to retransmit the data.
One way to do this is to use an asynchronous NSInputStream in concert with a file. When the asynchronous connection asks you to provide more data, you read in the data from a file. You have a few ways to do this:
UNIX/BSD interface. use open (or fopen), malloc, read, and create a NSData object from the malloced data
use the above with mmap() if you know it
use the Foundation class NSFileHandle APIs to do more or less the same using ObjectiveC
You can read up on streams in the 'Stream Programming Guide'. If this doesn't work for you there are lots of open source projects that can upload files, for instance MKNetworkKit

Uploading From App to Server in IOS

I know that conventionally for an app to interact with the internet, it must use a web service to exchange information. However, how would one upload data(photos, text, audio recordings etc.etc.) from app to server(which holds data for all user accounts)? I know some people use an email-to-server tactic from research but even then it sounds ineffective and slow. How do apps such as Instagram upload so fast? I am trying to replicate that sort of uploading. Please guide me in the right direction.
Thanks for the help!
You should definitely look into AFNetworking. Here is an example of my uploading an image to a php web service:
NSData *imageData = UIImagePNGRepresentation(pageImage);
AFHTTPClient *client= [AFHTTPClient clientWithBaseURL:[NSURL URLWithString:#"http://www.SERVER.com"]];
//You can add POST parameteres here
NSDictionary *params = [NSDictionary dictionaryWithObjectsAndKeys:
author, #"author",
title, #"title",
nil];
NSMutableURLRequest *request = [client multipartFormRequestWithMethod:#"POST" path:#"/PATH/TO/WEBSERVICE.php" parameters:params constructingBodyWithBlock: ^(id <AFMultipartFormData>formData) {
//This is the image
[formData appendPartWithFileData: imageData name:#"cover_image" fileName:#"temp.png" mimeType:#"image/png"];
}];
AFHTTPRequestOperation *operation = [[AFHTTPRequestOperation alloc] initWithRequest:request];
//Setup Upload block to return progress of file upload
[operation setUploadProgressBlock:^(NSInteger bytesWritten, long long totalBytesWritten, long long totalBytesExpectedToWrite) {
float progress = totalBytesWritten / (float)totalBytesExpectedToWrite;
NSLog(#"Upload Percentage: %f %%", progress*100);
}];
//Setup Completeion block to return successful or failure
[operation setCompletionBlockWithSuccess:^(AFHTTPRequestOperation *operation, id responseObject) {
NSString *response = [operation responseString];
NSLog(#"response: [%#]",response);
//Code to run after webservice returns success response code
} failure:^(AFHTTPRequestOperation *operation, NSError *error) {
NSLog(#"error: %#", [operation error]);
//Code to Run if Failed
}];
[operation start];
Edit- Also I use MBProgressHUD to display to the user the uploading on longer uploads.
As you might know, upload speed is always bound to the speed of the connection type you're using. Even the best upload technique will be slow when the connection is slow (GPRS for example, or EDGE, even 3G can be slow if network coverage is not good).
To upload large sets of data faster/better one thing you could do is compressing the data you're sending using ZIP or any other file compression format you wish or even develop you own compression algorithm (you might not want to do that ;-)).
If you want to reduce the overhead of HTTP/HTTPS connections for example, you can write your very own protocol for data exchange, implement it on the client/server side and upload faster. This will be a lot of work as you have to do all the implementation work not only for the protocol itself as you need to add security etc. But even if you choose to create a protocol, as said in the beginning, it will be slow if the connection is slow.
Update: A presenatation by Mike Krieger (Co-Founder of Instagram) where he covers your question just crossed my way https://speakerdeck.com/u/mikeyk/p/secrets-to-lightning-fast-mobile-design?slide=1.
The reason why you think it's so fast is, that they're updating the UI before the request (the Upload in this case) even completes. This is what Mike describes as "being optimistic". If the request fails you can still notify the user, but in the meantime make him feel productive and act like the request completed successfully.
This is a pretty open ended question but here are a few things to look at:
"Uploading fast" depends on the user's connection and server bandwidth so I won't get into that.
You can upload photos (and other files) by creating NSData objects and attaching them to a POST request. There is already a ton of sample code for uploading NSData but to convert a UIImage you will do the following:
NSData *imageData = UIImagePNGRepresentation(image);
You can do this using the built in Cocoa classes (NSMutableURLRequest) and with 3rd party networking classes (such as AFNetworking - just scroll down to file uploads).
When I send simple data to my webserver, I use the following approach: Use the ASIHttpRequest framework for connecting to your sever. Send the data in HTTP Post body, which is easy to do in the ASIHttpRequest framework. You will want to convert your data to either XML or JSON(use the SBJson framework for this) before sending it. I then write php scripts that parse the json or xml and then input this data into my database with custom SQL scripts. I can give you code snippets if you need them for any of these procedures...
It seems to me that, with your first sentence, you've basically answered your own question.
You need something on your server to receive the files and then you write client code to match. It could be as simple as ftp or as complex as a custom protocol depending on the security and control that you need.