IOS- Use to Try/Catch for load Json - objective-c

I'm working load data from the server, but sometimes the server is not connected (error).
I wanna use to Try/Catch or something to avoid error app
1: try/cactch at load data
2: try/catch at image
I don't know how to use
I write code is:
#try
{
dispatch_async(htvque, ^{
NSData* data = [NSData dataWithContentsOfURL: [NSURL URLWithString:listChannel]];
NSError* error;
jsonTable = [NSJSONSerialization JSONObjectWithData:data options:kNilOptions error:&error];
if (error) {
NSLog(#"%#", error);
}
else
{
NSMutableArray *arrImage = [jsonTable objectForKey:#"ListImage"];
for (int i =0; i<arrImage.count; i++) {
UIImage * result;
UIImageView *imgView = [[UIImageView alloc] init];
imgView.frame = CGRectMake(0, 30 * i , 20, 20);
imgView.image = result;
#try
{
NSData * data = [NSData dataWithContentsOfURL:[NSURL URLWithString: [arrImage objectAtIndex:i]]];
result = [UIImage imageWithData:data];
[self.view addSubview:imgView];
}
#catch
{}
#finally
{}
}
}
});
}
#catch(NSException * exp)
{
NSLOG(#"abc");
}
#finnaly
{
}

Don't use exceptions for this purpose for your ObjC programs.
In ObjC, just reserve use of exceptions for cases where:
you do not intend to recover
and when you do not intend to recover
-- if an exception is even appropriate there (I just don't use them, but that's a little too 'core for a lot of people).
Anyways - an exception in ObjC indicates a programmer error and is logically something you cannot expect to recover from (unlike other languages). Figure out what your error is instead of "try and swallow" error handling.
The remedy I suggest is to create a new question, which shows the appropriate code, and details the exception, how to reproduce it, etc.
Note: There is actually a handful of oddball Cocoa APIs which will throw in less than exceptional cases, when they should have just used another approach to error handling, such as NSError. The vast majority of Cocoa exceptions you will see in development are problems which you should and can correct (range error, does not respond to selector, reference counting).

I think this will be a reasonably robust way to do what you want. I wanted the example to show lots f error checking without relying on try/catch
dispatch_async(htvque, ^{
NSError* error = nil;
NSData* data = [NSData dataWithContentsOfURL:[NSURL URLWithString:listChannel] options:0 error:&error];
if (error) { NSLog(#"%#", error); return; }
NSDictionary *jsonTable = [NSJSONSerialization JSONObjectWithData:data options:kNilOptions error:&error];
if (error) { NSLog(#"%#", error); return; }
if (![jsonTable isKindOfClass:[NSDictionary class]]) { NSLog(#"jsonTable is not an NSDictionary: %#", jsonTable); return; }
NSArray *images = [jsonTable objectForKey:#"ListImage"];
if (![images isKindOfClass:[NSArray class]]) { NSLog(#"images is not an NSArray: %#", images); return; }
NSMutableArray *dataForImages = [NSMutableArray arrayWithCapacity:images.count];
// Build up an array with all the image data. For simplicity sake, I'll just skip ones the fail to load.
for (NSString *URLString in images) {
if (![URLString isKindOfClass:[NSString class]]) { NSLog(#"URLString is not an NSString: %#", URLString); continue; }
NSData* data = [NSData dataWithContentsOfURL:[NSURL URLWithString:URLString] options:0 error:&error];
if (error) { NSLog(#"%#", error); continue; }
[dataForImages addObject:data];
}
// MUST SWITCH TO MAIN QUEUE BEFORE UPDATING UI!!!!!!!
dispatch_sync(dispatch_get_main_queue(), ^{
// This is just a different way of iterating the array.
[dataForImages enumerateObjectsUsingBlock:^(id obj, NSUInteger idx, BOOL *stop) {
UIImage *image = [UIImage imageWithData:obj];
if (!image) { NSLog(#"Could not create image from data at index %d", idx); return; }
UIImageView *imageView = [[UIImageView alloc] initWithImage:image];
imageView.frame = CGRectMake(0, 30 * idx , 20, 20);
[self.view addSubview:imageView];
}];
});
});
This really shouldn't be a working solution, rather a rough outline.

Related

which dispatch_async method is advisable to load the json data to table view controller?

I have json data and now I want to push into table view? But initially, I have to get the json data so which dispatch method or child thread is recommend so that i can load the data in background and then push it.
Best practice is load data in background thread using dispatch_async method and passing a queue other then main so it don't block UI, and when u have data ready just call reload table view on main thread...
Below is a class an example from real project...
Class level method load transcations
+ (void)getAllTransactionsWithHandler:(void(^)(NSArray *transactions, NSError *error))handler{
dispatch_queue_t q = dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0);
dispatch_async(q, ^{
NSURL *url = [[NSBundle mainBundle] URLForResource:trnasactionFileName withExtension:#"json"];//[NSData dataWithContentsOfFile:trnasactionFileName];
NSData *data = [NSData dataWithContentsOfURL:url];
if (!data) {
if (handler) {
handler(nil, [NSError errorWithDomain:#"bank" code:900 userInfo:nil]);
}
return ;
}
NSError *error = nil;
NSArray *jsonArray = [NSJSONSerialization JSONObjectWithData:data options:0 error:&error];
if (error) {
if (handler) {
handler(nil, error);
}
}
else{
Transaction *transaction;
NSString *dateString;
NSMutableArray *objects = [NSMutableArray new];
for (NSDictionary *dic in jsonArray) {
transaction = [Transaction new];
dateString = dic[kOccured];
NSDateFormatter *dateFormater = [NSDateFormatter new];
[dateFormater setDateFormat:#"yyyy-MM-dd"];
transaction.occured = [dateFormater dateFromString:dateString];
transaction.proecessed = [dateFormater dateFromString:dic[kProcessed]];
transaction.desc = dic[kDescription];
transaction.amount = dic[kAmount];
[objects addObject:transaction];
}
if (handler) {
handler([NSArray arrayWithArray:objects], nil);
}
}
});
}
You can use this as
[Transaction getAllTransactionsWithHandler:^(NSArray *transactions, NSError *error) {
if (error) {
}else{
if ([transactions count] > 0) {
weakSelf.objects = transactions;
runOnMainThread(^{
[weakSelf.tableView reloadData];
});
}
}
}];
Where as runOnMainthread is a utility method which will run provided block of code on main thread
void runOnMainThread(void(^block)(void)){
if ([[NSThread currentThread] isMainThread])
block();
else{
dispatch_sync(dispatch_get_main_queue(), ^{
block();
});
}
}

GCDAsyncSocket not receiving all transmitted data, missing last "Chunk"

I am trying to send some strings and image data from a python script to an objective C application running on OSX.
I am collecting the transmitted data, using GCDAsyncSocket, and appending it to an NSMutableData until the server disconnects. I am then processing that NSData and splitting it into it's original parts.
The transmitted data consists of the following:
ID string, filled out to 16 bytes.
Image number string, filled out to 16 bytes.
Raw image data.
Termination string, filled out to 16 bytes.
The problem is that i am not receiving/getting the last chunk of data, i end up missing the end of the JPEG image, resulting in a corrupt (though mostly displayed) image, and a missing termination string.
Here is the code i am using with GCDAsyncSocket to get the data, and process it:
Socket connection:
- (void)socket:(GCDAsyncSocket *)sock didAcceptNewSocket:(GCDAsyncSocket *)newSocket
{
// This method is executed on the socketQueue (not the main thread)
#synchronized(connectedSockets)
{
[connectedSockets addObject:newSocket];
}
NSString *host = [newSocket connectedHost];
UInt16 port = [newSocket connectedPort];
dispatch_async(dispatch_get_main_queue(), ^{
#autoreleasepool {
[self logInfo:FORMAT(#"Accepted client %#:%hu", host, port)];
}
});
[newSocket readDataToData:[GCDAsyncSocket CRLFData] withTimeout:-1 tag:0];
}
Socket Data Received
- (void)socket:(GCDAsyncSocket *)sock didReadData:(NSData *)data withTag:(long)tag
{
// This method is executed on the socketQueue (not the main thread)
dispatch_async(dispatch_get_main_queue(), ^{
#autoreleasepool {
NSLog(#"Thread Data Length is %lu", (unsigned long)[data length]);
if (!imageBuffer){
imageBuffer = [[NSMutableData alloc]init];
}
[imageBuffer appendData:[data subdataWithRange:NSMakeRange(0, [data length])]];
NSLog(#"Total Data Length is %lu", (unsigned long)[imageBuffer length]);
}
});
// Echo message back to client
[sock writeData:data withTimeout:-1 tag:ECHO_MSG];
[sock readDataToData:[GCDAsyncSocket CRLFData] withTimeout:-1 tag:0];
}
Socket Disconnected
- (void)socketDidDisconnect:(GCDAsyncSocket *)sock withError:(NSError *)err
{
if (sock != listenSocket)
{
dispatch_async(dispatch_get_main_queue(), ^{
#autoreleasepool {
[self logInfo:FORMAT(#"Client Disconnected")];
NSData *cameraNumberData;
NSData *imageNumberData;
NSData *imageData;
NSData *endCommandData;
//if ([data length] > 40){
cameraNumberData = [imageBuffer subdataWithRange:NSMakeRange(0, 16)];
imageNumberData = [imageBuffer subdataWithRange:NSMakeRange(16, 16)];
imageData = [imageBuffer subdataWithRange:NSMakeRange(32, [imageBuffer length]-34)];
endCommandData = [imageBuffer subdataWithRange:NSMakeRange([imageBuffer length]-16, 16)];
//}
NSString *cameraNumberString = [[NSString alloc] initWithData:cameraNumberData encoding:NSUTF8StringEncoding];
NSString *imageNumberString = [[NSString alloc] initWithData:imageNumberData encoding:NSUTF8StringEncoding];
NSString *endCommandString = [[NSString alloc] initWithData:endCommandData encoding:NSUTF8StringEncoding];
NSImage* image = [[NSImage alloc]initWithData:imageData];
if (cameraNumberString)
{
NSLog(#"Image recieved from Camera no %#", cameraNumberString);
[self logMessage:cameraNumberString];
}
else
{
[self logError:#"Error converting received data into UTF-8 String"];
}
if (imageNumberString)
{
NSLog(#"Image is number %#", imageNumberString);
[self logMessage:imageNumberString];
}
else
{
[self logError:#"Error converting received data into UTF-8 String"];
}
if (image)
{
NSLog(#"We have an image");
[self.imageView setImage:image];
}
else
{
[self logError:#"Error converting received data into image"];
}
if (endCommandString)
{
NSLog(#"Command String is %#", endCommandString);
[self logMessage:endCommandString];
}
else
{
[self logError:#"No command string"];
}
//self.imageBuffer = nil;
}
});
#synchronized(connectedSockets)
{
[connectedSockets removeObject:sock];
}
}
}
I have used wireshark, and the data is being transmitted, it's just not getting through GCDAsynSocket.
So, i'm obviously missing something. Socket programming and encoding/decoding of data like this is relatively new to me, so i am probably being an idiot.
Help greatly appreciated!
Thanks
Gareth
Ok, so i finally got this working. It involved modifying the transmitting code in Python to send a completion string at the end of the data, and watching for that. The biggest takeaway was that i needed to re-call the readDataToData: method each time the socket read some data, otherwise it would just sit there and wait, and the transmitting socket would also just sit there.
I also had to implement re-calling the second receive with a tag so i could store the received data in the correct NSMutableData object in an NSMutableArray, otherwise i had no way of knowing after the first receive which transmitting socket the data was coming from as the ID was only at the beginning of the first message.
Here is the didReadData code:
- (void)socket:(GCDAsyncSocket *)sock didReadData:(NSData *)data withTag:(long)tag
{
dispatch_async(dispatch_get_main_queue(), ^{
#autoreleasepool {
NSInteger cameraNumberNumber = 0;
NSString *cameraNumberString = [[NSString alloc]init];
if (tag > 10){
cameraNumberNumber = tag-11;
DDLogVerbose(#"Second data loop, tag is %ld", tag);
} else {
NSData *cameraNumberData;
//if ([data length] > 40){
cameraNumberData = [data subdataWithRange:NSMakeRange(0, 16)];
NSString *cameraNumberString = [[NSString alloc] initWithData:cameraNumberData encoding:NSUTF8StringEncoding];
cameraNumberString = [cameraNumberString stringByTrimmingCharactersInSet:[NSCharacterSet whitespaceAndNewlineCharacterSet]];
cameraNumberNumber = [cameraNumberString intValue]-1;
}
if (cameraNumberNumber+1 <= self.images.count){
if ([self.images objectAtIndex:cameraNumberNumber] == [NSNull null]){
image* cameraImage = [[image alloc]init];
[self.images replaceObjectAtIndex: cameraNumberNumber withObject:cameraImage];
}
image* cameraImage = [self.images objectAtIndex:cameraNumberNumber];
[cameraImage.imageData appendData:[data subdataWithRange:NSMakeRange(0, [data length])]];
cameraImage.cameraNumber = cameraNumberString;
if (!imageBuffer){
imageBuffer = [[NSMutableData alloc]init];
}
[imageBuffer appendData:[data subdataWithRange:NSMakeRange(0, [data length])]];
DDLogVerbose(#"Total Data Length is %lu", (unsigned long)[imageBuffer length]);
} else {
DDLogInfo(#"Wrong camera quantity!");
NSAlert *testAlert = [NSAlert alertWithMessageText:#"Wrong camera quantity!"
defaultButton:#"Ok"
alternateButton:nil
otherButton:nil
informativeTextWithFormat:#"We have recieved more images than cameras, please set No.Cameras correctly!"];
[testAlert beginSheetModalForWindow:[self window]
modalDelegate:self
didEndSelector:#selector(stop)
contextInfo:nil];
}
[sock readDataToData:[#"end" dataUsingEncoding:NSUTF8StringEncoding] withTimeout:-1 tag:cameraNumberNumber + 11];
}
});
}
and here is the socketDidDisconnect code, a lot of things in here that don't make sense out of context, but it shows how i handled the received data.
- (void)socketDidDisconnect:(GCDAsyncSocket *)sock withError:(NSError *)err
{
if (sock != listenSocket)
{
dispatch_async(dispatch_get_main_queue(), ^{
#autoreleasepool {
totalCamerasFetched = [NSNumber numberWithInt:1+[totalCamerasFetched intValue]];
if ([totalCamerasFetched integerValue] >= [numberOfCameras integerValue]){
for (image* cameraImage in self.images){
NSData *cameraNumberData;
NSData *imageNumberData;
NSData *imageData;
NSData *endCommandData;
NSInteger cameraNumberNumber = 0;
cameraNumberData = [cameraImage.imageData subdataWithRange:NSMakeRange(0, 16)];
imageNumberData = [cameraImage.imageData subdataWithRange:NSMakeRange(16, 16)];
imageData = [cameraImage.imageData subdataWithRange:NSMakeRange(32, [cameraImage.imageData length]-32)];
endCommandData = [cameraImage.imageData subdataWithRange:NSMakeRange([cameraImage.imageData length]-16, 16)];
NSString *cameraNumberString = [[NSString alloc] initWithData:cameraNumberData encoding:NSUTF8StringEncoding];
cameraNumberString = [cameraNumberString stringByTrimmingCharactersInSet:[NSCharacterSet whitespaceAndNewlineCharacterSet]];
NSString *imageNumberString = [[NSString alloc] initWithData:imageNumberData encoding:NSUTF8StringEncoding];
imageNumberString = [imageNumberString stringByTrimmingCharactersInSet:[NSCharacterSet whitespaceAndNewlineCharacterSet]];
NSString *endCommandString = [[NSString alloc] initWithData:endCommandData encoding:NSUTF8StringEncoding];
NSImage* image = [[NSImage alloc]initWithData:imageData];
cameraNumberNumber = [cameraNumberString intValue]-1;
if (cameraNumberString)
{
DDLogInfo(#"Image recieved from Camera no %#", cameraNumberString);
}
else
{
DDLogError(#"No Camera number in data");
}
if (imageNumberString)
{
DDLogInfo(#"Image is number %#", imageNumberString);
}
else
{
DDLogError(#"No Image number in data");
}
if (image)
{
DDLogVerbose(#"We have an image");
NSString* dataPath = [[NSString alloc]initWithFormat:#"%#/image%#/",self.exportLocation, imageNumberString];
if (![[NSFileManager defaultManager] fileExistsAtPath:dataPath]){
NSError* error;
[[NSFileManager defaultManager] createDirectoryAtPath:dataPath withIntermediateDirectories:NO attributes:nil error:&error];
if (error)
{
DDLogError(#"[%#] ERROR: attempting to write directory for images", [self class]);
NSAssert( FALSE, #"Failed to create directory maybe out of disk space?");
}
}
NSString* dataPathVideo = [[NSString alloc]initWithFormat:#"%#/video%#/",self.exportLocation, imageNumberString];
if (![[NSFileManager defaultManager] fileExistsAtPath:dataPathVideo]){
NSError* error;
[[NSFileManager defaultManager] createDirectoryAtPath:dataPathVideo withIntermediateDirectories:NO attributes:nil error:&error];
if (error)
{
DDLogError(#"[%#] ERROR: attempting to write directory for images", [self class]);
NSAssert( FALSE, #"Failed to create directory maybe out of disk space?");
}
}
NSString * exportLocationFull = [[NSString alloc]initWithFormat:#"%#/image%#/camera_%#.jpg",self.exportLocation, imageNumberString, cameraNumberString];
DDLogInfo(#"Full export URL = %#", exportLocationFull);
[imageData writeToFile:exportLocationFull atomically:YES];
self.currentSet = [NSNumber numberWithInt:[imageNumberString intValue]];
NSImage* imageToStore = [[NSImage alloc]initWithData:imageData];
[self.imagesToMakeVideo replaceObjectAtIndex: cameraNumberNumber withObject:imageToStore];
} else {
DDLogError(#"No image loacted in data");
}
if (endCommandString)
{
DDLogVerbose(#"Command String is %#", endCommandString);
//[self logMessage:endCommandString];
}
else
{
//[self logError:#"No command string"];
}
self.imageBuffer = nil;
}
self.totalCamerasFetched = [NSNumber numberWithInt:0];
[self loadandDisplayLatestImages];
[self createVideowithImages:imagesToMakeVideo toLocation:[[NSString alloc]initWithFormat:#"%#/video%#/image_sequence_%#.mov",self.exportLocation, self.currentSet, self.currentSet]];
processing = false;
}//end of for loop
}
});
#synchronized(connectedSockets)
{
[connectedSockets removeObject:sock];
}
}
}
also here is how i modified the Python code to add the extra "end" tag.
def send_media_to(self, ip, port, media_name, media_number, media_dir):
camera_number = self.camera.current_mode['option'].number
sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
sock.connect((ip, port))
try:
sock.send(bytes(str(camera_number).ljust(16), 'utf-8'))
sock.send(bytes(str(media_number).ljust(16), 'utf-8'))
with open(media_dir + media_name, 'rb') as media:
sock.sendall(media.read())
finally:
sock.send(bytes(str("end").ljust(16), 'utf-8'))
sock.close()
Hopefully this helps someone else stuck in the same situation!

How to get array of images from network faster? (iOS)

Basically, I have an array of urls as strings, and as I loop through this array, if the element is a url for an image, I want to turn that url into a UIImage object and add it to another array. This is very slow though since I have to request the data for each URL. I've tried using dispatch_async as I show below but it doesn't seem to make any difference at all.
The key is that when I add these objects to my other array, whether they are images or something else they have to stay in order. Can anyone offer any guidance?
dispatch_group_t group = dispatch_group_create();
dispatch_queue_t queue = dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0);
for (int i=0; i<[slides count]; i++){
__block NSString *mediaURLString = [primaryPhoto objectForKey:#"url"];
if ([self mediaIsVideo:mediaURLString]){
***some code***
}
else{ //if media is an image
dispatch_group_async(group, queue, ^{
mediaURLString = [mediaURLString stringByAppendingString:#"?w=1285&h=750&q=150"];
NSURL *url = [NSURL URLWithString:mediaURLString];
[mutableMedia addObject:url];
NSURL *url = ((NSURL *)self.mediaItem);
NSURLRequest *request = [NSURLRequest requestWithURL:url];
NSURLResponse *response;
NSError *error;
NSData *urlData = [NSURLConnection sendSynchronousRequest:request returningResponse:&response error:&error];
UIImage *image = [[UIImage alloc] initWithData:urlData];
[mutableMedia replaceObjectAtIndex:i withObject:image];
});
}
}
dispatch_group_wait(group, DISPATCH_TIME_FOREVER);
try this:
[self performSelectorInBackground:#selector(WebServiceCallMethod) withObject:nil];
and create one method like this
-(void)WebServiceCallMethod
{
mediaURLString = [mediaURLString stringByAppendingString:#"?w=1285&h=750&q=150"];
NSURL *url = [NSURL URLWithString:mediaURLString];
[mutableMedia addObject:url];
NSURL *url = ((NSURL *)self.mediaItem);
NSURLRequest *request = [NSURLRequest requestWithURL:url];
NSURLResponse *response;
NSError *error;
NSData *urlData = [NSURLConnection sendSynchronousRequest:request returningResponse:&response error:&error];
UIImage *image = [[UIImage alloc] initWithData:urlData];
[mutableMedia replaceObjectAtIndex:i withObject:image];
}
Hope it Helps!!
Do yourself a favor and don't use +sendSynchronousRequest:... Try something like this instead:
dispatch_group_t group = dispatch_group_create();
for (int i=0; i<[slides count]; i++)
{
__block NSString *mediaURLString = [primaryPhoto objectForKey:#"url"];
if ([self mediaIsVideo:mediaURLString]){
***some code***
}
else
{
//if media is an image
mediaURLString = [mediaURLString stringByAppendingString:#"?w=1285&h=750&q=150"];
NSURL *url = [NSURL URLWithString:mediaURLString];
[mutableMedia addObject:url];
NSURL *url = ((NSURL *)self.mediaItem);
NSURLRequest *request = [NSURLRequest requestWithURL:url];
dispatch_group_enter(group);
[NSURLConnection sendAsynchronousRequest: request queue: [NSOperationQueue mainQueue] completionHandler:
^(NSURLResponse *response, NSData *data, NSError *connectionError)
{
if (data.length && nil == connectionError)
{
UIImage *image = [[UIImage alloc] initWithData:data];
[mutableMedia replaceObjectAtIndex:i withObject:image];
}
dispatch_group_leave(group);
}];
}
}
dispatch_group_notify(group, dispatch_get_main_queue(), ^{
// Do stuff here that you want to have happen after all the images are loaded.
});
This will start asynchronous requests for all your URLs. When each request finishes, it will run its completion handler which will update your array, and when all requests have finished, the block in the dispatch_group_notify call will be executed.
This approach has the advantage that you can call it from the main thread, all individual completion blocks will be run on the main thread (thus ensuring thread-safety for the mutableMedia array (at least as far as this code goes)) and the final completion block will also be run on the main thread, so you can do whatever you need to update the UI directly.
There is a nifty solution using dispatch lib. The code below should stand for itself.
The basic idea is that an array contains "input" objects which each will be "transformed" via an asynchronous unary task - one after the other. The final result of the whole operation is an array of the transformed objects.
Everything here is asynchronous. Every eventual result will be passed in a completion handler which is a block where the result is passed as a parameter to the call-site:
typedef void (^completion_t)(id result);
The asynchronous transform function is a block which takes the input as a parameter and returns a new object - via a completion handler:
typedef void (^unary_async_t)(id input, completion_t completion);
Now, the function transformEach takes the input values as an NSArray parameter inArray, the transform block as parameter transform and the completion handler block as parameter completion:
static void transformEach(NSArray* inArray, unary_async_t transform, completion_t completion);
The implementation is a follows:
static void do_each(NSEnumerator* iter, unary_async_t transform,
NSMutableArray* outArray, completion_t completion)
{
id obj = [iter nextObject];
if (obj == nil) {
if (completion)
completion([outArray copy]);
return;
}
transform(obj, ^(id result){
[outArray addObject:result];
do_each(iter, transform, outArray, completion);
});
}
static void transformEach(NSArray* inArray, unary_async_t transform,
completion_t completion) {
NSMutableArray* outArray = [[NSMutableArray alloc] initWithCapacity:[inArray count]];
NSEnumerator* iter = [inArray objectEnumerator];
do_each(iter, transform, outArray, completion);
}
And build and run the following example
int main(int argc, const char * argv[])
{
#autoreleasepool {
// Example transform:
unary_async_t capitalize = ^(id input, completion_t completion) {
dispatch_async(dispatch_get_global_queue(0, 0), ^{
sleep(1);
if ([input respondsToSelector:#selector(capitalizedString)]) {
NSLog(#"processing: %#", input);
NSString* result = [input capitalizedString];
if (completion)
completion(result);
}
});
};
transformEach(#[#"a", #"b", #"c"], capitalize, ^(id result){
NSLog(#"Result: %#", result);
});
sleep(10);
}
return 0;
}
will print this to the console:
2013-07-31 15:52:49.786 Sample2[1651:1603] processing: a
2013-07-31 15:52:50.789 Sample2[1651:1603] processing: b
2013-07-31 15:52:51.792 Sample2[1651:1603] processing: c
2013-07-31 15:52:51.793 Sample2[1651:1603] Result: (
A,
B,
C
)
You can easily create a category for NSArray which implements, say a
-(void) asyncTransformEachWithTransform:(unary_async_t)transform
completion:(completion_t)completionHandler;
method.
Have fun! ;)
Edit:
IFF you ask how this applies to your problem:
The array of URLs is the input array. In order to create the transform block, simply wrap your asynchronous network request in an asynchronous method, say:
`-(void) fetchImageWithURL:(NSURL*)url completion:(completion_t)completionHandler;`
Then wrap method fetchImageWithURL:completion: into a appropriate transform block:
unary_async_t fetchImage = ^(id url, completion_t completion) {
[self fetchImageWithURL:url completion:^(id image){
if (completion)
completion(image); // return result of fetch request
}];
};
Then, somewhere in your code (possible a view controller) assuming you implemented the category for NSArray, and your array of urls is property urls:
// get the images
[self.urls asyncTransformEachWithTransform:fetchImage completion:^(id arrayOfImages) {
// do something with the array of images
}];

How to load several photos with assetForURL with a list of URLs

For a list of URLs I need to load the photos with ALAssetsLibrary:assetForURL, and this within one method.
Since this method works async but it is not iterating over the passed list of URLs, as we all know.
I found this snippet (which should work):
- (void)loadImages:(NSArray *)imageUrls loadedImages:(NSArray *)loadedImages callback: (void(^)(NSArray *))callback
{
if (imageUrls == nil || [imageUrls count] == 0) {
callback(loadedImages);
}
else {
NSURL *head = [imageUrls head];
__unsafe_unretained id unretained_self = self;
ALAssetsLibrary* library = [[ALAssetsLibrary alloc] init];
[library assetForURL:head resultBlock:^(ALAsset *asset) {
ALAssetRepresentation *assetRepresentation = asset.defaultRepresentation;
UIImage *image = [UIImage imageWithCGImage:assetRepresentation.fullResolutionImage scale:assetRepresentation.scale orientation:(UIImageOrientation)assetRepresentation.orientation];
[unretained_self loadImages:[imageUrls tail] loadedImages:[loadedImages arrayByAddingObject:image] callback:callback];
} failureBlock:^(NSError *error) {
[unretained_self loadImages:[imageUrls tail] loadedImages:loadedImages callback:callback];
}];
}
}
How do I write the method definition in the form (above all the callback)
void loadImages(NSArray *imageUrls, NSArray *loadedImages, ...) ?
How do I call this method from another method (again mainly the callback part) ?
Can the callback be in the calling method or a 3rd method needed for this? and how does this method need to be written?
I have found the snippet here: http://www.calebmadrigal.com/functional-programming-deal-asynchronicity-objective-c/
Use NSThread to call the loadImages method.
NSMutableArray *imageCollection = [NSThread detachNewThreadSelector:#selector (loadImages:)
toTarget:self
withObject:imageUrlsCollection];
- (NSMutableArray *)loadImages:(NSArray *)imageUrls
{
ALAssetsLibrary* library = [[ALAssetsLibrary alloc] init];
NSMutableArray *loadedImages = [[NSMutableArray alloc] init];
#try
{
for(int index = 0; index < [imageUrls count]; index++)
{
NSURL *url = [imageUrls objectAtIndex:index];
[library assetForURL:url resultBlock:^(ALAsset *asset) {
ALAssetRepresentation *assetRepresentation = asset.defaultRepresentation;
dispatch_async(dispatch_get_main_queue(), ^{
UIImage *image = [UIImage imageWithCGImage:assetRepresentation.fullResolutionImage scale:assetRepresentation.scale orientation:(UIImageOrientation)assetRepresentation.orientation];
[loadedImages addObject:image];
});
} failureBlock:^(NSError *error) {
NSLog(#"Failed to get Image");
}];
}
}
#catch (NSException *exception)
{
NSLog(#"%s\n exception: Name- %# Reason->%#", __PRETTY_FUNCTION__,[exception name],[exception reason]);
}
#finally
{
return loadedImages;
}
}
Note: With ARC,take care about invalid attempt to access ALAssetPrivate past the lifetime of its owning ALAssetsLibrary issue
Here is the fix :)

Cacheing UIImage as NSData and getting it back

I'm trying to cache images I load from Flickr. If I load the same image, I'm hoping to use the cached version instead. For some reason, when I download an image from the internet, it works, but if I load it from my cache, it displays a blank image. I checked cacheData, and it has the same amount of bits as the image I put in, so it appears loading the file is working.
Here is how I cache images:
+ (void)cachePhoto:(NSData *)photo withKey:(NSString *)key {
if (photo) {
NSArray * urlArray = [fileManager URLsForDirectory:NSCachesDirectory inDomains:NSUserDomainMask];
NSURL * targetDirectory = (NSURL *)[urlArray objectAtIndex:0];
targetDirectory = [targetDirectory URLByAppendingPathComponent:key];
[photo writeToURL:targetDirectory atomically:YES];
[cachedPhotos addObject:key];
NSLog(#"target url %#", targetDirectory);
}
}
+ (NSData *)photoInCache:(NSString *)key {
if ([cachedPhotos containsObject:key]) {
NSString * path = [[cacheDirectory URLByAppendingPathComponent:key] path];
NSLog(#"path: %#", path);
return [fileManager contentsAtPath:path];
} else {
return nil;
}
}
And my code to get it back:
NSData * cacheData = [PhotoCache photoInCache:key];
if (cacheData) {
self.imageView.image = [UIImage imageWithData:cacheData];
[spinner stopAnimating];
NSLog(#"used cached image");
} else {
dispatch_queue_t downloadQueue = dispatch_queue_create("get photo from flickr", NULL);
dispatch_async(downloadQueue, ^{
NSData * imageData = [[NSData alloc] initWithContentsOfURL:url];
dispatch_async(dispatch_get_main_queue(), ^{
[spinner stopAnimating];
self.imageView.image = [UIImage imageWithData:imageData];
[PhotoCache cachePhoto:imageData
withKey:key];
});
});
}
I figured it out - I was loading the image into the imageView in my prepareForSegue, where the ImageView was not yet loaded. I loaded the image in viewDidLoad and it worked. I also had to use UIImagePNGRepresentation, like suggested in the comments.