IOS: how can i refresh UI with async TBXML-call - objective-c

I'm using TBXML to parse an http-XML file and display the contents in a UILabel & UIImageView .
The call to the XML is done with an async request.
When i view the logs the last log element in the succesblock is printed immediately. The changes in the UILabel & UIImageview are only visible after a few seconds.
How can i let IOS refresh the UI straight after finishing processing the XML ?
// Create a success block to be called when the async request completes
TBXMLSuccessBlock successBlock = ^(TBXML *tbxmlDocument) {
// If TBXML found a root node, process element and iterate all children
NSLog(#"PROCESSING ASYNC CALLBACK");
if (tbxmlDocument.rootXMLElement)
[self traverseElement:tbxmlDocument.rootXMLElement];
myArticle.Body = [[StringCleaner sharedInstance] cleanedString:myArticle.Body];
// myArticle.Body = [myArticle.Body stringByConvertingHTMLToPlainText];
self.articleBody.text = myArticle.Body;
self.articleBody.numberOfLines= 0;
self.articleBody.lineBreakMode = UILineBreakModeWordWrap;
[self.articleBody sizeToFit];
// set scroll view size
self.articleBodyScrollView.contentSize = CGSizeMake(self.articleBodyScrollView.contentSize.width, self.articleBody.frame.size.height);
NSURL *url = [NSURL URLWithString:myArticle.Photo];
NSData *data = [NSData dataWithContentsOfURL:url];
if (data != NULL)
{
UIImage *image = [UIImage imageWithData:data];
// articlePhoto = [[UIImageView alloc] initWithImage:image];
[self.articlePhoto setImage:image];
}else {
NSLog(#"no data");
}
NSLog(#"FINISHED PROCESSING ASYNC");
// [self printArticles];
};
// Create a failure block that gets called if something goes wrong
TBXMLFailureBlock failureBlock = ^(TBXML *tbxmlDocument, NSError * error) {
NSLog(#"Error! %# %#", [error localizedDescription], [error userInfo]);
};
// tbxml = [[TBXML alloc] initWithURL:[NSURL URLWithString:someXML]];
tbxml = [[TBXML alloc] initWithURL:[NSURL URLWithString:records]
success:successBlock
failure:failureBlock];

Sounds like your trying to update UI but not on UI thread. Wrap your UILabel and UIImageView updates into a dispatch_async on the main thread, e.g.:
dispatch_async(dispatch_get_main_queue(), ^
{
[self.articlePhoto setImage:image];
});

Related

how to open powerpoint by pspdfkit?

NSURL *documentURL = [[[NSBundle mainBundle] resourceURL] URLByAppendingPathComponent:#"item_70_1_3.ppt"];
PSPDFDocument *document = [PSPDFDocument documentWithURL:documentURL];
NSURL *tempURL = PSPDFTempFileURLWithPathExtension(#"flattened_signaturetest", #"pdf");
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_HIGH, 0), ^{
[[PSPDFProcessor defaultProcessor] generatePDFFromDocument:document pageRange:[NSIndexSet indexSetWithIndexesInRange:NSMakeRange(0, document.pageCount)] outputFileURL:tempURL options:#{kPSPDFProcessorAnnotationTypes : #(PSPDFAnnotationTypeAll)} progressBlock:^(NSUInteger currentPage, NSUInteger numberOfProcessedPages, NSUInteger totalPages) {
// Access UI only from main thread.
dispatch_async(dispatch_get_main_queue(), ^{
[PSPDFProgressHUD showProgress:(numberOfProcessedPages+1)/(float)totalPages status:PSPDFLocalize(#"Preparing...")];
});
} error:NULL];
// completion
dispatch_async(dispatch_get_main_queue(), ^{
[PSPDFProgressHUD dismiss];
PSPDFDocument *flattenedDocument = [PSPDFDocument documentWithURL:tempURL];
PSPDFViewController *pdfController = [[PSPDFViewController alloc] initWithDocument:flattenedDocument];
UINavigationController *navController = [[UINavigationController alloc] initWithRootViewController:pdfController];
[self presentViewController:navController animated:YES completion:NULL];
});
});
i open a empty content using above code : (
The PSPDFProcessor feature is experimental. Please try if the PPT file works if you open it with Safari, it uses mostly Apple's libraries and they can fail for certain files. The code here looks like it's simply copied from my examples in PSPDFCatalog and is thus fine.

Loading an image from a URL but displaying it progressively

I have a screen that will load around 5 images, but they are huge images. Right now I use a
NSURLRequest
and a:
connectionDidFinishLoading
..for callback to tell me when each image is loaded.
The problem is that images would pop up one by one. Is there a way to have it display the image while it loads?
Thanks
The guts of what you need to do this are available as CGImageSource methods.
First, you use an asynchronous NSURLConnection to get the data. You add received data to a NSMutableData object as it arrives, so the data object gets bigger and bigger til finished.
You also create a progressive image source:
CGImageSourceRef imageSourcRef = CGImageSourceCreateIncremental(dict);
You will find lots of examples here and on google how to set the dictionary required.
Then as the data arrives, you pass the TOTAL data object into this method:
CGImageSourceUpdateData(imageSourcRef, (__bridge CFDataRef)data, NO); // No means not finished
You can then ask the image source for an image, which will be partial as the image is downloading. With a CGImage you can create a UIImage.
When you get the final data, you update the image source on last time:
CGImageSourceUpdateData(imageSourcRef, (__bridge CFDataRef)data, YES);
You then use the image source to get a final image and you're done.
Displaying it while loading ,I don't think UIImageView can load UIImageswith incomplete data while loading.I will go for
AsyncImageView ,
It can deal with all the burden of loading image asynchronous.Also UIActivityIndicator is already added to it.So it will be more user friendly
Use blocks and GCD's dispatch_async method.
Look at this example:
//communityDetailViewController.h
#interface communityDetailViewController : UIViewController {
UIImageView *imgDisplay;
UIActivityIndicatorView *activity;
// the dispatch queue to load images
dispatch_queue_t queue;
}
#end
//communityDetailViewController.m
- (void)loadImage
{
[activity startAnimating];
NSString *url = #"URL the image";
if (!queue) {
queue = dispatch_queue_create("image_queue", NULL);
}
dispatch_async(queue, ^{
NSData *data = [NSData dataWithContentsOfURL:[NSURL URLWithString:url]];
UIImage *anImage = [UIImage imageWithData:data];
dispatch_async(dispatch_get_main_queue(), ^{
[activity stopAnimating];
activity.hidden = YES;
if (anImage != nil) {
[imgDisplay setImage:anImage];
}else{
[imgDisplay setImage:[UIImage imageNamed:#"no_image_available.png"]];
}
});
});
}
You can subclass UIImageView and use this.
-(void)connection:(NSURLConnection*)connection didReceiveResponse:(NSURLResponse*)response
{
imageData = [NSMutableData data];
imageSize = [response expectedContentLength];
imageSource = CGImageSourceCreateIncremental(NULL);
}
-(void)connection:(NSURLConnection*)connection didReceiveData:(NSData*)data
{
[imageData appendData:data];
CGImageSourceUpdateData(imageSource, (__bridge CFDataRef)imageData, ([imageData length] == imageSize) ? true : false);
CGImageRef cgImage = CGImageSourceCreateImageAtIndex(imageSource, 0, NULL);
if (cgImage){
UIImage* img = [[UIImage alloc] initWithCGImage:cgImage scale:1.0f orientation:UIImageOrientationUp];
dispatch_async( dispatch_get_main_queue(), ^{
self.image = img;
});
CGImageRelease(cgImage);
}
}

Cacheing UIImage as NSData and getting it back

I'm trying to cache images I load from Flickr. If I load the same image, I'm hoping to use the cached version instead. For some reason, when I download an image from the internet, it works, but if I load it from my cache, it displays a blank image. I checked cacheData, and it has the same amount of bits as the image I put in, so it appears loading the file is working.
Here is how I cache images:
+ (void)cachePhoto:(NSData *)photo withKey:(NSString *)key {
if (photo) {
NSArray * urlArray = [fileManager URLsForDirectory:NSCachesDirectory inDomains:NSUserDomainMask];
NSURL * targetDirectory = (NSURL *)[urlArray objectAtIndex:0];
targetDirectory = [targetDirectory URLByAppendingPathComponent:key];
[photo writeToURL:targetDirectory atomically:YES];
[cachedPhotos addObject:key];
NSLog(#"target url %#", targetDirectory);
}
}
+ (NSData *)photoInCache:(NSString *)key {
if ([cachedPhotos containsObject:key]) {
NSString * path = [[cacheDirectory URLByAppendingPathComponent:key] path];
NSLog(#"path: %#", path);
return [fileManager contentsAtPath:path];
} else {
return nil;
}
}
And my code to get it back:
NSData * cacheData = [PhotoCache photoInCache:key];
if (cacheData) {
self.imageView.image = [UIImage imageWithData:cacheData];
[spinner stopAnimating];
NSLog(#"used cached image");
} else {
dispatch_queue_t downloadQueue = dispatch_queue_create("get photo from flickr", NULL);
dispatch_async(downloadQueue, ^{
NSData * imageData = [[NSData alloc] initWithContentsOfURL:url];
dispatch_async(dispatch_get_main_queue(), ^{
[spinner stopAnimating];
self.imageView.image = [UIImage imageWithData:imageData];
[PhotoCache cachePhoto:imageData
withKey:key];
});
});
}
I figured it out - I was loading the image into the imageView in my prepareForSegue, where the ImageView was not yet loaded. I loaded the image in viewDidLoad and it worked. I also had to use UIImagePNGRepresentation, like suggested in the comments.

Array with UIImageView elements - setImageWithURL

I have such code
arrayWithImages = [[NSMutableArray alloc] init];
NSEnumerator *enumForNames = [arrayWithNames objectEnumerator];
NSEnumerator *enumForURLs = [arrayWithURLs objectEnumerator];
id objName, objURL;
while(objName = [enumForNames nextObject]) {
objURL = [enumForURLs nextObject];
UIImageView *anImage = nil;
[anImage setImageWithURL:[NSURL URLWithString:objURL]];
(...)
[arrayWithImages addObject:anImage];
}
And each time I got SIGABRT in line with "[arrayWithImages addObject:anImage];"
What's here wrong?
I don’t see an setImageWithURL method on UIImageView. Where is this from?
Is there any output from the SIGABRT crash?
Try this code:
// Download the image
NSData *imageData = [NSData dataWithContentsOfURL:[NSURL URLWithString:objURL]];
// Make an UIImage out of it
UIImage *anImage = [UIImage imageWithData:imageData];
// Create an image view with the image
UIImageView *imageView = [[UIImageView alloc] initWithImage:anImage];
// Check to make sure it exists
if (imageView != nil) {
// Add it to your array
[arrayWithImages addObject:imageView];
else {
NSLog(#"Image view is nil");
}
Note that you should download the images asynchronously to avoid hanging the loop. This blog post discussing asynchronous image downloading.
Also if you know that [enumForURLs nextObject]; will return an NSString (or even better, NSURL) you should assign objURL to type NSString (or NSURL).

Why aren't variables retained outside Asset-Library block?

I've come across a weird issue that I've spent ages working on.
I'm basically trying to get the filepath of a photo out of the asset library and draw it on a pdf using the CGRectMake method using the code below:
In the .h file:
#property (strong, nonatomic) UIImage *pdfSnagImage;
In the .m file:
NSURL *url = [[NSURL alloc] initWithString:pdf.photo];
ALAssetsLibrary* library = [[ALAssetsLibrary alloc] init];
[library assetForURL:url resultBlock:^(ALAsset *asset) {
ALAssetRepresentation *rep = [asset defaultRepresentation];
self.filename = [rep filename];
NSLog(#"filename for image is: %#", self.filename);
CGImageRef iref = [rep fullResolutionImage];
if (iref) {
self.pdfImage = [UIImage imageWithCGImage:iref];
NSLog(#"image height %f", self.pdfImage.size.height);
}
} failureBlock:^(NSError *error) {
NSLog(#"Couldn't load asset %# => %#", error, [error localizedDescription]);
}];
UIImage *testImage = self.pdfImage;
[testImage drawInRect:CGRectMake( (pageSize.width - testImage.size.width/2)/2, 350, testImage.size.width/2, testImage.size.height/2)];
What's happening is that the UIImage after the block, testImage is actually being resolved before the block - therefore it is Null, as self.pdfImage is only set within the block.
If I put those lines of code in the block I get an error with the CGRectMake method, Invalid Context 0x0.
How on earth can I assign the self.pdfImage UIImage first then draw it?
The block is executed asynchronously...it's being executed on a separate thread. This is the design of the assets lib api.