UIImageView Takes Time To Load Image - objective-c

I wanted to download image from internet from particular link
Downloading this image take place in other class which is Singleton class name moreAppInternet.m
My moreAppInterNet.m file is as follow
//finishedImgDling is bool
//mutableData is NSMutableData
//urlConnection is NSURLConnection
//appimage is UIImage
//imageUrl is NSString Containing Link
+(id)sharedManager
{
static moreAppInterNet *sharedMyManager = nil;
#synchronized(self) {
if (sharedMyManager == nil)
sharedMyManager = [[self alloc] init];
}
return sharedMyManager;
}
-(void)downloadImageFromUrl
{
finishedImgDling = NO;
[self.mutableData setLength:0];
self.urlConnection = [NSURLConnection connectionWithRequest:[NSURLRequest requestWithURL:[NSURL URLWithString:self.imageUrl]] delegate:self];
while(!finishedImgDling) {
[[NSRunLoop currentRunLoop] runMode:NSDefaultRunLoopMode beforeDate:[NSDate distantFuture]];
}
}
-(void)connection:(NSURLConnection *)connection didReceiveResponse:(NSURLResponse *)response
{
if (connection == self.urlConnection)
{
NSLog(#"imageDling");
}
}
-(void)connection:(NSURLConnection *)connection didReceiveData:(NSData *)data
{
if (connection == self.urlConnection)
{
if (self.mutableData == nil)
{
self.mutableData = [NSMutableData data];
}
[self.mutableData appendData:data];
}
}
-(void)connectionDidFinishLoading:(NSURLConnection *)connection
{
if (connection == self.urlConnection)
{
NSLog(#"image Downloaded");
finishedImgDling = YES;
[self setUrlConnection:nil];
[self storeImage];
}
}
-(void)storeImage
{
[self setAppimage:nil];
self.appimage = [UIImage imageWithData:self.mutableData];
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSFileManager *fileManager = [NSFileManager defaultManager];
NSString *path =[documentsDirectory stringByAppendingPathComponent:[NSString stringWithFormat:#"appImage.png"]];
[fileManager createFileAtPath:path contents:UIImagePNGRepresentation(self.appimage) attributes:nil];
NSLog(#"downloading and storing complete");
}
and my other ViewController is as follow from where I call that function
-(void)viewDidLoad
{
[moreAppInterNet sharedManager];
[self downloadImage];
}
-(void)downloadImage
{
NSInvocationOperation *operation = [[NSInvocationOperation alloc] initWithTarget:self selector:#selector(download_Image_from_other_class) object:nil];
NSOperationQueue *que = [[NSOperationQueue alloc] init];
[que addObserver:self forKeyPath:#"operations" options:0 context:NULL];
[que addOperation:operation];
[operation release];
self.dlingQue = que;
[que release];
}
-(void)download_Image_from_other_class
{
[[moreAppInterNet sharedManager] downloadImageFromUrl];
}
-(void) observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object
change:(NSDictionary *)change context:(void *)context
{
if ([keyPath isEqualToString:#"operations"] && object == self.dlingQue)
{
if ([self.dlingQue operationCount] == 0)
{
self.imgView = [[moreAppInterNet sharedManager] appimage];
//it takes lots of time to display image in UIImageView
//Even though Above line get executed
}
}
else {
[super observeValueForKeyPath:keyPath ofObject:object
change:change context:context];
}
}
The Problem is that self.imgView takes Lots of time to load the image approx. 5sec even though its downloaded completely.

The library I use for networking is AFNetworking

I think your code downloads the image again when you request it.
You need to check if you have the file in downloadImageFromUrl.
Probably, you need more robust library to handle file downloading.

Related

NSURLConnect works only 1 time

I'm trying to do a request but it works only in the first time....
Here is my code:
NSArray *infos = [rows objectAtIndex:indexPath.row];
NSString *thumbPath = thePath;
NSURLRequest *thumbRequest = [NSURLRequest requestWithURL:[NSURL URLWithString:thumbPath]cachePolicy:NSURLRequestUseProtocolCachePolicy timeoutInterval:10];
self->thumbConnection = [[NSURLConnection alloc] initWithRequest:thumbRequest
delegate:self
startImmediately:YES];
self->thumbData = [[NSMutableData alloc]init];
when receive Response
- (void)connection:(NSURLConnection *)connection didReceiveResponse:(NSURLResponse *)response {
if (self->thumbData == nil) {
self->thumbData = [[NSMutableData alloc]init];
}
- (void)connectionDidFinishLoading:(NSURLConnection *)connection {
self.thumbImage.image = [UIImage imageWithData:self->thumbData];
self.thumbActivityView.hidden = YES;
self->thumbData = nil;
}
when didFinishDownloading:
- (void)connectionDidFinishLoading:(NSURLConnection *)connection {
self.thumbImage.image = [UIImage imageWithData:self->thumbData];
self.thumbActivityView.hidden = YES;
[self->thumbData release];
self->thumbData = nil;
}
when timeout or other erros method:
- (void)connection:(NSURLConnection *)connection didFailWithError:(NSError *)error {
[self->thumbData release];
self->thumbData = nil;
self.thumbActivityView.hidden = YES;
[thumbConnection release];
self->thumbConnection = nil;
}
Don't allocate it again if it's non-nil. Better still allocate it in your object's init method and release it in your object's dealloc method

MBProgressHud and SDWebImagePrefetcher

I'm trying to show a custom MBProgressHUD while downloading a list of URLs with SDWebImagePrefetcher using NSURLConnection methods.
SDWebImagePrefetcher has a method that, when called, shows in console the progress of the images download.
Now, i would like to show that NSLog progress in the custom MBProgressHUD and I would like the HUD to stay on screen until the process is done, but I don't know how to do it and plus, when my NSURLConnection methods are called, it shows the initial HUD (Connnecting), then quickly jumps to "Complete", even if the images still needs to be downloaded.
Here's my code:
- (void)connection:(NSURLConnection *)connection didReceiveResponse:(NSURLResponse *)response {
HUD.mode = MBProgressHUDModeDeterminate;
}
- (void)connection:(NSURLConnection *)connection didReceiveData:(NSData *)data {
HUD.labelText = #"Loading";
HUD.detailsLabelText = #"Downloading contents..."; //here, i would like to show the progress of the download, but it seems to jump this part
HUD.dimBackground = YES;
}
- (void)connectionDidFinishLoading:(NSURLConnection *)connection {
//arr = array which holds a plist
NSMutableArray *array = [[[NSMutableArray alloc]init]autorelease];
for (NSDictionary *dict in arr) {
for (NSDictionary *val in [dict valueForKey:STR_ROWS]) {
[array addObject:[val objectForKey:#"image"]];
}
}
[prefetcher prefetchURLs:array];
HUD.customView = [[[UIImageView alloc] initWithImage:[UIImage imageNamed:#"checkmark.png"]] autorelease];
HUD.mode = MBProgressHUDModeCustomView;
HUD.labelText = NSLocalizedString(#"Completed",#"Completed!");
HUD.detailsLabelText = nil;
HUD.dimBackground = YES;
[HUD hide:YES afterDelay:2];
}
- (void)connection:(NSURLConnection *)connection didFailWithError:(NSError *)error {
[HUD hide:YES];
UIAlertView *alertView = [[[UIAlertView alloc] initWithTitle:#"Connection Failed message:[NSString stringWithFormat:#"Connection to the remote server failed with error:\n %#\n Try again in a while"),[error localizedDescription]] delegate:nil cancelButtonTitle:#"OK" otherButtonTitles:nil] autorelease];
[alertView show];
}
I tried to look into the examples, but didn't find out how to do what i want to do.
EDIT
HUD Setup:
-(void)alertView:(UIAlertView *)alertView clickedButtonAtIndex:(NSInteger)buttonIndex{
if (buttonIndex == 0){
[alertView dismissWithClickedButtonIndex:0 animated:YES];
}
else{
NetworkStatus internetStatus = [internetReachable currentReachabilityStatus];
switch (internetStatus) {
case NotReachable:
{
//not reachable
break;
}
case (ReachableViaWWAN):
{
//reachable but not with the needed mode
break;
}
case (ReachableViaWiFi):{
HUD = [[MBProgressHUD showHUDAddedTo:self.navigationController.view animated:YES]retain];
HUD.delegate = self;
HUD.dimBackground = YES;
HUD.labelText = #"Connecting...";
NSURL *URL = [NSURL URLWithString:#"http://mywebsite.com/myPlist.plist"];
NSURLRequest *request = [NSURLRequest requestWithURL:URL];
NSURLConnection *connection = [[NSURLConnection alloc] initWithRequest:request delegate:self];
[connection start];
[connection release];
break;
}
default:
break;
}
}
}
Any ideas?
In connection:didReceiveResponse: you must record how large the download is,
for example in self.responseSize. Then, in connection:didReceiveData: you
must append the data you just got to the data you previously got, and update the progress:
- (void)connection:(NSURLConnection *)connection didReceiveResponse:(NSURLResponse *)response {
HUD.mode = MBProgressHUDModeDeterminate;
HUD.labelText = #"Loading";
HUD.detailsLabelText = #"Downloading contents...";
HUD.dimBackground = YES;
// Define responseSize somewhere...
responseSize = [response expectedContentLength];
myData = [NSMutableData data];
}
- (void)connection:(NSURLConnection *)connection didReceiveData:(NSData *)data {
[myData appendData:data];
HUD.progress = (float)myData.length / responseSize;
}

Can't play system sounds after capturing audio / video

I'm recoding audio/video using AVfoudnation. and I need to play a sounds, using system sounds, before I start capturing video/audio. This is working correctly the first time, but when I try to do it the second time, the system audi doesn't play. My guess is that something in the AVfoundation is not been released correctly.
In my application deletage, I have this code in the applicationDidFinishLaunching method:
VKRSAppSoundPlayer *aPlayer = [[VKRSAppSoundPlayer alloc] init];
[aPlayer addSoundWithFilename:#"sound1" andExtension:#"caf"];
self.appSoundPlayer = aPlayer;
[aPlayer release];
and also this method
- (void)playSound:(NSString *)sound
{
[appSoundPlayer playSound:sound];
}
As you can see I'm using VKRSAppSoundPlayer, which works great!
In a view, I have this code:
- (void) startSession
{
self.session = [[AVCaptureSession alloc] init];
[session beginConfiguration];
if([session canSetSessionPreset:AVCaptureSessionPreset640x480])
session.sessionPreset = AVCaptureSessionPresetMedium;
[session commitConfiguration];
CALayer *viewLayer = [videoPreviewView layer];
AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
captureVideoPreviewLayer.frame = viewLayer.bounds;
[viewLayer addSublayer:captureVideoPreviewLayer];
self.videoInput = [AVCaptureDeviceInput deviceInputWithDevice:[self frontFacingCameraIfAvailable] error:nil];
self.audioInput = [AVCaptureDeviceInput deviceInputWithDevice:[self audioDevice] error:nil];
if(videoInput){
self.videoOutput = [[AVCaptureMovieFileOutput alloc] init];
[session addOutput:videoOutput];
//[videoOutput release];
if([session canAddInput:videoInput]){
//[session beginConfiguration];
[session addInput:videoInput];
}
//[videoInput release];
[session removeInput:[self audioInput]];
if([session canAddInput:audioInput]){
[session addInput:audioInput];
}
//[audioInput release];
if([session canAddInput:audioInput])
[session addInput:audioInput];
NSLog(#"startRunning!");
[session startRunning];
[self startRecording];
if(![self recordsVideo])
[self showAlertWithTitle:#"Video Recording Unavailable" msg:#"This device can't record video."];
}
}
- (void) stopSession
{
[session stopRunning];
[session release];
}
- (AVCaptureDevice *)frontFacingCameraIfAvailable
{
NSArray *videoDevices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
AVCaptureDevice *captureDevice = nil;
Boolean cameraFound = false;
for (AVCaptureDevice *device in videoDevices)
{
NSLog(#"1 frontFacingCameraIfAvailable %d", device.position);
if (device.position == AVCaptureDevicePositionBack){
NSLog(#"1 frontFacingCameraIfAvailable FOUND");
captureDevice = device;
cameraFound = true;
break;
}
}
if(cameraFound == false){
for (AVCaptureDevice *device in videoDevices)
{
NSLog(#"2 frontFacingCameraIfAvailable %d", device.position);
if (device.position == AVCaptureDevicePositionFront){
NSLog(#"2 frontFacingCameraIfAvailable FOUND");
captureDevice = device;
break;
}
}
}
return captureDevice;
}
- (AVCaptureDevice *) audioDevice
{
NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeAudio];
if ([devices count] > 0) {
return [devices objectAtIndex:0];
}
return nil;
}
- (void) startRecording
{
#if _Multitasking_
if ([[UIDevice currentDevice] isMultitaskingSupported]) {
[self setBackgroundRecordingID:[[UIApplication sharedApplication] beginBackgroundTaskWithExpirationHandler:^{}]];
}
#endif
[videoOutput startRecordingToOutputFileURL:[self generatenewVideoPath]
recordingDelegate:self];
}
- (void) stopRecording
{
[videoOutput stopRecording];
}
- (void)captureOutput:(AVCaptureFileOutput *)captureOutput
didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL
fromConnections:(NSArray *)connections error:(NSError *)error
{
NSFileManager *man = [[NSFileManager alloc] init];
NSDictionary *attrs = [man attributesOfItemAtPath: [outputFileURL path] error: NULL];
NSString *fileSize = [NSString stringWithFormat:#"%llu", [attrs fileSize]];
// close this screen
[self exitScreen];
}
-(BOOL)recordsVideo
{
AVCaptureConnection *videoConnection = [AVCamUtilities connectionWithMediaType:AVMediaTypeVideo
fromConnections:[videoOutput connections]];
return [videoConnection isActive];
}
-(BOOL)recordsAudio
{
AVCaptureConnection *audioConnection = [AVCamUtilities connectionWithMediaType:AVMediaTypeAudio
fromConnections:[videoOutput connections]];
return [audioConnection isActive];
}
If I do [videoInput release]; and [audioInput release]; I got a bad access error. that's why they are commented out. This may be part of the issue.
If I try to play the system sound n times, it work, but if I go first to the recording script, it wont work after that.
Any ideas?
The proper way to release AVCaptureSession is the following:
- (void) destroySession {
// Notify the view that the session will end
if ([delegate respondsToSelector:#selector(captureManagerSessionWillEnd:)]) {
[delegate captureManagerSessionWillEnd:self];
}
// remove the device inputs
[session removeInput:[self videoInput]];
[session removeInput:[self audioInput]];
// release
[session release];
// remove AVCamRecorder
[recorder release];
// Notify the view that the session has ended
if ([delegate respondsToSelector:#selector(captureManagerSessionEnded:)]) {
[delegate captureManagerSessionEnded:self];
}
}
If you're having some sort of release problems (bad access), I can recommend taking your code out of your current "messy" project to some other new project and debug the problem over there.
When I had similar problem, I just did that. I shared it on Github, you might find this project useful: AVCam-CameraReleaseTest

Loading and changing images asynchronous

Im loading 4 images from a website asynchronous,and after the loading i change them with a timer,but looks like on the array of the images is always the first image,when i change to another position of the array it contains the same image,dont change at all...
- (void)viewDidLoad
{
[super viewDidLoad];
[self getImages];
}
-(void)getImages
{
receivedData = [[NSMutableData alloc]init];
myImages = [[NSMutableArray alloc]init];
myImagesAdresses = [[NSMutableArray alloc]initWithCapacity:5];
[myImagesAdresses addObject:#"adress/Images/3.png"];
[myImagesAdresses addObject:#"adress/Images/2.png"];
[myImagesAdresses addObject:#"adress/Images/1.png"];
[myImagesAdresses addObject:#"adress/Images/0.png"];
[self loadNextImage];
}
-(void)loadNextImage
{
if([myImagesAdresses count])
{
NSURL * imageURL = [NSURL URLWithString:[myImagesAdresses lastObject]];
NSURLRequest * myRequest = [NSURLRequest requestWithURL:imageURL];
[[NSURLConnection alloc]initWithRequest:myRequest delegate:self];
NSLog(#"Start load URL %#",imageURL);
}
else {
[self.theImage setImage:[myImages objectAtIndex:2]];
[self changeImage];
NSLog(#"No More Images to Load");
}
}
-(void)connection:(NSURLConnection *)connection didReceiveData:(NSData *)data
{
[receivedData appendData:data];
}
-(void)connectionDidFinishLoading:(NSURLConnection *)connection
{
[myImages addObject:[UIImage imageWithData:[NSData dataWithData:receivedData]]];
[connection release];
connection = nil;
NSLog(#"Image from %# loaded",[myImagesAdresses lastObject]);
[myImagesAdresses removeLastObject];
[self loadNextImage];
}
-(void)connection:(NSURLConnection *)connection didFailWithError:(NSError *)error
{
[connection release];
connection =nil;
NSLog(#"Image from %# not loaded", [myImagesAdresses lastObject]);
[myImagesAdresses removeLastObject];
[self loadNextImage];
}
and the code to change the image:
-(void)changeImage
{
SEL selectorToCall = #selector(change: );
NSMethodSignature *methodSignature = [[self class] instanceMethodSignatureForSelector:selectorToCall];
NSInvocation * invocation = [NSInvocation invocationWithMethodSignature:methodSignature];
[invocation setTarget:self];
[invocation setSelector:selectorToCall];
NSTimer * newTimer = [NSTimer scheduledTimerWithTimeInterval:5.0 invocation:invocation repeats:YES];
self.paintingTimer = newTimer;
}
- (void) change:(NSTimer *)paramTimer
{
static NSInteger currentElement = 0;
if(++currentElement == [myImages count]) currentElement = 0;
[self.theImage setImage:[myImages objectAtIndex:currentElement]];
}
- (void) stopPainting
{
if (self.paintingTimer != nil){
[self.paintingTimer invalidate];
}
}
I dont getting any error,but the image simply dont change...
It looks like receivedData is an ivar that you repeatedly append to, but it never gets cleared. So after the first image is received, the second image is appended to it, etc.
After this line, you should empty receivedData:
[myImages addObject:[UIImage imageWithData:[NSData dataWithData:receivedData]]];
You should add this line of code after:
[myImages addObject:[UIImage imageWithData:[NSData dataWithData:receivedData]]];
like this:
[myImages addObject:[UIImage imageWithData:[NSData dataWithData:receivedData]]];
[receivedData setData:nil];

Async Image download a UITableView's cellForRowAtIndexPath not always called

I using Apple's LazyTableImages sample to async download images for UITableViewCells. The main difference is that I'm saving a copy of the file to the Documents folder so that it doesn't need to be downloaded all the time. The method below is called by the class responsible for downloading the image
- (void)imageDidLoad:(NSIndexPath *)indexPath
{
AsyncImage *aImage = [imageDownloadsInProgress objectForKey:indexPath];
if (aImage != nil)
{
UITableViewCell *cell = [self.tableView cellForRowAtIndexPath:aImage.indexPathInTableView];
cell.imageView.image = aImage.image;
}
}
When the image is not on the device, the download goes fine and the image is displayed. However when the image is on the device the image is not displayed and the cell variable is null.
The code for the AsyncImage is:
//
// AsyncImgView.m
// BelfastChamberOfCommerce
//
// Created by markpirvine on 23/06/2011.
// Copyright 2011 __MyCompanyName__. All rights reserved.
//
#import "AsyncImage.h"
#import "AppManager.h"
#define kAppIconHeight 48
#interface AsyncImage ()
#property (nonatomic, retain) NSString *fileUrl;
#end
#implementation AsyncImage
#synthesize imageUrl, fileUrl;
#synthesize image;
#synthesize indexPathInTableView;
#synthesize delegate;
- (void)dealloc
{
[imageUrl release];
[fileUrl release];
[image release];
[indexPathInTableView release];
[connection cancel];
[connection release];
[data release];
[super dealloc];
}
- (void)loadImage
{
if (connection != nil)
{
[connection release];
}
if (data != nil)
{
[data release];
}
if([self.imageUrl length] > 0)
{
self.fileUrl = [[[AppManager sharedInstance] getCacheLocation] stringByAppendingPathComponent:[[self.imageUrl componentsSeparatedByString:#"/"] lastObject]];
if([[NSFileManager defaultManager] fileExistsAtPath:self.fileUrl] == YES)
{
self.image = [UIImage imageWithContentsOfFile:self.fileUrl];
[self deliverImage];
}
else
{
NSURLRequest* request = [NSURLRequest requestWithURL:[NSURL URLWithString:self.imageUrl] cachePolicy:NSURLRequestUseProtocolCachePolicy timeoutInterval:60.0];
connection = [[NSURLConnection alloc] initWithRequest:request delegate:self];
}
}
}
- (void)connection:(NSURLConnection *)theConnection didReceiveData:(NSData *)incrementalData
{
if (data == nil)
{
data = [[NSMutableData alloc] initWithCapacity:2048];
}
[data appendData:incrementalData];
}
- (void)connectionDidFinishLoading:(NSURLConnection*)theConnection
{
[connection release];
connection = nil;
UIImage *iTmp = [[UIImage alloc] initWithData:data];
if (iTmp.size.width != kAppIconHeight && iTmp.size.height != kAppIconHeight)
{
CGSize itemSize = CGSizeMake(kAppIconHeight, kAppIconHeight);
UIGraphicsBeginImageContext(itemSize);
CGRect imageRect = CGRectMake(0.0, 0.0, itemSize.width, itemSize.height);
[iTmp drawInRect:imageRect];
self.image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
}
else
{
self.image = iTmp;
}
[UIImageJPEGRepresentation(self.image, 1.0) writeToFile:self.fileUrl atomically:YES];
[iTmp release];
[data release];
data = nil;
[self deliverImage];
}
- (void)deliverImage
{
[delegate imageDidLoad:self.indexPathInTableView];
}
- (void)cancelDownload
{
[connection cancel];
connection = nil;
data = nil;
}
#end
Any help would be greatly appreciated
I suggest using one of the open source classes for async download of images that has built-in support for caching. The one that I'm using is cached image for iOS.