could you please provide me by sample code for creation a UIImage object with specific width and height programmatically in the code then load an Image in it which supposed to fetch from json (it has to be an image URL).
If I use synchronous network request for loading the picture does it block my UI till loading be completed or not? if so what's the solution?
Thanks in Advance
Pull out the image URL from your json data, download the image, and then use the image as needed.
NSString *imageURL = url from your json string;
NSURL *url = [NSURL URLWithString:imageURL];
UIImageView *imageView = [UIImageView alloc] initWithFrame:CGRectMake(0,0,50,75)];
[yourParentView addSubview: imageView];
dispatch_queue_t queue = dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0);
dispatch_async(queue, ^{
UIImage *image = [UIImage imageWithData: [NSData dataWithContentsOfURL:url]];
dispatch_sync(dispatch_get_main_queue(), ^{
imageView.image = image;
CGRect frame = CGRectMake(imageView.origin.x, imageView.origin.y, image.size.width, image.size.height);
imageView.frame = frame;
});
});
Related
I am using below code to get the images from server. i want to get dynamicaly height of image and add image in scrollview.
From below code when i get the height outside the dispatch_async method it shows zero.
How i can get the dynamically height of image with async image load.
- (void)viewDidLoad {
[self LoadViewPublicEvents];
}
-(void) LoadViewPublicEvents
{
for (int i=0;i<arrayPublicEvents.count;i++)
{
UIImageView *img_vw1=[[UIImageView alloc] init];
dispatch_async(dispatch_get_global_queue(0, 0), ^{
NSData *imageData = [NSData dataWithContentsOfURL:[NSURL URLWithString:[NSString stringWithFormat:#"http://abc.us/uploads/event/%#",[[arrayPublicEvents objectAtIndex:i] valueForKey:#"image"]]]];
UIImage *images = [[UIImage alloc]initWithData:imageData];
dispatch_async(dispatch_get_main_queue(), ^{
img_vw1.image = images;
scaledHeight = images.size.height;
});
});
NSLog(#"%f",scaledHeight); // it print zero
img_vw1.backgroundColor=[UIColor clearColor];
img_vw1.frame=CGRectMake(0,y+5,screen_width,197);
[img_vw1 setContentMode:UIViewContentModeScaleAspectFit];
img_vw1.backgroundColor=[UIColor clearColor];
[self.scrll_vw addSubview:img_vw1];
}
}
Thanks in advance
Your code:
NSLog(#"%f",scaledHeight); // it print zero
img_vw1.backgroundColor=[UIColor clearColor];
img_vw1.frame=CGRectMake(0,y+5,screen_width,197);
[img_vw1 setContentMode:UIViewContentModeScaleAspectFit];
img_vw1.backgroundColor=[UIColor clearColor];
[self.scrll_vw addSubview:img_vw1];
Is being executed, before you loaded the image.
Thus you have to either wait (herefore you could use semaphores until the thread has finished) OR you place it inside of your block.
As you want to modify the UI it makes sense to place it into the main block:
UIImageView *img_vw1=[[UIImageView alloc] init];
dispatch_async(dispatch_get_global_queue(0, 0), ^{
NSData *imageData = [NSData dataWithContentsOfURL:[NSURL URLWithString:[NSString stringWithFormat:#"http://abc.us/uploads/event/%#",[[arrayPublicEvents objectAtIndex:i] valueForKey:#"image"]]]];
UIImage *images = [[UIImage alloc]initWithData:imageData];
dispatch_async(dispatch_get_main_queue(), ^{
img_vw1.image = images;
scaledHeight = images.size.height;
NSLog(#"%f",scaledHeight); // it print zero
img_vw1.backgroundColor=[UIColor clearColor];
img_vw1.frame=CGRectMake(0,y+5,screen_width,197);
[img_vw1 setContentMode:UIViewContentModeScaleAspectFit];
img_vw1.backgroundColor=[UIColor clearColor];
[self.scrll_vw addSubview:img_vw1];
});
});
For more information, here is a link to Apple's documentation: https://developer.apple.com/library/content/documentation/General/Conceptual/ConcurrencyProgrammingGuide/OperationQueues/OperationQueues.html
I have an app which loads an image from the local disk and then displays it on a UIImageView. I want to set the image to aspect scale to fit.
The problem I'm having is that the imageOrientation is coming back as UIImageOrientationRight even though it's a portrait image and that's messing with how the aspect calculations are done.
I've tried a few methods of changing the meta data but both rotate the image when it gets displayed.
UIImageView *iv = [[UIImageView alloc] initWithFrame:self.frame.frame];
NSMutableString *path =[[NSMutableString alloc] initWithString: [[NSBundle mainBundle] resourcePath]];
[path appendString:#"/pic2.jpg"];
NSData *data = [NSData dataWithContentsOfFile:path];
UIImage *img = [UIImage imageWithData:data];
UIImage *fixed1 = [UIImage imageWithCGImage:[img CGImage]
scale:1.0
orientation: UIImageOrientationUp];
UIImage *sourceImage = img;
UIImage *fixed2 = [UIImage
imageWithCGImage:[img imageRotatedByDegrees:90].CGImage
scale:sourceImage.scale
orientation:UIImageOrientationUp];
iv.image = fixed1; // fixed2;
[iv setContentMode:UIViewContentModeScaleAspectFill];
[self.view addSubview:iv];
In the end I took a different approach and made the UIImageView 360 wide and placed it in the centre and this achieved the desired effect.
I'm using an AVCaptureVideoDataOutput along with its delegate method to manipulate video frames. In the delegate method, I am using the sampleBuffer to create a CIImage, and from here I crop the CIImage, convert it to a UIImage and display it. Unfortunately, I need to determine the file-size of this new UIImage, but it's returning 0. The code works, the image is cropped beautifully, everything. I just don't see why it has no data!
Why might this be? Relevant code follows:
//In delegate method, given sampleBuffer...
CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
CFDictionaryRef attachments = CMCopyDictionaryOfAttachments(kCFAllocatorDefault,
sampleBuffer, kCMAttachmentMode_ShouldPropagate);
CIImage *ciImage = [[CIImage alloc] initWithCVPixelBuffer:pixelBuffer
options:(NSDictionary *)attachments];
...
dispatch_async(dispatch_get_main_queue(), ^(void) {
CGRect rect = [self drawFaceBoxesForFeatures:features forVideoBox:clap
orientation:curDeviceOrientation];
CIImage *cropped = [ciImage imageByCroppingToRect:rect];
UIImage *image = [[UIImage alloc] initWithCIImage:cropped];
NSData *data = UIImageJPEGRepresentation(image, 1);
NSLog(#"Image size is %d", data.length); //returns 0???
[imageView setImage:image];
[image release];
});
I had the same Problem, but with simple filtered images.
I stumbled upon this and it solved the issue. After this, I was able to save my image.
CGSize size = self.originalImage.size;
CGRect rect;
rect.origin = CGPointZero;
rect.size = size;
UIGraphicsBeginImageContext(size);
[[UIImage imageWithCIImage:self.filteredImage] drawInRect:rect];
UIImage * image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSData * jpegData = UIImageJPEGRepresentation(image, 1.0);
But I only needed this two lines in the "ImageContext"
I'm not able to show an image that I load from a json file.
I'm parsing my json with JSONKit and everything works fine but I can't load an image in the UIImageview. I hope some of you can help me out there.
below my code I thought might be correct but it isn't.
UIImageView *image = [[UIImageView alloc] initWithFrame:CGRectMake(20, 20, 100, 115)];
image.image = [UIImage imageNamed:[detailView objectForKey:#"thumbnail"]];
[self.view addSubview:image];
[image release];
i think in json u are getting url of image. display image from url using below code
//NSURL *url = [NSURL URLWithString:#"http://192.168.1.2x0/pic/LC.jpg"];
NSURL *url = [NSURL URLWithString:[detailView objectForKey:#"thumbnail"]];
NSData *data = [NSData dataWithContentsOfURL:url];
UIImageView *subview = [[UIImageView alloc] initWithFrame:CGRectMake(0.0f, 0.0f,320.0f, 460.0f)];
[subview setImage:[UIImage imageWithData:data]];
[self.view addSubview:subview];
[subview release];
the code you posted
image.image = [UIImage imageNamed:[detailView objectForKey:#"thumbnail"]];
will try to find the image on your bundle named the string stored in [detailView objectForKey:#"thumbnail"]
As you mentioned, your images are from remote server, you have to download the image from your remote server.
UIImageView *image = [[UIImageView alloc] initWithFrame:CGRectMake(20, 20, 100, 115)];
image.image = [UIImage imageWithData:[NSData dataWithContentsOfURL:[NSURL URLWithString:[detailView objectForKey:#"thumbnail"]]]];
[self.view addSubview:image];
[image release]
I m pretty new to iPhone development. I'm trying to implement scroll view with loading images dynamically from web. I m trying to use the following code.
for (counter = 0; counter < 2; counter++) {
RSSEntry *entry = [_allEntries objectAtIndex:counter];
NSData * imageData = [[NSData alloc] initWithContentsOfURL: [NSURL URLWithString: entry.articleUrl]];
UIImage *img = [UIImage imageWithData: imageData];
NSLog(#"%#", entry.articleUrl);
UIImageView *imageView = [[UIImageView alloc] initWithImage:img];
[imageData release];
CGRect rect = scrollViewController.frame;
imageView.frame = rect;
imageView.tag = (counter + 1); // tag our images for later use when we place them in serial fashion
[scrollViewController addSubview:imageView];
[imageView release];
}
These are two images that I load up on viewDidLoad. I want to load the next image when the user scroll onto the second image and if the user scroll next want to show that the image is loading. Any suggestion?
You muse use NSURLConnection and download the images asynchronously. You can use TCImageView
https://github.com/totocaster/TCImageView
It downloads the images in an async way.
I would also recommend Three20's TTImageView.