Only loading UIImage data used - objective-c

I have a 400 pattern images at 400x300 bundled within my app. I would like to make some kind of factory method to take a portion of that image and load them into UIImageViews. I've had some success with using content mode and clipping to bounds, but when I load a ton of these into a view it can take upwards of 5 seconds for the view to load. Here is an example of my current method.
UIImageView *tinyImageView = [[UIImageView alloc] initWithImage:[UIImage imageNamed#"400x300testImage.png"];
[tinyImageView setFrame:CGRectMake(0, 0, 10, 200)];
[tinyImageView setContentMode:UIViewContentModeTopLeft];
[tinyImageView setClipsToBounds:YES];
[self.tinyImagesView addSubview:tinyImageView];
I've been reading the ImageIO class files and I think my answer is in there but I'm having a hard time putting together workable code. In another stackoverflow question I came across this code
CFDictionaryRef options = (__bridge CFDictionaryRef)[NSDictionary dictionaryWithObjectsAndKeys:
(id)kCFBooleanTrue, (id)kCGImageSourceCreateThumbnailWithTransform,
(id)kCFBooleanTrue, (id)kCGImageSourceCreateThumbnailFromImageIfAbsent,
(id)[NSNumber numberWithFloat:200.0f], (id)kCGImageSourceThumbnailMaxPixelSize,
nil];
CGImageRef imgRef = CGImageSourceCreateThumbnailAtIndex(imageSource, 0, options);
UIImage *scaled = [UIImage imageWithCGImage:imgRef];
CGImageRelease(imgRef);
CFRelease(imageSource);
return scaled;
This has a similar load time to loading the full images and clipping.
Is it possible to read in only a 10x200 strip of an image file and load that into a UIImageView that is as fast as creating that 10x200 png and loading that using imageNamed?

I'm pretty sure what you really want is a CATiledLayer, where you can point it at the set of images and have it automatically pull up what it needs.
You can just add a CATiledLayer to any UIView.

Related

iOS 7 blur effect on videoPlayer

I want to make blur effect on video player.
So I play video using AVPlayer and whenever I want to share the video to social, share window display on video player. just I want to apply blur effect to share window's background.
renderContext function doesn't render AVPlayer's layer. But I saw that apple's new API drawViewHierarchyInRect will render specific layers such as video player or OpenGL layer.
So I used drawViewHierarchyInRect and it works as well on simulator but not on device.
Any idea?
- (UIImage *)snapshotOfVideoPlayer
{
UIGraphicsBeginImageContextWithOptions(self.view.bounds.size, NO, 1.0);
[self.view drawViewHierarchyInRect:self.view.bounds afterScreenUpdates:NO];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return image;
}
I believe the only way is using the AVAssetImageGenerator.
Assuming you have a reference to your AVPlayerItem:
AVURLAsset *asset = (AVURLAsset *)self.playerItem.asset;
AVAssetImageGenerator *imageGenerator = [[AVAssetImageGenerator alloc] initWithAsset:asset];
CGImageRef thumb = [imageGenerator copyCGImageAtTime:self.playerItem.currentTime
actualTime:NULL
error:NULL];
self.videoScreenshotIV.image = [UIImage imageWithCGImage:thumb];
Notice the self.playerItem.currentTime. This will ensure the image will be exactly the same as the moment of the screenshot.
The videoScreenshotIV is an UIImageView (contentMode scaleAspectFit) that is directly over the avplayer view with exactly the same bounds. I hide this UIImageView until I need to take a screenshot, I then first unhide it and set the image, then take the screenshot and then hide it again. It works perfectly! :)

Problems to create a Thumbnailimage from a WORD document in Objective c

I'm trying to create a ThumbnailImage to Word document. I use a UIWebView to load the document and then transform it into an image. Png. I've watched a thousand times and the file path is correct, but the final image is blank. Someone can look at my code to see if there is something wrong?. Thank you very much in advance.
UIWebView *myWebView22 = [[UIWebView alloc] initWithFrame:CGRectMake(0, 0, 100, 100)];
path44 = [caminoINICIAL stringByAppendingPathComponent:#"documentoInicial.doc"];
NSURL *fileURL22 = [[NSURL alloc] initFileURLWithPath:path44];
NSURLRequest *req22 = [NSURLRequest requestWithURL:fileURL22];
[myWebView22 setScalesPageToFit:YES];
[myWebView22 loadRequest:req22];
UIGraphicsBeginImageContext(myWebView22.bounds.size);
CGContextRef c = UIGraphicsGetCurrentContext();
CGContextTranslateCTM(c, 0, 0);
[myWebView22.layer renderInContext:c];
UIImage* viewImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
pngPath = [caminoINICIAL stringByAppendingPathComponent:[NSString stringWithFormat:#"imagenFinal.png"]];
[UIImagePNGRepresentation(viewImage) writeToFile:pngPath atomically:YES];
I'm pretty sure that's a timing problem. UIWebView has is own threads and loads the document async, you need to wait until it's loaded before you use renderInContext.
Figuring out the time might be tricky, either with a fixed time or some deep-subview-inspection.
You might wanna use QuickLook instead of UIWebView to get a little bit more control about the rendering.
The problem is that you are firing the request then immediately capturing the image. loadRequest: is asynchronous. It sends the request, and returns immediately. It doesn't wait until the document has loaded before returning. So when you capture the image, the device is still busy loading the document.
You can provide a delegate to the web view that receives a message once the document has finished loading. Capture the image from within that delegate method.

To Convert HTML doc to image in cocoa

is it possible to convert the HTML page to image in cocoa?
Actually i have created the complete view in the HTML and now i want to convert the whole html preview to the image (any jpeg or png etc.).
I couldn't find any resource or sample on the web, which provides some sort of help on my above queries.It's highly appreciated if someone could share his wisdom on how I can achieve this.
Thanks in advance..
First off, I'd like to thank sergio... his answer got me started but I thought I'd share some of the code that I didn't find obvious that I had to write to make it work:
Here's how to make a thumbnail for a page without ever having it displayed:
// Your width and height can be whatever you like, but if you want this to render
// off screen, you need an x and y bigger than the superview's width and height
UIWebView* webView = [[UIWebView alloc] initWithFrame:CGRectMake(largerScreenDimension, largerScreenDimension, largerScreenDimension, largerScreenDimension)];
[self.view addSubview:webView]; // UIWebViews without an assigned superview don't load ever.
webView.delegate = self; // or whoever you have implement UIWebViewDelegate
webView.scalesToFit = YES; // This zooms the page appropriately to fill the entire thumbnail.
[webView loadRequest:[NSURLRequest requestWithURL:url]];
Then implement this in your delegate:
- (void)webViewDidFinishLoad:(UIWebView *)webView {
UIGraphicsBeginImageContext(webView.bounds.size);
[webView.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *webViewImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSData *thumbnailData = UIImagePNGRepresentation(webViewImage);
[webView removeFromSuperview];
}
Finally, to display this thumbnail you'll need something like:
thumbnailImageView.image = [UIImage imageWithData:thumbnailData];
As a bonus thing I'll mention, I wanted multiple thumbnails to be generated at once. I found using objc_setAssociatedObject() and objc_getAssociatedObject() to be very helpful with keeping track of which webView was loading which thumbnail. Going into detail on how that worked is beyond the scope of this question, though.
You can draw your view in an image context, like this:
UIWebView* view = ...
....
UIGraphicsBeginImageContext(view.bounds.size);
[view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *viewImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSData *imagedata = UIImagePNGRepresentation(viewimage);
NSString *encodedString = [imageData base64Encoding];
Another option would be using Quartz PDF engine to create a PDF.

adding UIImageView in UIScrollview with large image blocks thread

I am loading an image in a UIImageView which I then add to a UIScrollView.
The image is a local image and is about 5000 pixels in height.
The problem is that when I add the UIImageView to the UIScrollView the thread is blocked.
It is obvious because when I do this I can not scroll the UIScrollView till the image is displayed.
Here's the example.
UIScrollView *myscrollview = [[UIScrollView alloc] initWithFrame:CGRectMake(0, 0, 768, 1004)];
myscrollview.contentSize = CGSizeMake(7680, 1004);
myscrollview.pagingEnabled = TRUE;
[self.view addSubview:myscrollview];
NSString* str = [[NSBundle mainBundle] pathForResource:#"APPS.jpg" ofType:nil inDirectory:#""];
NSData *imageData = [NSData dataWithContentsOfFile:str];
UIImageView *singleImageView = [[UIImageView alloc] initWithImage:[UIImage imageWithData:imageData]];
//the line below is the blocking line
[scrollView addSubview:singleImageView];
It is the last line in the script that blocks the scroller. When I leave it out everything works perfect, except for the fact the image is not showing of course.
I seem to recall that using multithreading does not work on UIView operations so I guess that's out of the question ass well.
Thanks for your kind help.
If you're providing these large images, you should maybe check out CATiledLayer; there's a video of a good presentation on how to use this from WWDC 2010.
If these aren't your images, and you can't downsample them or break them into tiles, you can draw the image on a background thread. You may not draw to the screen graphics context on any main thread but the main thread, but that doesn't prevent you from drawing to a non-screen graphics context. On your background thread you can
create a drawing context with CGBitmapContextCreate
draw your image on it just as you would draw onto the screen in drawRect:
when you're done loading and drawing the image invoke your view's drawRect: method on the main thread using performSelectorOnMainThread:withObject:waitUntilDone:
In your view's drawRect: method, once you've fully drawn your image on the in-memory context, copy it to the screen using CGBitmapContextCreateImage and CGContextDrawImage.
This isn't trivial, you'll need to start your background thread at the right time, synchronize access to your images, etc. The CATiledLayer approach is almost certainly the better one if you can find a way to manipulate the images to make that work.
Why you want to load such huge image to memory at one time? split it into many small images and load/free it dynamically.
Try allocating the UIImageView without a UIImage, and add it as a sub-view to the UIScrollView.
Load the UIImage in a separate thread, and get the method running on the other thread to set the image property of the UIImageView when the image is loaded into memory. Also you'll probably encounter some memory problems as an image of this size loaded into a UIImage will probably be 30MB+
UIScrollView *myscrollview = [[UIScrollView alloc] initWithFrame:CGRectMake(0, 0, 768, 1004)];
myscrollview.contentSize = CGSizeMake(7680, 1004);
myscrollview.pagingEnabled = TRUE;
[self.view addSubview:myscrollview];
NSString* str = [[NSBundle mainBundle] pathForResource:#"APPS.jpg" ofType:nil inDirectory:#""];
UIImageView *singleImageView = [[UIImageView alloc] init];
[scrollView addSubview:singleImageView];
//Then fire off a method on another thread to load the UIImage and set the image
//property of the UIImageView.
Just keep an eye on memory and beware of using convenience constructors with UIImage (or any object that could end up being huge)
Also where are you currently running this code?

Does anyone know how I can get the "real" size of the UIImageView instead of creating a frame

I was hoping there was some sort of function that could take the real size of my picture
CGRect myImageRect = CGRectMake(0.0f, 0.0f, 320.0, 210.0f); // 234
UIImageView *myImage = [[UIImageView alloc] initWithFrame:myImageRect];
[myImage setImage:[UIImage imageNamed:#"start.png"]];
And with CGRectMake which does something like this ..( imageViewHeight ,
CGFloat imageViewHeight = [myImage frame].size.height;
So that I could get the real size instead of having to define it like you can seen above.
I don't really get what you're asking, but here's a shot:
If you have a UIImage, and you want to know its size, you can ask it for its -[UIImage size] property.
However, if you want to create a UIImageView that's the same size as a UIImage, then you can just use -[UIImageView initWithImage:], which will automatically set the frame of the UIImageView to correspond to the dimensions of the image.
If, however, you're just looking to change the dimensions of a currently existing view, there's really no easy way to do that without messing around with the view's frame. You could maybe apply an affine transform to scale it, but it's easier to manipulate the frame.
It looks like you're asking to add the image to the imageview without first creating a frame. If this is the case, you can do the following:
UIImageView *myImage = [[UIImageView alloc] initWithImage:[UIImage imageNamed:#"start.png"]];
As far as I understood, you are looking for the size of the image used in UIImageView object. To do that, there is not a function built in UIImageView but you can do it this way:
NSString* image= [myImage image].accessibilityIdentifier; // Get the image's name
UIImage *img = [UIImage imageNamed:image]; // Create an image object with that name
CGSize size = img.size; // Get the size of image
Hope this helps your question.