Objective C - UIImage resizing not working - objective-c

I have a program that fetches an image from the library, but I'm using code I found online to resize that image so that it can fit on the screen (basically making it 640x960), but then it would still be too big to display, so in another UIImage I'm copying the first resized image and re-resizing this one to make it about 1/4 of the screen (or 160x240). The code is this:
for ViewController.h:
UIImage *img;
UIImage *thumb;
-(UIImage*) scaleImage: (UIImage*)image toSize:(CGSize)newSize;
(this of course, is only the code related to my problem)
for ViewController.m
-(UIImage*) scaleImage: (UIImage*)image toSize:(CGSize)newSize {
UIGraphicsBeginImageContext(newSize);
[image drawInRect:CGRectMake(0, 0, newSize.width, newSize.height)];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
and in the same m file on another function, the scaleImage function is called when pressing a button with these lines:
[self scaleImage:img toSize:CGSizeMake(640, 960)];
thumb = img;
[self scaleImage:thumb toSize:CGSizeMake(160, 240)];
In the project I've previously been able to successfully provide an image for img using [info objectForKey:UIImagePickerControllerOriginalImage]; which would be the image chosen from the library. I've already "file owned" everything so that this function takes place (and it does because I create an UIAlert within it and it shows, and a NSLog to print out when scaleImage starts and it does, twice), but the image is never re-sized!! Does anyone know why?? Please let me know, thank you for anyone who comments with help or suggestions!!

Your scaleImage method returns the scaled image, for example
thumb = [self scaleImage:img toSize:CGSizeMake(640, 960)];

Related

Render play image over existing image in objective c

I have collection view with some videos and images
Using AVFoundation able to capture video from iPhone and generated thumbnail using AVAssetImageGenerator. When shown in gallery image should distinguish it's of video thumbnail. So i need to transform exact image by drawing video symbol(like play icon) over it.
Is it possible?
You could use CoreGraphics to edit the image.
First, create a UIImage with the image you would like to edit. Then, do something like this:
UIImage *oldThumbnail; //set this to the original thumbnail image
UIGraphicsBeginImageContext(oldThumbnail.size);
[oldThumbnail drawInRect:CGRectMake(0, 0, oldThumbnail.size.width, oldThumbnail.size.height)];
/*Now there are two ways to draw the play symbol.
One would be to have a pre-rendered play symbol that you load into a UIImage and draw with drawInRect */
UIImage *playSymbol = [UIImage imageNamed:"PlaySymbol.png"];
CGRect playSymbolRect; //I'll let you figure out calculating where you should draw the play symbol
[playSymbol drawInRect: playSymbolRect];
//The other way would be to draw the play symbol directly using CoreGraphics calls. Start with this:
CGContextRef context = UIGraphicsGetCurrentContext();
//now use CoreGraphics calls. I won't go over it here, butthe second answer to this questionmay be helpful.
//Once you have finished drawing your image, you can put it in a UIImage.
UIImage *newThumbnail = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext(); //make sure you remember to do this :)
Now you can use the newly generated thumbnail in a UIImageView, cache it so you don't need to re-render it every time, etc.
This should work (you may have to play with positions and sizes):
-(UIImage*)drawPlayButton:(UIImage*)image
{
UIImage *playButton = [UIImage imageNamed:#"playbutton.png"];
UIGraphicsBeginImageContext(image.size);
[image drawInRect:CGRectMake(0, 0, image.size.width, image.size.height)];
[playButton drawInRect:CGRectMake(image.size.width/2-playButton.size.width/2, image.size.height/2-playButton.size.height/2, playButton.size.width, playButton.size.height)];
UIImage *result = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return result;
}

iOS 7 blur effect on videoPlayer

I want to make blur effect on video player.
So I play video using AVPlayer and whenever I want to share the video to social, share window display on video player. just I want to apply blur effect to share window's background.
renderContext function doesn't render AVPlayer's layer. But I saw that apple's new API drawViewHierarchyInRect will render specific layers such as video player or OpenGL layer.
So I used drawViewHierarchyInRect and it works as well on simulator but not on device.
Any idea?
- (UIImage *)snapshotOfVideoPlayer
{
UIGraphicsBeginImageContextWithOptions(self.view.bounds.size, NO, 1.0);
[self.view drawViewHierarchyInRect:self.view.bounds afterScreenUpdates:NO];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return image;
}
I believe the only way is using the AVAssetImageGenerator.
Assuming you have a reference to your AVPlayerItem:
AVURLAsset *asset = (AVURLAsset *)self.playerItem.asset;
AVAssetImageGenerator *imageGenerator = [[AVAssetImageGenerator alloc] initWithAsset:asset];
CGImageRef thumb = [imageGenerator copyCGImageAtTime:self.playerItem.currentTime
actualTime:NULL
error:NULL];
self.videoScreenshotIV.image = [UIImage imageWithCGImage:thumb];
Notice the self.playerItem.currentTime. This will ensure the image will be exactly the same as the moment of the screenshot.
The videoScreenshotIV is an UIImageView (contentMode scaleAspectFit) that is directly over the avplayer view with exactly the same bounds. I hide this UIImageView until I need to take a screenshot, I then first unhide it and set the image, then take the screenshot and then hide it again. It works perfectly! :)

Screenshot a separate UIView (not self.view)

Basically I want to screenshot a UIView which is called CustomViewLayout, which belongs to MyClass. MyClass's view is assigned to NormalView. Therefore if I call self.view it will reference NormalView. I have made a property viewCustom which is an outlet for CustomViewLayout. Anyways, I want to screenshot CustomViewLayout, I have tried this:
UIGraphicsBeginImageContextWithOptions(self.viewCustom.bounds.size, self.viewCustom.opaque, 0.0);
[self.viewCustom.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage * img = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return img;
That image is then displayed/attached inside an In-App Mail (MFMailComposeViewController). And it doesn't work, it shows a blue box with a question mark inside, which I presume means the image is not readable. I know there is nothing wrong with my In-App Mail image attachment code because if I change my screenshot code to screenshot self.view like below:
UIGraphicsBeginImageContextWithOptions(self.viewCustom.bounds.size, self.viewCustom.opaque, 0.0);
[self.viewCustom.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage * img = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return img;
Then it works fine. So what should I do to screenshot my other view?
Thanks for the help!
The solution was just easily to load the different nib name from MyClass and then reload it back later.
Easy!

UIGraphicsGetImageFromCurrentImageContext() retain cause crash

This is a weird one, but I think I'm just missing something simple.
I have an Objective C class called Theme.
Themes load all the art for my game from files and stores them in a NSMutableArray.
When the app launches it reates a new Theme object.
Part of the loading process does something like this:
UIImage *image = [self mergeImage: bg toImage: overlay];
[imageArray addObject: image];
I take a background image and place the overlay on top of it, creating a new UIImage, by using this code:
- (UIImage *)mergeImage:(UIImage *)image1 toImage:(UIImage *)image2 {
UIGraphicsBeginImageContext(image1.size);
[image1 drawInRect:CGRectMake(0, 0, image1.size.width, image1.size.height)];
[image2 drawInRect:CGRectMake(0, 0, image2.size.width, image2.size.height)];
UIImage *resultingImage = UIGraphicsGetImageFromCurrentImageContext();
//[resultingImage retain];
UIGraphicsEndImageContext();
return resultingImage;
}
All of this happens in the init function of Theme.
Here's the problem, when I call [[Theme alloc] init] from
-(BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions
The app crashes a second later, leaving me with no stack or anything(I figure it screws up some memory and crashes later when it uses it again or something).
Here's the weird part. If I don't add the newly merged image to the imageArray, it never crashes. I don't have the image anymore, but there's no crash. Also, if I don't add the image to the array, but I retain it in the mergeImage function, it does crash, which makes me think that retaining this newly-made image is causing the problem.
Weirder still, if I don't call mergeImage until later on in the app, outside of teh [Theme init] and didFinishLaunchingWithOptions, it's OK.
Any idea why? What's wrong with retaining this image when the app first launches?
Try this Below Code:
UIGraphicsBeginImageContext(drawImage.bounds.size);
[drawImage.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *viewImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return viewImage
Thanks..!

To Convert HTML doc to image in cocoa

is it possible to convert the HTML page to image in cocoa?
Actually i have created the complete view in the HTML and now i want to convert the whole html preview to the image (any jpeg or png etc.).
I couldn't find any resource or sample on the web, which provides some sort of help on my above queries.It's highly appreciated if someone could share his wisdom on how I can achieve this.
Thanks in advance..
First off, I'd like to thank sergio... his answer got me started but I thought I'd share some of the code that I didn't find obvious that I had to write to make it work:
Here's how to make a thumbnail for a page without ever having it displayed:
// Your width and height can be whatever you like, but if you want this to render
// off screen, you need an x and y bigger than the superview's width and height
UIWebView* webView = [[UIWebView alloc] initWithFrame:CGRectMake(largerScreenDimension, largerScreenDimension, largerScreenDimension, largerScreenDimension)];
[self.view addSubview:webView]; // UIWebViews without an assigned superview don't load ever.
webView.delegate = self; // or whoever you have implement UIWebViewDelegate
webView.scalesToFit = YES; // This zooms the page appropriately to fill the entire thumbnail.
[webView loadRequest:[NSURLRequest requestWithURL:url]];
Then implement this in your delegate:
- (void)webViewDidFinishLoad:(UIWebView *)webView {
UIGraphicsBeginImageContext(webView.bounds.size);
[webView.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *webViewImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSData *thumbnailData = UIImagePNGRepresentation(webViewImage);
[webView removeFromSuperview];
}
Finally, to display this thumbnail you'll need something like:
thumbnailImageView.image = [UIImage imageWithData:thumbnailData];
As a bonus thing I'll mention, I wanted multiple thumbnails to be generated at once. I found using objc_setAssociatedObject() and objc_getAssociatedObject() to be very helpful with keeping track of which webView was loading which thumbnail. Going into detail on how that worked is beyond the scope of this question, though.
You can draw your view in an image context, like this:
UIWebView* view = ...
....
UIGraphicsBeginImageContext(view.bounds.size);
[view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *viewImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSData *imagedata = UIImagePNGRepresentation(viewimage);
NSString *encodedString = [imageData base64Encoding];
Another option would be using Quartz PDF engine to create a PDF.