Render play image over existing image in objective c - objective-c

I have collection view with some videos and images
Using AVFoundation able to capture video from iPhone and generated thumbnail using AVAssetImageGenerator. When shown in gallery image should distinguish it's of video thumbnail. So i need to transform exact image by drawing video symbol(like play icon) over it.
Is it possible?

You could use CoreGraphics to edit the image.
First, create a UIImage with the image you would like to edit. Then, do something like this:
UIImage *oldThumbnail; //set this to the original thumbnail image
UIGraphicsBeginImageContext(oldThumbnail.size);
[oldThumbnail drawInRect:CGRectMake(0, 0, oldThumbnail.size.width, oldThumbnail.size.height)];
/*Now there are two ways to draw the play symbol.
One would be to have a pre-rendered play symbol that you load into a UIImage and draw with drawInRect */
UIImage *playSymbol = [UIImage imageNamed:"PlaySymbol.png"];
CGRect playSymbolRect; //I'll let you figure out calculating where you should draw the play symbol
[playSymbol drawInRect: playSymbolRect];
//The other way would be to draw the play symbol directly using CoreGraphics calls. Start with this:
CGContextRef context = UIGraphicsGetCurrentContext();
//now use CoreGraphics calls. I won't go over it here, butthe second answer to this questionmay be helpful.
//Once you have finished drawing your image, you can put it in a UIImage.
UIImage *newThumbnail = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext(); //make sure you remember to do this :)
Now you can use the newly generated thumbnail in a UIImageView, cache it so you don't need to re-render it every time, etc.

This should work (you may have to play with positions and sizes):
-(UIImage*)drawPlayButton:(UIImage*)image
{
UIImage *playButton = [UIImage imageNamed:#"playbutton.png"];
UIGraphicsBeginImageContext(image.size);
[image drawInRect:CGRectMake(0, 0, image.size.width, image.size.height)];
[playButton drawInRect:CGRectMake(image.size.width/2-playButton.size.width/2, image.size.height/2-playButton.size.height/2, playButton.size.width, playButton.size.height)];
UIImage *result = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return result;
}

Related

Get Image from image View

if we add more than one View to imageView and NSString also. Is it possible to get image from ImageView that contain all images and strings etc?
like this
From what I understand in your question, you have added an ImageView for the building, another for the video symbol (as a subview) and a UILabel (again as a subview) for the time. You want all these to be rendered as a single UIImage. For this, you can use:
UIGraphicsBeginImageContext(imageView.frame.size);
CGContextRef context = UIGraphicsGetCurrentContext();
[imageView.layer renderInContext:context];
UIImage *screenShot = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
The screenshot UIImage will contain your desired image.

iOS 7 blur effect on videoPlayer

I want to make blur effect on video player.
So I play video using AVPlayer and whenever I want to share the video to social, share window display on video player. just I want to apply blur effect to share window's background.
renderContext function doesn't render AVPlayer's layer. But I saw that apple's new API drawViewHierarchyInRect will render specific layers such as video player or OpenGL layer.
So I used drawViewHierarchyInRect and it works as well on simulator but not on device.
Any idea?
- (UIImage *)snapshotOfVideoPlayer
{
UIGraphicsBeginImageContextWithOptions(self.view.bounds.size, NO, 1.0);
[self.view drawViewHierarchyInRect:self.view.bounds afterScreenUpdates:NO];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return image;
}
I believe the only way is using the AVAssetImageGenerator.
Assuming you have a reference to your AVPlayerItem:
AVURLAsset *asset = (AVURLAsset *)self.playerItem.asset;
AVAssetImageGenerator *imageGenerator = [[AVAssetImageGenerator alloc] initWithAsset:asset];
CGImageRef thumb = [imageGenerator copyCGImageAtTime:self.playerItem.currentTime
actualTime:NULL
error:NULL];
self.videoScreenshotIV.image = [UIImage imageWithCGImage:thumb];
Notice the self.playerItem.currentTime. This will ensure the image will be exactly the same as the moment of the screenshot.
The videoScreenshotIV is an UIImageView (contentMode scaleAspectFit) that is directly over the avplayer view with exactly the same bounds. I hide this UIImageView until I need to take a screenshot, I then first unhide it and set the image, then take the screenshot and then hide it again. It works perfectly! :)

Objective C - UIImage resizing not working

I have a program that fetches an image from the library, but I'm using code I found online to resize that image so that it can fit on the screen (basically making it 640x960), but then it would still be too big to display, so in another UIImage I'm copying the first resized image and re-resizing this one to make it about 1/4 of the screen (or 160x240). The code is this:
for ViewController.h:
UIImage *img;
UIImage *thumb;
-(UIImage*) scaleImage: (UIImage*)image toSize:(CGSize)newSize;
(this of course, is only the code related to my problem)
for ViewController.m
-(UIImage*) scaleImage: (UIImage*)image toSize:(CGSize)newSize {
UIGraphicsBeginImageContext(newSize);
[image drawInRect:CGRectMake(0, 0, newSize.width, newSize.height)];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
and in the same m file on another function, the scaleImage function is called when pressing a button with these lines:
[self scaleImage:img toSize:CGSizeMake(640, 960)];
thumb = img;
[self scaleImage:thumb toSize:CGSizeMake(160, 240)];
In the project I've previously been able to successfully provide an image for img using [info objectForKey:UIImagePickerControllerOriginalImage]; which would be the image chosen from the library. I've already "file owned" everything so that this function takes place (and it does because I create an UIAlert within it and it shows, and a NSLog to print out when scaleImage starts and it does, twice), but the image is never re-sized!! Does anyone know why?? Please let me know, thank you for anyone who comments with help or suggestions!!
Your scaleImage method returns the scaled image, for example
thumb = [self scaleImage:img toSize:CGSizeMake(640, 960)];

Get image data for resizableImageWithCapInsets: after being resized

I have a png that I'm using as an image mask for another image. Both image have rounded corners that are being preserved with resizableImageWithCapInsets:. The image (not the mask) is being sized automatically because it is the progressImage of a UIProgressView.
The problem I'm having is that if I use CGImageMaskCreate to create a mask and simply give it the width of of the width of the UIProgressView, the mask image is being stretched (ie, it does not preserve the caps).
What I'd like to do is create a new UIImage with resizable caps, manually resize the image (perhaps by putting it in a UIImageView), get the data representation of the resized version of the mask image, then create a new image with that data and use that image as the mask. It seems, though, that even if I create a UIImageView with the image, and then set the frame of the UIImageView appropriately, getting the image back out of the view gives my original image.
Is there any way to get the result of a resizableImageWithCapInsets: image without it actually being drawn into the UI?
I would try drawing image piece by piece into CGContext where you can access data provider or extract NSData through creating UIImage
- (UIImage*) drawImage:(UIImage *)image
atSize:(CGSize) size
{
UIGraphicsBeginImageContext(size);
CGContextRef context = UIGraphicsGetCurrentContext();
[image drawInRect:CGRectMake(0, 0, size.width, size.height)];
CGImageRef cgImage = CGBitmapContextCreateImage(context);
UIImage *outImage = [UIImage imageWithCGImage:cgImage];
CGImageRelease(cgImage);
UIGraphicsEndImageContext();
return outImage;
}

renderInContext called on MPMoviePlayerController

I am trying to programatically create an image from the current frame of a MPMoviePlayerController video. I am using renderInContext called upon the player's view.layer however the image that is created is all black with the video player controls, but I am expecting to see the current frame of the video
My code (http://pastie.org/1020066)
UIGraphicsBeginImageContext(moviePlayer.view.bounds.size);
[moviePlayer.view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *viewImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
// I also tried this code (same results)
// moviePlayer is a subview of videoView (UIView object)
UIGraphicsBeginImageContext(videoView.bounds.size);
[videoView.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *viewImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
If you are trying to get a thumbnail of your movie, you should probably check the thumbnailImageAtTime:timeOption: method of MPMoviePlayerController.