To Convert HTML doc to image in cocoa - objective-c

is it possible to convert the HTML page to image in cocoa?
Actually i have created the complete view in the HTML and now i want to convert the whole html preview to the image (any jpeg or png etc.).
I couldn't find any resource or sample on the web, which provides some sort of help on my above queries.It's highly appreciated if someone could share his wisdom on how I can achieve this.
Thanks in advance..

First off, I'd like to thank sergio... his answer got me started but I thought I'd share some of the code that I didn't find obvious that I had to write to make it work:
Here's how to make a thumbnail for a page without ever having it displayed:
// Your width and height can be whatever you like, but if you want this to render
// off screen, you need an x and y bigger than the superview's width and height
UIWebView* webView = [[UIWebView alloc] initWithFrame:CGRectMake(largerScreenDimension, largerScreenDimension, largerScreenDimension, largerScreenDimension)];
[self.view addSubview:webView]; // UIWebViews without an assigned superview don't load ever.
webView.delegate = self; // or whoever you have implement UIWebViewDelegate
webView.scalesToFit = YES; // This zooms the page appropriately to fill the entire thumbnail.
[webView loadRequest:[NSURLRequest requestWithURL:url]];
Then implement this in your delegate:
- (void)webViewDidFinishLoad:(UIWebView *)webView {
UIGraphicsBeginImageContext(webView.bounds.size);
[webView.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *webViewImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSData *thumbnailData = UIImagePNGRepresentation(webViewImage);
[webView removeFromSuperview];
}
Finally, to display this thumbnail you'll need something like:
thumbnailImageView.image = [UIImage imageWithData:thumbnailData];
As a bonus thing I'll mention, I wanted multiple thumbnails to be generated at once. I found using objc_setAssociatedObject() and objc_getAssociatedObject() to be very helpful with keeping track of which webView was loading which thumbnail. Going into detail on how that worked is beyond the scope of this question, though.

You can draw your view in an image context, like this:
UIWebView* view = ...
....
UIGraphicsBeginImageContext(view.bounds.size);
[view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *viewImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSData *imagedata = UIImagePNGRepresentation(viewimage);
NSString *encodedString = [imageData base64Encoding];
Another option would be using Quartz PDF engine to create a PDF.

Related

iOS 7 blur effect on videoPlayer

I want to make blur effect on video player.
So I play video using AVPlayer and whenever I want to share the video to social, share window display on video player. just I want to apply blur effect to share window's background.
renderContext function doesn't render AVPlayer's layer. But I saw that apple's new API drawViewHierarchyInRect will render specific layers such as video player or OpenGL layer.
So I used drawViewHierarchyInRect and it works as well on simulator but not on device.
Any idea?
- (UIImage *)snapshotOfVideoPlayer
{
UIGraphicsBeginImageContextWithOptions(self.view.bounds.size, NO, 1.0);
[self.view drawViewHierarchyInRect:self.view.bounds afterScreenUpdates:NO];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return image;
}
I believe the only way is using the AVAssetImageGenerator.
Assuming you have a reference to your AVPlayerItem:
AVURLAsset *asset = (AVURLAsset *)self.playerItem.asset;
AVAssetImageGenerator *imageGenerator = [[AVAssetImageGenerator alloc] initWithAsset:asset];
CGImageRef thumb = [imageGenerator copyCGImageAtTime:self.playerItem.currentTime
actualTime:NULL
error:NULL];
self.videoScreenshotIV.image = [UIImage imageWithCGImage:thumb];
Notice the self.playerItem.currentTime. This will ensure the image will be exactly the same as the moment of the screenshot.
The videoScreenshotIV is an UIImageView (contentMode scaleAspectFit) that is directly over the avplayer view with exactly the same bounds. I hide this UIImageView until I need to take a screenshot, I then first unhide it and set the image, then take the screenshot and then hide it again. It works perfectly! :)

UIBarButtonItem with UIImage Always Tinted iOS 7

I'm trying to add a UIBarButtonItem containing a UIImage to a UIToolbar. The image keeps being tinted and I can't get it to show as the original colored image - all I want to do is display an image, verbatim, in a UIBarButtonItem! I'm following the directions in the iOS 7 transition guide to set the image rendering mode to UIImageRenderingModeAlwaysOriginal.
UIImage *image = [UIImage imageNamed:#"myImage.png"];
image = [image imageWithRenderingMode:UIImageRenderingModeAlwaysOriginal];
UIBarButtonItem *ratingImage = [[UIBarButtonItem alloc] initWithImage:image style:UIBarButtonItemStyleBordered target:nil action:nil];
[toolbar setItems:[NSArray arrayWithObjects:ratingImage, nil] animated:YES];
One thing to note is that I set the tintColor for the main UIWindow of my app right when it loads...maybe this isn't important with regard to my issue, but thought I'd mention it.
I spent an evening trying to figure this out as well. You were very close to the solution.
The trick is to instantiate the UIImage with the rendering mode.
Instead of doing:
UIImage *image = [UIImage imageNamed:#"myImage.png"];
image = [image imageWithRenderingMode:UIImageRenderingModeAlwaysOriginal];
do this:
UIImage *image = [[UIImage imageNamed:#"myImage.png"] imageWithRenderingMode:UIImageRenderingModeAlwaysOriginal];
and it works!
In my case, I had dragged a Navigation bar to my viewcontroller in the IB, and added the BarButtonItem. But don't provide the item an image in the IB. Make an outlet and assign it the UIImage (like we created above) by doing this:
[myCustomBarButtonItem setImage:image];
Hope this works for you.
UIImageRenderingModeAlwaysOriginal can also be set by selecting the image in your Assets.xcassets "folder" in XCode and setting the "Render as" dropdown to "Original image".
For Swift 2.1+ it would look like this:
let image : UIImage? = UIImage(named:"myImage.png")!.imageWithRenderingMode(UIImageRenderingMode.AlwaysOriginal)
UPDATED Swift 3
let image : UIImage? = UIImage(named:"myImage.png")!.withRenderingMode(.alwaysOriginal)
The accepted answer is fine but if you placed the UIBarButtonItem in a storyboard or xib then you can just:
Go to the Assets catalog where the image lives
Select the image
Go to the attributes inspector (cmd-opt-4)
Set "Render As" to "Original Image"
Only do this if you want all instances of this image to show up without tinting.
If you want it to work for versions of iOS less than v7, you might need to to this:
UIImage *image = [UIImage imageNamed:#"myImage.png"];
#try {
image = [image imageWithRenderingMode:UIImageRenderingModeAlwaysOriginal];
} #catch (NSException *exception) {
}
Since imageWithRenderingMode: is an iOS 7 method, you'll get an exception if you try and use it with a lesser version.

Objective C - UIImage resizing not working

I have a program that fetches an image from the library, but I'm using code I found online to resize that image so that it can fit on the screen (basically making it 640x960), but then it would still be too big to display, so in another UIImage I'm copying the first resized image and re-resizing this one to make it about 1/4 of the screen (or 160x240). The code is this:
for ViewController.h:
UIImage *img;
UIImage *thumb;
-(UIImage*) scaleImage: (UIImage*)image toSize:(CGSize)newSize;
(this of course, is only the code related to my problem)
for ViewController.m
-(UIImage*) scaleImage: (UIImage*)image toSize:(CGSize)newSize {
UIGraphicsBeginImageContext(newSize);
[image drawInRect:CGRectMake(0, 0, newSize.width, newSize.height)];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
and in the same m file on another function, the scaleImage function is called when pressing a button with these lines:
[self scaleImage:img toSize:CGSizeMake(640, 960)];
thumb = img;
[self scaleImage:thumb toSize:CGSizeMake(160, 240)];
In the project I've previously been able to successfully provide an image for img using [info objectForKey:UIImagePickerControllerOriginalImage]; which would be the image chosen from the library. I've already "file owned" everything so that this function takes place (and it does because I create an UIAlert within it and it shows, and a NSLog to print out when scaleImage starts and it does, twice), but the image is never re-sized!! Does anyone know why?? Please let me know, thank you for anyone who comments with help or suggestions!!
Your scaleImage method returns the scaled image, for example
thumb = [self scaleImage:img toSize:CGSizeMake(640, 960)];

Programmatically email an image with some subviews but not others?

I'm working on something that will create an email and attach an image of the screen for the user to send. I'm using the following code to create and attach the image.
UIGraphicsBeginImageContext([self.view frame].size);
[[self.view layer] renderInContext:UIGraphicsGetCurrentContext()];
UIImage *myImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
UIGraphicsBeginImageContext([self.view bounds].size);
[myImage drawInRect:CGRectMake(0, 0, [self.view bounds].size.width,[self.view bounds].size.height)];
myImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSData *imageData = UIImagePNGRepresentation(myImage);
[mailer addAttachmentData:imageData mimeType:#"image/jpg" fileName:#"blah"];
However, this includes all of the subviews, and I want to exclude a toolbar and a segmented view, leaving only the view above the toolbar and the text field in that view. I've tagged all the relevant views and subviews, but how do I use those tags to create an image that includes what I want and excludes what I don't want?
Before you render the image, set all the views you don't want to show to hidden.
I have several apps that do this exact same thing, and that's what I do.
renderInContext and related functions basically take a screenshot, so WYSIWYG

Can't draw UImage in UIView::drawRect

I know this seems like a simple task, which is why I don't understand why I can't get the image to render.
When I set up my UIView, I do the following:
myUiView.backgroundColor = [UIColor clearColor];
myUiView.opaque = NO;
I create and retain the UIImage in the init function of my UIView:
image = [[UIImage imageWithContentsOfFile:[[NSBundle mainBundle] pathForResource:#"test" ofType:#"png"]] retain];
then my drawRect looks like this:
- (void) drawRect:(CGRect) rect
{
[image drawInRect:self.bounds];
}
Ultimately I'll be manipulating that UIImage via bitmap context, and then in drawRect create a CGImage out of the context, and render that, but for now I'm just trying to get it rendering a known image.
I've been digging through this site, as well as the documentation. I've gone down the CG path and tried drawing it with CGContextDrawImage by following the numerous examples other people have posted, but that didn't work either.
So I've come back to what seems to be the most straightforward way to draw an image, but it isn't working.
Any help would be greatly appreciated.
Thanks in advance.
First of all, verify that the size and position of self.bounds are what you want them to be. If the size is {0,0} nothing will display. Check using this function:
NSLog(#"%#", NSStringFromCGRect(self.bounds));
Also make sure that the image is not nil:
NSLog(#"%#", image);