Does anyone has any idea how can I capture screen using objective c in mac os?
to be more specific, how can I capture the active / focused application screen then create an image into a specified path.
Any help is highly appreciated.
#Daniel,
You don't need to understand and implement the whole "Son of Grab". You just need the code below.
The following function will give you the screenshot
// This just invokes the API as you would if you wanted to grab a screen shot. The equivalent using the UI would be to
// enable all windows, turn off "Fit Image Tightly", and then select all windows in the list.
CGImageRef screenShot = CGWindowListCreateImage(CGRectInfinite, kCGWindowListOptionOnScreenOnly, kCGNullWindowID, kCGWindowImageDefault);
Use the following code to convert it to a NSImage
NSBitmapImageRep *bitmapRep = [[NSBitmapImageRep alloc] initWithCGImage:screenShot];
// Create an NSImage and add the bitmap rep to it...
NSImage *image = [[NSImage alloc] init];
[image addRepresentation:bitmapRep];
[bitmapRep release];
bitmapRep = nil;
Have you checked out Apple's “Son of Grab” for capturing images of windows with the CGWindow api?
You can also check Apple's OpenGLScreenSnapshot
Related
Now I work with UIImagePickerController and the crop image is a square.
this what I have now:
and this is what I want! it is possible?
There are a bunch of such "modules"/"libraries" available on GitHub for free. For example, the first match to the search I ran was:
https://github.com/myang-git/iOS-Image-Crop-View
You might find a suitable one.
You can use CGImageCreateWithImageInRect for Crop Images:
CGImageRef RefImage = CGImageCreateWithImageInRect([uncroppedImage CGImage], bounds);
UIImage *croppedImage = [UIImage imageWithCGImage:RefImage];
CGImageRelease(RefImage);
I want to make blur effect on video player.
So I play video using AVPlayer and whenever I want to share the video to social, share window display on video player. just I want to apply blur effect to share window's background.
renderContext function doesn't render AVPlayer's layer. But I saw that apple's new API drawViewHierarchyInRect will render specific layers such as video player or OpenGL layer.
So I used drawViewHierarchyInRect and it works as well on simulator but not on device.
Any idea?
- (UIImage *)snapshotOfVideoPlayer
{
UIGraphicsBeginImageContextWithOptions(self.view.bounds.size, NO, 1.0);
[self.view drawViewHierarchyInRect:self.view.bounds afterScreenUpdates:NO];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return image;
}
I believe the only way is using the AVAssetImageGenerator.
Assuming you have a reference to your AVPlayerItem:
AVURLAsset *asset = (AVURLAsset *)self.playerItem.asset;
AVAssetImageGenerator *imageGenerator = [[AVAssetImageGenerator alloc] initWithAsset:asset];
CGImageRef thumb = [imageGenerator copyCGImageAtTime:self.playerItem.currentTime
actualTime:NULL
error:NULL];
self.videoScreenshotIV.image = [UIImage imageWithCGImage:thumb];
Notice the self.playerItem.currentTime. This will ensure the image will be exactly the same as the moment of the screenshot.
The videoScreenshotIV is an UIImageView (contentMode scaleAspectFit) that is directly over the avplayer view with exactly the same bounds. I hide this UIImageView until I need to take a screenshot, I then first unhide it and set the image, then take the screenshot and then hide it again. It works perfectly! :)
I'm able to capture NSWindows which are visible using code simular to the SonOfGrab Example
But how can I capture a hidden NSWindow? Or what is the way to hide a NSWindow from the user but to still appear in de Windowserver?
Thanks in advance!
It isn't possible to capture the contents of an NSWindow which isn't visible on-screen. When a window is minimized/hidden/whatever, the visual representation is dropped to save memory.
(Not sure on exactly how this is managed, feel free to chime in if you have a deeper knowledge of the window system. I only know this from experience, trying to capture windows with CamTwist and BoinxTV.)
You can do it by:
NSImage *img = [[NSImage alloc] initWithCGImage:[window windowImageShot] size:window.frame.size];
category to NSWindow:
- (CGImageRef)windowImageShot
{
CGWindowID windowID = (CGWindowID)[self windowNumber];
CGWindowImageOption imageOptions = kCGWindowImageBoundsIgnoreFraming | kCGWindowImageNominalResolution;
CGWindowListOption singleWindowListOptions = kCGWindowListOptionIncludingWindow;
CGRect imageBounds = CGRectNull;
CGImageRef windowImage = CGWindowListCreateImage(imageBounds, singleWindowListOptions, windowID, imageOptions);
return windowImage;
}
is it possible to convert the HTML page to image in cocoa?
Actually i have created the complete view in the HTML and now i want to convert the whole html preview to the image (any jpeg or png etc.).
I couldn't find any resource or sample on the web, which provides some sort of help on my above queries.It's highly appreciated if someone could share his wisdom on how I can achieve this.
Thanks in advance..
First off, I'd like to thank sergio... his answer got me started but I thought I'd share some of the code that I didn't find obvious that I had to write to make it work:
Here's how to make a thumbnail for a page without ever having it displayed:
// Your width and height can be whatever you like, but if you want this to render
// off screen, you need an x and y bigger than the superview's width and height
UIWebView* webView = [[UIWebView alloc] initWithFrame:CGRectMake(largerScreenDimension, largerScreenDimension, largerScreenDimension, largerScreenDimension)];
[self.view addSubview:webView]; // UIWebViews without an assigned superview don't load ever.
webView.delegate = self; // or whoever you have implement UIWebViewDelegate
webView.scalesToFit = YES; // This zooms the page appropriately to fill the entire thumbnail.
[webView loadRequest:[NSURLRequest requestWithURL:url]];
Then implement this in your delegate:
- (void)webViewDidFinishLoad:(UIWebView *)webView {
UIGraphicsBeginImageContext(webView.bounds.size);
[webView.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *webViewImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSData *thumbnailData = UIImagePNGRepresentation(webViewImage);
[webView removeFromSuperview];
}
Finally, to display this thumbnail you'll need something like:
thumbnailImageView.image = [UIImage imageWithData:thumbnailData];
As a bonus thing I'll mention, I wanted multiple thumbnails to be generated at once. I found using objc_setAssociatedObject() and objc_getAssociatedObject() to be very helpful with keeping track of which webView was loading which thumbnail. Going into detail on how that worked is beyond the scope of this question, though.
You can draw your view in an image context, like this:
UIWebView* view = ...
....
UIGraphicsBeginImageContext(view.bounds.size);
[view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *viewImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSData *imagedata = UIImagePNGRepresentation(viewimage);
NSString *encodedString = [imageData base64Encoding];
Another option would be using Quartz PDF engine to create a PDF.
is it possible to assign a highres custom uitabbaritem image?
UIImage *img;
img = [UIImage imageNamed:#"TabIcon51#2x.png"];
self.tabBarItem = [[UITabBarItem alloc] initWithTitle:#"more" image:img tag:5];
this doesnt work. is there a workaround, or even better an officel link / solution for this usecase?
thanks
alex
If you're using imageNamed, you can exclude the #2x. The way imageNamed works is that if you're on a high res device it automatically loads the #2x file if it exists, otherwise it loads the regular file.