take screen short iphone when app is running background - objective-c

thanks helping me I am poor in English.
Calling this method every 1 min from NSTimer. I need capture Activity happens in iPhone, not my application.... I tried with bellow code it will take my application screen.... I need to take iPhone screen short...
example user opens safari and types URLthen I need to take that screen capture from my app it is possible.
if it is possible how to achieve that.
if ([[UIScreen mainScreen] respondsToSelector:#selector(scale)]) {
UIGraphicsBeginImageContextWithOptions(self.window.bounds.size, NO, [UIScreen mainScreen].scale);
} else {
UIGraphicsBeginImageContext(self.window.bounds.size);
}
[self.window.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSData *imageData = UIImagePNGRepresentation(image);
if (imageData) {
[imageData writeToFile:#"screenshot.png" atomically:YES];
} else {
NSLog(#"error while taking screenshot");
}

The Answer is NO.
Support for background execution must be declared in advance by the app that uses them.
Below are the Background modes for apps
Please read this link for more information : iOS BackgroundExecution

Related

How to capture fullscreen in objective-c cocoa? [duplicate]

Does anyone has any idea how can I capture screen using objective c in mac os?
to be more specific, how can I capture the active / focused application screen then create an image into a specified path.
Any help is highly appreciated.
#Daniel,
You don't need to understand and implement the whole "Son of Grab". You just need the code below.
The following function will give you the screenshot
// This just invokes the API as you would if you wanted to grab a screen shot. The equivalent using the UI would be to
// enable all windows, turn off "Fit Image Tightly", and then select all windows in the list.
CGImageRef screenShot = CGWindowListCreateImage(CGRectInfinite, kCGWindowListOptionOnScreenOnly, kCGNullWindowID, kCGWindowImageDefault);
Use the following code to convert it to a NSImage
NSBitmapImageRep *bitmapRep = [[NSBitmapImageRep alloc] initWithCGImage:screenShot];
// Create an NSImage and add the bitmap rep to it...
NSImage *image = [[NSImage alloc] init];
[image addRepresentation:bitmapRep];
[bitmapRep release];
bitmapRep = nil;
Have you checked out Apple's “Son of Grab” for capturing images of windows with the CGWindow api?
You can also check Apple's OpenGLScreenSnapshot

Take Screenshot in iOS 8 with Objective C doesn'work

i have written an App that takes screenshots. If the user enables this feature the app uses a background queue to take screenshots.
In iOS7 everything works fine but with iOS 8 there are only send white Screenshots to the server.
I use this method to take a screenshot:
CGImageRef cgScreen = UIGetScreenImage();
if (cgScreen) {
result = [UIImage imageWithCGImage:cgScreen];
CGImageRelease(cgScreen);
}
Does anyone have an idea to solve the problem?
Unfortunately I could not find the Apple's official documentation about the method UIGetScreenImage() but I have read so many stackoverflow entries saying that has been removed in iOS8 and especially iOS7 with 64bit binaries.
I am using following method to make screenshot and which is working from iOS 5 to iOS 8 without any problem.
- (void)makeScreenShotWithView:(UIView *)view scale:(float)scale {
UIGraphicsBeginImageContextWithOptions(view.bounds.size, YES, scale);
[view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return image;
}
if you want to make the whole screen as screenshot then pass a window object for an example view.window.
Try this

iOS 7 blur effect on videoPlayer

I want to make blur effect on video player.
So I play video using AVPlayer and whenever I want to share the video to social, share window display on video player. just I want to apply blur effect to share window's background.
renderContext function doesn't render AVPlayer's layer. But I saw that apple's new API drawViewHierarchyInRect will render specific layers such as video player or OpenGL layer.
So I used drawViewHierarchyInRect and it works as well on simulator but not on device.
Any idea?
- (UIImage *)snapshotOfVideoPlayer
{
UIGraphicsBeginImageContextWithOptions(self.view.bounds.size, NO, 1.0);
[self.view drawViewHierarchyInRect:self.view.bounds afterScreenUpdates:NO];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return image;
}
I believe the only way is using the AVAssetImageGenerator.
Assuming you have a reference to your AVPlayerItem:
AVURLAsset *asset = (AVURLAsset *)self.playerItem.asset;
AVAssetImageGenerator *imageGenerator = [[AVAssetImageGenerator alloc] initWithAsset:asset];
CGImageRef thumb = [imageGenerator copyCGImageAtTime:self.playerItem.currentTime
actualTime:NULL
error:NULL];
self.videoScreenshotIV.image = [UIImage imageWithCGImage:thumb];
Notice the self.playerItem.currentTime. This will ensure the image will be exactly the same as the moment of the screenshot.
The videoScreenshotIV is an UIImageView (contentMode scaleAspectFit) that is directly over the avplayer view with exactly the same bounds. I hide this UIImageView until I need to take a screenshot, I then first unhide it and set the image, then take the screenshot and then hide it again. It works perfectly! :)

UIPasteboard: image saving issue

I'v thought that dealing with UIPasteboard is easy but it turned out to be a time-consuming issue...
I want to store an UIImage in UIPasteboard and then paste this image in iMessage, WhatsApp, Gmail... and others.
That's my method where I use UIPasteboard
- (void) postClipboard
{
if ([[modelView currentView] isImage])
{
UIImage *image = [self readyImageLandscape:orientationLandscape];
[[UIPasteboard generalPasteboard] setImage:image];
}
}
It works on iPhone 3GS 5.1. I've tested it on Gmail and WhatsApp
Then I modified the method to
- (void) postClipboard
{
if ([[modelView currentView] isImage])
{
UIImage *image = [self readyImageLandscape:orientationLandscape];
[[UIPasteboard generalPasteboard] setImage:image];
[[UIPasteboard generalPasteboard] setPersistent:YES];
}
}
Still works on iPhone 3GS 5.1.
But my employer says that it doesn't work on iPhone 4S 6.0 neither in WhatsApp nor in any other application.
Am I doing all wrong or there should be another approach to make it work on iPhone 4S 6.0?
See this answer: https://stackoverflow.com/a/12613632/830946
Looks like that code will work with a single image, but not with multiple.

To Convert HTML doc to image in cocoa

is it possible to convert the HTML page to image in cocoa?
Actually i have created the complete view in the HTML and now i want to convert the whole html preview to the image (any jpeg or png etc.).
I couldn't find any resource or sample on the web, which provides some sort of help on my above queries.It's highly appreciated if someone could share his wisdom on how I can achieve this.
Thanks in advance..
First off, I'd like to thank sergio... his answer got me started but I thought I'd share some of the code that I didn't find obvious that I had to write to make it work:
Here's how to make a thumbnail for a page without ever having it displayed:
// Your width and height can be whatever you like, but if you want this to render
// off screen, you need an x and y bigger than the superview's width and height
UIWebView* webView = [[UIWebView alloc] initWithFrame:CGRectMake(largerScreenDimension, largerScreenDimension, largerScreenDimension, largerScreenDimension)];
[self.view addSubview:webView]; // UIWebViews without an assigned superview don't load ever.
webView.delegate = self; // or whoever you have implement UIWebViewDelegate
webView.scalesToFit = YES; // This zooms the page appropriately to fill the entire thumbnail.
[webView loadRequest:[NSURLRequest requestWithURL:url]];
Then implement this in your delegate:
- (void)webViewDidFinishLoad:(UIWebView *)webView {
UIGraphicsBeginImageContext(webView.bounds.size);
[webView.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *webViewImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSData *thumbnailData = UIImagePNGRepresentation(webViewImage);
[webView removeFromSuperview];
}
Finally, to display this thumbnail you'll need something like:
thumbnailImageView.image = [UIImage imageWithData:thumbnailData];
As a bonus thing I'll mention, I wanted multiple thumbnails to be generated at once. I found using objc_setAssociatedObject() and objc_getAssociatedObject() to be very helpful with keeping track of which webView was loading which thumbnail. Going into detail on how that worked is beyond the scope of this question, though.
You can draw your view in an image context, like this:
UIWebView* view = ...
....
UIGraphicsBeginImageContext(view.bounds.size);
[view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *viewImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSData *imagedata = UIImagePNGRepresentation(viewimage);
NSString *encodedString = [imageData base64Encoding];
Another option would be using Quartz PDF engine to create a PDF.