How to fix translucent toolbar screenshot showing as black in iOS 7? - ios7

My top and bottom toolbars are showing as black when in the app they show as light gray. I'm assuming it may have something to do with the default 'translucency checkbox' I found in the .storyboard, attributes inspector of the toolbar?
I'm using a UIActivityViewController to shoot this image out in an email. Here's the code for grabbing the screenshot:
// Grab a screenshot
UIGraphicsBeginImageContext(self.view.bounds.size);
[self.view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *imageToShare = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();

Solution found here: iOS: what's the fastest, most performant way to make a screenshot programmatically?
Seems like this should be a factory method provided by an Apple framework though. I'm putting this a Utils.h/.m class:
+ (UIImage *)screenshot
{
CGSize imageSize = CGSizeZero;
UIInterfaceOrientation orientation = [UIApplication sharedApplication].statusBarOrientation;
if (UIInterfaceOrientationIsPortrait(orientation)) {
imageSize = [UIScreen mainScreen].bounds.size;
} else {
imageSize = CGSizeMake([UIScreen mainScreen].bounds.size.height, [UIScreen mainScreen].bounds.size.width);
}
UIGraphicsBeginImageContextWithOptions(imageSize, NO, 0);
CGContextRef context = UIGraphicsGetCurrentContext();
for (UIWindow *window in [[UIApplication sharedApplication] windows]) {
CGContextSaveGState(context);
CGContextTranslateCTM(context, window.center.x, window.center.y);
CGContextConcatCTM(context, window.transform);
CGContextTranslateCTM(context, -window.bounds.size.width * window.layer.anchorPoint.x, -window.bounds.size.height * window.layer.anchorPoint.y);
if (orientation == UIInterfaceOrientationLandscapeLeft) {
CGContextRotateCTM(context, M_PI_2);
CGContextTranslateCTM(context, 0, -imageSize.width);
} else if (orientation == UIInterfaceOrientationLandscapeRight) {
CGContextRotateCTM(context, -M_PI_2);
CGContextTranslateCTM(context, -imageSize.height, 0);
} else if (orientation == UIInterfaceOrientationPortraitUpsideDown) {
CGContextRotateCTM(context, M_PI);
CGContextTranslateCTM(context, -imageSize.width, -imageSize.height);
}
if ([window respondsToSelector:#selector(drawViewHierarchyInRect:afterScreenUpdates:)]) {
[window drawViewHierarchyInRect:window.bounds afterScreenUpdates:YES];
} else {
[window.layer renderInContext:context];
}
CGContextRestoreGState(context);
}
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return image;
}

You should probably set the color you on your layer before rendering it:
self.view.backgroundColor = [UIColor whateverColorYouWant];
UIGraphicsBeginImageContext(self.view.bounds.size);
[self.view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *imageToShare = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
Take a look at that anwser

Related

I can't draw continuous lines on another image using PAN GESTURE

The code which I used to draw on images is as given below, I am using panGesture to find where the user touches. Now when I use this code the lines the user draws comes as points when i am moving my hands over the image very fast.
UIGraphicsBeginImageContextWithOptions((self.thatsMyImage.frame.size), NO, 0.0);
[self.selfieImage.image drawInRect:CGRectMake(0,0,self.thatsMyImage.frame.size.width, self.thatsMyImage.frame.size.height)];
[[UIColor whiteColor] set];
CGContextMoveToPoint(UIGraphicsGetCurrentContext(), from.x, from.y);
CGContextAddLineToPoint(UIGraphicsGetCurrentContext(), to.x , to.y);
CGContextSetLineWidth(UIGraphicsGetCurrentContext(), 10.0f);
CGContextStrokeEllipseInRect(UIGraphicsGetCurrentContext(), CGRectMake(to.x, to.y,10,10));
CGContextSetFillColor(UIGraphicsGetCurrentContext(), CGColorGetComponents([[UIColor blueColor] CGColor]));
CGContextFillPath(UIGraphicsGetCurrentContext());
CGContextStrokePath(UIGraphicsGetCurrentContext());
self.thatsMyImage.image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
This is the method which is being called when panGesture is detected.
-(void)freeFormDrawing:(UIPanGestureRecognizer *)gesture
{
if(gesture.state == UIGestureRecognizerStateChanged)
{
CGPoint p = [gesture locationInView:self.selfieImage];
CGPoint startPoint = lastPoint;
lastPoint = [gesture locationInView:self.selfieImage];
[self drawLineFrom:startPoint endPoint:p];
}
if(gesture.state == UIGestureRecognizerStateEnded)
{
// lastPoint = [gesture locationInView:self.selfieImage];
}
}
Can anybody please tell me how can i do free-form (Doodle) on images with smooth lines and curves? Thanks in advance and Happy Coding!
I had implemented the same what I had was that I was adding a transparent UIImageView above the UIImageView that had my UIImage I wanted to draw on.
UIImageView *drawableView = [[UIImageView alloc]initWithFrame:self.drawImageView.bounds];
drawableView.userInteractionEnabled = YES;
UIPanGestureRecognizer *panGesture = [[UIPanGestureRecognizer alloc] initWithTarget:self action:#selector(drawingViewDidPan:)];
[drawableView addGestureRecognizer:panGesture];
Then when the user panned on the UIImageView I would call this function
- (void)drawingViewDidPan:(UIPanGestureRecognizer*)sender
{
CGPoint currentDraggingPosition = [sender locationInView:drawableView];
if(sender.state == UIGestureRecognizerStateBegan) {
prevDraggingPosition = currentDraggingPosition;
}
if(sender.state != UIGestureRecognizerStateEnded){
[self drawLine:prevDraggingPosition to:currentDraggingPosition];
}
prevDraggingPosition = currentDraggingPosition;
}
Both prevDraggingPosition and currentDraggingPosition are CGPoint.
Then I used the following function to draw line from the prevDraggingPosition to currentDraggingPosition
-(void)drawLine:(CGPoint)from to:(CGPoint)to
{
#autoreleasepool {
CGSize size = drawableView.frame.size;
UIGraphicsBeginImageContextWithOptions(size, NO, 0.0);
CGContextRef context = UIGraphicsGetCurrentContext();
[drawableView.image drawAtPoint:CGPointZero];
CGFloat strokeWidth = 4.0;
strokeColor = colorChangeView.backgroundColor;
CGContextSetLineWidth(context, strokeWidth);
CGContextSetStrokeColorWithColor(context, strokeColor.CGColor);
CGContextSetLineCap(context, kCGLineCapRound);
CGContextMoveToPoint(context, from.x, from.y);
CGContextAddLineToPoint(context, to.x, to.y);
CGContextStrokePath(context);
drawableView.image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
}
}
Then finally to get the image with the drawing on it you can build the image by drawing the drawableView image onto your UIImageView that has your image like this.
- (UIImage*)buildImage
{
#autoreleasepool {
UIGraphicsBeginImageContextWithOptions(originalImageSize, NO, self.drawImageView.image.scale);
[self.drawImageView.image drawAtPoint:CGPointZero];
[drawableView.image drawInRect:CGRectMake(0,0, originalImageSize.width, originalImageSize.height)];
UIImage *tmp = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return tmp;
}
}
Where originalImageSize is the size of your image.
Hope this helps!

taking screenshot programmaticallyin ios7

This the code I use
UIScreen *screen = [UIScreen mainScreen] ;
UIWindow *keyWindow = [[UIApplication sharedApplication] keyWindow];
UIView *view = [screen snapshotViewAfterScreenUpdates:YES];
UIGraphicsBeginImageContextWithOptions(screen.bounds.size, NO, 0);
[keyWindow drawViewHierarchyInRect:keyWindow.bounds afterScreenUpdates:YES];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSData *data= UIImagePNGRepresentation(image);
Then I send the data to SLComposeViewController
[mySLComposerSheet addImage:[UIImage imageWithData:data]];
Everything is fine but the image is rotated, my app is only in landscape mode but the screenshot is vertical
It's because you snapshot UIWindow, who don't handle rotation like UIView / UIViewController. You have to handle rotation yourself.
Try with this code:
- (UIImage *)makeSnapshot
{
CGSize imageSize = CGSizeZero;
UIInterfaceOrientation orientation = [UIApplication sharedApplication].statusBarOrientation;
if (UIInterfaceOrientationIsPortrait(orientation)) {
imageSize = [UIScreen mainScreen].bounds.size;
} else {
imageSize = CGSizeMake([UIScreen mainScreen].bounds.size.height, [UIScreen mainScreen].bounds.size.width);
}
UIGraphicsBeginImageContextWithOptions(imageSize, NO, 0);
CGContextRef context = UIGraphicsGetCurrentContext();
for (UIWindow *window in [[UIApplication sharedApplication] windows]) {
CGContextSaveGState(context);
CGContextTranslateCTM(context, window.center.x, window.center.y);
CGContextConcatCTM(context, window.transform);
CGContextTranslateCTM(context, -window.bounds.size.width * window.layer.anchorPoint.x, -window.bounds.size.height * window.layer.anchorPoint.y);
if (orientation == UIInterfaceOrientationLandscapeLeft) {
CGContextRotateCTM(context, M_PI_2);
CGContextTranslateCTM(context, 0, -imageSize.width);
} else if (orientation == UIInterfaceOrientationLandscapeRight) {
CGContextRotateCTM(context, -M_PI_2);
CGContextTranslateCTM(context, -imageSize.height, 0);
} else if (orientation == UIInterfaceOrientationPortraitUpsideDown) {
CGContextRotateCTM(context, M_PI);
CGContextTranslateCTM(context, -imageSize.width, -imageSize.height);
}
if ([window respondsToSelector:#selector(drawViewHierarchyInRect:afterScreenUpdates:)]) {
[window drawViewHierarchyInRect:window.bounds afterScreenUpdates:YES];
} else {
[window.layer renderInContext:context];
}
CGContextRestoreGState(context);
}
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return image;
}

Taking Screenshot of UIView using UIButton

I am trying to make an app with a button that will take a screenshot of the object drawn on the device's screen and save it on the device's photo gallery...
How will i able to do that? any idea?
UIGraphicsBeginImageContext(self.view.bounds.size);
[self.view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *screenShotimage = UIGraphicsGetImageFromCurrentImageContext();
UIImageWriteToSavedPhotosAlbum(screenShotimage, nil, nil, nil);
UIGraphicsEndImageContext();
This code will take screen shot of your screen and will save the image in photo gallery
Apple describes a way to do it here : http://developer.apple.com/library/ios/#qa/qa1703/_index.html
- (UIImage*)screenshot
{
// Create a graphics context with the target size
// On iOS 4 and later, use UIGraphicsBeginImageContextWithOptions to take the scale into consideration
// On iOS prior to 4, fall back to use UIGraphicsBeginImageContext
CGSize imageSize = [[UIScreen mainScreen] bounds].size;
if (NULL != UIGraphicsBeginImageContextWithOptions)
UIGraphicsBeginImageContextWithOptions(imageSize, NO, 0);
else
UIGraphicsBeginImageContext(imageSize);
CGContextRef context = UIGraphicsGetCurrentContext();
// Iterate over every window from back to front
for (UIWindow *window in [[UIApplication sharedApplication] windows])
{
if (![window respondsToSelector:#selector(screen)] || [window screen] == [UIScreen mainScreen])
{
// -renderInContext: renders in the coordinate space of the layer,
// so we must first apply the layer's geometry to the graphics context
CGContextSaveGState(context);
// Center the context around the window's anchor point
CGContextTranslateCTM(context, [window center].x, [window center].y);
// Apply the window's transform about the anchor point
CGContextConcatCTM(context, [window transform]);
// Offset by the portion of the bounds left of and above the anchor point
CGContextTranslateCTM(context,
-[window bounds].size.width * [[window layer] anchorPoint].x,
-[window bounds].size.height * [[window layer] anchorPoint].y);
// Render the layer hierarchy to the current context
[[window layer] renderInContext:context];
// Restore the context
CGContextRestoreGState(context);
}
}
// Retrieve the screenshot image
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return image;
}
And remember #import <QuartzCore/QuartzCore.h>

Rotating a full screen UIImageView

I have two images:
Help-Portrait.png (320 x 480)
Help-Landscape.png (480 x 320)
When a user clicks the help button on any view, they need to be presented with the correct image, which should also rotate when the device does. I have tried adding the imageView to both the window, and the navigation controller view.
For some reason I am having issues with this.
Could anyone shed light on what I am doing wrong?
UIImage *image = nil;
CGRect frame;
if (UIInterfaceOrientationIsPortrait([[UIApplication sharedApplication] statusBarOrientation])) {
image = [UIImage imageNamed:#"Help-Portrait.png"];
frame = CGRectMake(0, 0, 320, 480);
} else {
image = [UIImage imageNamed:#"Help-Landscape.png"];
frame = CGRectMake(0, 0, 480, 320);
}
if (!helpImageView) {
helpImageView = [[UIImageView alloc] initWithFrame:frame];
helpImageView.autoresizingMask = UIViewAutoresizingFlexibleHeight | UIViewAutoresizingFlexibleWidth;
helpImageView.image = image;
}
[[UIApplication sharedApplication] setStatusBarHidden:YES];
UITapGestureRecognizer *tap = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(helpImageTapped:)];
helpImageView.userInteractionEnabled = YES;
[helpImageView addGestureRecognizer:tap];
[self.view addSubview:helpImageView];
[tap release];
willRotateToInterfaceOrientation:
if(helpImageView) {
[(id)[UIApplication sharedApplication] setStatusBarHidden:NO animated:YES];
if (UIInterfaceOrientationIsPortrait(toInterfaceOrientation)) {
helpImageView.image = [UIImage imageNamed:#"Help-Portrait.png"];
} else {
helpImageView.image = [UIImage imageNamed:#"Help-Landscape.png"];
}
}
When you rotate the device the image and the frame don't change, and you end up with two thirds of the portrait image displayed on the left part of the screen.
What I want is it for it to show the correct image for the orientation, the right way up. Also I would like animation for the image rotation, but thats a side issue
The place where you need to adjust your button image is in your ViewController's shouldAutorotateToInterfaceOrientation method (documentation linked for you).
Do something like:
- (BOOL)shouldAutorotateToInterfaceOrientation: (UIInterfaceOrientation)interfaceOrientation
{
UIImage *image = NULL;
if (UIInterfaceOrientationIsPortrait(interfaceOrientation))
{
image = [UIImage imageNamed:#"Help-Portrait.png"];
} else {
image = [UIImage imageNamed:#"Help-Landscape.png"];
}
[yourButton setImage: image forState: UIControlStateNormal]
return YES;
}
Michael Dautermann's answer looks to have almost all the answer, but I'm opposed to using shouldAutorotateToInterfaceOrientation. This method is designed only to determine if a rotation should or should not occur, nothing else.
You should use either didRotateFromInterfaceOrientation: willAnimateRotationToInterfaceOrientation:duration instead.
didRotateFromInterfaceOrientation: - interfaceOrientation is already set on your UIViewController so you can get the current orientation. In this case the rotation animation is already complete.
willAnimateRotationToInterfaceOrientation:duration - The benefit of this method is execution time. You are inside the rotation animation so you won't have the less than pretty effects which happens when you change UI either after the rotation animation completes.
Got it working, with this code:
- (void)showHelpImage {
NSString *imageName = #"Help_Portrait.png";
CGRect imageFrame = CGRectMake(0, 0, 320, 480);
helpImageView = [[UIImageView alloc] initWithImage:[UIImage imageNamed:imageName]];
helpImageView.frame = imageFrame;
[self.view addSubview:helpImageView];
[self updateHelpImageForOrientation:self.interfaceOrientation];
UITapGestureRecognizer *tap = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(helpImageTapped:)];
helpImageView.userInteractionEnabled = YES;
[helpImageView addGestureRecognizer:tap];
[self.view addSubview:helpImageView];
[tap release];
}
- (void)willRotateToInterfaceOrientation:(UIInterfaceOrientation)toInterfaceOrientation duration:(NSTimeInterval)duration
{
[self updateHelpImageForOrientation:toInterfaceOrientation];
}
- (void)updateHelpImageForOrientation:(UIInterfaceOrientation)orientation {
NSString *imageName = nil;
CGRect imageFrame = helpImageView.frame;
if (orientation == UIInterfaceOrientationPortrait || orientation == UIInterfaceOrientationPortraitUpsideDown) {
imageName = #"Help_Portrait.png";
imageFrame = CGRectMake( 0, 0, 320, 480);
} else if (orientation == UIInterfaceOrientationLandscapeLeft || orientation == UIInterfaceOrientationLandscapeRight) {
imageName = #"Help_Landscape.png";
imageFrame = CGRectMake( 0, 0, 480, 320);
}
helpImageView.image = [UIImage imageNamed:imageName];
helpImageView.frame = imageFrame;
}
Got the idea from:
http://www.dobervich.com/2010/10/22/fade-out-default-ipad-app-image-with-proper-orientation/

kCAFilterNearest maginifcation filter (UIImageView)

I am using QREncoder library found here: https://github.com/jverkoey/ObjQREncoder
Basically, i looked at the example code by this author, and when he creates the QRCode it comes out perfectly with no pixelation. The image itself that the library provides is 33 x 33 pixels, but he uses kCAFilterNearest to magnify and make it very clear (no pixilation). Here is his code:
UIImage* image = [QREncoder encode:#"http://www.google.com/"];
UIImageView* imageView = [[UIImageView alloc] initWithImage:image];
CGFloat qrSize = self.view.bounds.size.width - kPadding * 2;
imageView.frame = CGRectMake(kPadding, (self.view.bounds.size.height - qrSize) / 2,
qrSize, qrSize);
[imageView layer].magnificationFilter = kCAFilterNearest;
[self.view addSubview:imageView];
I have a UIImageView in a xib, and I am setting it's image like this:
[[template imageVQRCode] setImage:[QREncoder encode:ticketNum]];
[[[template imageVQRCode] layer] setMagnificationFilter:kCAFilterNearest];
but the qrcode is really blurry. In the example, it comes out crystal clear.
What am i doing wrong?
Thanks!
UPDATE:I found out that the problem isn't with scaling or anything to do with kCAFFilterNearest. It has to do with generating the PNG image from the view. Here's how it looks on the deive vs how it looks like when i save the UIView to the PNG representation (Notice the QRCodes quality):
UPDATE 2: This is how I am generating the PNG file from UIView:
UIGraphicsBeginImageContextWithOptions([[template view] bounds].size, YES, 0.0);
[[[template view] layer] renderInContext:UIGraphicsGetCurrentContext()];
UIImage *viewImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
[UIImagePNGRepresentation(viewImage) writeToFile:plistPath atomically:YES];
I have used below function for editing image.
- (UIImage *)resizedImage:(CGSize)newSize interpolationQuality:(CGInterpolationQuality)quality
{
BOOL drawTransposed;
switch (self.imageOrientation) {
case UIImageOrientationLeft:
case UIImageOrientationLeftMirrored:
case UIImageOrientationRight:
case UIImageOrientationRightMirrored:
drawTransposed = YES;
break;
default:
drawTransposed = NO;
}
return [self resizedImage:newSize
transform:[self transformForOrientation:newSize]
drawTransposed:drawTransposed
interpolationQuality:quality];
}
- (UIImage *)resizedImage:(CGSize)newSize
transform:(CGAffineTransform)transform
drawTransposed:(BOOL)transpose
interpolationQuality:(CGInterpolationQuality)quality
{
CGRect newRect = CGRectIntegral(CGRectMake(0, 0, newSize.width, newSize.height));
CGRect transposedRect = CGRectMake(0, 0, newRect.size.height, newRect.size.width);
CGImageRef imageRef = self.CGImage;
// Build a context that's the same dimensions as the new size
CGBitmapInfo bitmapInfo = CGImageGetBitmapInfo(imageRef);
if((bitmapInfo == kCGImageAlphaLast) || (bitmapInfo == kCGImageAlphaNone))
bitmapInfo = kCGImageAlphaNoneSkipLast;
CGContextRef bitmap = CGBitmapContextCreate(NULL,
newRect.size.width,
newRect.size.height,
CGImageGetBitsPerComponent(imageRef),
0,
CGImageGetColorSpace(imageRef),
bitmapInfo);
// Rotate and/or flip the image if required by its orientation
CGContextConcatCTM(bitmap, transform);
// Set the quality level to use when rescaling
CGContextSetInterpolationQuality(bitmap, quality);
// Draw into the context; this scales the image
CGContextDrawImage(bitmap, transpose ? transposedRect : newRect, imageRef);
// Get the resized image from the context and a UIImage
CGImageRef newImageRef = CGBitmapContextCreateImage(bitmap);
UIImage *newImage = [UIImage imageWithCGImage:newImageRef];
// Clean up
CGContextRelease(bitmap);
CGImageRelease(newImageRef);
return newImage;
}
UIImageWriteToSavedPhotosAlbum([image resizedImage:CGSizeMake(300, 300) interpolationQuality:kCGInterpolationNone], nil, nil, nil);
Please find below image and let me know if you need any help.