How to take a screen shot of a scroll view and attach to an email? - objective-c

I have a UIScrollView that has content that needs to be emailed. The screenshot only captures the visible areas on the screen. The scroll view size is 768 x 2000. The method I am using at the moment is the following;
- (IBAction) Email
{
UIImage* image = nil;
UIGraphicsBeginImageContext(_scrollView.contentSize);
{
CGPoint savedContentOffset = _scrollView.contentOffset;
CGRect savedFrame = _scrollView.frame;
_scrollView.contentOffset = CGPointZero;
_scrollView.frame = CGRectMake(0, 0, _scrollView.contentSize.width, _scrollView.contentSize.height);
[_scrollView.layer renderInContext: UIGraphicsGetCurrentContext()];
image = UIGraphicsGetImageFromCurrentImageContext();
_scrollView.contentOffset = savedContentOffset;
_scrollView.frame = savedFrame;
}
UIGraphicsEndImageContext();
NSData * imageData = UIImageJPEGRepresentation(image, 0.95);
if ([MFMailComposeViewController canSendMail]) {
MFMailComposeViewController *mc = [[MFMailComposeViewController alloc] init];
mc.mailComposeDelegate = self;
[mc addAttachmentData:imageData mimeType:#"image/jpeg" fileName:#"Attachment.jpeg"];
[self presentModalViewController:mc animated:YES];
}
}
Thanks for the replies. I realised I only had the scrollview as an outlet and not a property. Changed the code to the following and fixed the issue;

Related

How control memory usage when applying CIFilters?

When I apply CIFilters to images the memory usage keeps growing and I don't know what to do.
I've tried everything I could:
using #autoreleasepool:
- (UIImage *)applySepiaToneTo:(UIImage *)img //Sepia
{
#autoreleasepool
{
CIImage *ciimageToFilter = [CIImage imageWithCGImage:img.CGImage];
CIFilter *sepia = [CIFilter filterWithName:#"CISepiaTone"
keysAndValues: kCIInputImageKey, ciimageToFilter,
#"inputIntensity", #1.0, nil];
return [self retrieveFilteredImageWithFilter:sepia];
}
}
- (UIImage *)retrieveFilteredImageWithFilter:(CIFilter *)filtro
{
#autoreleasepool
{
CIImage *ciimageFiltered = [filtro outputImage];
CGImageRef cgimg = [_context createCGImage:ciimageFiltered
fromRect:[ciimageFiltered extent]];
UIImage *filteredImage = [UIImage imageWithCGImage:cgimg];
CGImageRelease(cgimg);
return filteredImage;
}
}
I'm also downsizing the image to be filtered and doing the filtering in a background thread:
- (void)filterWasSelected:(NSNotification *)notification
{
self.darkeningView.alpha = 0.5;
self.darkeningView.userInteractionEnabled = YES;
[self.view bringSubviewToFront:self.darkeningView];
[self.activityIndic startAnimating];
[self.view bringSubviewToFront:self.activityIndic];
int indice = [notification.object intValue];
__block NSArray *returnObj;
__block UIImage *auxUiimage;
if(choosenImage.size.width == 1280 || choosenImage.size.height == 1280)
{
UIImageView *iv;
if(choosenImage.size.width >= choosenImage.size.height)
{
float altura = (320 * choosenImage.size.height)/choosenImage.size.width;
iv = [[UIImageView alloc] initWithFrame:CGRectMake(0,0,320,altura)];
iv.image = choosenImage;
}
else
{
float largura = (choosenImage.size.width * 320)/choosenImage.size.height;
iv = [[UIImageView alloc] initWithFrame:CGRectMake(0,0,largura,320)];
iv.image = choosenImage;
}
UIGraphicsBeginImageContextWithOptions(iv.bounds.size, YES, 0.0);
[iv.layer renderInContext:UIGraphicsGetCurrentContext()];
auxUiimage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
}
else
auxUiimage = choosenImage;
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_BACKGROUND, 0), ^{
if(artisticCollection)
returnObj = [self.filterCoordinator setupFilterArtisticType:indice toImage:auxUiimage];
else
returnObj = [self.filterCoordinator setupFilterOldOrVintageType:indice toImage:auxUiimage];
dispatch_async(dispatch_get_main_queue(), ^{
self.darkeningView.alpha = 0.3;
self.darkeningView.userInteractionEnabled = NO;
[self.activityIndic stopAnimating];
[self.view bringSubviewToFront:stageBackground];
[self.view bringSubviewToFront:stage];
[self.view bringSubviewToFront:self.filtersContainerView];
[self.view bringSubviewToFront:self.framesContainerView];
[self.view bringSubviewToFront:self.colorsContainerView];
if(returnObj)
{
auxUiimage = [returnObj firstObject];
NSLog(#"filtered image width = %f and height = %f", auxUiimage.size.width, auxUiimage.size.height);
returnObj = nil;
choosenImageContainer.image = auxUiimage;
}
});
});
}
I've also tried creating the context using the contextWithEAGLContext method, nothing changed.
I've researched a lot including stack overflow and found nothing.
Until I place the image in the image view (the image comes from the photo album) I'm only using 23 mega of memory, when I apply a filter, the use jumps to 51 mega and does not comes down. If I continue to apply other filters the memory usage only grows.
There's no linking in my app, I've checked in Instruments.
Also the bringSubviewToFront methods are not responsible, I've checked.
It's in the creation of the CIImage followed by the creation of the CIFilter object.
I know that in the process of applying the filter data is loaded into memory, but how to clean the memory after applying the filter?
Is there any secret that I'm not aware of?? Please help

Crop UIImagePicker and output to CollectionView cell

I am trying to create a camera button that takes a square picture that then fills a CollectionViewController cell. Right now, I'm having a little trouble cropping the picture and outputting it to the cell. In the lines of code commented "Crop the image to a square: I'm getting an error saying that "image" is an undeclared identifier. How do I crop the image and set it to a cell?
- (IBAction)cameraButtonClicked:(id)sender {
if (![UIImagePickerController isSourceTypeAvailable:(UIImagePickerControllerSourceTypeCamera)]) {
UIAlertView *cameraAlertView = [[UIAlertView alloc] initWithTitle:#"Camera Not Available" message:#"There is no camera on this device which really defeats the purpose of this game. We suggest you get an iDevice with a camera." delegate:nil cancelButtonTitle:#"Okay" otherButtonTitles:nil];
[cameraAlertView show];
}else{
UIImagePickerController *ipc = [[UIImagePickerController alloc] init];
ipc.sourceType = UIImagePickerControllerSourceTypeCamera;
if (ipc.sourceType == UIImagePickerControllerSourceTypeCamera) {
//Create camera overlay
CGRect f = ipc.view.bounds;
f.size.height -= ipc.navigationBar.bounds.size.height;
CGFloat barHeight = (f.size.height - f.size.width) / 2;
UIGraphicsBeginImageContext(f.size);
[[UIColor colorWithWhite:0 alpha:.5] set];
UIRectFillUsingBlendMode(CGRectMake(0, 0, f.size.width, barHeight), kCGBlendModeNormal);
UIRectFillUsingBlendMode(CGRectMake(0, f.size.height - barHeight, f.size.width, barHeight), kCGBlendModeNormal);
UIImage *overlayImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
UIImageView *overlayIV = [[UIImageView alloc] initWithFrame:f];
overlayIV.image = overlayImage;
[ipc.cameraOverlayView addSubview:overlayIV];
}
ipc.delegate = self;
[self presentViewController:ipc animated:YES completion:nil];
//Crop the image to a square
CGSize imageSize = image.size;
CGFloat width = imageSize.width;
CGFloat height = imageSize.height;
if (width != height) {
CGFloat newDimension = MIN(width, height);
CGFloat widthOffset = (width - newDimension) / 2;
CGFloat heightOffset = (height - newDimension) / 2;
UIGraphicsBeginImageContextWithOptions(CGSizeMake(newDimension, newDimension), NO, 0.);
[image drawAtPoint:CGPointMake(-widthOffset, -heightOffset)
blendMode:kCGBlendModeCopy
alpha:1.];
image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
}
}
}
#pragma mark ImagePicker Delegate
-(void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
[self dismissViewControllerAnimated:YES completion:nil];
UIImage *image = [info objectForKey:UIImagePickerControllerOriginalImage];
self.imageView.image = image;
}
-(void)imagePickerControllerDidCancel:(UIImagePickerController *)picker
{
[self dismissViewControllerAnimated:YES completion:nil];
}
#end
Move your "Crop" coded below this line:
UIImage *image = [info objectForKey:UIImagePickerControllerOriginalImage]; in
didFinishPickingMediaWithInfo
You will need to update your dataSource with the newly cropped image. You can then either call reloadData or insertItemsAtIndexPaths: on your UICollectionView to have it reflect the newly added image. I would side with calling the latter method as it will be more efficient than reloading the entire collection with reloadData (when there are lots of images in the collection).
UICollectionView reference is here.

NSMutableArray not holding UIImages properly?

I have enabled my Cocoa Touch app to be navigable by swiping left or right to alter positions in history. The animation is kind of done like Android's "card" style. Where swiping to the left (<--) just moves the current screen image out of the way, while showing the previous view beneath it.
This works fine, but when I want to swipe the other way (-->), to go back, I need to get the previous image, and move that over the current view. Now, I had this working if I only store the previous image, but what if I go <-- a few times, then I will not have enough images.
So, the solution is obvious, use an NSMutableArray and just throw the latest image at the front of the array, and when you swipe the other way, just use the first image in the array. However, the image never shows when I start the animation. It just shows nothing. Here's the required methods you should see:
-(void)animateBack {
CGRect screenRect = [self.view bounds];
UIImage *screen = [self captureScreen];
[imageArray insertObject:screen atIndex:0]; //Insert the screenshot at the first index
imgView = [[UIImageView alloc] initWithFrame:screenRect];
imgView.image = screen;
[imgView setHidden:NO];
NSLog(#"Center of imgView is x = %f, y = %f", imgView.center.x, imgView.center.y);
CGFloat startPointX = imgView.center.x;
CGFloat width = screenRect.size.width;
NSLog(#"Width = %f", width);
imgView.center = CGPointMake(imgView.center.x, imgView.center.y);
[self.view addSubview: imgView];
[self navigateBackInHistory];
[UIView animateWithDuration:.3 animations:^{
isSwiping = 1;
imgView.center = CGPointMake(startPointX - width, imgView.center.y);
} completion:^(BOOL finished){
// Your animation is finished
[self clearImage];
isSwiping = 0;
}];
}
-(void)animateForward {
CGRect screenRect = [self.view bounds];
//UIImage *screen = [self captureScreen];
UIImage *screen = [imageArray objectAtIndex:0]; //Get the latest image
imgView = [[UIImageView alloc] initWithFrame:screenRect];
imgView.image = screen;
[imgView setHidden:NO];
NSLog(#"Center of imgView is x = %f, y = %f", imgView.center.x, imgView.center.y);
CGFloat startPointX = imgView.center.x;
CGFloat width = screenRect.size.width;
NSLog(#"Width = %f", width);
imgView.center = CGPointMake(imgView.center.x - width, imgView.center.y);
[self.view addSubview: imgView];
[UIView animateWithDuration:.3 animations:^{
isSwiping = 1;
imgView.center = CGPointMake(startPointX, imgView.center.y);
} completion:^(BOOL finished){
// Your animation is finished
[self navigateForwardInHistory];
[self clearImage];
isSwiping = 0;
}];
}
-(void)clearImage {
[imgView setHidden:YES];
imgView.image = nil;
}
-(void)navigateBackInHistory {
[self saveItems:self];
[self alterIndexBack];
item = [[[LEItemStore sharedStore] allItems] objectAtIndex:currentHistoryIndex];
[self loadItems:self];
}
-(void)navigateForwardInHistory {
[imageArray removeObjectAtIndex:0]; //Remove the latest image, since we just finished swiping this way.
[self saveItems:self];
[self alterIndexForward];
item = [[[LEItemStore sharedStore] allItems] objectAtIndex:currentHistoryIndex];
[self loadItems:self];
}
Note that imgView is a class-level UIImageView and imageArray is a class level array.
Any ideas? Thanks.
Edit
Here's the code at the top of my .m to initalize it. Still does not work.
.....
NSMutableArray *imageArray;
- (void)viewDidLoad
{
[super viewDidLoad];
imageArray = [imageArray initWithObjects:nil];
It looks like you forgot to create the array. Something like this at the appropriate time would do (assuming ARC):
imageArray = [NSMutableArray array];
Glad that worked out.

Objective C - Saving a Filtered UIImage

I have a view with a slider and a background image. The background image is updated according to the slider, where the slider controls the level of the filter.
I am using this framework https://github.com/BradLarson/GPUImage to process the image with the filter.
From the code below, the image is updating with the slider, and the filter is working great. However I am trying to save the image once processed by the filter(to set a UIImage in another class), but I cannot achieve this(the filtered image is not saved, but the ufiltered image is saved)... am I saving the image wrong, incorrect format?
Please help me I have been at this for hours!
- (void)loadView
{
CGRect mainScreenFrame = [[UIScreen mainScreen] applicationFrame];
GPUImageView *primaryView = [[GPUImageView alloc] initWithFrame:mainScreenFrame];
self.view = primaryView;
*imageSlider = [[UISlider alloc] initWithFrame:CGRectMake(25.0, mainScreenFrame.size.height - 50.0, mainScreenFrame.size.width - 50.0, 40.0)];
[imageSlider addTarget:self action:#selector(updateSliderValue:) forControlEvents:UIControlEventValueChanged];
imageSlider.autoresizingMask = UIViewAutoresizingFlexibleWidth | UIViewAutoresizingFlexibleTopMargin;
imageSlider.minimumValue = 0.0;
imageSlider.maximumValue = 1.0;
imageSlider.value = 0.5;
[primaryView addSubview:imageSlider];
[self setupDisplayFiltering];
- (IBAction)updateSliderValue:(id)sender
{
CGFloat midpoint = [(UISlider *)sender value];
[(GPUImageBrightnessFilter *)sepiaFilter setBrightness:midpoint];
//[(GPUImageSepiaFilter *)sepiaFilter setIntensity:[(UISlider *)sender value]];
//[(GPUImageSaturationFilter *)sepiaFilter setSaturation:midpoint];
//[(GPUImageRGBFilter *)sepiaFilter setGreen:midpoint];
[sourcePicture processImage];
}
- (void)setupDisplayFiltering;
{
NSUserDefaults *defaults = [NSUserDefaults standardUserDefaults];
NSData *imageToUseData = [defaults dataForKey:#"imageToUse"];
UIImage *inputImage = [UIImage imageWithData:imageToUseData];
sourcePicture = [[GPUImagePicture alloc] initWithImage:inputImage smoothlyScaleOutput:YES];
//sepiaFilter = [[GPUImageSepiaFilter alloc] init];
sepiaFilter = [[GPUImageBrightnessFilter alloc] init];
GPUImageView *imageView = (GPUImageView *)self.view;
[sepiaFilter forceProcessingAtSize:imageView.sizeInPixels]; // This is now needed to make the filter run at the smaller output size
[sourcePicture addTarget:sepiaFilter];
[sepiaFilter addTarget:imageView];
[sourcePicture processImage];
UIImage *outputImage = [sepiaFilter imageFromCurrentlyProcessedOutput];
UIImage *quickFilteredImage = [brightnessFilter imageByFilteringImage:inputImage];
// set the image chosen for other classes to set uiimages with
NSData *imageToStore = UIImageJPEGRepresentation(outputImage, 100);
[defaults removeObjectForKey:#"imageToUse"];
[defaults setObject:imageToStore forKey:#"imageToUse"];
}

Rotating a full screen UIImageView

I have two images:
Help-Portrait.png (320 x 480)
Help-Landscape.png (480 x 320)
When a user clicks the help button on any view, they need to be presented with the correct image, which should also rotate when the device does. I have tried adding the imageView to both the window, and the navigation controller view.
For some reason I am having issues with this.
Could anyone shed light on what I am doing wrong?
UIImage *image = nil;
CGRect frame;
if (UIInterfaceOrientationIsPortrait([[UIApplication sharedApplication] statusBarOrientation])) {
image = [UIImage imageNamed:#"Help-Portrait.png"];
frame = CGRectMake(0, 0, 320, 480);
} else {
image = [UIImage imageNamed:#"Help-Landscape.png"];
frame = CGRectMake(0, 0, 480, 320);
}
if (!helpImageView) {
helpImageView = [[UIImageView alloc] initWithFrame:frame];
helpImageView.autoresizingMask = UIViewAutoresizingFlexibleHeight | UIViewAutoresizingFlexibleWidth;
helpImageView.image = image;
}
[[UIApplication sharedApplication] setStatusBarHidden:YES];
UITapGestureRecognizer *tap = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(helpImageTapped:)];
helpImageView.userInteractionEnabled = YES;
[helpImageView addGestureRecognizer:tap];
[self.view addSubview:helpImageView];
[tap release];
willRotateToInterfaceOrientation:
if(helpImageView) {
[(id)[UIApplication sharedApplication] setStatusBarHidden:NO animated:YES];
if (UIInterfaceOrientationIsPortrait(toInterfaceOrientation)) {
helpImageView.image = [UIImage imageNamed:#"Help-Portrait.png"];
} else {
helpImageView.image = [UIImage imageNamed:#"Help-Landscape.png"];
}
}
When you rotate the device the image and the frame don't change, and you end up with two thirds of the portrait image displayed on the left part of the screen.
What I want is it for it to show the correct image for the orientation, the right way up. Also I would like animation for the image rotation, but thats a side issue
The place where you need to adjust your button image is in your ViewController's shouldAutorotateToInterfaceOrientation method (documentation linked for you).
Do something like:
- (BOOL)shouldAutorotateToInterfaceOrientation: (UIInterfaceOrientation)interfaceOrientation
{
UIImage *image = NULL;
if (UIInterfaceOrientationIsPortrait(interfaceOrientation))
{
image = [UIImage imageNamed:#"Help-Portrait.png"];
} else {
image = [UIImage imageNamed:#"Help-Landscape.png"];
}
[yourButton setImage: image forState: UIControlStateNormal]
return YES;
}
Michael Dautermann's answer looks to have almost all the answer, but I'm opposed to using shouldAutorotateToInterfaceOrientation. This method is designed only to determine if a rotation should or should not occur, nothing else.
You should use either didRotateFromInterfaceOrientation: willAnimateRotationToInterfaceOrientation:duration instead.
didRotateFromInterfaceOrientation: - interfaceOrientation is already set on your UIViewController so you can get the current orientation. In this case the rotation animation is already complete.
willAnimateRotationToInterfaceOrientation:duration - The benefit of this method is execution time. You are inside the rotation animation so you won't have the less than pretty effects which happens when you change UI either after the rotation animation completes.
Got it working, with this code:
- (void)showHelpImage {
NSString *imageName = #"Help_Portrait.png";
CGRect imageFrame = CGRectMake(0, 0, 320, 480);
helpImageView = [[UIImageView alloc] initWithImage:[UIImage imageNamed:imageName]];
helpImageView.frame = imageFrame;
[self.view addSubview:helpImageView];
[self updateHelpImageForOrientation:self.interfaceOrientation];
UITapGestureRecognizer *tap = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(helpImageTapped:)];
helpImageView.userInteractionEnabled = YES;
[helpImageView addGestureRecognizer:tap];
[self.view addSubview:helpImageView];
[tap release];
}
- (void)willRotateToInterfaceOrientation:(UIInterfaceOrientation)toInterfaceOrientation duration:(NSTimeInterval)duration
{
[self updateHelpImageForOrientation:toInterfaceOrientation];
}
- (void)updateHelpImageForOrientation:(UIInterfaceOrientation)orientation {
NSString *imageName = nil;
CGRect imageFrame = helpImageView.frame;
if (orientation == UIInterfaceOrientationPortrait || orientation == UIInterfaceOrientationPortraitUpsideDown) {
imageName = #"Help_Portrait.png";
imageFrame = CGRectMake( 0, 0, 320, 480);
} else if (orientation == UIInterfaceOrientationLandscapeLeft || orientation == UIInterfaceOrientationLandscapeRight) {
imageName = #"Help_Landscape.png";
imageFrame = CGRectMake( 0, 0, 480, 320);
}
helpImageView.image = [UIImage imageNamed:imageName];
helpImageView.frame = imageFrame;
}
Got the idea from:
http://www.dobervich.com/2010/10/22/fade-out-default-ipad-app-image-with-proper-orientation/