Adding CIFilters to selfie camera feed from ARSCNView - objective-c

I'm setting up my ARScene using this code which works perfectly. I'm then trying to apply a CIFilter to the selfie camera feed and set it back in real time. The only way I can access the camera feed data is through the -session didUpdateFrame method (I think?) but when I apply the filter and set it back the feed is just pure white as if I set the background.contents = [UIColor White], which I did not.
self.sceneView = [[ARSCNView alloc] initWithFrame:self.view.frame];
self.sceneView.delegate = self;
self.sceneView.showsStatistics = NO;
[self.view addSubview:self.sceneView];
SCNScene *scene = [SCNScene new];
self.sceneView.scene = scene;
ARFaceTrackingConfiguration *facetrack = [[ARFaceTrackingConfiguration alloc] init];
[self.sceneView.session runWithConfiguration:facetrack];
self.sceneView.session.delegate = self;
- (void)session:(ARSession *)session didUpdateFrame:(ARFrame *)frame {
CIImage *beginImage = [CIImage imageWithCVPixelBuffer: frame.capturedImage ];
CIFilter *filter = [CIFilter filterWithName:#"CIComicEffect" keysAndValues: kCIInputImageKey, beginImage, nil];
CIImage *outputImage = [[filter outputImage] imageByCroppingToRect:beginImage.extent];
UIImage *sat_img = [UIImage imageWithCIImage:outputImage];
self.sceneView.scene.background.contents = sat_img;
}
Is this even possible?

Related

Change NSImage brightness using NSSlider

I am working on simple image processing app, here is my unsuccessful attempt to change NSImage brightness:
- (IBAction)brightnessSlider:(NSSlider*)sender {
ViewController *controller = (ViewController*)[NSApplication sharedApplication].keyWindow.contentViewController;
controller.imageView.image = originalImage;
CIImage* const beginImage = [self fromNSImageToCIImage:originalImage];
filter = [CIFilter filterWithName:#"CIColorControls" keysAndValues: kCIInputImageKey, beginImage, kCIInputBrightnessKey, [sender doubleValue], nil];
self->filteredImage = self->filter.outputImage;
controller.imageView.image = [self fromCIImageToNSImage:filteredImage];
}
Here is the implementation of fromNSImageToCIImage and fromCIImageToNSImage:
- (NSImage *)fromCIImageToNSImage:(CIImage *)inputImage {
struct CGImage *cg = [context createCGImage:inputImage fromRect:[inputImage extent]];
NSImage *finalImage = [[NSImage alloc] initWithCGImage:cg size:NSZeroSize];
return finalImage;
}
- (CIImage *)fromNSImageToCIImage:(NSImage *)inputImage {
struct CGImage *cg = [inputImage CGImageForProposedRect: nil context: nil hints: nil];
CIImage *temp = [[CIImage alloc] initWithCGImage: cg options: nil];
return temp;
}
Is it the right way to approach this? I don't even have a clue what is not working. Thanks.
EDITED (I rewrote the code and there's a problem with sender value):
- (IBAction)brightnessSlider:(NSSlider*)sender {
ViewController *controller = (ViewController*)[NSApplication sharedApplication].keyWindow.contentViewController;
CIImage* const beginImage = [self fromNSImageToCIImage:controller.imageView.image];
CIFilter *brightness = [CIFilter filterWithName:#"CIColorControls" keysAndValues: kCIInputImageKey, beginImage, #"inputBrightness", [sender doubleValue], nil];
CIImage *outputImage = brightness.outputImage;
struct CGImage* cgimg = [context createCGImage:outputImage fromRect:[outputImage extent]];
NSImage *newImage = [[NSImage alloc] initWithCGImage:cgimg size:NSZeroSize];
controller.imageView.image = newImage;
}
It crashes when NSSlider is used - in this line: CIFilter *brightness = [CIFilter filterWithName:#"CIColorControls" keysAndValues: kCIInputImageKey, beginImage, #"inputBrightness", [sender doubleValue], nil];
and debugger shows: Thread 1: EXC_BAD_ACCESS (code=1, address=0x6fbbba08)
Changing faulty line to this: CIFilter *brightness = [CIFilter filterWithName:#"CIColorControls" keysAndValues: kCIInputImageKey, beginImage, #"inputBrightness", [NSNumber numberWithFloat:[sender floatValue]], nil]; did the job.

How to take a screen shot of a scroll view and attach to an email?

I have a UIScrollView that has content that needs to be emailed. The screenshot only captures the visible areas on the screen. The scroll view size is 768 x 2000. The method I am using at the moment is the following;
- (IBAction) Email
{
UIImage* image = nil;
UIGraphicsBeginImageContext(_scrollView.contentSize);
{
CGPoint savedContentOffset = _scrollView.contentOffset;
CGRect savedFrame = _scrollView.frame;
_scrollView.contentOffset = CGPointZero;
_scrollView.frame = CGRectMake(0, 0, _scrollView.contentSize.width, _scrollView.contentSize.height);
[_scrollView.layer renderInContext: UIGraphicsGetCurrentContext()];
image = UIGraphicsGetImageFromCurrentImageContext();
_scrollView.contentOffset = savedContentOffset;
_scrollView.frame = savedFrame;
}
UIGraphicsEndImageContext();
NSData * imageData = UIImageJPEGRepresentation(image, 0.95);
if ([MFMailComposeViewController canSendMail]) {
MFMailComposeViewController *mc = [[MFMailComposeViewController alloc] init];
mc.mailComposeDelegate = self;
[mc addAttachmentData:imageData mimeType:#"image/jpeg" fileName:#"Attachment.jpeg"];
[self presentModalViewController:mc animated:YES];
}
}
Thanks for the replies. I realised I only had the scrollview as an outlet and not a property. Changed the code to the following and fixed the issue;

How control memory usage when applying CIFilters?

When I apply CIFilters to images the memory usage keeps growing and I don't know what to do.
I've tried everything I could:
using #autoreleasepool:
- (UIImage *)applySepiaToneTo:(UIImage *)img //Sepia
{
#autoreleasepool
{
CIImage *ciimageToFilter = [CIImage imageWithCGImage:img.CGImage];
CIFilter *sepia = [CIFilter filterWithName:#"CISepiaTone"
keysAndValues: kCIInputImageKey, ciimageToFilter,
#"inputIntensity", #1.0, nil];
return [self retrieveFilteredImageWithFilter:sepia];
}
}
- (UIImage *)retrieveFilteredImageWithFilter:(CIFilter *)filtro
{
#autoreleasepool
{
CIImage *ciimageFiltered = [filtro outputImage];
CGImageRef cgimg = [_context createCGImage:ciimageFiltered
fromRect:[ciimageFiltered extent]];
UIImage *filteredImage = [UIImage imageWithCGImage:cgimg];
CGImageRelease(cgimg);
return filteredImage;
}
}
I'm also downsizing the image to be filtered and doing the filtering in a background thread:
- (void)filterWasSelected:(NSNotification *)notification
{
self.darkeningView.alpha = 0.5;
self.darkeningView.userInteractionEnabled = YES;
[self.view bringSubviewToFront:self.darkeningView];
[self.activityIndic startAnimating];
[self.view bringSubviewToFront:self.activityIndic];
int indice = [notification.object intValue];
__block NSArray *returnObj;
__block UIImage *auxUiimage;
if(choosenImage.size.width == 1280 || choosenImage.size.height == 1280)
{
UIImageView *iv;
if(choosenImage.size.width >= choosenImage.size.height)
{
float altura = (320 * choosenImage.size.height)/choosenImage.size.width;
iv = [[UIImageView alloc] initWithFrame:CGRectMake(0,0,320,altura)];
iv.image = choosenImage;
}
else
{
float largura = (choosenImage.size.width * 320)/choosenImage.size.height;
iv = [[UIImageView alloc] initWithFrame:CGRectMake(0,0,largura,320)];
iv.image = choosenImage;
}
UIGraphicsBeginImageContextWithOptions(iv.bounds.size, YES, 0.0);
[iv.layer renderInContext:UIGraphicsGetCurrentContext()];
auxUiimage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
}
else
auxUiimage = choosenImage;
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_BACKGROUND, 0), ^{
if(artisticCollection)
returnObj = [self.filterCoordinator setupFilterArtisticType:indice toImage:auxUiimage];
else
returnObj = [self.filterCoordinator setupFilterOldOrVintageType:indice toImage:auxUiimage];
dispatch_async(dispatch_get_main_queue(), ^{
self.darkeningView.alpha = 0.3;
self.darkeningView.userInteractionEnabled = NO;
[self.activityIndic stopAnimating];
[self.view bringSubviewToFront:stageBackground];
[self.view bringSubviewToFront:stage];
[self.view bringSubviewToFront:self.filtersContainerView];
[self.view bringSubviewToFront:self.framesContainerView];
[self.view bringSubviewToFront:self.colorsContainerView];
if(returnObj)
{
auxUiimage = [returnObj firstObject];
NSLog(#"filtered image width = %f and height = %f", auxUiimage.size.width, auxUiimage.size.height);
returnObj = nil;
choosenImageContainer.image = auxUiimage;
}
});
});
}
I've also tried creating the context using the contextWithEAGLContext method, nothing changed.
I've researched a lot including stack overflow and found nothing.
Until I place the image in the image view (the image comes from the photo album) I'm only using 23 mega of memory, when I apply a filter, the use jumps to 51 mega and does not comes down. If I continue to apply other filters the memory usage only grows.
There's no linking in my app, I've checked in Instruments.
Also the bringSubviewToFront methods are not responsible, I've checked.
It's in the creation of the CIImage followed by the creation of the CIFilter object.
I know that in the process of applying the filter data is loaded into memory, but how to clean the memory after applying the filter?
Is there any secret that I'm not aware of?? Please help

Weird issue with Core Image and Window Resizing

OK, here's my issue :
I have an NSImageView subclass, onto which the user can drag'n'drop an image (a.k.a. dropper)
Once the image is dropped, a 'processed' version of it is shown in an NSImageView next it (a.k.a. output)
Pretty straightforward so far.
The thing is, whenever I try resizing the window (after showing the processed image), the app crashes (EXC_BAD_ACCESS), nothing being showing in the console (no actual error, even though NSZombieEnabled is set to on).
If I just "copy" the image, the resizing works fine :
[_output setImage:[_dropper image]];
However, when I set _output's image to the filtered version, there comes the problem :
NSImage* processed = [[[_dropper image] copy] filteredWith:[CIFilter sepiaToneWithIntensity:0.5]];
[_output setImage:processed];
And here are the 2 functions mentioned above (actually, categories I've written for CIImage and CIFilter respectively) :
- (NSImage*)filteredWith:(CIFilter *)filter
{
NSImage* start = [self copy];
CIContext* context = [[CIContext alloc] init];
CGImageSourceRef src = CGImageSourceCreateWithData((CFDataRef)[start TIFFRepresentation], NULL);
CGImageRef ref = CGImageSourceCreateImageAtIndex(src, 0, NULL);
CIImage* img = [CIImage imageWithCGImage:ref];
[filter setValue:img forKey:kCIInputImageKey];
CIImage* result = [filter valueForKey:kCIOutputImageKey];
CGImageRef res = [context createCGImage:result fromRect:[result extent]];
NSImage* final = [[NSImage alloc] initWithCGImage:res size:[start size]];
return final;
}
+ (CIFilter*)sepiaToneWithIntensity:(CGFloat)intensity
{
CIFilter* newFilter = [CIFilter filterWithName:#"CISepiaTone"];
[newFilter setValue:[NSNumber numberWithFloat:intensity] forKey:#"inputIntensity"];
return newFilter;
}
Any ideas?

Objective C - Saving a Filtered UIImage

I have a view with a slider and a background image. The background image is updated according to the slider, where the slider controls the level of the filter.
I am using this framework https://github.com/BradLarson/GPUImage to process the image with the filter.
From the code below, the image is updating with the slider, and the filter is working great. However I am trying to save the image once processed by the filter(to set a UIImage in another class), but I cannot achieve this(the filtered image is not saved, but the ufiltered image is saved)... am I saving the image wrong, incorrect format?
Please help me I have been at this for hours!
- (void)loadView
{
CGRect mainScreenFrame = [[UIScreen mainScreen] applicationFrame];
GPUImageView *primaryView = [[GPUImageView alloc] initWithFrame:mainScreenFrame];
self.view = primaryView;
*imageSlider = [[UISlider alloc] initWithFrame:CGRectMake(25.0, mainScreenFrame.size.height - 50.0, mainScreenFrame.size.width - 50.0, 40.0)];
[imageSlider addTarget:self action:#selector(updateSliderValue:) forControlEvents:UIControlEventValueChanged];
imageSlider.autoresizingMask = UIViewAutoresizingFlexibleWidth | UIViewAutoresizingFlexibleTopMargin;
imageSlider.minimumValue = 0.0;
imageSlider.maximumValue = 1.0;
imageSlider.value = 0.5;
[primaryView addSubview:imageSlider];
[self setupDisplayFiltering];
- (IBAction)updateSliderValue:(id)sender
{
CGFloat midpoint = [(UISlider *)sender value];
[(GPUImageBrightnessFilter *)sepiaFilter setBrightness:midpoint];
//[(GPUImageSepiaFilter *)sepiaFilter setIntensity:[(UISlider *)sender value]];
//[(GPUImageSaturationFilter *)sepiaFilter setSaturation:midpoint];
//[(GPUImageRGBFilter *)sepiaFilter setGreen:midpoint];
[sourcePicture processImage];
}
- (void)setupDisplayFiltering;
{
NSUserDefaults *defaults = [NSUserDefaults standardUserDefaults];
NSData *imageToUseData = [defaults dataForKey:#"imageToUse"];
UIImage *inputImage = [UIImage imageWithData:imageToUseData];
sourcePicture = [[GPUImagePicture alloc] initWithImage:inputImage smoothlyScaleOutput:YES];
//sepiaFilter = [[GPUImageSepiaFilter alloc] init];
sepiaFilter = [[GPUImageBrightnessFilter alloc] init];
GPUImageView *imageView = (GPUImageView *)self.view;
[sepiaFilter forceProcessingAtSize:imageView.sizeInPixels]; // This is now needed to make the filter run at the smaller output size
[sourcePicture addTarget:sepiaFilter];
[sepiaFilter addTarget:imageView];
[sourcePicture processImage];
UIImage *outputImage = [sepiaFilter imageFromCurrentlyProcessedOutput];
UIImage *quickFilteredImage = [brightnessFilter imageByFilteringImage:inputImage];
// set the image chosen for other classes to set uiimages with
NSData *imageToStore = UIImageJPEGRepresentation(outputImage, 100);
[defaults removeObjectForKey:#"imageToUse"];
[defaults setObject:imageToStore forKey:#"imageToUse"];
}