Image Processing in iPhone - objective-c

Actually I am adding Image Processing Feature in my iPhone Application It should do Brightness, Contrast, Sharpen, Exposure....
But i am not able to Find any article/Tutorial on the Internet. Will you please help me to find any tutorial or tell me how can i implement the iPhone View Based Application.
I have found 1 link http://www.iphonedevsdk.com/forum/iphone-sdk-development/10094-adjust-image-brightness-contrast-fly.html its worked also for Brightness but its not working on iPad.
So Suggest something that i can start with my Image Processing Logic.
Thanks
Rick Jackson

I personally like the approach in the GLImageProcessing project from Apple's sample code. Check it out.

There are a few libraries that support image processing in Quartz. There are even a few categories on UIImage to do some basic stuff.
The following are a few examples:
https://github.com/esilverberg/ios-image-filters
https://github.com/cmkilger/CKImageAdditions
http://code.google.com/p/simple-iphone-image-processing/
But as said before by #Felz those libraries are slow because they use the quartz codebase, which isn't that fast (for example: changing the saturation of an image with a resolution of 1024x1024 might take up to 4 to 8 seconds, depending on which device your using).
If your project is iOS 5 or higher then you should definitely consider using CoreImage

You can try GPUImage framework created by Brad Larson. It includes awesome image filters and also easy to use.

Related

How to achieve this image distortion effect in iOS?

I'm building an iphone photo app. I don't know how to achieve this kind of effect on an image:
When the user drags a finger across the arrow, I want the image to be distorted accordingly. How can I achieve this? Is there any framework that makes this process simple?
Thank you.
No clue about iOS, but the look reminds me of a thin-plate spline warp. It's quite easy to implement in OpenGL, a quick Google search for example code returns plenty of hits.
From the sound of it I think your looking for the CoreImage framework. Look into the various distortion effects...in particular CIBumpDistortionLinear.
I don't have any code so check out this tutorial and read up on CoreImage.

Standard iOS 7 blur implementation

I have read dozens of questions about mimicking iOS 7 blur effect in earlier versions of iOS. But the fundamental question that arises here is does iOS 7's UIKit really have a nice and convenient way to make any UIView blurred? That seems quite logical to me. Any help appreciated.
Apple has provided sample code to blur any UIImage in iOS 7's style.
Go to Apple Developer downloads and log in if necessary. Search for "imageeffects" to bring up the WWDC 2013 Sample Code entry, and download the iOS_UIImageEffects sample code.
In that project, there's a UIImage category in UIImage+ImageEffects.h that you can copy into your own project, containing the applyBlurWithRadius:... method.
You'll need to link your project with the Accelerate framework.
This won't automatically do a blur on a view in realtime — for that, see FXBlurView, which performs a similar technique by automatically snapshotting its superview. It can be pretty performance-intensive, though: consider first whether you can achieve what you want by statically blurring an image, rather than trying to "live"-blur moving content.
UPDATE
As of iOS 8.0 (not iOS 7), UIKit provides UIVisualEffectView and UIBlurEffect. You can watch WWDC 2014 Session 419 “Advanced Graphics and Animations for iOS Apps” (or just download the slides) for an introduction to these classes.
ORIGINAL
No, it doesn't. Apple has not exposed an interface for doing this conveniently. For example, if you read through this discussion, Rincewind's responses should make it clear that Apple doesn't provide a public API for this, and that the private APIs they use have serious limits and are likely to change.
You must implement the blur effect yourself. You'll probably want to use the new -[UIView drawViewHierarchyInRect:afterScreenUpdates:] method to capture the appearance of the background view, and then apply a CIFilter to it to perform the blur.
UIKit does not have a convenient way of achieving this effect. However, there's a few libraries on Github that easily achieve this effect. Nick Lockwood's seems to be the most popular.

Detect Winks in front facing camera using CIFaceFeature

I have an app which uses AVFoundation and tracks the face, eyes, and mouth position. I use the CIFaceFeature to detect these and mark them on the screen.
Is there a simple way to detect a wink using the framework?
For iOS 7, Yes, now you can do it with CoreImage.
Here is the API diff in iOS 7 Beta 2:
CoreImage
CIDetector.h
Added CIDetectorEyeBlink
Added CIDetectorSmile
Before iOS 7:
No, there is no way with iOS frameworks (AVFoundation or CoreImage) for now.
You can check out with OpenCV... but it's more of a researchy topic, not guarantee to work well in different situations:
First, you need to build a eye close/open classifier, afaik, there is no build-in eye wink classifier in OpenCV, so you need to collect enough "close" and "open" samples, and train a binary classifier. (I would suggest using Principle Component Analysis + Support Vector Machine. Both are available in OpenCV)
Then in iOS, use CoreImage to detect the locations of both eyes. And cut a square patch image around the eye center. The size of the patch should be normalized in terms of the detected face bounds rectangle.
And then you need to convert UIImage/CIImage to OpenCV IplImage or CvMat format, and feed them into your OpenCV classifier to determine the eyes are open or close.
Finally, determine if there is a wink based on the sequence of eye open and close.
(You also need to check if the processing frame rate is able to pick a wink action: say the wink happens within 0.5 frame... then you'll never detect it...)
It's a hard problem... otherwise Apple would have already included them in the framework.

360 degree video in MPMoviePlayerController

I am trying to develop an iphone application which needs to show a 360 degree video like the one and rotate the video as per the phone movement. How can i do this? Is it possible to do this with normal MPMovieplayer controller?
I don't think you can do this with a normal MPMoviePlayerController, but there are several libraries out there to achieve this. Have a look here:
PanoramaGL
Panorama 360
They work with OpenGL and you can embed them in your Objective-C code.
EDIT:
As #Mangesh Vyas kindly pointed out those are intended to use with fixed images only. However they might be a suitable starting point for embedding video as well, if you modify the code accordingly. They already do the handling of direction, accelerometer etc. so you don't have to implement all that yourself.

iOS: compare a slice of an image to library of options

I'm basically trying to work out how to take a slice of an image, say a screenshot of an iPhone home screen, slice out the first icon and compare it to a set array of images in a library. Any help on where to start?
I'm no iPhone programmer, but I might be able to suggest a few things:
The SURF feature detection implemented in OpenCV should help you with this
There is a nice article on using OpenCV in Objective-C code.
A quick & dirty way might be to use the difference blend mode which should return the difference between the 1st image(top) and the 2nd image(bottom). If there is no difference the result will be completely black. So, the more black pixels in the difference result, potentially, the more similarities between the compared images.
I'm not an iOS developer, so I don't know if there is an image library that ships with sdk or if there's a free/opensource library for basic image processing. Still this should be trivial to implement:
e.g.
- (int)difference((int)topPixel,(int)bottomPixel)
{
return abs(topPixel-bottomPixel);
}
Note: Syntax might not be correct :)
HTH
This may not help you with taking a screenshot of the iOS home screen... But these articles show how to take snapshots from within a UIKit application:
https://developer.apple.com/library/prerelease/ios/#qa/qa1703/_index.html
https://developer.apple.com/library/prerelease/ios/#qa/qa1714/_index.html
Perhaps you would instruct the user to press home-power (buttons) to take a snapshot and store in the photo roll, then load that screenshot into an app to process the screenshot.
Hope this helps!