I want to use 'marker reader' technology in my iPhone application (you have a piece of paper, you point the iphone camera at it and information is bein drawn up from it). What API kits can i use/are availible?
The closest SDK I can think of is ARToolKit which should help in a Framework Solution for finding Markers and deriving information. There are a couple of other Frameworks which could be used and I've listed below:
Layar
ARToolKit
Popcode
SGAREnvironment
Related
I'm looking for an iPhone based, preferably iOS5 with ARC project that uses the iPhone4's gyro to look around in spherical coordinate system. The phone is at the center of a sphere, and by looking at the sensor output, it can understand where the camera is pointing in spherical coordinates.
I'm not sure if what I'm thinking of can be accomplished with iOS5 CMAttitude which blends sensors of iPhone4 can it?
I intend to use the project to control a robotic turret and make it be able to "look" at a particular point within a spherical coordinate system.
What comes to mind is that a 360 panorama or a tour wrist like app would be a good starting point for such a project. Is there something that is similar, open source and uses native iOS Core Motion framework?
Thank you!
If you would like to license the TourWrist technology, please let me know. For example, we license the TourWrist capture and viewer APIs/SDKs.
Dan Smigrod
via: support#TourWrist.com
I need to create an iOS 5 application will run on iPad2 (I can use private API because the App will not be released in App Store) will show live stream from front camera, recognize eyes and render a pair of glasses (I have the 3D model) following face movements.
Which is the best approach and the best technology (e.g. OpenGL ES) I can use?
Just use the libraries included in XCode. I have a sample here. It's got everything you need.
It uses the AVFoundation, CoreImage, CoreMedia, and CoreVideo frameworks.
I am trying to develop an iphone application which needs to show a 360 degree video like the one and rotate the video as per the phone movement. How can i do this? Is it possible to do this with normal MPMovieplayer controller?
I don't think you can do this with a normal MPMoviePlayerController, but there are several libraries out there to achieve this. Have a look here:
PanoramaGL
Panorama 360
They work with OpenGL and you can embed them in your Objective-C code.
EDIT:
As #Mangesh Vyas kindly pointed out those are intended to use with fixed images only. However they might be a suitable starting point for embedding video as well, if you modify the code accordingly. They already do the handling of direction, accelerometer etc. so you don't have to implement all that yourself.
I'm basically trying to work out how to take a slice of an image, say a screenshot of an iPhone home screen, slice out the first icon and compare it to a set array of images in a library. Any help on where to start?
I'm no iPhone programmer, but I might be able to suggest a few things:
The SURF feature detection implemented in OpenCV should help you with this
There is a nice article on using OpenCV in Objective-C code.
A quick & dirty way might be to use the difference blend mode which should return the difference between the 1st image(top) and the 2nd image(bottom). If there is no difference the result will be completely black. So, the more black pixels in the difference result, potentially, the more similarities between the compared images.
I'm not an iOS developer, so I don't know if there is an image library that ships with sdk or if there's a free/opensource library for basic image processing. Still this should be trivial to implement:
e.g.
- (int)difference((int)topPixel,(int)bottomPixel)
{
return abs(topPixel-bottomPixel);
}
Note: Syntax might not be correct :)
HTH
This may not help you with taking a screenshot of the iOS home screen... But these articles show how to take snapshots from within a UIKit application:
https://developer.apple.com/library/prerelease/ios/#qa/qa1703/_index.html
https://developer.apple.com/library/prerelease/ios/#qa/qa1714/_index.html
Perhaps you would instruct the user to press home-power (buttons) to take a snapshot and store in the photo roll, then load that screenshot into an app to process the screenshot.
Hope this helps!
Actually I am adding Image Processing Feature in my iPhone Application It should do Brightness, Contrast, Sharpen, Exposure....
But i am not able to Find any article/Tutorial on the Internet. Will you please help me to find any tutorial or tell me how can i implement the iPhone View Based Application.
I have found 1 link http://www.iphonedevsdk.com/forum/iphone-sdk-development/10094-adjust-image-brightness-contrast-fly.html its worked also for Brightness but its not working on iPad.
So Suggest something that i can start with my Image Processing Logic.
Thanks
Rick Jackson
I personally like the approach in the GLImageProcessing project from Apple's sample code. Check it out.
There are a few libraries that support image processing in Quartz. There are even a few categories on UIImage to do some basic stuff.
The following are a few examples:
https://github.com/esilverberg/ios-image-filters
https://github.com/cmkilger/CKImageAdditions
http://code.google.com/p/simple-iphone-image-processing/
But as said before by #Felz those libraries are slow because they use the quartz codebase, which isn't that fast (for example: changing the saturation of an image with a resolution of 1024x1024 might take up to 4 to 8 seconds, depending on which device your using).
If your project is iOS 5 or higher then you should definitely consider using CoreImage
You can try GPUImage framework created by Brad Larson. It includes awesome image filters and also easy to use.