iPhone iOS open source gyroscope based spherical coordinate system viewer, like 360 panorama or tour wrist - objective-c

I'm looking for an iPhone based, preferably iOS5 with ARC project that uses the iPhone4's gyro to look around in spherical coordinate system. The phone is at the center of a sphere, and by looking at the sensor output, it can understand where the camera is pointing in spherical coordinates.
I'm not sure if what I'm thinking of can be accomplished with iOS5 CMAttitude which blends sensors of iPhone4 can it?
I intend to use the project to control a robotic turret and make it be able to "look" at a particular point within a spherical coordinate system.
What comes to mind is that a 360 panorama or a tour wrist like app would be a good starting point for such a project. Is there something that is similar, open source and uses native iOS Core Motion framework?
Thank you!

If you would like to license the TourWrist technology, please let me know. For example, we license the TourWrist capture and viewer APIs/SDKs.
Dan Smigrod
via: support#TourWrist.com

Related

Mac OS X: Libraries for displaying human anatomy and interacting with it?

I have been asked to implement a way to display an image of a part of human anatomy, such as the spine, and permit the user to interact with it using OS X and Objective C/Swift. For example, with the spine, I should be able to pull on a vertebra horizontally, thus changing the angle of the spine in that location, or possibly rotating a vertebra clockwise/counterclockwise a few degrees. I have absolutely no idea how to implement this kind of code. It seems like animation stuff, but not a video game. I have seen discussions of Cocoa2D for gaming, but don't know if it is the right path. Are there any libraries out there that could be used: commercial or otherwise? I would really appreciate a nudge in the correct direction.
Thank you for your suggestions!

Kinect: How to obtain a skeleton from back view?

Why should you ever want something like this?
I want to track a single user that is mounted above the ground in a horizontal position. The user is facing downwards to allow free movement of legs and arms. Think of swimming for example.
I mounted the Kinect at the ceiling facing downwards so I have a free view of all extremities.
The sensor is rotated 90° in z-axis to have the maximum resolution (you're usually taller than wide).
Therefore the user is seen from the backside, rotated by 90°. It is impossible to get a proper skeleton from OpenNI 1.5. My tests showed that OpenNI is expecting the user facing the camera with the head up in y-axis (see my other answer). Microsofts SDK is the same but I excluded it here because it won't allow you to change the source code and cannot be adapted. OpenNI 2.0 is not working with the current SensorKinect to interface the Kinect in Linux. So:
Which class is generating the skeleton in OpenNI 1.5.x?
My best guess would be to rotate the prototype skeleton by y 180° and z 90°. If you know where I could find this.
EDIT: As I just learned there is no open source software that generates a skeleton from depth images so I fall back to the question in the header:
How can I get a user skeleton from a rotated back view?

iOS5 how to get device orientation from CMAttitude using CMAttitudeReferenceFrameXTrueNorthZVertical?

I'm building an augmented reality game in iOS5 on devices that support gyroscopes.
I want to use CMAttitudeReferenceFrameXTrueNorthZVertical to map the device orientation and find out which CLLocation the device is looking toward. This is a new orientation available in iOS5 based on sensor fusion algorithms. It is supposed to be much smoother than the accelerometer based code.
I see a lot of examples of pre-iOS5 code, which use accelerometer and older implementations of the AR that use accelerometer code. To rewrite such code, I need to understand how to map the new CMAttitude and current location into a vector from the current location to some other CLLocation defined by drawing a vector from the center of the screen, out the back of the iphone towards that reference point.
Thank you for any hints!
Look at the APple pARK sample.. it does a perspective transform that covers the screen then projects the 3D coordinate from your location to the other geo location. https://developer.apple.com/library/ios/#samplecode/pARk/Introduction/Intro.html#//apple_ref/doc/uid/DTS40011083

Augmented Reality Marker Reader iPhone

I want to use 'marker reader' technology in my iPhone application (you have a piece of paper, you point the iphone camera at it and information is bein drawn up from it). What API kits can i use/are availible?
The closest SDK I can think of is ARToolKit which should help in a Framework Solution for finding Markers and deriving information. There are a couple of other Frameworks which could be used and I've listed below:
Layar
ARToolKit
Popcode
SGAREnvironment

360 degree video in MPMoviePlayerController

I am trying to develop an iphone application which needs to show a 360 degree video like the one and rotate the video as per the phone movement. How can i do this? Is it possible to do this with normal MPMovieplayer controller?
I don't think you can do this with a normal MPMoviePlayerController, but there are several libraries out there to achieve this. Have a look here:
PanoramaGL
Panorama 360
They work with OpenGL and you can embed them in your Objective-C code.
EDIT:
As #Mangesh Vyas kindly pointed out those are intended to use with fixed images only. However they might be a suitable starting point for embedding video as well, if you modify the code accordingly. They already do the handling of direction, accelerometer etc. so you don't have to implement all that yourself.