iOS7: Log: MKMapView pitch cannot be enabled on this device - ios7

Since I updated to iOS7, I get for an app under development always the log message "MKMapView pitch cannot be enabled on this device." The device is a iPhone4.
Can somebody tell me what this message means and how I could get rid of it?

Maybe you've already resolved this. Anyhow I suppose that your warning is because you have selected the check "3D Perspective" in Attribute Inspector of your map view, but your device isn't able to do it. I have not an iPhone 4 so I'm not sure if it can or cannot.
Anyway, if you are not interested in 3D map try to disable that attribute.

Related

Detecting motion gestures with siri remote tvOS

I am developing an app for tvOS and I want when the user shakes the remote, or moves it in a downward slash, that an event gets triggerred. But apple's documentation mostly focuses on registering button presses and the focus engine.
Can anyone help me with how I can access the accelometer?
Thank you for your help
To use the motion sensing aspects of the Siri Remote, you need to treat it as a game controller. See Working with Game Controllers in App Programming Guide for tvOS and the GCMotion class.
While it's fairly easy to port an ios game to tvos, note the following limitation that did slow me down as originaly I was using the rotation feature and expected to be the same on the remote, I had overlooked it in the doc but it says "Although the remote supports motion data (and the GCMotion profile), the remote cannot determine the attitude or rotation of the remote. The corresponding properties always return constant values."
And the constant values as per the tvOS header GCMotion.h are:
#note Remotes can not determine a stable rotation rate so the values will be (0,0,0) at all times.
#note Remotes can not determine a stable attitude so the values will be (0,0,0,1) at all times.

sony-camera-api trackingFocus

I can turn tracking Focus on and use the actTrackingFocus. Once the actTrackingFocus is set how can I get the coordinates back from the camera so I can draw a box in the Liveview box showing what the camera is focused on?
That is not possible with the existing API unfortunately.
Appreciate that this is an old question, but if you are still trying and OK playing in python...
The tracking focus location is (apparently) reported via the frame info packets, and thus you have to enable them and then decode.
We are attempting to do this with pysony 1
Use 'python src/example/pygameLiveView -i' to see the reported locations. You might need to add your 'actTrackingFocus()' call to enable tracking focus, but they should be rendered (box with triangle corners) on screen.
Since none of the devs have a camera which support tracking focus, we'd love to hear whether it works on not. :-)

How to programmatically start front camera of iPad?

I would like to start front camera of the iPad when app starts.
How do I do it programmatically?
Please let me know.
First thing you need to do is to detect if your device has got front-facing camera. For that you need to iterate through the video devices.
Try this method of UIImagePickerController:
+ (BOOL)isCameraDeviceAvailable:(UIImagePickerControllerCameraDevice)cameraDevice
This is a class method and UIImagePickerControllerCameraDevice can take two values:
- UIImagePickerControllerCameraDeviceRear
- UIImagePickerControllerCameraDeviceFront
Example code:
if( [UIImagePickerController isCameraDeviceAvailable: UIImagePickerControllerCameraDeviceFront ])
{
// do something
}
Note that this is available for iOS 4.0 and later.
Also I am not sure if there is any API's to start the front-facing camera up front. The camera always seems to start in the same mode that the user left it the last time it was used. Maybe by design Apple did not expose any API's to change this. Maybe Apple wanted the users to make a call on this.
Nevertheless you can atleast detect the availability of Fron Camera & provide your feature.
If I understand your question correctly, all you have to do is open your Camera to be in Front Mode instead of Rear Mode, so write this inside the method where you call the picker for the first time:
picker.cameraDevice=UIImagePickerControllerCameraDeviceFront;
Hope this answers your question.

UIImagePickerController console message

I am using UIImagePickerController for selecting images from the photo library. But I am getting a message on the console saying "Using two-stage rotation animation. To use the smoother single-stage animation, this application must remove two-stage method implementations."
What is this due to ?
Thanks
Several people appear to be having the same problem. None of these links provide a solution (as far as I can tell), so this post is intended just as a starting point for more searching. I have made it CW.
stackoverflow: Single-Stage vs Two-Stage Animation for iPhone Apps
cocoabuilder: how to change to the smoother single stage animation
stackoverflow: tabbarcontroller and navigationcontrollers in landscape mode
Apple support: 11186784
That normally means you have implemented one of:
willAnimateFirstHalfOfRotationToInterfaceOrientation:duration:
willAnimateSecondHalfOfRotationFromInterfaceOrientation:duration:
on one of your view controllers, even if they are empty. This is neither wrong nor deprecated, the message just say it could possibly done better if you moved your implementation to: willAnimateRotationToInterfaceOrientation:duration:

Apple Magic Mouse Api

I just bought a Magic Mouse and I like it pretty much. But as a Mac Developer it's even cooler. But there's one problem: is there already an API available for it? I want to use it for one of my applications. For, example, detect the user's finger positions, swipe or stretch gestures etc...
Does anyone know if there's an API for it (and how to use it)?
The Magic Mouse does not use the NSTouch API. I have been experimenting with it and attempting to capture touch information. I've had no luck so far. The only touch method that is common to both the mouse and the trackpad is the swipeWithEvent: method. It is called for a two finger swipe on the device only.
It seems the touch input from the mouse is being interpreted somewhere else, then forwarded on to the public API. I have yet to find the private API that is actually doing the work.
get a look here: http://www.iphonesmartapps.org/aladino/?a=multitouch
there's a full working proof-of-concept using the CGEventPost method.
--
all the best!
I have not tested, but I would be shocked if it didn't use NSTouch. NSTouch is the API you use to interact with the multi-touch trackpads on current MacBook Pros (and the new MacBooks that came out this week). You can check out the LightTable sample project to see how it is used.
It is part of AppKit, but it is a Snow Leopard only API.
I messed around with the below app before getting my magic mouse. I was surprised to find that the app also tracked the multi touch points on the mouse.
There is a link in the comments to some source that gets the raw data similarly, but there is no source to this actual app.
http://lericson.blogg.se/code/2009/november/multitouch-on-unibody-macbooks.html