iOS 7 MapKit MKAnnotationView's image property type encoding weirdness - mapkit

These are the results of
method_getTypeEncoding(class_getInstanceMethod(MKAnnotationView.class, #selector(setImage:)))
method_getTypeEncoding(class_getInstanceMethod(MKAnnotationView.class, #selector(image)))
7.0:
v12#0:4^{UIImage=#^vf{?=b1b3b1b1b1b16b2}}8
^{UIImage=#^vf{?=b1b3b1b1b1b16b2}}8#0:4
and 6.1:
v12#0:4#8
#8#0:4
I don't understand why it's a ^{... instead of #. It's causing me problems in Rubymotion.
Thanks in advance!

You might want to check out Promotion and it's MapKit support (that I wrote). It makes it REALLY easy to draw a map screen and put annotations on the map (and customize their look).
https://github.com/clearsightstudio/ProMotion

they fixed it in RubyMotion 2.7

Related

can you set a predetermined billing app qt?

can you set a vertical orientation in a qml app?
if so?
I have searched on various sites to try and solve this problem. I found it in C ++ code but I would need the piece of code in qml language
I'm using the application project - qt quick application - empty
I'm using a version of qt 5.10.1
thank you
You're looking for Screen.orientationUpdateMask.
Once the mask is set, Screen.orientation will contain the current orientation of the screen.You can read more about the Screem QML type here. Of course the orientation in this case is set by the accelerometer.
If you want to be able to go back and forth between portrait and landscape without the use of the accelerometer and while having the logic in qml you will need to use the Transform, Scale and Rotation QML types. I wouldn't recommend this approach.
One alternative to using Transform would be to use two different views all together, which might not be a good idea for maintainability especially if you want to use the 4 orientations.
If you want to force the orientation no matter what you can do it in the manifest file as you would normally without Qt.

Draggable MapKit Annotations

I am working with MapKit on MacOS and trying to enable a draggable annotation that uses a custom image. I can successfully get the annotation to be draggable but it requires the user to be quite accurate with where they click and drag as the annotation image is larger than a conventional pin. Is there a simple way to expand the area so that any part of the image is draggable? Otherwise I imagine I will have to use some kind of NSGesture on the view to manually set the dragstate, but was hoping there might be an easier way!
Okay, I never managed to sort this to my satisfaction using Annotations. I’m not saying it can’t be done, maybe someone else can comment and leave pointers to help. But I eventually achieved what I wanted using overlays instead. So if someone stumbles on this question and has the same issue, you can make it work with a custom overlay rather than an annotation and you implement the dragging using a NSPanGestureRecognizer with translation for the movement.

Standard iOS 7 blur implementation

I have read dozens of questions about mimicking iOS 7 blur effect in earlier versions of iOS. But the fundamental question that arises here is does iOS 7's UIKit really have a nice and convenient way to make any UIView blurred? That seems quite logical to me. Any help appreciated.
Apple has provided sample code to blur any UIImage in iOS 7's style.
Go to Apple Developer downloads and log in if necessary. Search for "imageeffects" to bring up the WWDC 2013 Sample Code entry, and download the iOS_UIImageEffects sample code.
In that project, there's a UIImage category in UIImage+ImageEffects.h that you can copy into your own project, containing the applyBlurWithRadius:... method.
You'll need to link your project with the Accelerate framework.
This won't automatically do a blur on a view in realtime — for that, see FXBlurView, which performs a similar technique by automatically snapshotting its superview. It can be pretty performance-intensive, though: consider first whether you can achieve what you want by statically blurring an image, rather than trying to "live"-blur moving content.
UPDATE
As of iOS 8.0 (not iOS 7), UIKit provides UIVisualEffectView and UIBlurEffect. You can watch WWDC 2014 Session 419 “Advanced Graphics and Animations for iOS Apps” (or just download the slides) for an introduction to these classes.
ORIGINAL
No, it doesn't. Apple has not exposed an interface for doing this conveniently. For example, if you read through this discussion, Rincewind's responses should make it clear that Apple doesn't provide a public API for this, and that the private APIs they use have serious limits and are likely to change.
You must implement the blur effect yourself. You'll probably want to use the new -[UIView drawViewHierarchyInRect:afterScreenUpdates:] method to capture the appearance of the background view, and then apply a CIFilter to it to perform the blur.
UIKit does not have a convenient way of achieving this effect. However, there's a few libraries on Github that easily achieve this effect. Nick Lockwood's seems to be the most popular.

Image processing - How detect array similar colors from a image on ios

I want to create a area of similar colors when user touches into a point on imageview. Who knows library or technique to resolve problems for ios.
Original image link:
http://cannshine.com/images/1.jpg
After touched by user, link:
http://cannshine.com/images/2.jpg
Please helping, thanks!
It looks to me like you want to create a magic wand tool, correct? I can't help you come up with a way to handle this and I don't know of any libraries that can select a group of like-colored pixels based on tolerance, but there is this article here I found. It covers implementing a magic wand tool using Objective-C. I found it in the answer to this similar question on SO.

Image Processing in iPhone

Actually I am adding Image Processing Feature in my iPhone Application It should do Brightness, Contrast, Sharpen, Exposure....
But i am not able to Find any article/Tutorial on the Internet. Will you please help me to find any tutorial or tell me how can i implement the iPhone View Based Application.
I have found 1 link http://www.iphonedevsdk.com/forum/iphone-sdk-development/10094-adjust-image-brightness-contrast-fly.html its worked also for Brightness but its not working on iPad.
So Suggest something that i can start with my Image Processing Logic.
Thanks
Rick Jackson
I personally like the approach in the GLImageProcessing project from Apple's sample code. Check it out.
There are a few libraries that support image processing in Quartz. There are even a few categories on UIImage to do some basic stuff.
The following are a few examples:
https://github.com/esilverberg/ios-image-filters
https://github.com/cmkilger/CKImageAdditions
http://code.google.com/p/simple-iphone-image-processing/
But as said before by #Felz those libraries are slow because they use the quartz codebase, which isn't that fast (for example: changing the saturation of an image with a resolution of 1024x1024 might take up to 4 to 8 seconds, depending on which device your using).
If your project is iOS 5 or higher then you should definitely consider using CoreImage
You can try GPUImage framework created by Brad Larson. It includes awesome image filters and also easy to use.