With which method has Apple replaced "dragImage:"? - objective-c

I have an app where I allow the user to drag/drop some data from an NSView, with a custom drag image under the cursor.
I just updated to the Xcode 6 beta, and now my drag/drop code won't compile. This is because Apple has deprecated the following NSView method:
dragImage:at:offset:event:pasteboard:source:slideBack:
Fine, but what am I supposed to use instead? All the Apple documentation I've found still merrily recommends the deprecated method.
There is also a "dragFile:" method, however my NSView represents AV data, and it's unsuitable to write a large new file to disk every time the user begins a drag. The user may well to abort the drag, possibly multiple times in rapid succession.
What is the new way to initiate a drag operation with a custom icon?

Thanks to Kyle Sluder on Apple's Dev forums for alerting me to this. It turns out the replacement is
NSView beginDraggingSessionWithItems: event: source:

Related

Best Practices When Using CoreBluetooth Framework

Lately I have been playing around with the bluetooth framework and grew a strong enough knowledge to start building an application. The only problem is that all the examples I found and all the practice I have made consist in putting the core bluetooth core code inside the same file as the UIView with which the user is interacting.
I would like my future application to have multiple views in which the BLE scan occurs on the background. I initially thought about creating an object with a name similar to bleDeviceFinder and pass this object through each view. However, after thinking about it I realised that if I want something to happen in the current view I need the function didDiscoverPeripheral to have direct access to the UIView objects which it is supposed to affect.
I know it is probably a stupid question, what would be the best way to do so? I was thinking maybe to set and alert and subscribe every view to that alert; is this a good solution?
A quasi singleton BTLEManager that you pass around in the app. It sends NSNotifications for events like discovery, and your ViewControllers observe these notifications. The truth (i.e. list of discovered devices) stays in BTLEManager. Once a viewController has received such a notification it asks the BTLEManager for the list of current devices and then the viewController changes your views accordingly. The Views should never talk to the BTLEManager directly.
That's how I would do it.

How to programmatically terminate an NSDraggingSession?

Context:
OS X Application implemented in Cocoa/ObjC
Xcode 6.1
Base SDK: 10.0
Deployment Target: 10.8+
Question:
In my app, I have a custom NSView subclass which implements NSDraggingSource. I'm using the new NSDraggingSession-based API (not the old deprecated one).
I have my dragging source fully implemented, no problems.
I am already implementing:
-[NSDraggingSource draggingSession:movedToPoint:]
I would like to additionally do some hit detection within this method and depending on the results of that hit test, possibly cancel the current drag session programmatically.
So: What is the best way to cancel/terminate the current NSDraggingSession programmatically? (preferably from -[NSDraggingSource draggingSession:movedToPoint:])
It might be nice to also be able to specify the NSDragOperation upon termination too.
Notes:
I've tried calling -[NSResponder cancelOperation:] on my custom view, thinking this might have something to do with dragging sessions, but that doesn't work. I get an exception with unrecognized selector.
I DO NOT need help doing hit detection.
I DO NOT need help implementing NSDraggingSource in general.

Receive remote control events without audio

Here is some background information, otherwise skip ahead to the question in bold. I am building an app and I would like it to have access to the remote control/lock screen events. The tricky part is that this app does not play audio itself, it controls the audio of another device nearby. The communication between devices is not a problem when the app is in the foreground. As I just found out, an app does not assume control of the remote controls until it has played audio with a playback audio session, and was the last do so. This presents a problem because like I said, the app controls ANOTHER device's audio and has no need to play its own.
My first inclination is to have the app play a silent clip every time it is opened in order to assume control of the remote controls. The fact that I have to do this makes me wonder if I am even going to be allowed to do it by Apple or if there is another way to achieve this without fooling the system with fake audio clips.
QUESTION(S): Will Apple approve an app that plays a silent audio clip in order to assume control of the remote/lock screen controls for the purpose of controlling another device's audio? Is there any way of assuming control of the remote controls without an audio session?
P.S. I would prefer to have this functionality on iOS 4.0 and up.
P.P.S I have seen this similar question and it has gotten me brainstorming but the answer provided is not specific to what I need to know.
NOTE: As of iOS 7.1, you should be using MPRemoteCommandCenter instead of the answer below.
You create various system-provided subclasses of MPRemoteCommand and assign them to properties of the [MPRemoteCommandCenter sharedCommandCenter].
I'm keeping the rest of this around for historical reference, but the following is not guaranteed to work on recent iOS versions. In fact, it just might not.
You definitely do need an audio player but not necessarily an explicit session to take control of the remote control events. (AVAudioSession is implicit to any app that plays audio.) I spent a decent amount of time playing with this to confirm this.
I've seen a lot of confusion on the internet about where to set up the removeControlEventRecievedWithEvent: method and various approaches to the responder chain. I know this method works on iOS 6 and iOS 7. Other methods have not. Don't waste your time handling remote control events in the app delegate (where they used to work) or in a view controller which may go away during the lifecycle of your app.
I made a demo project to show how to do this.
Here's a quick rundown of what has to happen:
You need to create a subclass of UIApplication. When the documentation says UIResponder, it means UIApplication, since your application class is a subclass of UIResponder. In this subclass, you're going to implement the remoteControlReceivedWithEvent: and canBecomeFirstResponder methods. You want to return YES from canBecomeFirstResponder. In the remote control method, you'll probably want to notify your audio player that something's changed.
You need to tell iOS to use your custom class to run the app, instead of the default UIApplication. To do so, open main.m and change this:
return UIApplicationMain(argc, argv, nil, NSStringFromClass([RCAppDel`egate class]));
to look like this:
return UIApplicationMain(argc, argv, NSStringFromClass([RCApplication class]), NSStringFromClass([RCAppDelegate class]));
In my case RCApplication is the name of my custom class. Use the name of your subclass instead. Don't forget to #import the appropriate header.
OPTIONAL: You should configure an audio session. It's not required, but if you don't, audio won't play if the phone is muted. I do this in the demo app's delegate, but do so where appropriate.
Play something. Until you do, the remote controls will ignore your app. I just took an AVPlayer and gave it the URL of a streaming site that I expect to be up. If you find that it fails, put your own URL in there and play with it to your heart's content.
This example has a little bit more code in there to log out remote events, but it's not all that complicated. I just define and pass around some string constants.
I bet that a silent looping MP3 file would help work towards your goal.
Moshe's solution worked great for me! However one issue I noticed is when you paused the audio, the media controls would go away and you won't be able to play it again without going back into the app. If you set the Media Info on the lock screen when you play the audio then this won't happen:
NSDictionary *mediaInfo = #{MPMediaItemPropertyTitle: #"My Title",
MPMediaItemPropertyAlbumTitle: #"My Album Name",
MPMediaItemPropertyPlaybackDuration: [NSNumber numberWithFloat:0.30f]};
[[MPNowPlayingInfoCenter defaultCenter] setNowPlayingInfo:mediaInfo];

iOS 5 - Camera Overlay Clone

Is there an open source nib out that that's a clone of the overlay view that Apple uses for its Camera.app? I'm currently using UIImagePickerController with picker.showCameraControls = YES, but I need to tweak the functionality ever so slightly.
I've been thinking of subclassing the UIImagePickerController, but would that give me more control over takePicture? Specifically, I want to be able to call that method without forcing the user to leave the interface.
This is what the AVCaptureSession class is for. The documentation shows how to set up a captureSession, from there it's all cake!

Is Apple deprecating UIPopover?

Since iOS 5.1 was released, the default for showing the Master view controller in split views is a slide in type of thing. In order to present a popover it seems like you have to enable it using a UIPopover controller instead. Does this mean that the popover is going to going out of style?
When it comes to Apple's API's, deprecated means that Apple has specifically stated that something is in the process of going away. It's usually accompanied by advice regarding a new way to accomplish the same thing. So, if Apple ever deprecates UIPopoverController, you'll know it just from reading the documentation.
That said, it's also a good idea to read the release notes for each new version of iOS that comes along. In the iOS 5.1 release notes you'll find a note that explains what you're seeing:
In 5.1 the UISplitViewController class adopts the sliding presentation
style when presenting the left view (previously only seen in Mail).
This style is used when presentation is initiated either by the
existing bar button item provided by the delegate methods or by a
swipe gesture within the right view. No additional API adoption is
required to obtain this behavior, and all existing API, including that
of the UIPopoverController instance provided by the delegate, will
continue to work as before.