Override / cancel iPad application switch gesture - objective-c

I want to track four finger touches in my application, but they are cancelled as the iPad uses the four finger swipe to switch applications.
Is there a way to cancel / override this gesture?

I looked into this for a game I worked on and I could not find a way to override it. Even if you found a way, it would probably get you rejected from the app store, since there doesn't seem to be any public API for it.

This be my answer. I'm not sure how you'd go about to actually execute the below as I guess implementions do things differently, but generally it sounds as there is a possibility, because after all these are the words of Apple regarding the multitasking gestures:
"Developers are encouraged to evaluate any existing interactions in
their applications for potential sources of interference. In order to
properly interoperate with multitasking gestures, applications must
properly handle the following methods and notifications:
touchesCancelled:withEvent: (UIResponder) cancelTrackingWithEvent:
(UIControl) applicationDidBecomeActive: and
applicationWillResignActive: (application delegate)
UIApplicationDidBecomeActiveNotification and
UIApplicationWillResignActiveNotification notifications These can be
enabled for development via Xcode so you can update your apps to
interoperate with these new gestures. Test them and give us your
feedback in the Apple Developer Forums."

I really don't think you can because I have watched countless videos on how to override on iPhone/iPod but there are no ones for iPad must be so you can't cheat on games!

Related

Why does iOS sometimes disable animations?

I'm not sure what causes it, but I and others on my team have found that, for some reason, iOS sometimes decides to completely disable all animations within our app. iOS general animations (parallax, app switching, home button, etc.) are still enabled, so it's restricted to our app.
This wouldn't be so much of an issue if it weren't for the fact that it seems that some things like -[UIResponder becomeFirstResponder] don't work immediately after what would otherwise be an apparition animation (for instance, in a viewDidAppear method, or the block of a -[UIViewController dismissViewControllerAnimated:completion:]).
I've checked our code to ensure this isn't something we do, and indeed we simply pass YESs into the Cocoa Touch framework when it asks if we want things animated, and at no point in our code (or, as far as I know, in our 3rd-party SDKs) is +[UIView setAnimationsEnabled:] ever called. Likewise, I didn't do anything in iOS settings like "Reduce Motion", and simply restarting our app or letting the iOS device sleep will reverse this state.
So, what might cause iOS to disable our app's ability to use system animations? Additionally, Does this affect how/when/if delegate methods and callback blocks are being called?
Also, is there a way to detect, trigger, or reverse iOS's decision to disable animations?

Best Practices When Using CoreBluetooth Framework

Lately I have been playing around with the bluetooth framework and grew a strong enough knowledge to start building an application. The only problem is that all the examples I found and all the practice I have made consist in putting the core bluetooth core code inside the same file as the UIView with which the user is interacting.
I would like my future application to have multiple views in which the BLE scan occurs on the background. I initially thought about creating an object with a name similar to bleDeviceFinder and pass this object through each view. However, after thinking about it I realised that if I want something to happen in the current view I need the function didDiscoverPeripheral to have direct access to the UIView objects which it is supposed to affect.
I know it is probably a stupid question, what would be the best way to do so? I was thinking maybe to set and alert and subscribe every view to that alert; is this a good solution?
A quasi singleton BTLEManager that you pass around in the app. It sends NSNotifications for events like discovery, and your ViewControllers observe these notifications. The truth (i.e. list of discovered devices) stays in BTLEManager. Once a viewController has received such a notification it asks the BTLEManager for the list of current devices and then the viewController changes your views accordingly. The Views should never talk to the BTLEManager directly.
That's how I would do it.

Receive remote control events without audio

Here is some background information, otherwise skip ahead to the question in bold. I am building an app and I would like it to have access to the remote control/lock screen events. The tricky part is that this app does not play audio itself, it controls the audio of another device nearby. The communication between devices is not a problem when the app is in the foreground. As I just found out, an app does not assume control of the remote controls until it has played audio with a playback audio session, and was the last do so. This presents a problem because like I said, the app controls ANOTHER device's audio and has no need to play its own.
My first inclination is to have the app play a silent clip every time it is opened in order to assume control of the remote controls. The fact that I have to do this makes me wonder if I am even going to be allowed to do it by Apple or if there is another way to achieve this without fooling the system with fake audio clips.
QUESTION(S): Will Apple approve an app that plays a silent audio clip in order to assume control of the remote/lock screen controls for the purpose of controlling another device's audio? Is there any way of assuming control of the remote controls without an audio session?
P.S. I would prefer to have this functionality on iOS 4.0 and up.
P.P.S I have seen this similar question and it has gotten me brainstorming but the answer provided is not specific to what I need to know.
NOTE: As of iOS 7.1, you should be using MPRemoteCommandCenter instead of the answer below.
You create various system-provided subclasses of MPRemoteCommand and assign them to properties of the [MPRemoteCommandCenter sharedCommandCenter].
I'm keeping the rest of this around for historical reference, but the following is not guaranteed to work on recent iOS versions. In fact, it just might not.
You definitely do need an audio player but not necessarily an explicit session to take control of the remote control events. (AVAudioSession is implicit to any app that plays audio.) I spent a decent amount of time playing with this to confirm this.
I've seen a lot of confusion on the internet about where to set up the removeControlEventRecievedWithEvent: method and various approaches to the responder chain. I know this method works on iOS 6 and iOS 7. Other methods have not. Don't waste your time handling remote control events in the app delegate (where they used to work) or in a view controller which may go away during the lifecycle of your app.
I made a demo project to show how to do this.
Here's a quick rundown of what has to happen:
You need to create a subclass of UIApplication. When the documentation says UIResponder, it means UIApplication, since your application class is a subclass of UIResponder. In this subclass, you're going to implement the remoteControlReceivedWithEvent: and canBecomeFirstResponder methods. You want to return YES from canBecomeFirstResponder. In the remote control method, you'll probably want to notify your audio player that something's changed.
You need to tell iOS to use your custom class to run the app, instead of the default UIApplication. To do so, open main.m and change this:
return UIApplicationMain(argc, argv, nil, NSStringFromClass([RCAppDel`egate class]));
to look like this:
return UIApplicationMain(argc, argv, NSStringFromClass([RCApplication class]), NSStringFromClass([RCAppDelegate class]));
In my case RCApplication is the name of my custom class. Use the name of your subclass instead. Don't forget to #import the appropriate header.
OPTIONAL: You should configure an audio session. It's not required, but if you don't, audio won't play if the phone is muted. I do this in the demo app's delegate, but do so where appropriate.
Play something. Until you do, the remote controls will ignore your app. I just took an AVPlayer and gave it the URL of a streaming site that I expect to be up. If you find that it fails, put your own URL in there and play with it to your heart's content.
This example has a little bit more code in there to log out remote events, but it's not all that complicated. I just define and pass around some string constants.
I bet that a silent looping MP3 file would help work towards your goal.
Moshe's solution worked great for me! However one issue I noticed is when you paused the audio, the media controls would go away and you won't be able to play it again without going back into the app. If you set the Media Info on the lock screen when you play the audio then this won't happen:
NSDictionary *mediaInfo = #{MPMediaItemPropertyTitle: #"My Title",
MPMediaItemPropertyAlbumTitle: #"My Album Name",
MPMediaItemPropertyPlaybackDuration: [NSNumber numberWithFloat:0.30f]};
[[MPNowPlayingInfoCenter defaultCenter] setNowPlayingInfo:mediaInfo];

Hacking UINavigationBar / Apple approval

Here is a screenshot from Evernote. It appears to have (but i might be wrong) a custom UINavigationBar as well as a custom UITabBar.
A quick glance at the apps on my phone shows i could use a screenshot of Instagram, Path, DailyBooth, ...
From the docs:-
Because managing the navigation bar is
the responsibility of the navigation
controller, direct modification of the
navigation bar itself is considered
off limits for the most part.
Does Apple regard this kind of thing as grounds for AppStore refusal?
There's a distinction between "custom" navigation bars and using private methods.
Apple prevent you from directly modifying the navigation bar by make a lot of the required methods private or properties read-only. If you were to modify a read-only property or use a private undocumented method your application would be rejected (Apple automatically analyse code upon submission to detect for this sort of thing).
However, there's nothing to stop you from implementing your own UINavigationBar equivalent, or customising it using publicly available SDK methods. The issue then becomes whether your application strays too far from the Human Interface Guidelines: although to be honest, Apple are generally fairly flexible on that.
Whilst I wouldn't necessarily recommend this, I've worked on one app where we rolled our own navigation controller because we had some animation and transition requirements that we couldn't achieve with Apple's own class. As long as you're not calling anything untoward or drastically going against the HIGs, you're generally fine.

NSView vs. Webview

Is there disadvantages to using WebKit WebViews compared to using NSViews?
I'm using a webview to create a UI for an application. The application itself does not have much interactivity. I have seen it mentioned, on this website & others, that using a WebView can be convient means of prototyping.
However, with our team this seems like an ideal way to produce the production ready UIs, especially with WebKit. Are we missing something?
Thanks,
Ross
Okay, so you seem to be asking if using an HTML interface (presented via a WebView) for your application has any disadvantages.
The answer to this is "no", at least "not necessarily". This is analogous to building an iPhone specific web application, and there are some excellent examples of those. The caveat would be that a lot of those sites end up recreating the look and feel of a native iPhone app, for consistency and to make the users feel "at home".
Given that you're developing a native app anyway, it seems a shame to throw away, or recreate, the responsiveness and appearance of the native chrome. Of course, for certain types of applications (games are an obvious example) a user has no expectations about the application's UI, so you're free to knock yourself out.
The other factor to consider is the amount of interactivity (although I notice that you say there isn't much in your case). The native controls will make coding a lot simpler than having to capture all user input through the "filter" of a WebView, even though using one might make the initial layout of the screens easier.
I hope that's the sort of answer you were looking for (although it's mostly non technical).
As you might have known if you spend some time in the documentation, you'd have seem that WebView is a subclass of NSView.
The documentation says about WebView:
WebView is the core view class in the WebKit framework that manages interactions between the WebFrame and WebFrameView classes. To embed web content in your application, you just create a WebView object, attach it to a window, and send a loadRequest: message to its main frame.
And about NSView:
NSView is a class that defines the basic drawing, event-handling, and printing architecture of an application. You typically don’t interact with the NSView API directly; rather, your custom view classes inherit from NSView and override many of its methods, which are invoked automatically by the Application Kit. If you’re not creating a custom view class, there are few methods you need to use.
So here's the answer to your question:
Is there disadvantages to using WebKit WebViews compared to using NSViews?
Yes. You can't display any web content with NSView. That's what you need WebView for.
I suggest reading some more documentation though.