How to pull data from iOS app by initiating the call in watchOS complication? - watchos

I am currently placing
[[WCSession defaultSession] transferUserInfo:applicationDict];
Inside of my Complication Controller, within
getCurrentTimelineEntryForComplication
I know that this is not the correct place for it, but I am a loss as to where and when I should begin pulling info from iOS. This seems to be sparsely documented.
My goal is to have it make an initial call to a function in iOS which will fetch data from a server and respond by placing that data into a class on the watchOS side. I also want to be able to update it periodically afterward.
So how do I go about doing this?

Wouldn't recommend requesting data from the phone within 'getCurrentTimelineEntryForComplication'. Apple recommends you get data in
handle(_ backgroundTasks: Set<WKRefreshBackgroundTask>)
in your ExtensionDelegate or push the data to the watch in a backgroundRefreshTask on the iOS side using
WCSession.default.transferCurrentComplicationUserInfo(_)
within
application(_ application: UIApplication, performFetchWithCompletionHandler completionHandler: #escaping (UIBackgroundFetchResult) -> Void)

Related

tvOS GameController At what point does GCController.controllers become populated after launch?

I'm trying to read the GCController.controllers() array after my app has launched to know which controllers were already connected to the AppleTV at app launch. But GCController.controllers().count is 0 until some point after viewDidAppear gets called on my initial UIViewController. Does anyone know the definitive point by which you can check GCController.controllers() to know that it has been populated with the currently connected controllers?
I am aware of the need to register for controller connection notifications with;
NSNotificationCenter.defaultCenter().addObserver(self, selector: "handleControllerDidConnectNotification:" , name: GCControllerDidConnectNotification , object: nil)
But that is for later after launch. First I need to know which ones are already connected. Anyone know?
You can call startWirelessControllerDiscoveryWithCompletionHandler on viewDidLoad and then check GCController.controllers() on viewWillAppear that seem to work for the game app I just finished.
Docs:
After your app has finished launching, the operating system
automatically creates a list of connected controllers. Call the
controllers class method to retrieve an array of GCController objects
for all connected controllers. Next, use these objects to configure
the controllers or read the controller’s inputs. If there are no
connected controllers or you call this method while your app is
launching, the array will be empty.
GCController will generate GCControllerDidConnectNotification notifications for each controller, including those connected to the device prior to launch. If you're not getting notifications for already-connected controllers, confirm the following:
Double-check that it is paired and turned on
Make sure it is a MFi controller.

Refresh NSManagedObjectContext

I am writing a WatchKit extension for an iPhone application. I share Core Data using App Groups. This is working, the data is shared. When a new NSManagedObject is created using watch I send a notification to the iPhone app that a new object was created. To do that I use MMWormhole. The iPhone app receives the MMWormhole notification and now I have to do the last step - refresh NSManagedObjectContext. How can I do it?
I was trying to forward notification of NSManagedObjectContextDidSaveNotification inside MMWormhole notification and to use mergeChangesFromContextDidSaveNotification in the iPhone app, but it doesn’t work as MMWormhole serialize the notification and NSManagedObject doesn’t support it.
The simple way is just to have the app reload its data. Re-do any fetches so that you get the latest data from the persistent store.
If you want to make it more sophisticated, do something like this:
In the watch extension, for every new/changed/deleted object,
Call objectID to get the NSManagedObjectID
Convert the object ID to a string URIRepresentation
Pass these strings in the MMWormhole message
In the app, when receiving the message,
Use [NSPersistentStoreCoordinator managedObjectIDForURIRepresentation:] to convert the strings back to an NSManagedObjectID
Use [NSManagedObjectContext existingObjectWithID:] to get the managed object corresponding to the object ID.
Now you know which objects need refreshing.

iOS 7 - Update Data Every 24 Hours

I'm creating an app that needs to check for data once a day (midnight). I know there is a background fetch mode in iOS7, but from what I know there is no way to force it to update in given time interval. Is there any way to do this and still pass the Appstore review?
Thank you for any suggestions.
There is not real way to do this, since it requires you app to be running in background. background running modes are restricted to audio, VOIP, location and accessory type apps.
What you could do is just check when you last update data in the app when the user launches the app. This way you will only update data when the user starts your app and also only use data when the user is really using the app.
The background fetching will only work if the user is start your app often and iOS will allow you app to do background fetching. iOS will decide when you app is allowed to do a background refresh and you have little influence over the interval.
UIApplicationBackgroundFetchIntervalMinimum
The smallest fetchinterval supported by the system.
Maybe it's not exactly answer you expect but in iOS 7 there is a functionality which allow you to fetch the data every some period of time.
In this scenario iOS intelligently schedules the background fetch events based on your app usage and it helps you save battery life. So this not going to work every 24h but I think you can read the data and if it has been updated refresh the app if not ignore it.
In your Xcode 5 -> Target -> Capabilities turn on Background modes (background fetch).
And in application:didFinishLaunchingWithOptions add:
[[UIApplication sharedApplication] setMinimumBackgroundFetchInterval:UIApplicationBackgroundFetchIntervalMinimum];
This is a method which will be called:
- (void)application:(UIApplication *)application performFetchWithCompletionHandler:(void (^)(UIBackgroundFetchResult))completionHandler
{
}

Core Location In Conjunction with RestKit

I am using RestKit in my app for all web/network calls. On launch, in the AppDelegate, I start updating the user's location. By the time the user approves the app to use the location (on first launch), the app would have moved to the RootViewController and started the RestKit HTTP call. The HTTP call requires the user's location, and therefore, will fail.
What are some ways around that? How can I work with Core Location while utilizing RestKit's concurrency?
I am aware that the way the app is structured (update UI upon web service retrieval) is very dangerous. I am trying to avoid as much refactors as I can (management orders). What are ways around this?
Do not ask for location on AppDelegate.
Instead navigate to RootController and in ViewDidLoad, call startUpdatingLocation & immediately pop up a view with processing image like UIActivityIndicator.
Upon getting the location successfully, a protocol method is fired, remove the pop up view here and send the RestKit Call with acquired coordinates.

Receive remote control events without audio

Here is some background information, otherwise skip ahead to the question in bold. I am building an app and I would like it to have access to the remote control/lock screen events. The tricky part is that this app does not play audio itself, it controls the audio of another device nearby. The communication between devices is not a problem when the app is in the foreground. As I just found out, an app does not assume control of the remote controls until it has played audio with a playback audio session, and was the last do so. This presents a problem because like I said, the app controls ANOTHER device's audio and has no need to play its own.
My first inclination is to have the app play a silent clip every time it is opened in order to assume control of the remote controls. The fact that I have to do this makes me wonder if I am even going to be allowed to do it by Apple or if there is another way to achieve this without fooling the system with fake audio clips.
QUESTION(S): Will Apple approve an app that plays a silent audio clip in order to assume control of the remote/lock screen controls for the purpose of controlling another device's audio? Is there any way of assuming control of the remote controls without an audio session?
P.S. I would prefer to have this functionality on iOS 4.0 and up.
P.P.S I have seen this similar question and it has gotten me brainstorming but the answer provided is not specific to what I need to know.
NOTE: As of iOS 7.1, you should be using MPRemoteCommandCenter instead of the answer below.
You create various system-provided subclasses of MPRemoteCommand and assign them to properties of the [MPRemoteCommandCenter sharedCommandCenter].
I'm keeping the rest of this around for historical reference, but the following is not guaranteed to work on recent iOS versions. In fact, it just might not.
You definitely do need an audio player but not necessarily an explicit session to take control of the remote control events. (AVAudioSession is implicit to any app that plays audio.) I spent a decent amount of time playing with this to confirm this.
I've seen a lot of confusion on the internet about where to set up the removeControlEventRecievedWithEvent: method and various approaches to the responder chain. I know this method works on iOS 6 and iOS 7. Other methods have not. Don't waste your time handling remote control events in the app delegate (where they used to work) or in a view controller which may go away during the lifecycle of your app.
I made a demo project to show how to do this.
Here's a quick rundown of what has to happen:
You need to create a subclass of UIApplication. When the documentation says UIResponder, it means UIApplication, since your application class is a subclass of UIResponder. In this subclass, you're going to implement the remoteControlReceivedWithEvent: and canBecomeFirstResponder methods. You want to return YES from canBecomeFirstResponder. In the remote control method, you'll probably want to notify your audio player that something's changed.
You need to tell iOS to use your custom class to run the app, instead of the default UIApplication. To do so, open main.m and change this:
return UIApplicationMain(argc, argv, nil, NSStringFromClass([RCAppDel`egate class]));
to look like this:
return UIApplicationMain(argc, argv, NSStringFromClass([RCApplication class]), NSStringFromClass([RCAppDelegate class]));
In my case RCApplication is the name of my custom class. Use the name of your subclass instead. Don't forget to #import the appropriate header.
OPTIONAL: You should configure an audio session. It's not required, but if you don't, audio won't play if the phone is muted. I do this in the demo app's delegate, but do so where appropriate.
Play something. Until you do, the remote controls will ignore your app. I just took an AVPlayer and gave it the URL of a streaming site that I expect to be up. If you find that it fails, put your own URL in there and play with it to your heart's content.
This example has a little bit more code in there to log out remote events, but it's not all that complicated. I just define and pass around some string constants.
I bet that a silent looping MP3 file would help work towards your goal.
Moshe's solution worked great for me! However one issue I noticed is when you paused the audio, the media controls would go away and you won't be able to play it again without going back into the app. If you set the Media Info on the lock screen when you play the audio then this won't happen:
NSDictionary *mediaInfo = #{MPMediaItemPropertyTitle: #"My Title",
MPMediaItemPropertyAlbumTitle: #"My Album Name",
MPMediaItemPropertyPlaybackDuration: [NSNumber numberWithFloat:0.30f]};
[[MPNowPlayingInfoCenter defaultCenter] setNowPlayingInfo:mediaInfo];