Apple Watch app AVSpeechSynthesizer not working in background - background

In my watchkit extension capability section there is no option to check the checkbox for audio in background mode section.
I have also checked apple doc .
They are only 3 exception.
1) NSURLSession
2) WKAudioFilePlayer or WKAudioFileQueuePlayer
3) HKWorkoutSession
But i need my dynamic text to speak by the system.
I am using this code
let dictSetting = UserDefaults.standard.value(forKey: "settings") as! [String : String]
let settingObj = SettingWatch(fromDict: dictSetting)
let utterance = AVSpeechUtterance(string: text)
utterance.voice = AVSpeechSynthesisVoice(language: "en-US")
self.synth = AVSpeechSynthesizer()
self.synth?.delegate = self
self.synth?.speak(utterance)
Is there any way that i can my Apple Watch app can speak the text even if in backgound. Or is there any way to keep my app in foreground?
Please suggest if you have came across any solution.

Related

swift 3 local notification: how to change display text for each notification

I know how to make a local notification that triggers each day at a specific time. But how do I change the UNMutableNotificationContent.Body for each notification?
Now it just display the same text over and over again.
I'm coding for iOS 10.
Have you tried this?
let content = UNMutableNotificationContent()
content.title = self.notificationTitle // You could set this variable to anything you like
content.body = self.notificationBody // You could set this variable to anything you like
content.sound = UNNotificationSound.default()

Permission to take photo OR get image from library not shown in iOS9 (Xcode 7beta, Swift2)

The code below shows an example for my access to the image lib. No matter where I call the code (view) I do not see the permission dialog from the phone popping up and therefore cannot allow my app to access either camera or library.
Also, the privacy settings do not show my app either. Any thoughts? I'm going nuts.
let imgPicker = UIImagePickerController()
imgPicker.sourceType = UIImagePickerControllerSourceType.PhotoLibrary
imgPicker.modalPresentationStyle = UIModalPresentationStyle.Popover
self.presentViewController(imgPicker, animated: true, completion: nil)
another way I tried
if UIImagePickerController.isSourceTypeAvailable(UIImagePickerControllerSourceType.Camera) {
let imagePicker:UIImagePickerController = self.imagePickerController
imagePicker.allowsEditing = true
imagePicker.sourceType = UIImagePickerControllerSourceType.Camera
imagePicker.cameraCaptureMode = UIImagePickerControllerCameraCaptureMode.Photo
imagePicker.cameraDevice = UIImagePickerControllerCameraDevice.Rear
imagePicker.showsCameraControls = true
imagePicker.navigationBarHidden = true
imagePicker.toolbarHidden = true
imagePicker.delegate = self
self.presentViewController(imagePicker, animated: true, completion: nil)
}
You need to set a new pair of Info.plist values, same as the String for Location Services would be in iOS8:
Just set your description string for those.
Check info.plist.
if "Bundle display name" set as nothing,
At iOS 8.1 will popup permission window like this
"" Would Like to Access Your Photos
But at iOS 9, there is no popup permission at all.
Also, no permission setting, at Setting > Privacy > Photos > Your app name.
To solve this,
Remove "Bundle display name" in your info.plist.
Apparently this appears to be an issue with iOS9. For some reason the permission access dialog does not pop up. I have successfully tested on iOS8, using the same code.

Current Location doesn't work with Apple Maps IOS 6

Before IOS 6, I was using this URL scheme to open the native maps app and find directions from the users current location to an address that I created.
http://maps.google.com/maps?daddr=" + address + "&saddr=Current+Location
This was working great, but now that they got rid google maps with IOS 6, we had to check which IOS version they were on and then refer them to the new apple maps url scheme if they were using IOS 6.0 or greater. The new url scheme we are using is this....
http://maps.apple.com/maps?daddr=" + address + "&saddr=Current+Location
This is based on the new documentation for map url schemes, which can be found here..
Anyways, I've tested it a bunch and it boils down to the new apple maps does recognize Current Location, like google maps did.
Does anyone know how I fix this?
Keep in mind I am building a html app with phone gap, so using native code to set the starting address to current location won't help me.
I am having the same problem. I haven't found a solution yet but if you leave off the saddr
http://maps.apple.com/maps?daddr=" + address
it will just ask them where to start and the first option is "Current Location" so when they click "Current Location" it will show the map correctly.
If anyone finds a better solution please post it as I am still looking for a better solution.
You can use my method:
<script type="text/javascript">
var link = "maps:saddr=YPlat,YPlong&daddr=42.118599,-72.625122";
navigator.geolocation.getCurrentPosition(showPosition);
function showPosition(position)
{
link = link.replace("YPlat",position.coords.latitude);
link = link.replace("YPlong",position.coords.longitude);
window.location = link;
}
</script>
confirmed with iOS 5.1 and iOS 6
Just pass "Current Location" as the source address:
http://maps.apple.com/maps?saddr=Current%20Location&daddr=Your_Address
You can get the coordinates of the current location using CLLocationManager, or its wrapper DKLocationManager (on github), created by Keith Pitt.
Once you have the coordinates, you can use the following code sample.
+ (void) openDirectionFrom:CLLocation* currentLocation To:(NSString*) daddr {
NSString* urlStr;
NSString* saddr = #"Current+Location";
if ([[UIDevice currentDevice] systemVersion] floatValue] >=6) {
//iOS 6+, Should use map.apple.com. Current Location doesn't work in iOS 6 . Must provide the coordinate.
if ((currentLocation.coordinate.latitude != kCLLocationCoordinate2DInvalid.latitude) && (currentLocation.coordinate.longitude != kCLLocationCoordinate2DInvalid.longitude)) {
//Valid location.
saddr = [NSString stringWithFormat:#"%f,%f", currentLocation.coordinate.latitude,currentLocation.coordinate.longitude];
urlStr = [NSString stringWithFormat:#"http://maps.apple.com/maps?saddr=%#&daddr=%#", saddr, daddr];
} else {
//Invalid location. Location Service disabled.
urlStr = [NSString stringWithFormat:#"http://maps.apple.com/maps?daddr=%#", daddr];
}
} else {
// < iOS 6. Use maps.google.com
urlStr = [NSString stringWithFormat:#"http://maps.google.com/maps?saddr=%#&daddr=%#", saddr, daddr];
}
[(UIApplicationWithEvents*)[UIApplication sharedApplication] openURL:[NSURL URLWithString:urlStr]];
}

Can Adobe AIR Desktop application take full screen snapshots (aka Print Screen button)

I would like to know if its possible to get full screen snapshots from an air application.
What i am interested in, is functionality similar to PrintScreen button in windows, which takes snapshots of all screens, including third party application windows, not just window in which air app is running.
If its not specific to air, and flash/flex API can provide such functionality, it also would be great.
Thanx a lot in advance.
Check out this article as it explains obtaining a screenshot by calling a native process:
import flash.filesystem.File;
import flash.events.NativeProcessExitEvent;
var process:NativeProcess;
if(NativeProcess.isSupported) {
var file:File = File.applicationDirectory;
var args:Vector.<String> = new Vector.<String>();
if (Capabilities.os.toLowerCase().indexOf("win") > -1) {
file = file.resolvePath("PATH/TO/WINDOWS/printscr");
//use your prefered screenshot tool here (e.g. https://code.google.com/p/screenshot-cmd/
//also setup the args as needed
} else if (Capabilities.os.toLowerCase().indexOf("mac") > -1) {
file = file.resolvePath("/usr/sbin/screencapture");
args[0] = "-i";
args[1] = "screencapture.png";
}
var nativeProcessStartupInfo:NativeProcessStartupInfo = new NativeProcessStartupInfo();
nativeProcessStartupInfo.arguments = args;
nativeProcessStartupInfo.executable = file;
nativeProcessStartupInfo.workingDirectory = File.desktopDirectory;
process = new NativeProcess();
process.start(nativeProcessStartupInfo);
process.addEventListener(NativeProcessExitEvent.EXIT,done);
}else trace("NativeProcess NOT SUPPORTED!");
function done(e:NativeProcessExitEvent):void{
trace("screenshot comprete");
}
One important thing to bear in mind is the AIR device profile.
If you're initially testing in ADL, be sure to use the extendedDesktop profile, otherwise NativeProcess.isSupported will return false.
For more details check out the NativeProcess documentation and the Communicating with native processes in AIR developer guide

Getting Safari document title/location with Scripting Bridge does not work in full-screen mode

I'm trying to get the URL and document title from the topmost Safari document/tab. I have an AppleScript and an objective-c version using Apple's Scripting Bridge framework.
Both versions work fine for most web pages, however when I open a Youtube video in full-screen mode, the Scripting Bridge based version fails. The Apple Script works fine for "normal" and full-screen Safari windows.
Can anyone see what is wrong with the Scripting Bridge code below to cause it to fail for full-screen Safari windows?
Here the code (I omitted error checking for brevity):
AppleScript:
tell application "Safari"
# Give us some time to open video in full-screen mode
delay 10
do JavaScript "document.title" in document 0
end tell
Scripting Bridge:
SafariApplication* safari = [SBApplication applicationWithBundleIdentifier:#"com.apple.Safari"];
SBElementArray* windows = [safari windows];
SafariTab* currentTab = [[windows objectAtIndex: 0] currentTab];
// This fails when in full-screen mode:
id result = [safari doJavaScript: #"document.title" in: currentTab];
NSLog(#"title: %#", result);
Scripting Bridge error (with added line breaks):
Apple event returned an error. Event = 'sfri'\'dojs'{
'----':'utxt'("document.title"),
'dcnm':'obj '{ 'want':'prop',
'from':'obj '{ 'want':'cwin',
'from':'null'(),
'form':'indx',
'seld':1 },
'form':'prop',
'seld':'cTab' }
}
Error info = {
ErrorNumber = -1728;
ErrorOffendingObject = <SBObject #0x175c2de0:
currentTab of SafariWindow 0 of application "Safari" (238)>;
}
I could not find details about the given error code. It complains about 'currentTab' which shows that the JavaScript event at least made it all the way to Safari. I assume that the current tab receives the event, but refuses to run the JS code, because it is in full-screen mode. However, why does this work for an AppleScript? Don't they use the same code path eventually?
Any suggestions are greatly appreciated. Thanks!
This appears to be a failure in the header file created by sdef/sdp--your Objective-C does not parallel your working AppleScript (and the header file seemed to indicate that there is no doJavaScript:in: that would work on a document, as in your AppleScript). When I used the following code, which does parallel your AppleScript, it gave the title without error even for a full-screen YouTube video. The cast from SafariDocument* to SafariTab* in order to use doJavaScript:in: feels wrong, but seems to work.
SafariApplication* safari = [SBApplication applicationWithBundleIdentifier:#"com.apple.Safari"];
SBElementArray* documents = [safari documents];
SafariDocument* frontDocument = [documents objectAtIndex: 0];
// This *no longer* fails when in full-screen mode:
id result = [safari doJavaScript: #"document.title" in: (SafariTab *) frontDocument];
NSLog(#"title: %#", result);
tell application "Safari"
get URL of current tab of front window
end tell
The Objective-C is then simply
SafariApplication* safari = [SBApplication
applicationWithBundleIdentifier:#"com.apple.Safari"];
for (SafariWindow *window in safari.windows) {
if ([window visible]) {
result = window.currentTab.URL;
break;
}
}