Before IOS 6, I was using this URL scheme to open the native maps app and find directions from the users current location to an address that I created.
http://maps.google.com/maps?daddr=" + address + "&saddr=Current+Location
This was working great, but now that they got rid google maps with IOS 6, we had to check which IOS version they were on and then refer them to the new apple maps url scheme if they were using IOS 6.0 or greater. The new url scheme we are using is this....
http://maps.apple.com/maps?daddr=" + address + "&saddr=Current+Location
This is based on the new documentation for map url schemes, which can be found here..
Anyways, I've tested it a bunch and it boils down to the new apple maps does recognize Current Location, like google maps did.
Does anyone know how I fix this?
Keep in mind I am building a html app with phone gap, so using native code to set the starting address to current location won't help me.
I am having the same problem. I haven't found a solution yet but if you leave off the saddr
http://maps.apple.com/maps?daddr=" + address
it will just ask them where to start and the first option is "Current Location" so when they click "Current Location" it will show the map correctly.
If anyone finds a better solution please post it as I am still looking for a better solution.
You can use my method:
<script type="text/javascript">
var link = "maps:saddr=YPlat,YPlong&daddr=42.118599,-72.625122";
navigator.geolocation.getCurrentPosition(showPosition);
function showPosition(position)
{
link = link.replace("YPlat",position.coords.latitude);
link = link.replace("YPlong",position.coords.longitude);
window.location = link;
}
</script>
confirmed with iOS 5.1 and iOS 6
Just pass "Current Location" as the source address:
http://maps.apple.com/maps?saddr=Current%20Location&daddr=Your_Address
You can get the coordinates of the current location using CLLocationManager, or its wrapper DKLocationManager (on github), created by Keith Pitt.
Once you have the coordinates, you can use the following code sample.
+ (void) openDirectionFrom:CLLocation* currentLocation To:(NSString*) daddr {
NSString* urlStr;
NSString* saddr = #"Current+Location";
if ([[UIDevice currentDevice] systemVersion] floatValue] >=6) {
//iOS 6+, Should use map.apple.com. Current Location doesn't work in iOS 6 . Must provide the coordinate.
if ((currentLocation.coordinate.latitude != kCLLocationCoordinate2DInvalid.latitude) && (currentLocation.coordinate.longitude != kCLLocationCoordinate2DInvalid.longitude)) {
//Valid location.
saddr = [NSString stringWithFormat:#"%f,%f", currentLocation.coordinate.latitude,currentLocation.coordinate.longitude];
urlStr = [NSString stringWithFormat:#"http://maps.apple.com/maps?saddr=%#&daddr=%#", saddr, daddr];
} else {
//Invalid location. Location Service disabled.
urlStr = [NSString stringWithFormat:#"http://maps.apple.com/maps?daddr=%#", daddr];
}
} else {
// < iOS 6. Use maps.google.com
urlStr = [NSString stringWithFormat:#"http://maps.google.com/maps?saddr=%#&daddr=%#", saddr, daddr];
}
[(UIApplicationWithEvents*)[UIApplication sharedApplication] openURL:[NSURL URLWithString:urlStr]];
}
Related
MacOS 10.14
<key>NSMicrophoneUsageDescription</key>
<string>Record audio!</string>
This works in a swift project:
AVCaptureDevice.requestAccess(for: .audio) { granted in
if granted {
//self.setupCaptureSession()
}
}
But this does not work in an ObjectiveC project (Thread 8: signal SIGABRT)
[AVCaptureDevice requestAccessForMediaType:AVMediaTypeAudio completionHandler:^(BOOL granted) {
if (granted) {
//self.microphoneConsentState = PrivacyConsentStateGranted;
}
else {
//self.microphoneConsentState = PrivacyConsentStateDenied;
}
}];
What have I done wrong or missed in the ObjectiveC project? (I don't want to convert my project to swift.)
Any help appreciated. Thanks, paul
In Order to use the AVCaptureDevice you need to add the description for both items i.e NSCameraUsageDescription, NSMicrophoneUsageDescription
For Reference Please see the Apple Doc
If you only want to record the Audio you can also use the AVAudioSession API.
I began a new ObjectiveC project (MacOs 10.14) and copied everything from the old project to it (xibs etc). It displayed the appropriate access dialogue but didn't actually record. Had to check the audio input checkbox in capabilities - after messing about for an hour. :-)
In my watchkit extension capability section there is no option to check the checkbox for audio in background mode section.
I have also checked apple doc .
They are only 3 exception.
1) NSURLSession
2) WKAudioFilePlayer or WKAudioFileQueuePlayer
3) HKWorkoutSession
But i need my dynamic text to speak by the system.
I am using this code
let dictSetting = UserDefaults.standard.value(forKey: "settings") as! [String : String]
let settingObj = SettingWatch(fromDict: dictSetting)
let utterance = AVSpeechUtterance(string: text)
utterance.voice = AVSpeechSynthesisVoice(language: "en-US")
self.synth = AVSpeechSynthesizer()
self.synth?.delegate = self
self.synth?.speak(utterance)
Is there any way that i can my Apple Watch app can speak the text even if in backgound. Or is there any way to keep my app in foreground?
Please suggest if you have came across any solution.
I can download logs file via analytics console > Devices > Device Search > Device Information > Download Logs
we can search the logs file by deviceId.
My question is How to know the deviceid from the user ??
For example, there is some problem on the application, the user reports to the admin, and the admin searchs the user device by deviceId.
Is there a code to display deviceId on my application, so the user can send the deviceId to the admin ??
Of course, there is a JavaScript API to get the Device ID.
See WL.Device.getID()
I have seen folks use code like this in Native apps and have some special view to show it t the user
NSUUID *oNSUUID = [[UIDevice currentDevice] identifierForVendor];
or
import android.provider.Settings.Secure;
private String android_id =
Secure.getString(getContext().getContentResolver(), Secure.ANDROID_ID);
I have not found an example of this API in JavaScript, but it is very easy to pass the data up from native, for example
in main.m
#import "WL.h"
NSDictionary *data = #{#"id": oNSUUID};
[[WL sharedInstance] sendActionToJS:#"DeviceInfo"
withData:data];
in your app
//global
var nativeInfo = null;
// in wlCommonInit()
var actionReceiver = function(received){
WL.Logger.error(received);
if ( received.action == "DeviceInfo"){
nativeInfo = received.data;
}
};
WL.App.addActionReceiver ("GarantiActionReceiver", actionReceiver);
I would like to know whether there is a Swift equivalent of the following Objective-C code
NSURL *appURL = [NSURL URLWithString: #"myapp://"];
if ([app canOpenURL: appURL] {
NSLog(#"The app is this URL Scheme is installed on the device");
}
Before reading this answer you must solemnly swear not to do any of the activities on the page you linked to. (Looking for dating apps? Seriously?)
The method is essentially the same:
if let appURL = NSURL(string: "myapp://test/url/") {
let canOpen = UIApplication.sharedApplication().canOpenURL(appURL)
println("Can open \"\(appURL)\": \(canOpen)")
}
let we take two app with name A and B.
Now i want to open app B from app A, so for that first create URLSchema in app B.
Than add below code to info.plist file in app A.
<key>LSApplicationQueriesSchemes</key>
<array>
<string>app b schema Name</string>
</array>
Now add below code to app A from where you want to open app B.
UIApplication.shared.canOpenURL(URL(string:"app b schema Name://")! as URL)
thank you.
I'm trying to get the URL and document title from the topmost Safari document/tab. I have an AppleScript and an objective-c version using Apple's Scripting Bridge framework.
Both versions work fine for most web pages, however when I open a Youtube video in full-screen mode, the Scripting Bridge based version fails. The Apple Script works fine for "normal" and full-screen Safari windows.
Can anyone see what is wrong with the Scripting Bridge code below to cause it to fail for full-screen Safari windows?
Here the code (I omitted error checking for brevity):
AppleScript:
tell application "Safari"
# Give us some time to open video in full-screen mode
delay 10
do JavaScript "document.title" in document 0
end tell
Scripting Bridge:
SafariApplication* safari = [SBApplication applicationWithBundleIdentifier:#"com.apple.Safari"];
SBElementArray* windows = [safari windows];
SafariTab* currentTab = [[windows objectAtIndex: 0] currentTab];
// This fails when in full-screen mode:
id result = [safari doJavaScript: #"document.title" in: currentTab];
NSLog(#"title: %#", result);
Scripting Bridge error (with added line breaks):
Apple event returned an error. Event = 'sfri'\'dojs'{
'----':'utxt'("document.title"),
'dcnm':'obj '{ 'want':'prop',
'from':'obj '{ 'want':'cwin',
'from':'null'(),
'form':'indx',
'seld':1 },
'form':'prop',
'seld':'cTab' }
}
Error info = {
ErrorNumber = -1728;
ErrorOffendingObject = <SBObject #0x175c2de0:
currentTab of SafariWindow 0 of application "Safari" (238)>;
}
I could not find details about the given error code. It complains about 'currentTab' which shows that the JavaScript event at least made it all the way to Safari. I assume that the current tab receives the event, but refuses to run the JS code, because it is in full-screen mode. However, why does this work for an AppleScript? Don't they use the same code path eventually?
Any suggestions are greatly appreciated. Thanks!
This appears to be a failure in the header file created by sdef/sdp--your Objective-C does not parallel your working AppleScript (and the header file seemed to indicate that there is no doJavaScript:in: that would work on a document, as in your AppleScript). When I used the following code, which does parallel your AppleScript, it gave the title without error even for a full-screen YouTube video. The cast from SafariDocument* to SafariTab* in order to use doJavaScript:in: feels wrong, but seems to work.
SafariApplication* safari = [SBApplication applicationWithBundleIdentifier:#"com.apple.Safari"];
SBElementArray* documents = [safari documents];
SafariDocument* frontDocument = [documents objectAtIndex: 0];
// This *no longer* fails when in full-screen mode:
id result = [safari doJavaScript: #"document.title" in: (SafariTab *) frontDocument];
NSLog(#"title: %#", result);
tell application "Safari"
get URL of current tab of front window
end tell
The Objective-C is then simply
SafariApplication* safari = [SBApplication
applicationWithBundleIdentifier:#"com.apple.Safari"];
for (SafariWindow *window in safari.windows) {
if ([window visible]) {
result = window.currentTab.URL;
break;
}
}