I am using the new iOS7 developer SDK and now the app request from the user his permission to record from mic when the App try to record in the first time.
My App will record after a countdown,so the user can't see this request.
I use this code to check the requestRecordPermission:
[[AVAudioSession sharedInstance] requestRecordPermission:^(BOOL granted) {
if (granted) {
// Microphone enabled code
}
else {
// Microphone disabled code
}
}];
But how can i trigger the request by myself before i start to record ?
In the new iOS7 it's very simple try this:
if([[AVAudioSession sharedInstance] respondsToSelector:#selector(requestRecordPermission)])
{
[[AVAudioSession sharedInstance] requestRecordPermission];
}
Here is final code snippet that does work for me. It support both Xcode 4 and 5, and works for iOS5+.
#ifndef __IPHONE_7_0
typedef void (^PermissionBlock)(BOOL granted);
#endif
PermissionBlock permissionBlock = ^(BOOL granted) {
if (granted)
{
[self doActualRecording];
}
else
{
// Warn no access to microphone
}
};
// iOS7+
if([[AVAudioSession sharedInstance] respondsToSelector:#selector(requestRecordPermission:)])
{
[[AVAudioSession sharedInstance] performSelector:#selector(requestRecordPermission:)
withObject:permissionBlock];
}
else
{
[self doActualRecording];
}
As "One Man Crew" claimed you can use requestRecordPermission.
Important thing to be aware of is that you must check that requestRecordPermission is implemented. The reason is that if your app would run on older iOS version (iOS 6.x for example) it would crash after this call.
To prevent that you must check that this selector is implemented (this is a good practice anyway).
Code should be something like this:
if([[AVAudioSession sharedInstance] respondsToSelector:#selector(requestRecordPermission:)]){
[[AVAudioSession sharedInstance] requestRecordPermission];
}
Using this method your app would support the new OS and also previous versions of the OS.
I'm using this method every time Apple add more functionality to new OS (that way I can support older versions pretty easy).
AVAudioSession.sharedInstance().requestRecordPermission({ (granted) -> Void in
if !granted
{
let microphoneAccessAlert = UIAlertController(title: NSLocalizedString("recording_mic_access",comment:""), message: NSLocalizedString("recording_mic_access_message",comment:""), preferredStyle: UIAlertControllerStyle.Alert)
var okAction = UIAlertAction(title: NSLocalizedString("OK",comment:""), style: UIAlertActionStyle.Default, handler: { (alert: UIAlertAction!) -> Void in
UIApplication.sharedApplication().openURL(NSURL(string: UIApplicationOpenSettingsURLString)!)
})
var cancelAction = UIAlertAction(title: NSLocalizedString("Cancel",comment:""), style: UIAlertActionStyle.Cancel, handler: { (alert: UIAlertAction!) -> Void in
})
microphoneAccessAlert.addAction(okAction)
microphoneAccessAlert.addAction(cancelAction)
self.presentViewController(microphoneAccessAlert, animated: true, completion: nil)
return
}
self.performSegueWithIdentifier("segueNewRecording", sender: nil)
});
Based on https://stackoverflow.com/users/1071887/idan's response.
AVAudioSession *session = [AVAudioSession sharedInstance];
// AZ DEBUG ## iOS 7+
AVAudioSessionRecordPermission sessionRecordPermission = [session recordPermission];
switch (sessionRecordPermission) {
case AVAudioSessionRecordPermissionUndetermined:
NSLog(#"Mic permission indeterminate. Call method for indeterminate stuff.");
break;
case AVAudioSessionRecordPermissionDenied:
NSLog(#"Mic permission denied. Call method for denied stuff.");
break;
case AVAudioSessionRecordPermissionGranted:
NSLog(#"Mic permission granted. Call method for granted stuff.");
break;
default:
break;
}
Swift 4:
let session = AVAudioSession.sharedInstance()
if (session.responds(to: #selector(AVAudioSession.requestRecordPermission(_:)))) {
AVAudioSession.sharedInstance().requestRecordPermission({(granted: Bool)-> Void in
if granted {
print("granted")
} else {
print("not granted")
}
})
}
Related
When wrong user login ios app is crashing and no log displayed. But in Android is working with out crash. Why?
Thread 26: Fatal error: Unexpectedly found nil while unwrapping an Optional value
Fatal error: Unexpectedly found nil while unwrapping an Optional value: file /Users/user/FlutterHome/flutter/.pub-cache/hosted/pub.dartlang.org/flutter_aws_amplify_cognito-1.0.0+7/ios/Classes/SwiftFlutterAwsAmplifyCognito.swift, line 275
Future<dynamic> login(
{String username,
String password,
GlobalKey<ScaffoldState> globalKey}) async {
return FlutterAwsAmplifyCognito.signIn(username, password)
.then((SignInResult result) {
debugPrint('------------------------${result}');
switch (result.signInState) {
case SignInState.SMS_MFA:
// TODO: Handle this case.
break;
case SignInState.PASSWORD_VERIFIER:
// TODO: Handle this case.
break;
case SignInState.CUSTOM_CHALLENGE:
// TODO: Handle this case.
break;
case SignInState.DEVICE_SRP_AUTH:
// TODO: Handle this case.
break;
case SignInState.DEVICE_PASSWORD_VERIFIER:
// TODO: Handle this case.
break;
case SignInState.ADMIN_NO_SRP_AUTH:
// TODO: Handle this case.
break;
case SignInState.NEW_PASSWORD_REQUIRED:
// TODO: Handle this case.
break;
case SignInState.DONE:
break;
case SignInState.UNKNOWN:
// TODO: Handle this case.
break;
case SignInState.ERROR:
// TODO: Handle this case.
break;
}
return result.codeDetails;
}).catchError((error) {
if (error.code == 'Error') {
globalKey.currentState.showSnackBar(SnackBar(
backgroundColor: Colors.red,
content: Text(LocalizationsUtils(
Locale.fromSubtags(languageCode: AppPreferences().language))
.errorIncorrectEmailPassword),
));
}
});
}
github link
https://github.com/jonsaw/amazon-cognito-identity-dart
Had the same problem. There is just a missing return in code.
static func signIn(result: #escaping FlutterResult, username: String, password: String) {
AWSMobileClient.default().signIn(username: username, password: password){(signinResult, error) in
if (error != nil) {
DispatchQueue.main.async {
result(FlutterError(code: "Error", message: "Error signing in", details: error?.localizedDescription))
}
return // this return was missing!!!
}
FYI: If you run the Application with Xcode, the debugger will point you to the bug.
PS: the return is missing in every error, so the app might crash on other functionCalls too, for example if you try to get Tokens without being loggedIn.
I have a MacOS app that is a registered custom URL handler.
I'm trying to make it show a specific NSViewController if the app is started via the url handler or the regular window and ViewController if no parameters are used.
In my AppDelegate.swift I have this:
func application(_ application: NSApplication, open urls: [URL]) {
AppDelegate.externalCaller = true;
NSApp.hide(self)
let url: URL = urls[0];
// Process the URL.
let components = NSURLComponents(url: url, resolvingAgainstBaseURL: true);
let method = components?.host;
DispatchQueue.main.asyncAfter(deadline: .now() + 0.1) {
if (method == "DO_STH") {
// do something with no window
NSApplication.shared.windows.last?.close();
} else if (method == "DO_STH_2") {
// do something with no window
NSApplication.shared.windows.last?.close();
} else if (method == "PROCESS_STUFF") {
// Show window
DispatchQueue.main.async {
let mainStoryboard = NSStoryboard(name: NSStoryboard.Name("Main"), bundle: nil);
let restoringViewController = mainStoryboard.instantiateController(withIdentifier: NSStoryboard.SceneIdentifier("restoringData")) as! RestoringViewController;
if let window = NSApp.mainWindow {
window.contentViewController = restoringViewController;
}
NSApp.activate(ignoringOtherApps: true);
AppDelegate.restoreData();
}
}
}
}
}
The problem is that by the time NSApp.hide(self) runs, there's already a window visible with the default viewController, creating some flickering effect.
What's the best way to make the app start without any visible window by default and only show the window if it wasn't started with any URL parameter and later on demand if a specific URL parameter exists?
Unchecking "Is initial Controller" and adding this to AppDelegate solved my issue
func applicationDidFinishLaunching(_ notification: Notification) {
if (!AppDelegate.externalCaller) {
showWindow();
}
}
func showWindow() {
let mainStoryboard = NSStoryboard(name: NSStoryboard.Name("Main"), bundle: nil);
let windowController = mainStoryboard.instantiateController(withIdentifier: NSStoryboard.SceneIdentifier("MainWindowController")) as! NSWindowController;
windowController.showWindow(self);
}
i am Developing one Application, Actually my App is showing landscape and Portrait in Tabs/iPads. But in Tablets(iPad's working fine) when i was check the Orientation functionality working fine until unlock the Device.When i was lock the screen on Particular mode like (Portrait/Landscape) After that turned the Device showing before Orientation. Not Update the present Orientation.
I followed this link :https://github.com/yamill/react-native-orientation
this is my code:
componentWillMount(){
this.getOrientationtype()
}
getOrientationtype(){
//alert("Hello")
if(Platform.OS == 'ios'){ // to Identifying Android or iOS
if(aspectRatio>1.6){ // Code for Iphone
// alert("phone")
Orientation.lockToPortrait()
}
else{
Orientation.getOrientation((err, initial) => {
if(initial != 'UNKNOWN'){
this.setState({
orientation:initial
})
}
else{
this.setState({
orientation:'PORTRAIT'
})
}
});
}
}
else{
if(DeviceInfo.isTablet()){
// alert("android tab")
Orientation.getOrientation((err, initial) => {
if(initial != 'UNKNOWN'){
this.setState({
orientation:initial
})
}
else{
this.setState({
orientation:'PORTRAIT'
})
}
});
}
else{
Orientation.lockToPortrait()
}
}
}
Please find Out this Solution ....I am Using this link enter link description here
1.You should use Orientation.addOrientationListener for listen to the Orientation Events.
2.As see from the source code of OrientationModule.java, this library just call unregisterReceiver in onHostPause, so you can't receive onConfigurationChanged event after lock the screen.One way is to edit the onHostResume inside OrientationModule.java to meet what you want.
#Override
public void onHostResume() {
final Activity activity = getCurrentActivity();
if (activity == null) {
FLog.e(ReactConstants.TAG, "no activity to register receiver");
return;
}
activity.registerReceiver(receiver, new IntentFilter("onConfigurationChanged"));
//add below code to onHostResume function
//send broadcast onResume
final int orientationInt = getReactApplicationContext().getResources().getConfiguration().orientation;
Configuration newConfig = new Configuration();
newConfig.orientation = orientationInt;
Intent intent = new Intent("onConfigurationChanged");
intent.putExtra("newConfig", newConfig);
activity.sendBroadcast(intent);
}
the whole code can be found here OrientationModule.java
I want to pass data from my iOS App to my watchOS 3 app using WKWatchConnectivityRefreshBackgroundTask
How do I set up code in my watchOS App to handle the data being transferred?
For example in the past I used this iOS code to send a message from the iOS App and if there was no connection send a context:
func sendTable()
{
let tableInfo: WatchWorkout = PhoneData().buildWatchTableData(Foundation.Date().refDays())
let archivedTable: Data = NSKeyedArchiver.archivedData(withRootObject: tableInfo)
if validSession
{
sendMessage([Keys.UpdateType : PhoneUpdateType.TableInfo.rawValue, Keys.Workout: archivedTable])
}
else
{
do
{
try updateApplicationContext([Keys.UpdateType : PhoneUpdateType.TableInfo.rawValue, Keys.Workout: archivedTable])
}
catch
{
print("Phone Session - error sending info: \(error)")
}
}
}
func sendMessage(_ message: [String : AnyObject], replyHandler: (([String : AnyObject]) -> Void)? = nil, errorHandler: ((NSError) -> Void)? = nil)
{
print("Phone Session - phone sent message")
session!.sendMessage(message,
replyHandler:
nil,
errorHandler:
{
(error) -> Void in
print("Phone Session - Error Message during transfer to Watch: \(error)")
}
)
}
func updateApplicationContext(_ applicationContext: [String : AnyObject]) throws
{
print("Phone Session - phone sent context")
if ((session) != nil)
{
do
{
try session!.updateApplicationContext(applicationContext)
}
catch let error
{
print("Phone Session - OPPS something wrong - context send failed")
throw error
}
}
}
I'm not sure how to code the receipt of this data as a background task on the watch.
Can someone provide some example code or post a link? The only Apple example code is not very helpful:
https://developer.apple.com/library/prerelease/content/samplecode/WatchBackgroundRefresh/Introduction/Intro.html
Thanks
Greg
The Quick Switch sample code was updated together with the release of watchOS 3 to include an example of handling the WatchConnectivity background refresh task.
#ccjensen The Quick Switch sample code doesn't work, is it?
It will crash on my iPhone6 iOS10.0 beta3. I sent feedback already last Friday.
In my Case, calling
updateApplicationContext(_:)
transferUserInfo(_:)
transferCurrentComplicationUserInfo(_:)
transferFile(_:metadata:)
on iPhone side never trigger handle(_:) listener.
My development environment is mac osx, appcelerator sdk 5.3.0 and testing on google nexus Android 6.0. Ti.Media.showCamera not opening camera even if permissions are granted. Here is my code
function openCamera(parms) {
if (Ti.Media.hasCameraPermissions) {
Ti.API.error("Yes has camera permission");
Ti.Media.showCamera({
success : function(event) {
parms.source.image = newBlob;
},
cancel : function() {
Ti.API.error("User cancelled pictur selection");
},
error : function(error) {
var a = Ti.UI.createAlertDialog({
title : 'Camera Error'
});
if (error.code == Ti.Media.NO_CAMERA) {
a.setMessage("No Camera Found!");
} else {
a.setMessage('Unexpected Error: ' + error.code);
}
a.show();
},
mediaTypes : [Ti.Media.MEDIA_TYPE_PHOTO],
animated : true,
autoHide : true,
allowEditing : true,
saveToPhotoGallery : false,
showControls : true
});
} else {
Ti.API.error("No camera permission. Asking for Permission");
Ti.Media.requestCameraPermissions(function(e) {
Ti.API.error(JSON.stringify(e));
if (e.success === true) {
openCamera(parms);
} else {
alert("Access denied, error: " + e.error);
}
});
}
};
In console log this displayed
Yes has camera permission
[WARN] : InputEventReceiver: Attempted to finish an input event but
the input event receiver has already been disposed.
Would someone point me out what is wrong here.
Hiii I think you are missing brackets after hasCameraPermissions. hasCameraPermissions() is a method defined inside Ti.Media.
Use it like this:
if(hasCameraPermissions()){
//Do you code.....
}