Can AVCaptureDevice be used (objective-c or Swift) to access an iOS device as a camera source, when it is connected via a lightning cable, very much like Quicktime does for OSX Yosemite?
Quicktime select camera source image
If not, is there any other way to capture it?
I'm using AVCaptureDevice.devices() (in swift) but it only lists the built-in Mac camera and mic.
Found the solution for this (thanks Chris Adamson), after looking at a presentation on WWDC where Apple announced this capability - fast forward to 4:09.
The following CMI property needs to be set before AVCaptureDevice can detect the iOS device as a camera/capture device (example in Swift):
var prop : CMIOObjectPropertyAddress = CMIOObjectPropertyAddress(
mSelector: CMIOObjectPropertySelector(kCMIOHardwarePropertyAllowScreenCaptureDevices),
mScope: CMIOObjectPropertyScope(kCMIOObjectPropertyScopeGlobal),
mElement: CMIOObjectPropertyElement(kCMIOObjectPropertyElementMaster))
var allow:UInt32 = 1
CMIOObjectSetPropertyData( CMIOObjectID(kCMIOObjectSystemObject),
&prop,
0,
nil,
UInt32(sizeofValue(allow)),
&allow)
After this is done (e.g. in your AppDelegate), the standard registration of observers can be done, and the iOS camera will show up in the list of available devices
// register for notifications when a new device is connected
notifications.registerObserver(AVCaptureSessionDidStartRunningNotification, forObject: session, block: {note in
var devices = AVCaptureDevice.devicesWithMediaType(AVMediaTypeMuxed)
+ AVCaptureDevice.devicesWithMediaType(AVMediaTypeVideo) as! [AVCaptureDevice]
for device in self.devices {
if device.modelID == "iOS Device" {
// device is your AVCaptureDevice... use it as usual
}
}
})
Related
I am using Agora's ReactNative library for a group calling project, I need the user to be able to mute/unmute his phone microphone. Currently there is only a function to muteLocalAudioStream which does mute the whole stream, that includes the background sounds, as the streamer can add background sound to the call.
For Android I managed to mute/unmute the microphone with the below hack:
#ReactMethod
public void muteMic(){
AudioManager audioManager = (AudioManager) mContext.getSystemService(Context.AUDIO_SERVICE);
audioManager.setMode(AudioManager.MODE_IN_CALL);
if (audioManager.isMicrophoneMute() == false) {
audioManager.setMicrophoneMute(true);
} else {
audioManager.setMicrophoneMute(false);
}
}
However, I couldn't do it for iOS, I appreciate your help on this.
You can change recording level with a call to
// volume: [0,100]
RtcEngine.adjustRecordingSignalVolume(volume)
We are using ExoPlayer to play m3u8 files (stream) on Android TV. The streaming is working fine, but the video plays in portrait mode (even if the video is shot in landscape).
Looks like some issue with orientation of the android TV instead of aspect ratio.
private fun initializePlayer() {
if(mPlayer == null) {
playerView = activity!!.findViewById<SimpleExoPlayerView>(R.id.texture_view)
// playerView!!.setControllerVisibilityListener(this)
playerView!!.requestFocus()
val bandwidthMeter = DefaultBandwidthMeter()
val videoTrackSelectionFactory = AdaptiveTrackSelection.Factory(bandwidthMeter)
mTrackSelector = DefaultTrackSelector(videoTrackSelectionFactory)
mPlayer = ExoPlayerFactory.newSimpleInstance(activity, mTrackSelector)
playerView!!.player= mPlayer
mPlayerAdapter = LeanbackPlayerAdapter(activity, mPlayer, UPDATE_DELAY)
mPlayerGlue = VideoPlayerGlue(activity!!, mPlayerAdapter!!)
mPlayerGlue!!.host = VideoSupportFragmentGlueHost(this)
mPlayerGlue!!.playWhenPrepared()
play(s1)
}
}
Commenting these lines :
mPlayerAdapter = LeanbackPlayerAdapter(activity, mPlayer, UPDATE_DELAY)
mPlayerGlue = VideoPlayerGlue(activity!!, mPlayerAdapter!!)
mPlayerGlue!!.host = VideoSupportFragmentGlueHost(this)
mPlayerGlue!!.playWhenPrepared()
Plays the video in landscape but the player controls are hidden and it only plays the lowest quality of the video. Please help us with this.
Metadata of the MP4 video contains a property called Rotation=90° but it's ignored by the ExoPlayer. To fix it you need to inject this Java function into your code:
void onVideoSizeChanged(int width,
int height,
int unappliedRotationDegrees, // 90° or 270°
float pixelWidthHeightRatio);
This allows an application using TextureView to easily apply the rotation by making the appropriate call to TextureView.setTransform. Note that on Lollypop+ unappliedRotationDegrees will always be equal to 0.
You can find this function on a line #74 at MediaCodecVideoTrackRenderer page of GitHub.
If the above-mentioned method doesn't work for you, you may find another remedy in Rotation Issue #91 post on a GitHub.
As far i know, exoplayer will generate its size based on texture view size. So try to programmatically resize your texture view by
playerView.setResizeMode(AspectRatioFrameLayout.RESIZE_MODE_FILL);
and also try to resize your player programmatically
mPlayer.setVideoScalingMode(C.VIDEO_SCALING_MODE_SCALE_TO_FIT_WITH_CROPPING);
Hope this will help.
How to change the default camera app in Windows 10 desktop ? The option is available from Settings in Phones but not from Desktops
If you want to make 'advanced' photo capture, then you can use MediaCapture class. Everything about this you will find at MSDN. There are also quite nice samples at GitHub.
It also seems that my old post for WinRT is still quite relevant. You will find there that I'm using GetCameraID:
private static async Task<DeviceInformation> GetCameraID(Windows.Devices.Enumeration.Panel desired)
{
DeviceInformation deviceID = (await DeviceInformation.FindAllAsync(DeviceClass.VideoCapture))
.FirstOrDefault(x => x.EnclosureLocation != null && x.EnclosureLocation.Panel == desired);
if (deviceID != null) return deviceID;
else throw new Exception(string.Format("Camera of type {0} doesn't exist.", desired));
}
to choose a device to be used to capture the photo. In your app you can enumerate devices and choose the one that suits you.
I am developing my app for iOS 10, but my default iOS functionality extensions not working well. Like am not able to access camera, Microphone and media Library. Every time it got crashed. I have written all, but nothing working.
case .Authorized:
picker!.sourceType = UIImagePickerControllerSourceType.PhotoLibrary
if UIDevice.currentDevice().userInterfaceIdiom == .Phone
{
self.presentViewController(picker!, animated: true, completion: nil)
}
break
//handle authorized status
case .Denied, .Restricted :
print("Denied")
let alertController = UIAlertController (title: appName, message: "Go to Settings?", preferredStyle: .Alert)
let settingsAction = UIAlertAction(title: "Settings", style: .Default) { (_) -> Void in
let settingsUrl = NSURL(string: UIApplicationOpenSettingsURLString)
if let url = settingsUrl {
UIApplication.sharedApplication().openURL(url)
}
}
let cancelAction = UIAlertAction(title: "Cancel", style: .Default, handler: nil)
alertController.addAction(settingsAction)
alertController.addAction(cancelAction)
presentViewController(alertController, animated: true, completion: nil)
break
A significant change in iOS 10 is that you must declare ahead of time any access to private data or your App will crash. The fix is quick but easy to overlook if the usage is not a major feature of an App so here is your reminder if you are planning an iOS 10 migration.
Don’t Forget Your Purpose Strings
Once you link with iOS 10 you must declare access to any user private data types. You do this by adding a usage key to your app’s Info.plist together with a purpose string. The list of frameworks that count as private data is a long one:
Contacts, Calendar, Reminders, Photos, Bluetooth Sharing, Microphone, Camera, Location, Health, HomeKit, Media Library, Motion, CallKit, Speech Recognition, SiriKit, TV Provider.
If you are using one of these frameworks and fail to declare the usage your app will crash when it first makes the access. The crash log helpfully tells you which key you are missing. For example, this is the result of accessing the camera without adding the key to Info.plist:
This app has crashed because it attempted to access privacy-sensitive data without a usage description. The app’s Info.plist must contain an NSCameraUsageDescription key with a string value explaining to the user how the app uses this data.
To avoid the crash we need to add the suggested key to ‘Info.plist’ (Xcode 8 already contains the full list of possible keys):
The system shows the purpose string when asking the user to allow access (so you may want to localize it):
The direction from Apple is clear. If you access private data declare your intentions up front or expect your App to crash.
You can check all privacy settings key for apple documentation : Privacy setting keys for iOS10
I'm working on iPad.
I would like to detect when user plug-out headphone. First I used a listener on the property kAudioSessionProperty_AudioRouteChange. So all was working well until I've decided to add a button to switch to speakers when headphone was still plugged. So I’m now facing a problem, maybe someone would have an idea to fix it.
Here is the scenario :
I plug a headphone -> my audio route change callback is called
then I switch sound to speakers (without unplugging my headphone) -> audio route change callback is called
then I unplug my headphone (when sound is still outputting to speakers) -> audio route change callback is NOT called, which seems logical.
But here is my problem ! So my question is : Do you see a way to detect that headphone was unplugged for this last case ?
Thanks for your help
EDIT :
Ok I found a workaround :
To detect whether or not headphones are plugged, I execute a test function all the times I need to know it (instead using a boolean), this might be less good for performances but it's working, here is my code for the ones who may need it :
//set back the default audio route
UInt32 audioRouteOverride = kAudioSessionOverrideAudioRoute_None;
AudioSessionSetProperty(kAudioSessionProperty_OverrideAudioRoute, sizeof(audioRouteOverride), &audioRouteOverride);
//check if this default audio route is Heaphone or Speaker
CFStringRef newAudioRoute;
UInt32 newAudioRouteSize = sizeof(newAudioRoute);
AudioSessionGetProperty(kAudioSessionProperty_AudioRoute, &newAudioRouteSize, &newAudioRoute);
NSString *newAudioRouteString = (__bridge NSString *)newAudioRoute;
CFRelease(newAudioRoute);
//if this default audio route is not Headphone, it means no headphone is plugged
if ([newAudioRouteString rangeOfString:#"Headphones"].location != NSNotFound){
NSLog(#"Earphone available");
return true;
}
else {
NSLog(#"No Earphone available");
return false;
}
Hope it will help someone !
I imagine a solution it in the following way:
You create in the AppDelegate a boolean for the speakers, let's say:
BOOL isSpeakerOn. And every time the audio route callback is called you have to verify what the current situation with the speakers and what you want to do then.
This is the best tutorial dealing this issue:
http://www.techotopia.com/index.php/Detecting_when_an_iPhone_Headphone_or_Docking_Connector_is_Unplugged_(iOS_4)