Route audio to specific device output on OS X - objective-c

Am looking to add a preview channel to an AV project, so that a video or audio file can be playing on the master output channel but the user can preview a separate audio or video file on a different channel.
Naturally the prerequisite is that the user has a sound device capable of multiple channels or a separate device for master output than for preview. For example, the output could go to a USB audio device which provides balanced audio output but the preview could be via the standard headphone jack.
I am struggling to find the right documentation on enumerating sound devices connected to the mac and then linking either AVAudioPlayer or AVPlayer/AVPlayerLayer or an underlying API.
Anyone know where I should be looking?
Is this a case of discovering the sound devices, selecting the appropriate one then using something like NSSound to set the device before I start playing the file? What happens with files already playing?
Thanks

Note that AVAudioSession is iOS-only.

You can change the audio output device for a given AVPlayer instance by setting the instance property 'audioOutputDeviceUniqueID' to the UniqueID of the desired device. This also works for AVAudioPlayer.
I can confirm that it works as expected in MacOS 10.11.6, using Key-Value coding ( setValue:forKey:). The AVAudioPlayer API also has a specific method to do this 'setCurrentDevice:'
The messiest part of the process is obtaining the list of valid UniqueIDs for the current system. 0 always means' System Default", but any other specific device is identified by a CF$ that appears to be constructed by CoreAudio at runtime using the a combination of manufacturer's name, product name, and a bunch of other mysterious codes. The process to get the current list is described here: AudioObjectGetPropertyData to get a list of input devices
Apple's doc on 'audioOutputDeviceUniqueID' :
Instance Property
audioOutputDeviceUniqueID
Specifies the unique ID of the Core Audio output device used to play audio.
Declaration
#property(nonatomic, copy) NSString *audioOutputDeviceUniqueID;
Discussion
The default value of this property is nil, indicating that the default audio output device is used. Otherwise the value of this property is a string containing the unique ID of the Core Audio output device to be used for audio output.
Core Audio's kAudioDevicePropertyDeviceUID is a suitable source of audio output device unique IDs.

Please check EZAudio.
There is EZMicrophone and EZOutput.
By using them, you can route audio to specific device output on OS X.

Related

Get information from /dev/sda

I'am able to detect the presence of USB device connected or not to my embedded device imediattely after the boot, using
access(path, F_OK);
function, where path is /dev/sdax with x=1,2,...
Next step is retrieve from /dev/sdax which type of device is conncted (e.g. pen drive, mouse, etc etc), maybe I could use "udev" and/or "libudev", but in this case I loose the information about which sdax is pointed.
Any suggestion? I'm looking for some IOCTLs, but I didn't found somenthing which return e.g. vendor id

WebRTC Changing Media Streams on the Go

Now since device enumeration is present in chrome, i know i can select a device during "getUserMedia" negotiation. I was also wondering whether i could switch devices during the middle of a call (queue up a local track and switch tracks or do i have to renegotiate the stream)? I am not sure if this is something that is still blocked or now is "allowable"
I have tried to make a new track, but i can't figure out how to switch the track on the go. I know this was previously impossible, but was wondering now if it is possible?
Even i have the same requirement. I have to record the video using MediaRecorder. For this I am using navigator.getUserMedia with constraints of audio and video. You can pass the video or audio tracks dynamically by getting the available devices from navigator.mediaDevices.enumerateDevices() and attaching the respective device to constraints and calling navigator.getUserMedia with new constraints again. The point to be noted when doing this is, you have to kill the existing tracks using track.stop() method.
You can see my example here.
StreamTrack's readyState is getting changed to ended, just before playing the stream (MediaStream - MediaStreamTrack - WebRTC)
In Firefox, you can use the RTPSender object to call replaceTrack() to replace a track on the fly (with no renegotiation). This should eventually be supported by other browsers as part of the spec.
Without replaceTrack(), you can remove the old stream, add a new one, deal with onnegotiationnedded, and let the client process the change in streams.
See the replaceTrack() test in the Mozilla source: https://developer.mozilla.org/en-US/docs/Web/API/RTCRtpSender/replaceTrack
Have you tried calling getUserMedia() when you want to change to a different device?
There's an applyConstraints() method in the Media Capture and Streams spec that makes it possible to change constraints on the fly, but it hasn't been implemented yet:
dev.w3.org/2011/webrtc/editor/getusermedia.html#the-model-sources-sinks-constraints-and-states
dev.w3.org/2011/webrtc/editor/getusermedia.html#methods-1

Play encrypted video with AVPlayer

I'm implementing an application that contains video player. For some reasons video files are encrypted with AES, and size of these files can be rather big to avoid loading it to RAM as one piece. I'm looking for some way to play it with AVPlayer.
Tried:
1) Custom NSURLProtocol as suggested here http://aptogo.co.uk/2010/07/protecting-resources/
Didn't work, I suggest that AVPlayer uses it's own and mine does not get called.
2) Use AVAsset to chop video in small chunks and then feed them to AVPlayer - failed because there's no API in AVPlayer for that.
Any workaround would be greatly appreciated :)
You have 2 options:
If targeting iOS 7 and newer the check out AVAssetResourceLoaderDelegate. It allows you to do what you would with a custom NSURLProtocol but specifically for AVPlayer.
Emulate an HTTP server with support for the Range header and point the AVURLAsset to localhost.
I implemented #2 before and can provide more info if needed.
I just downloaded the Apple sample project https://developer.apple.com/library/ios/samplecode/sc1791/Listings/ReadMe_txt.html and it seems to do exactly what you want.
The delegate catch each AVURLAsset's AVAssetResourceLoader calls and makes up a brand new .m3a8 file with a custom decryption key in it.
Then it feeds the player with all .ts file URLs in the m3a8.
The project is a good overview of what it is possible to do with HLS feeds.

How to use the USB/HID port with objective-c under a Mac environment

I try to implement an application in snow Leopard, Reading Data from USB/HID device.In my application i tried following steps:
IOHIDManagerCreate()
CreateDeviceMatchingDictionary()
IOHIDManagerSetDeviceMatching()
IOHIDManagerOpen()
IOHIDManagerCopyDevices()
Create a Reference for device(IOHIDDeviceRef)
Based On the IOHIDDeviceRef i Fetch device details such as(Device ProductIDKey, Device VendorIDKey,Device ProductKey,Device Serial NumberKey,Device VersionNumberKey ect.)
IOHIDDeviceOpen(),ie :Using IOHIDDeviceRef i opened Device;
IOHIDDeviceCopyMatchingElements(); Ie Copy Matching Elements from the Device;
Create a Reference element(ie IOHIDElementRef);
Using IOHIDElementRef i retrieved Device Usage,Device Cocookie,Device UsagePage etc.
Up to this My application working Fine.
My doubts are
How can read data From Endpoint 1, My device is special purpose device having only one End point(Interrupt no synchronization data end point)?
Is their any Pipe associated with end point 1(HID Device)?;
Is their any ReadPipe and WritePipe functions in HIDManager Based Application?
Is it possible to retrieve data from USB/HID using IOHIDDeviceSetReoprtWithCallback()?
Every thing Did based on this Link:
http://developer.apple.com/library/mac/#documentation/DeviceDrivers/Conceptual/HID/new_api_10_5/tn2187.html#//apple_ref/doc/uid/TP40000970-CH214-SW7 ...
Thank you so much for your help ..
You are trying to use HID functions with non-HID device.
HID is Human Interface Device. Any HID device must conform "Device Class Definition for HID".
I suspect, your device does not conforms this specification.
So you should use other OS interface to work with your device. I suggest to use libusb. This is cross-platform library for working with USB devices on low level. It will allow you to read/write your endpoint directly.

Start playing streaming audio on symbian

The tiny question is:
How to start (realplayer ?) playing given online resourse (e.g. http://example.com/file.mp3)
PyS60, C++ or C# via RedFiveLabs would do.
EDIT1: Title changed from "Start RealPlayer on symbian" to the more appropriate.
I think the title is a little misleading if you just want to play back media content and not use a particular application for it.
In C++ there is CMdaAudioPlayerUtility::OpenUrlL() but it's not widely implemented. For example in S60 it will complete with KErrNotSupported status. To play files you can use other open functions in CMdaAudioPlayerUtility such as OpenFileL() or OpenDesL() but you need a separate mechanism for retrieving the files or at least the bytes onto the device.
There is also CVideoPlayerUtility::OpenUrlL() which supports rtsp audio streams but not http.