How do you designate audio device channels when using AVPlayer? - objective-c

I am attempting to designate output channels of an audio interface for sending audio during video playback on macOS. I am using AVPlayer to play a video file:
auto player = [AVPlayer playerWithPlayerItem:playerItem]; // playerItem created from a file path
player.audioOutputDeviceUniqueID = deviceID; // deviceID determined by passing kAudioObjectPropertyName to AudioObjectGetPropertyData and inspecting device name, e.g. "Babyface Pro"
/* select hardware channels? */
[player play];
The AVPlayer defaults to playing channels 1/2 of my Babyface Pro, but I'd like to select, for instance, channels 3/4. I cannot select these channel routes globally (i.e. through TotalMix FX) since I am using more than one AVPlayer and selecting separate channels for each.
I came across a nearly identical question, however the question did not require playing video, only audio.

Related

pause only one AVAudioPlayerNode

I encounter some problems with AVFoundation and sound spatialisation.
I have several instances of player (AVAudioPlayerNode) attached to the audio engine.
they are connected to one environment node (AVAudioEnvironmentNode). the players node play some mini sounds. An finally, the environment node is connected to the main mixer
[engine mainMixerNode];
It works great, but when I decide to pause one particular player with
[player pause];
It pause the player but also all the other playerNodes.

AVPlayer long song buffering issue

I've got an issue with long songs while using AVPlayer.
I have tested it on a 64 min song.
The issue is: when the AVPlayer buffer is full it stops playback (rate = 0.0f) and stops to download new timeRanges. When I resume playback manually it plays for some seconds and stops again. I think it continues to download new content to buffer but this process is very slow and is not suitable for gapless playback.
Is it possible to control this situation to achieve gapless playback?
Am I allowed to modify loaded time ranges (clean the buffer) during playback?
Am I allowed to increase buffer size?
Are you running it on the main thread? Try to do something like this:
#include <dispatch/dispatch.h>
dispatch_queue_t playQueue = dispatch_queue_create("com.example.playqueue", NULL);
AVAudioPlayer* player = ...
dispatch_async(playQueue, ^{
[player play];
});
If that doesn't work I'd suggest giving OpenAL a try.

Setup separate app volume controls

How can i control my app's output volume.
I've got an app that uses https://github.com/mattgallagher/AudioStreamer to stream mp3 files from the internet. The AudioStreamer class does not have a way to change output volume and I don't want to change system volume.
Many apps do this:
iTunes
Spotify
MPlayerX
Most Audio Players
Edit: If you're hear about AudioStreamer, I've since switched to Apple's AVPlayer, which i've found far simpler and superior. Easy volume adjustment too!
AudioStreamer and I'm guessing most OSX media players use the AudioToolbox framework. AudioToolbox uses a programming interface called AudioQueue to playback media files. Here is the way to adjust the volume using AudioQueue.
AudioQueueSetParameter(audioQueue, kAudioQueueParam_Volume, 0.5);
audioQueue is an AudioQueRef
kAudioQueueParam_Volume tells AudioQueueSetParameter() to change the Volume Parameter
0.5 is the volume from 0.0 - 1.0
More details on AudioQueue: https://developer.apple.com/library/mac/#documentation/MusicAudio/Reference/AudioQueueReference/Reference/reference.html
You can use AVAudioPlayer, it has an instance method setVolume: to set the output volume:
AVAudioPlayer * audioPlayer = ...
float volume = aVolumeValue / 100.f; // aVolumeValue can be 0~100
[audioPlayer setVolume:volume];
[audioPlayer play];
By the way, you can store the aVolumeValue into NSUserDefaults, and control it by UISlider object.

How to create a simultaneously playing audio stack from different audio files in Cocoa?

I'm working on a Mac OS X application using Cocoa in Xcode. One feature involves a simultaneous audio playback:
I want to build some kind of audio stack: An audio file merged in runtime from a set of different source files. The set of source audio files differs each runtime. Each Audio file has the exact same length.
After creating the audio stack (or merged file) I want to play [and store] it.
I'm new to audio frameworks in Cocoa. Is there a high-level API that provides an appropriate functionality? Do I have to look inside the CoreAudio, Audio Unit or QTKit framework? Do you have an implementation idea (or sample implementation)?
If you just want to play a bunch of audio files simultaneously, this is easy; unlikes Windows, OS X acts as if it has an infinite-channel mixer built-in, and all the higher-level functions just grab a new channel if necessary for each sound. For example:
NSURL *u1 = [[NSBundle mainBundle] URLForResource:#"1" withExtension:#"mp3"];
NSURL *u2 = [[NSBundle mainBundle] URLForResource:#"2" withExtension:#"mp3"];
NSSound *s1 = [[NSSound alloc] initWithContentsOfURL:u1 byReference:YES];
NSSound *s2 = [[NSSound alloc] initWithContentsOfURL:u2 byReference:YES];
[s1 play];
[s2 play];
This will start playing MyApp.app/Contents/Resources/1.mp3 and MyApp.app/Contents/Resources/2.mp3 at (very close to) the same time.
If you need a (real or virtual) file with a bunch of audio tracks in them, QTKit is probably the easiest way; create a movie, open all of your audio files, copy their tracks into the movie (with the same start date), and now you can play the movie (or do whatever else you want).
If you want to actually merge the audio into one stereo track (e.g., so you can save a normal audio file), instead of just playing the tracks together at runtime, you could use QTKit as above and create an export session (much like using the "Export…" feature in QuickTime Player), or use CoreAudio, or a variety of different cross-platform open source libraries.

AVQueuePlayer playing items simultaneously rather than in sequence

I am trying to put together a simple app to play a pre-roll video followed by some content video.
Currently, I'm trying to use the AVQueuePlayer class to get this done. Unfortunately, it seems to want to play the videos properly in sequence.
For example, the pre-roll plays by itself for a few seconds, then (prior to the pre-roll being complete) it starts to try and play the content video. For a few seconds, the player seems to be at war with itself and switches back and forth between the two videos (neither playing very well at all). Finally, when the pre-roll is finished, the content video then plays the rest of the way normally.
I've looked through the documentation for the AVQueuePlayer and I don't see anything obvious that I'm missing.
My code is pretty basic:
AVPlayerItem *preRollItem = [AVPlayerItem playerItemWithURL: preRollUrl];
AVPlayerItem *contentItem = [AVPlayerItem playerItemWithURL: contentUrl];
self.player = [AVQueuePlayer queuePlayerWithItems:[NSArray arrayWithObjects:preRollItem, contentItem, nil]];
[self.player play];
What is the trick to getting the videos to play in sequence.
Make sure you are actually testing on the device. From my experience the iOS5 simulator has big problems with AVQueuePlayer and does bizarre things.
I found the iOS4.3 simulator is doing a much better job when it comes to testing AVFoundation.