open al sounds don't play after the incoming call, until app restart - objective-c

I'm playing game sounds using OpenAL, and bg music using standard AV. Recently i've found that after the
incoming call all openal sounds don't work while bg music is still playing. If I force stop app and start again
sounds appear again. Do smbd happen to know what's happening to openal during/after the incoming call?

Ok, it seems I've found a solution.
I'm using obj-c sound manager, so I just added beginInterruption and endInterruption delegate methods of AVAudioSession (and AVAudioPlayer) to my class.
beginInterruption looks like:
alcMakeContextCurrent(NULL);
and endInterruption looks something like:
NSError * audioSessionError = NULL;
[audioSession setCategory:soundCategory error:&audioSessionError];
if (audioSessionError)
{
Log(#"ERROR - SoundManager: Unable to set the audio session category");
return;
}
// Set the audio session state to true and report any errors
audioSessionError = NULL;
[audioSession setActive:YES error:&audioSessionError];
if (audioSessionError)
{
Log(#"ERROR - SoundManager: Unable to set the audio session state to YES with error %d.", (int) result);
return;
}
//music players handling
bool plays = false;
if (musicPlayer[currentPlayer] != nil)
plays = [musicPlayer[currentPlayer] isPlaying];
if (musicPlayer[currentPlayer] != nil && !plays)
[musicPlayer[currentPlayer] play];
alcMakeContextCurrent(context);
Yes, this works if you're using only openAL sounds. But to play long tracks you should use AVAudioPlayer.
But here's the Apple magic again! If you play music along with OpenAL sounds something odd happens.
Cancel the incoming call and AVAudioSessionDelegate::endInterruption with AVAudioPlayerDelegate::audioPlayerEndInterruption will never called. Only beginInterruption, not the end.
Even AppDelegate::applicationWillEnterForeground will not be called, and app just don't know that we've returned.
But the good news is that you can call your endInterruption in AppDelegate::applicationDidBecomeActive method, and openAL context will be restored. And this works!
- (void)applicationDidBecomeActive:(UIApplication *)application
{
if (MySoundMngr != nil)
{
[MySoundMngr endInterruption];
}
// Restart any tasks that were paused and so on....
}

I had a hard time figuring this out so wanted to add my answer here. This is all specifically in Xamarin, but I suspect it applies generally and is similar to #Tertium's answer
You can prevent iOS from interrupting your audio in some situations (e.g., getting a phone call but declining it), using AVAudioSession.SharedInstance().SetPrefersNoInterruptionsFromSystemAlerts(true, out NSError err);
You will still be interrupted in some situations (e.g., you accept a phone call). To catch these you must AVAudioSession.Notifications.ObserveInteruption(myAudioInterruptionHandler); when you launch your app.
Inside this handler, you can determine if you're shutting down or coming back like so:
void myAudioInterruptionHandler(object sender, AVAudioSessionInterruptionEventArgs args) {
args.Notification.UserInfo.TryGetValue(
new NSString("AVAudioSessionInterruptionTypeKey"),
out NSObject typeKey
);
bool isBeginningInterruption = (typeKey.ToString() == "1");
// ...
}
Interruption Begins
When the interruption begins, stop whatever audio is playing (you'll need to handle this on your own based on your app, but probably by calling AL.SourceStop on everything).
Then, critically,
ContextHandle audioContextHandle = Alc.GetCurrentContext();
Alc.MakeContextCurrent(ContextHandle.Zero);
If you don't do this right away, iOS will fry your ALC context and you are doomed. Note that if you have a handler for AudioRouteChanged this is too late, you must do it in the AudioInterruption handler.
Interruption Ends
When you're coming back from the interruption, first reboot your iOS audio session:
AVAudioSession.SharedInstance().SetActive(true);
You may also need to reset your preferred input (I think this step is optional if you always use the default input) AVAudioSession.SharedInstance().SetPreferredInput(Input, out NSError err)
Then restore your context
Alc.MakeContextCurrent(audioContextHandle);

Related

Cocoa - detect event when camera started recording

In my OSX application I'm using code below to show preview from camera.
[[self session] beginConfiguration];
NSError *error = nil;
AVCaptureDeviceInput *newVideoDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:captureDevice error:&error];
if (captureDevice != nil) {
[[self session] removeInput: [self videoDeviceInput]];
if([[self session] canAddInput: newVideoDeviceInput]) {
[[self session] addInput:newVideoDeviceInput];
[self setVideoDeviceInput:newVideoDeviceInput];
} else {
DLog(#"WTF?");
}
}
[[self session] commitConfiguration];
Yet, I need to detect the exact time when the preview from the camera becomes available.
In other words I'm trying to detect the same moment like in Facetime under OSX, where animation starts once the camera provides the preview.
What is the best way to achieve this?
I know this question is really old, but I stumbled upon it too when I was looking for this same question, and I have found answers so here goes.
For starters, AVFoundation is too high level, you'll need to drop down to a lower level, CoreMediaIO. There's not a lot of documentation on this, but basically you need to perform a couple queries.
To do this, we'll use a combination of calls. First, CMIOObjectGetPropertyDataSize lets us get the size of the data we'll query for next, which we can then use when we call CMIOObjectGetPropertyData. To set up the get property data size call, we need to start at the top, using this property address:
var opa = CMIOObjectPropertyAddress(
mSelector: CMIOObjectPropertySelector(kCMIOHardwarePropertyDevices),
mScope: CMIOObjectPropertyScope(kCMIOObjectPropertyScopeGlobal),
mElement: CMIOObjectPropertyElement(kCMIOObjectPropertyElementMaster)
)
Next, we'll set up some variables to keep the data we'll need:
var (dataSize, dataUsed) = (UInt32(0), UInt32(0))
var result = CMIOObjectGetPropertyDataSize(CMIOObjectID(kCMIOObjectSystemObject), &opa, 0, nil, &dataSize)
var devices: UnsafeMutableRawPointer? = nil
From this point on, we'll need to wait until we get some data out, so let's busy loop:
repeat {
if devices != nil {
free(devices)
devices = nil
}
devices = malloc(Int(dataSize))
result = CMIOObjectGetPropertyData(CMIOObjectID(kCMIOObjectSystemObject), &opa, 0, nil, dataSize, &dataUsed, devices);
} while result == OSStatus(kCMIOHardwareBadPropertySizeError)
Once we get past this point in our execution, devices will point to potentially many devices. We need to loop through them, somewhat like this:
if let devices = devices {
for offset in stride(from: 0, to: dataSize, by: MemoryLayout<CMIOObjectID>.size) {
let current = devices.advanced(by: Int(offset)).assumingMemoryBound(to: CMIOObjectID.self)
// current.pointee is your object ID you will want to keep track of somehow
}
}
Finally, clean up devices
free(devices)
Now at this point, you'll want to use that object ID you saved above to make another query. We need a new property address:
var CMIOObjectPropertyAddress(
mSelector: CMIOObjectPropertySelector(kCMIODevicePropertyDeviceIsRunningSomewhere),
mScope: CMIOObjectPropertyScope(kCMIOObjectPropertyScopeWildcard),
mElement: CMIOObjectPropertyElement(kCMIOObjectPropertyElementWildcard)
)
This tells CoreMediaIO that we want to know if the device is currently running somewhere (read: in any app), wildcarding the rest of the fields. Next we get to the meat of the query, camera below corresponds to the ID you saved before:
var (dataSize, dataUsed) = (UInt32(0), UInt32(0))
var result = CMIOObjectGetPropertyDataSize(camera, &opa, 0, nil, &dataSize)
if result == OSStatus(kCMIOHardwareNoError) {
if let data = malloc(Int(dataSize)) {
result = CMIOObjectGetPropertyData(camera, &opa, 0, nil, dataSize, &dataUsed, data)
let on = data.assumingMemoryBound(to: UInt8.self)
// on.pointee != 0 means that it's in use somewhere, 0 means not in use anywhere
}
}
With the above code samples you should have enough to test whether or not the camera is in use. You only need to get the device once (the first part of the answer); the check for if it's in use however, you'll have to do at any time you want this information. As an extra exercise, consider playing with CMIOObjectAddPropertyListenerBlock to be notified on event changes for the in use property address we used above.
While this answer is nearly 3 years too late for the OP, I hope it helps someone in the future. Examples here are given with Swift 3.0.
The previous answer from the user jer is definitely the correct answer, but I just wanted to add one additional important information.
If a listener block is registered with CMIOObjectAddPropertyListenerBlock, the current run loop must be run, otherwise no event will be received and the listener block will never fire.

How would you write fetching a collection the "Reactive Cocoa" way?

The client I'm building is using Reactive Cocoa with Octokit and so far it has been going very well. However now I'm at a point where I want to fetch a collection of repositories and am having trouble wrapping my head around doing this the "RAC way"
// fire this when an authenticated client is set
[[RACAbleWithStart([GHDataStore sharedStore], client)
filter:^BOOL (OCTClient *client) {
return client != nil && client.authenticated;
}]
subscribeNext:^(OCTClient *client) {
[[[client fetchUserRepositories] deliverOn:RACScheduler.mainThreadScheduler]
subscribeNext:^(OCTRepository *fetchedRepo) {
NSLog(#" Received new repo: %#",fetchedRepo.name);
}
error:^(NSError *error) {
NSLog(#"Error fetching repos: %#",error.localizedDescription);
}];
} completed:^{
NSLog(#"Completed fetching repos");
}];
I originally assumed that -subscribeNext: would pass an NSArray, but now understand that it sends the message every "next" object returned, which in this case is an OCTRepository.
Now I could do something like this:
NSMutableArray *repos = [NSMutableArray array];
// most of that code above
subscribeNext:^(OCTRepository *fetchedRepo) {
[repos addObject:fetchedRepo];
}
// the rest of the code above
Sure, this works, but it doesn't seem to follow the functional principles that RAC enables. I'm really trying to stick to conventions here. Any light on capabilities of RAC/Octokit are greatly appreciated!
It largely depends on what you want to do with the repositories afterward. It seems like you want to do something once you have all the repositories, so I'll set up an example that does that.
// Watch for the client to change
RAC(self.repositories) = [[[[[RACAbleWithStart([GHDataStore sharedStore], client)
// Ignore clients that aren't authenticated
filter:^ BOOL (OCTClient *client) {
return client != nil && client.authenticated;
}]
// For each client, execute the block. Returns a signal that sends a signal
// to fetch the user repositories whenever a new client comes in. A signal of
// of signals is often used to do some work in response to some other work.
// Often times, you'd want to use `-flattenMap:`, but we're using `-map:` with
// `-switchToLatest` so the resultant signal will only send repositories for
// the most recent client.
map:^(OCTClient *client) {
// -collect will send a single value--an NSArray with all of the values
// that were send on the original signal.
return [[client fetchUserRepositories] collect];
}]
// Switch to the latest signal that was returned from the map block.
switchToLatest]
// Execute a block when an error occurs, but don't alter the values sent on
// the original signal.
doError:^(NSError *error) {
NSLog(#"Error fetching repos: %#",error.localizedDescription);
}]
deliverOn:RACScheduler.mainThreadScheduler];
Now self.repositories will change (and fire a KVO notification) whenever the repositories are updated from the client.
A couple things to note about this:
It's best to avoid subscribeNext: whenever possible. Using it steps outside of the functional paradigm (as do doNext: and doError:, but they're also helpful tools at times). In general, you want to think about how you can transform the signal into something that does what you want.
If you want to chain one or more pieces of work together, you often want to use flattenMap:. More generally, you want to start thinking about signals of signals--signals that send other signals that represent the other work.
You often want to wait as long as possible to move work back to the main thread.
When thinking through a problem, it's sometimes valuable to start by writing out each individual signal to think about a) what you have, b) what you want, and c) how to get from one to the other.
EDIT: Updated to address #JustinSpahrSummers' comment below.
There is a -collect operator that should do exactly what you're looking for.
// Collect all receiver's `next`s into a NSArray. nil values will be converted
// to NSNull.
//
// This corresponds to the `ToArray` method in Rx.
//
// Returns a signal which sends a single NSArray when the receiver completes
// successfully.
- (RACSignal *)collect;

iOS 6.1 AudioQueueStop, AudioQueueDispose error

Using Audio Queue for my iOS App, I have some problem with a test on iOS6.1, though it has worked fine on iOS6.
The problem is AudioQueueStop and AudioQueueDispose don't return immediately, or sometimes they crash.
Like this:
if (_audioQueue)
{
auto err = AudioQueueStop(_audioQueue, true); // Some delay before return
for (int i = 0; i < kNumberAudioQueueBuffers; i++) {
AudioQueueFreeBuffer(_audioQueue, _audioQueueBuffer[i]);
}
err = AudioQueueDispose(_audioQueue, true); // This also has delay
_audioQueue = nil;
}
This isn't called on main thread but another thread, but other stuffs such as AudioQueueNewoutput and AudioQueueStart also called on that thread.
Actually I tried to run a simple app which uses AudioQueue as a test and in that case it worked fine (on both of iOS6 and 6.1). So other parts in my codes might affect but I couldn't figure out.
Is there anyone who had similar problems and hopefully fixed?
I'm seeing similar issues with AudioQueueSetProperty() and kAudioQueueProperty_MagicCookie. My app crashes every single time this is called, when it worked fine on iOS 6.0 and earlier. I'm thinking Apple messed up the audio queue implementation in 6.1.

Detect screen on/off from iOS service

I am developing a network monitor app that runs in background as a service. Is it possible to get a notification/call when the screen is turned on or off?
It exists in Android by using the following code:
private void registerScreenOnOffReceiver()
{
IntentFilter filter = new IntentFilter(Intent.ACTION_SCREEN_ON);
filter.addAction(Intent.ACTION_SCREEN_OFF);
registerReceiver(screenOnOffReceiver, filter);
}
screenOnOffReceiver is then called when screen is turned on/off. Is there a similar solution for iOS?
Edit:
The best I've found so far is UIApplicationProtectedDataWillBecomeUnavailable ( Detect if iPhone screen is on/off ) but it require the user to enable Data Protection (password protection) on the device.
You can use Darwin notifications, to listen for the events. I'm not 100% sure, but it looks to me, from running on a jailbroken iOS 5.0.1 iPhone 4, that one of these events might be what you need:
com.apple.iokit.hid.displayStatus
com.apple.springboard.hasBlankedScreen
com.apple.springboard.lockstate
Update: also, the following notification is posted when the phone locks (but not when it unlocks):
com.apple.springboard.lockcomplete
To use this, register for the event like this (this registers for just one event, but if that doesn't work for you, try the others):
CFNotificationCenterAddObserver(CFNotificationCenterGetDarwinNotifyCenter(), //center
NULL, // observer
displayStatusChanged, // callback
CFSTR("com.apple.iokit.hid.displayStatus"), // event name
NULL, // object
CFNotificationSuspensionBehaviorDeliverImmediately);
where displayStatusChanged is your event callback:
static void displayStatusChanged(CFNotificationCenterRef center, void *observer, CFStringRef name, const void *object, CFDictionaryRef userInfo) {
NSLog(#"event received!");
// you might try inspecting the `userInfo` dictionary, to see
// if it contains any useful info
if (userInfo != nil) {
CFShow(userInfo);
}
}
If you really want this code to run in the background as a service, and you're jailbroken, I would recommend looking into iOS Launch Daemons. As opposed to an app that you simply let run in the background, a launch daemon can start automatically after a reboot, and you don't have to worry about iOS rules for apps running tasks in the background.
Let us know how this works!
Using the lower-level notify API you can query the lockstate when a notification is received:
#import <notify.h>
int notify_token;
notify_register_dispatch("com.apple.springboard.lockstate", &notify_token, dispatch_get_main_queue(), ^(int token) {
uint64_t state = UINT64_MAX;
notify_get_state(token, &state);
NSLog(#"com.apple.springboard.lockstate = %llu", state);
});
Of course your app will have to start a UIBackgroundTask in order to get the notifications, which limits the usefulness of this technique due to the limited runtime allowed by iOS.
While iPhone screen is locked appdelegate method
"- (void)applicationWillResignActive:(UIApplication *)application"
will be called you can check that. Hope it may help you.

Objective C: Get notifications about a user's idle state

My cocoa app runs background tasks, which I would like to stop when the user becomes idle (no keyboard/mouse input) and then resume when the user becomes active again. Is there a way to register for idle-state notifications?
In case you can't link to Carbon (ie. you want to compile x86_64 bit binary) you can wrap this function (which returns current idle time in seconds resolution as double - CFTimeInterval) in a timer:
#include <IOKit/IOKitLib.h>
CFTimeInterval CFDateGetIdleTimeInterval() {
mach_port_t port;
io_iterator_t iter;
CFTypeRef value = kCFNull;
uint64_t idle = 0;
CFMutableDictionaryRef properties = NULL;
io_registry_entry_t entry;
IOMasterPort(MACH_PORT_NULL, &port);
IOServiceGetMatchingServices(port, IOServiceMatching("IOHIDSystem"), &iter);
if (iter) {
if ((entry = IOIteratorNext(iter))) {
if (IORegistryEntryCreateCFProperties(entry, &properties, kCFAllocatorDefault, 0) == KERN_SUCCESS && properties) {
if (CFDictionaryGetValueIfPresent(properties, CFSTR("HIDIdleTime"), &value)) {
if (CFGetTypeID(value) == CFDataGetTypeID()) {
CFDataGetBytes(value, CFRangeMake(0, sizeof(idle)), (UInt8 *) &idle);
} else if (CFGetTypeID(value) == CFNumberGetTypeID()) {
CFNumberGetValue(value, kCFNumberSInt64Type, &idle);
}
}
CFRelease(properties);
}
IOObjectRelease(entry);
}
IOObjectRelease(iter);
}
return idle / 1000000000.0;
}
You'll need to link your code to IOKit.framework
There's a Carbon API that will send a notification when there hasn't been a user event after a certain duration called EventLoopIdleTimer. Uli Kusterer has written a Cocoa wrapper for here (look for UKIdleTimer).
If you want something lower level, you may be able to implement the behavior you want with a combination of timers and the CoreGraphics function CGEventSourceSecondsSinceLastEventType (available in <CoreGraphics/CGEventSource.h>).
Apple's Technical Q&A QA1340 Registering and unregistering for sleep and wake notifications may be what you are looking for.
If you need more control than NSWorkspaceWillSleepNotification (Listing 1), use I/O Kit and register to receive power notifications (Listing 3).
I used a different approach.
Subclassing UIApplication I override the sendEvent method filtering touches (actually you can filter any kind of event, acceleration, touches, etc.).
Using a shared variable and a background timer I managed the "idle".
Every time the user touch the screen the variable is set with current timeInterval (current time).
The timer fire method checks for the elapsed time since last touch, if greater than the threshold (in my case was around 90seconds) you can POST your own notification.
I used this simple approach to create a custom set of apps that after some idle time automatically call the "screensaver" app.
Nothing clever, it just do the job.
Hope that helps.