Handling Device change in OpenAL - opentk

I'm trying to write an application using OpenTK.OpenAL to wrap openal, and I'm concerned about how I should handle a hypothetical situation where I unplug my default audio output device (such as headphones). When I try to open my default device, it labels it "OpenAL Soft" instead of the actual device name, after opening it. And it doesn't seem to respond at all when I unplug the headphones I was using, and once they're plugged back in it just is completely quiet.

I just got done looking through some of OpenTK's OpenAL code. It seems like OpenTK is being as true to OpenAL as possible, they're just wrapping OpenAL calls.
For something like a hardware disconnect event, you'd have to listen for that event from the OS. I don't believe that OpenAL has this function natively. On Windows you'd probably want to look at WM_DEVICECHANGE to determine if an audio device connected or disconnected, and from there set the device like you normally would in OpenTK.

Related

When having a p2p webtrc connection, how to use different resolutions for the video call and the photo capture?

I'm working on a p2p webtrc video call between HoloLens2 and PC. I also need to support the capturing of photos(and send photos to the server). Now the video and photo can be supported with a resolution of 2272x1278, but I need the photo resolution of 3904x2196(the highest value that HoloLens2 provides).
The problem is when I am trying to change the resolutions, I found I had no limit when the call continues.
I use MediaCapture to take a photo. And the WebcamSource based on MixedReality-WebRTC running on the SharedReadOnly mode. I thought of one way to solve this: shut the call down when taking a photo, and restart it after capturing finished. But the problem is
How can I set the mode to exclusive WebcamSource when just capturing the photo?
Can I make sure when the call had been shut down, the WebcamSource is released?
Or if there is another way to use different resolutions for the video call and the photo capture? Thanks a lot.
How can I set the mode to exclusive WebcamSource when just capturing the photo?
No, SharingMode has hardcoded in the UwpUtils and does not expose any API to access.
Can I make sure when the call had been shut down, the WebcamSource is released?
To make sure dispose of audio and video tracks and media sources last, please reference the following code:
localAudioTrack?.Dispose();
localVideoTrack?.Dispose();
microphoneSource?.Dispose();
webcamSource?.Dispose();

How do I play the system sound from an osx application?

I am trying to play the default system beep sound in my program. I can play a specified sound, but I want to play the alert sound that is set in the Sound System Preferences.
Call NSBeep(). It's a part of AppKit.
An alternative to the accepted answer could be to use AudioServicesPlayAlertSound, passing it the kSystemSoundID_UserPreferredAlert parameter.
However, this has the drawback to require the AudioToolbox Framework in the app.

Capture screen and audio with objective-c

I'm using an AVCaptureSession to create a screen recording (OSX), but i'd also like to add the computer audio to it (not the microphone, but anything that's playing through the speakers) i'm not really sure how to do that so the first thing i tried was adding an audio device like so:
AVCaptureDevice *audioDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
After adding this device the audio was recorded, but it sounded like it was captured through the microphone. Is it possible to actually capture the computer's output sound this way? like quicktime does.
Here's an open source framework that supposedly makes it as easy to capture speaker output as it is a screenshot.
https://github.com/pje/WavTap
The home page for WavTap does mention that it requires kernel extension signing privileges to run under MacOS 10.10 & newer, and that requires signing into your Apple Developer Account and submitting this form. More information can be found here.

Loop wav file seamleassly in the background on windows 8

I have managed to get seamsless looping of wav files using the SharpDX library.
But this does not seem to work while the app is minimised (in the background).
Using the metro players I do not get a seamless loop this is why I use XAudio2 in the SharpDX library.
Hope someone can help with this.
When your app is in the background it no longer has access to the CPU so your audio will stop playing.
The only way around this is with background agents running the audio component. The issue here is that the certification process will be hard on you if you are just playing looping audio. Playing audio in the background is intended for audio player apps (like the inbuilt "Music" app).
If I were a user of your app I would likely be unhappy that it clogs up the audio system when it isn't in the foreground (if, for example, I went to answer a Lync call). If the only way to stop your app playing audio is to go and turn it off manually or exit the app then my opinion is that the user experience isn't great.
Of course, you may have a different opinion, or your app might be doing something I haven't considered.

adobe air, detect when a webcam is unplugged?

I have tried to check activity, fps, status. nothing is triggered when an active camera is unplugged in as3/air. Has anyone found a working way? In my case i have a kiosk running air 2.7 running two webcams. In some cases a usb webcam might be unplugged and plugged back in. I have been trying find a way to detect when its unplugged so I can restart it. Ideas?
Unfortunately i have no USB camera to test this with (only got insight).
You could try ActivityEvent and set motionlevel to some low value.
ActivityEvent will give you response if and when motion is detected in camera. I belive when Camera is physically disconnected, activity event should trigger since no activity will be detected.
Here's a piece of example:
import flash.media.Camera;
import flash.display.Stage;
import flash.media.Video;
import flash.events.ActivityEvent;
import flash.events.StatusEvent;
var camera:Camera = Camera.getCamera();
camera.setMode(stage.stageWidth, stage.stageHeight,25);
camera.addEventListener(ActivityEvent.ACTIVITY, activityEventHandler,false,0,true);
camera.setMotionLevel(3);
var video:Video = new Video();
video.width = stage.stageWidth;
video.height = stage.stageHeight;
video.attachCamera(camera);
addChild(video);
function activityEventHandler(a:ActivityEvent):void{
trace('Motion detected: '+a.activating);
}
Note:
setMotionlevel default value is 50 so if you set it to eg. 3 then camera still notices some small changes, even eye-blinking. That would help you to detect if there is any kind of motion at all. If no motion is detected then camera is probably dead.
You maybe even can use motionlevel as 1, but this value is very sensitive and even slightest change in room lightning is probably detected as motion.
Let me know if that helps, would be interesting to hear about this in practice with real USB camera.