pjsua_conf_disconnect do not mute mic on some Mac - objective-c

The question:
Is it possible that under specific condition (hardware, ...) the pjsua_conf_disconnect(0, callInfo.conf_slot) function do not mute the mic?
If yes, how can we effectively mute the mic with PJSIP?
The details:
In a OSX SIP application, user can mute the mic, and this will call:
...
pjsua_call_info callInfo;
pjsua_call_get_info([self identifier], &callInfo);
pj_status_t status = pjsua_conf_disconnect(0, callInfo.conf_slot);
...
Where [self identifier] is the pjsua_call_id of the current call.
I know for sure that after those 3 lines, status == PJ_SUCCESS, because only if this condition is true the UI is updated to let the user know the mic is muted.
This project uses the pjlib 1.12.0 for POSIX static libs.
Users that face this problem are on Mac OSX 10.8.1 and 10.8.2. They are all using their built-in MacBook [Pro,Air] mic, no headset.
Note that I am NOT able reproduce this problem myself on an OSX 10.8.2 mid-2009 MBP with the exact same build (from the MacAppStore), this is what making this problem hard to trouble shoot.
Note that it is not a random problem, it is contant: as the mute function never works for users experiencing this issue, and it always work for the others.

Related

GCGamepad Deprecated?

While testing one of our old iPad game (written in Objective-C), we found that GCGamepad is now deprecated. What is the replacement right now?
The game can still run perfectly on iOS 13 including gamepad support, but we would like to "modernise" it.
It surprise us that we can simply convert this game to Mac via Mac Catalyst! Everything works (including keyboard commands) except for gamepad.
I think you're meant to use GCExtendedGamepad now. https://developer.apple.com/documentation/gamecontroller/gcextendedgamepad?language=objc

iOS iPhone iPad Simulator

This is what I get when I test my iPhone code on iPadPro 12.9 inch 2nd generation simulator. Isn't there a discrepancy ? Thanks, David.
I would highly recommend using this UIDeviceHardware instead of checking the interface idiom:
https://github.com/fahrulazmi/UIDeviceHardware/blob/master/UIDeviceHardware.m
NSString *platformString = [UIDeviceHardware platformString];
I've been using it for awhile now, and it works perfectly for me.
However, there is a problem when using it on simulator. When running on simulator, the platform is equal to x86_64 or i386 which will simply return iPad or iPhone. So ... I feel like you won't find a satisfying conclusion unless you tested this on a real device or at least got someone to test it for you on a real device.
In your case, you would check for one of these two platform strings:
"iPad Pro 12.9-inch (WiFi)" or "iPad Pro 12.9-inch (Cellular)"
https://github.com/fahrulazmi/UIDeviceHardware/blob/master/UIDeviceHardware.m#L83-L84
A good way to check for those is to just check for the prefix:
[platforString hasPrefix:"iPad Pro 12.9-inch"]

Capture screen and audio with objective-c

I'm using an AVCaptureSession to create a screen recording (OSX), but i'd also like to add the computer audio to it (not the microphone, but anything that's playing through the speakers) i'm not really sure how to do that so the first thing i tried was adding an audio device like so:
AVCaptureDevice *audioDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
After adding this device the audio was recorded, but it sounded like it was captured through the microphone. Is it possible to actually capture the computer's output sound this way? like quicktime does.
Here's an open source framework that supposedly makes it as easy to capture speaker output as it is a screenshot.
https://github.com/pje/WavTap
The home page for WavTap does mention that it requires kernel extension signing privileges to run under MacOS 10.10 & newer, and that requires signing into your Apple Developer Account and submitting this form. More information can be found here.

Liveview on Android/QX1 Sony Camera API 2.01 fails

Using the supplied Android demo from
https://developer.sony.com/downloads/all/sony-camera-remote-api-beta-sdk/
Connected up to the WIFI connection on a Sony QX1. The sample application finds the camera device and is able to connect to it.
The liveview is not displaying correctly. At most, one frame is shown and the code hits an exception in SimpleLiveViewSlicer.java
if (commonHeader[0] != (byte) 0xFF) {
throw new IOException("Unexpected data format. (Start byte)");
}
Shooting a photo does not seem to work. Zooming does work - lens is moving. Camera is working fine when using the PlayMemories app directly, so no hardware issue.
Hoping from advice from Sony on this one - standard hardware and demo application should work.
Can you provide some details of your setup?
What version of Android SDK are you compiling with?
What IDE and OS are you using?
Have you installed the latest firmware? (http://www.sony.co.uk/support/en/product/ILCE-QX1#SoftwareAndDownloads)
Edit:
We tested the sample code using a QX1 lens and the same setup as you and were able to run the sample code just fine.
One thing to check is whether the liveview is ready to transfer images. To confirm whether the camera is ready to transfer liveview images, the client can check “liveviewStatus” status of “getEvent” API (see API specification for details). Perhaps there is some timing issue due to connection speed that is causing the crash.

AudioServicesPlaySystemSound not playing sounds

I am playing a small .wav file using the AudioToolBox.
AudioServicesPlaySystemSound (soundFileObject);
But sometimes it is not playing.
What is the reason?
If you're in the Simulator, make sure that in System Prefrences → Sound, that "Play user interface sound effects" is not turned off. If you're on the device, check that the ringer switch is not set to silent.
Maybe it's this issue? AudioServicesDisposeSystemSoundID() will stop the sound work before playing.
Building on user1056521's answer, use the following code:
AudioServicesPlaySystemSoundWithCompletion(soundID, ^{
AudioServicesDisposeSystemSoundID(soundID);
});
where soundID is the SystemSoundID value you got when you called AudioServicesCreateSystemSoundID().
The completion block ensures the sound was played and completed playing before it is disposed of.
System Preferences → Sound → [x] Play user interface sound — this checkbox helped for me too, it solved the AudioServicesPlaySystemSound(_) issue. But worth noting was that the sample Apple Xcode project "SysSound" worked just fine without checking this box. I'm using the Xcode 4.4.1 for creating own app from scratch. Could there be differencies between projects created with older and newer Xcode tools?
For those who visit this question later,
AudioServicesPlaySystemSoundID()'s volume is based on system Ringer in range of 0.1 ~ 1.0, not mediaVolume
AudioServicesPlayAlertSoundID()'s volume is based on system Ringer in range of approximately 0.3 ~ 1.0
On the device, go into settings and turn up the volume on Sounds->Ringer and Alerts