I am trying to implement something like the camera on/off toggle that Google Meet has. I have tried getting the tracks from getUserMedia and setting enabled = true/false, it works but the camera indicator light is still on all the time. How are they able to toggle the camera and the indicator light?
To lose the camera indicator you will have to remove any reference to the stream (no just set enabled to false).
So you have to remove the track from the peer.
Thanks to Transceiver APi you can easily switch/remove track from the peer without renegotiate the ice.
Related
We are developing a unique streaming app for special content. Due to the content rights and the release window, we can't allow users to screen cast from their phone or tablet to a TV. So, we would appreciate your guidance and support on how we can fully disable casting, mirroring, etc. in Android.
We've tried a few things and are able to disable screenshots and digital recording e.g. loom. When we screen record, the sounds is recorded but the image is black. So, we've been able to successfully disable screenshots and screen recordings. However, we're unable to stop screen mirror, smart view, airplay and casting.
I am trying to create a peer to peer meeting basing on WebRtc, I am able to see him on his camera, or watch his shared screen, but may I watch his screen and meanwhile watching him on his camera?
Try this demo: https://www.webrtc-experiment.com/demos/screen-and-video-from-single-peer.html
For offerer; it attaches two unique MediaStream objects:
Audio+Video
Screen
Remember, Firefox doesn't support this feature, yet!
Before iOS7 came, we noticed an issue:
Music remote-control from earbud or springboard can hijack our audio session even if we set the category to solo-ambient or another exclusive mode.
We thus tried a few things:
We tried to take ownership of the audio session back. But this requires that our audio code knows when to take it back and from whom. We thought we could let the app code become the first responder to remote-control events, do our stuff, and then pass the events on to the music app. However, we found that the events got detained by the first responder and there is no way to push it back to the chain of commands.
We tried to become first-resonder and block remote-control events all together when we are in solo-ambient. This worked fine with iOS6, still works with earbud control in iOS7, but fails with iOS7's control center. The control center seems to bypass the remote-control event handler remoteControlReceivedWithEvent completely, where we put our blocking code.
I read something elsewhere that:
You can't block the music app. your app can become one though (apple
won't like that) and then the control center would control yours.
But I found no documentation whatsoever about control center.
And as said above, control center does not enter the normal remote control hooks even if an app is the first responder.
Another quoteP
Remote Control Event handling is so your app can be controlled by
Control Center, the earbuds, etc... it is not so that your app can eat
said controls, preventing control of other apps from said sources. It
only worked in iOS6 because of a bug in iOS, now fixed in iOS7
Is it that what had were using was due this bug? I find it hard to believe because we got the solution on this list and the Xcode mailing list so I assume that was an accepted solution.
Now we really wonder if we are missing something from the very beginning:
Is solo-ambient really an exclusive mode for audio session or is it that music app is an exception to that exclusivity?
How can our app live in harmony with the remote-control, and control center?
Where can we find up-to-date documentation of remote-control and control center?
The remote control has been mysteriously fixed after clean building everything agains iOS7 SDK. Now app delegate can receive remote-control events from Control Center. However, the play/pause events are UIEventSubtypeRemoteControlPause and UIEventSubtypeRemoteControlPlay instead of the iOS6's UIEventSubtypeRemoteControlTogglePlayPause.
I'm developing a Kiosk-Mode Application and I want to prevent the notification center gesture (swipe down from top). Since the Application isn't distributed using the AppStore private APIs are allowed.
I have skimmed through the UIKit class dump but did not find any hints on how to disable it (resp. don't know where to look, tried UIApplication and UIWindow).
Has anyone tried this yet and had success?
It may be a little tricky, cause you are trying to disable native iOS function like all-fingers swipe gesture on iPad and so on. Some people use this method to define different notification center swipe side:
statusBarHidden = YES
and then you can redefine side:
setStatusBarOrientation:UIInterfaceOrientationLandscapeLeft
as example.
I think you couldn't find any hacks to prevent iOS notification from showing at all. As last approach you can get some profit from dealing with jaiblroken features on unlocked devices...
Our app knows how to listen to the audio input. Can I open the app in the background (multi-tasking) and still get it to work as if it is open? Or can my app stay in the background and still get audio inputs,and output audio sounds? How i do that?
Set your AudioSessionCategory as AVAudioSessionCategoryRecord. The docs explain here that this category will allow you to record audio input even when the app is in the background. This will allow you to listen to the device input in an AudioQueue recording callback function. However, when this is occurring in the background, the status bar is tinted red by the system to alert the user that a backgrounded app is listening (and possibly recording) device input.
The phone is only allowed to do a few things in the background, like play sounds, and monitor location.
My guess is that this won't work, or if it does, it will be rejected by apple.
You may be able to get it working by playing a sound in the background and using that to keep the app alive to listen to the input. I'm not sure if this will work.