WebRTC: How to determine if remote user has disabled their video track? - webrtc

I have a button for disabling the local video track:
button.onclick = function(){
mediaStream.getVideoTracks()[0].enabled =
!(mediaStream.getVideoTracks()[0].enabled);
}
I want to be able to detect this on the remote side, so that I can switch out the view with a user friendly image instead of showing a black screen.
Are there any events fired or any properties of the stream that the remote user can check on its local stream object that indicate that the other user shut off their video?

No, there is no direct way to identify the remote video muted state.
You need to pass the video disabled event to remote user with signalling(over ws) or you can use data channel to relay the video disabled/enabled events.
You can predict remote video states based peerConnection stats, but they depends on bandwidth/network fluctuations.
And moreover browser will send some video data (empty/black frames) when we disable(mediaStream.getVideoTracks()[0].enabled = 0) the video track.

Related

Can't send Media Stream of captureStream() through WebRTC

I create a WebRTC with simple peer. I need to capture stream from video element and send it to peers. I use vidEl.captureStream() to get Media Stream from video element and send it to other peer. It's working but only audio and video is blackscreen(not something is display).
[Testing] I create another video element on same origin of first video element and use captureStream() and test setObject with Media Stream captured and it's working.
How to send MediaStream from video.captureStream() to new peer through WebRTC and display video from remote peer.
Sorry for bad english.
https://webrtc.github.io/samples/src/content/capture/video-pc/
This sample does'nt work on chrome, edge but fire is worked. I don't know why.
There is currently an open bug in Chrome which breaks this:
https://bugs.chromium.org/p/chromium/issues/detail?id=1156408

WebRTC addTrack / addStream after createOffer

I have created a WebRTC Channel for Text Chat only.
When user wishes to have Video / Audio chat, they can turn on the Video / Audio by pressing a button.
In WebRTC, we need to call navigator.getUserMedia and add the video stream before we createOffer.
But there is a problem, because as soon as we call getUserMedia, the browser will ask user for permission to access to the Camera, even though we do not initiate a Video / Audio chat yet.
My question is, is there a way for us to call navigator.getUserMedia at later stage and add the stream into the connection?
Yes! You need to do a re-negotiation. It is just another round of Offer/Answer. It can be done at anytime, and by either side.
Check out play-from-disk-renegotation this shows a peer adding/removing and removing video after it has connected.

WebRTC: View self-view while muting outgoing video in a call

Currently, the video mute functionality in webrtc is achieved by setting the enabled property of a video track to false
stream.getVideoTracks().forEach(function (track) {
track.enabled = false;
});
But the above code would not only mute the outgoing video, but the local self-view which is rendered using that local stream, also gets black frames.
Is there a way, to ONLY mute the outgoing video frames, but still be able to show a local self-view?
There's no easy way yet. Once MediaStreamTrack.clone() is supported by browsers, you could clone the video track to get a second instance of it with a separately controllable mute property, and send one track to your self-view and the other to the peerConnection. This would let you turn off video locally and remotely independently.
Today, the only workarounds I know of would be to call getUserMedia twice on Chrome (should work on https at least, where permissions will be persisted so the user won't be prompted twice) which would get you two tracks you could video-mute independently, or on Firefox you could use RTCRtpSender.replaceTrack() with a second "fake" video stream from getUserMedia using the non-standard { video: true, fake: true } constraint like this.

webrtc video stream stop sharing

I have created webrtc based video chat suing peerjs.
The local and remote video element is created using control:
local:
'video id= [local_peer_id] autoplay="true" controls="true">'
remote:
and
'video id= [remote_peer_id] autoplay="true" controls="true">'
Now during the video chat if local user mute auido remote user can not hear anything and its working perfect.
Problem is with the video. If local user pause his own video he can see the video is paused but remote user still can see his video live.
on the other hand if remote user pause his video, local user still can see his video live.
Any one tell what need to do to deploy the feathure
"Pause" and "resume" video that works real time for both peer?
You need to know the difference between the HTML tags and the WebRTC streams...
You can have streams running, without having them attached to any HTML tag, and the media can still be sent and received by each peer. So, each peer can attach the stream to a audio/video tag, and the tag will only act as a player that you use to play a stream that is already running.
So, if you mute the HTML tag, you will only be muting the player, and not the stream. If you want to make anything to have effect on the other peer, you need to do stuff on the stream or in the peer connection.
In particular, to mute and resume audio or video, you need to toggle the media tracks in the media stream
// create a button to toggle video
var button = document.createElement("button");
button.appendChild(document.createTextNode("Toggle Hold"));
button.onclick = function(){
mediaStream.getVideoTracks()[0].enabled =
!(mediaStream.getVideoTracks()[0].enabled);
}
To pause/resume audio, use getAudioTracks() instead.
calling mediaStream.stop() will stop the camera
where mediaStream is the stream that you got when calling getUserMedia
Here is a working example
mediaStream.getAudioTracks()[0].stop();
mediaStream.getVideoTracks()[0].stop();
Hope this will work with new standards. Its working fine in my app.

how to connect disconnect the camera device using getUserMedia and webRTC

I am creating an audio/video and chat application using webRTC and Node.js. I need to mute and unmute the camera device.
Presently, I am able to disconnect and the other party is not able to see me, but the problem I see is that it doesn't disconnect the camera. It still remains active and connected as I see the camera flash still on.
I need help how to disconnect when muted and connect it back when unmuted. I want the same feature as we see in skype video call.
It varies a bit between Firefox and Chrome. These steps, in this order, work for me.
1) Set the src property on your video element to empty string ''.
2) Make sure the stop method exists before calling it as a function. Firefox doesn't have it, and if you try to run it, your code will throw an error.
if (localStream && localStream.stop) {
localStream.stop();
}
3) After you call cameraStream.stop() (or not), set localStream = null. (Maybe not actually necessary, but it couldn't hurt to let the object get garbage-collected. And when the user asks to start the camera up again, you can check against the variable to see if you need to clean up after the previous stream before starting a new one.)
When you are getting your media, in your success callback function you have to keep your localstream in a variable. Then, when you want to stop your stream, you can do localstream.stop();
To start again, you can repeat to call your getUserMedia() method again.