I am using RTC for Web. "unmute-audio" callback is not triggering when remote user click on unmute button in mobile app.But it will trigger "stream-added" callback. In mobile I am using below function for mute and unmute stream.Used function for mute and unmute
client.on("stream-subscribed", function(evt){
var stream = evt.stream;
// Mutes the remote stream.
stream.unmuteAudio();
});
try this
"unmute-audio" callback will give userID of remote user who unmuted himself.
As I mentioned in quetion "stream-added" is triggering instead of "unmute-audio". So it is possible to get same userId from "stream-added" callback also. So now I can update mute and unmute status in UI.
Related
Here is the demo with mute unmute logic of agora client lib
Main problem is that MediaStreamTrack switching to 'ended' state after setMuted or setEnabled and it is not going back to 'live' state after reverse action on those methods of LocalAudioTrack so I can't use it with AudioContext for audio processing.
Even volume-indicator event on AgoraClient stops firing after mute and unmute actions on LocalAudioTrack
So what is proper way to make mute and unmute and to get actual active native MediaStreamTrack?
You should be able to use the setMuted method which doesn't destroy the track (only stops publishing it) for your use case to continue your processing on the track.
I'm unable to reproduce the the volume-indicator event stopping after mute as you described, I'm attaching a screenshot of the log below.
I'm really confused on how notification listener works in expo, especially for a killed app, I read the doc and I knew that addNotificationResponseReceivedListener listener use when the app is ( in background or killed). I need to know how this listener work and when I unsubscribe the listener does it keep listening?
this callback is called after the user taps on the notification and the App open. you will receive an object with the push notification info (like title, message and extra data) and from there you can decide if you will redirect the user somewhere or just resume the app flow.
If the app is not subscribed (if you didn't generate the token for push notification) you will not receive the notifications.
If you just didn't create the callback responseListener.current = Notifications.addNotificationResponseReceivedListener(response => {..... your app will show the notifications anyway (normally you just use the callback when you want to do some action with the notification info)
Hope this info helps.
I have created a WebRTC Channel for Text Chat only.
When user wishes to have Video / Audio chat, they can turn on the Video / Audio by pressing a button.
In WebRTC, we need to call navigator.getUserMedia and add the video stream before we createOffer.
But there is a problem, because as soon as we call getUserMedia, the browser will ask user for permission to access to the Camera, even though we do not initiate a Video / Audio chat yet.
My question is, is there a way for us to call navigator.getUserMedia at later stage and add the stream into the connection?
Yes! You need to do a re-negotiation. It is just another round of Offer/Answer. It can be done at anytime, and by either side.
Check out play-from-disk-renegotation this shows a peer adding/removing and removing video after it has connected.
I have a button for disabling the local video track:
button.onclick = function(){
mediaStream.getVideoTracks()[0].enabled =
!(mediaStream.getVideoTracks()[0].enabled);
}
I want to be able to detect this on the remote side, so that I can switch out the view with a user friendly image instead of showing a black screen.
Are there any events fired or any properties of the stream that the remote user can check on its local stream object that indicate that the other user shut off their video?
No, there is no direct way to identify the remote video muted state.
You need to pass the video disabled event to remote user with signalling(over ws) or you can use data channel to relay the video disabled/enabled events.
You can predict remote video states based peerConnection stats, but they depends on bandwidth/network fluctuations.
And moreover browser will send some video data (empty/black frames) when we disable(mediaStream.getVideoTracks()[0].enabled = 0) the video track.
I have created webrtc based video chat suing peerjs.
The local and remote video element is created using control:
local:
'video id= [local_peer_id] autoplay="true" controls="true">'
remote:
and
'video id= [remote_peer_id] autoplay="true" controls="true">'
Now during the video chat if local user mute auido remote user can not hear anything and its working perfect.
Problem is with the video. If local user pause his own video he can see the video is paused but remote user still can see his video live.
on the other hand if remote user pause his video, local user still can see his video live.
Any one tell what need to do to deploy the feathure
"Pause" and "resume" video that works real time for both peer?
You need to know the difference between the HTML tags and the WebRTC streams...
You can have streams running, without having them attached to any HTML tag, and the media can still be sent and received by each peer. So, each peer can attach the stream to a audio/video tag, and the tag will only act as a player that you use to play a stream that is already running.
So, if you mute the HTML tag, you will only be muting the player, and not the stream. If you want to make anything to have effect on the other peer, you need to do stuff on the stream or in the peer connection.
In particular, to mute and resume audio or video, you need to toggle the media tracks in the media stream
// create a button to toggle video
var button = document.createElement("button");
button.appendChild(document.createTextNode("Toggle Hold"));
button.onclick = function(){
mediaStream.getVideoTracks()[0].enabled =
!(mediaStream.getVideoTracks()[0].enabled);
}
To pause/resume audio, use getAudioTracks() instead.
calling mediaStream.stop() will stop the camera
where mediaStream is the stream that you got when calling getUserMedia
Here is a working example
mediaStream.getAudioTracks()[0].stop();
mediaStream.getVideoTracks()[0].stop();
Hope this will work with new standards. Its working fine in my app.