WebRTC stop local tracks not disabling webcam after using replaceTrack - webrtc

I try to stop the browser webcam permissions indicator by using track.stop() like this:
myLocalStream.getTracks().forEach(track => {
track.stop();
});
In most cases this works fine. But if during the session I have switched between different cameras by using replaceTrack() it will not work. The browser still shows a running webcam. The code for switching between tracks looks like this:
pc.getSenders().map(sender => {
sender.replaceTrack(myLocalStream.getTracks().find(t => t.kind == sender.track.kind), myLocalStream)
});
It looks like after replacing the tracks, the browser has still some references to running tracks, that I don't know how to stop. Any suggestions?

Related

Tokbox audioLevelUpdated event not working in firefox

I'm experiencing an issue when trying to capture the audio levels from the Tokbox publisher. My code works perfectly on Chrome but are not working as expected on Firefox . I'm using
Here is my code:
this.publisher.on('audioLevelUpdated', (event) => {
console.log("event.audioLevel: " + event.audioLevel);

Agora.io User can't display the other user but can send self image and display it

I am using Agora SDK on web, it works completely fine on Google Chrome and Firefox browsers on macOS and windows, but on Safari, the user can send his own image and he can also sees himself but he can't see the person he is in videocall with.
Agora's Web SDK leverages the WebRTC API, and for this reason Safari browser has some limitations. Your question does not include much detail or code, so its difficult to make a definitive diagnosis, but the issue sounds like the browser autoplay policy is blocking the remote video stream from playing. Usually the autoplay policy does not affect the local streams (because the local streams do not play the local audio), so you only need to deal with the remote streams.
There are two options for working around the autoplay policy.
Bypass the autoplay block when the playback fails.
When detecting an autoplay block, instruct the user to click on the webpage to resume the playback:
stream.play("agora_remote"+ stream.getId(), function(err){
if (err && err.status !== "aborted"){
// The playback fails. Guide the user to resume the playback by clicking.
document.querySelector("#agora_remote"+ stream.getId()).onclick=function(){
stream.resume().then(
function (result) {
console.log('Resume succeeds: ' + result);
}).catch(
function (reason) {
console.log('Resume fails: ' + reason);
});
}
}
});
Bypass the autoplay block in advance.
If you prefer dealing with the autoplay block in advance, choose one of the following methods:
Method one: Play the stream without sound first by Stream.play("ID", { muted: true }), because autoplay without sound is allowed.
Method two: In your UI design, instruct the user to interact with the webpage, either by clicking or touching, to play the stream.
For more details about the two implementations for bypassing the autoplay policy ahead in advance, I would recommend taking a look at Agora's "Deal with autoplay policy" guide.

Websocket disconnecting when app is on background (REACT NATIVE)

Im doing an "Uber like" app, and using a websocket to keep an eye on the driver (to get his location). Within 1~10 minutes (it varies) and on background, the connection dies, i'm guessing it's not the websocket, the onclose event doesn't even return anything. It's like something is force closing the app or something alike. Anyone has any clue on what is happening? I'm using Expo.
The websocket function is this one: (i'm using reconnecting-websocket)
connect = () => {
var URL = 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXxx';
var token = 'XXXXXXXXXXXXXXXXXXXXXXX';
var ws = new ReconnectingWebSocket(URL,token, { debug: true, reconnectInterval: 3000 });
ws.onopen = () => {
console.log('Entered!');
}
ws.onclose = () => {
console.log('Left!')
}
}
The "Entered!" is printed, the "Left!" isn't.
Operating system is killing your app after some period in the background. You can use Headless JS (Android only) to run some background tasks periodically (can't keep websocket connection active though) or things like react-native-background-task or react-native-background-fetch that work on both iOS and Android but are not able to run tasks more frequently than once per 15 minutes. To my knowledge, the only way to achieve what you want is to write separate native modules for Android and iOS suited for your needs (and that's still not guaranteed to work, as operating systems nowadays are eager to kill any battery draining background apps). Consider enforcing foreground mode for your app

ASP.NET Core (3.0) Rangeprocessing Videostream

I'm currently expiring an issue. I try to serve a video file via FileStreamResult. Basically my code looks like that:
public IActionResult Video([FromQuery] int fileId)
{
var sfs = _sourceService.GetStreamByFileId(fileId);
return new FileStreamResult(sfs,new MediaTypeHeaderValue("video/mp4")) {EnableRangeProcessing = true};
}
In the next step I try to present the video on a website. For this example, I've tried the basic HTML5 Video Player.
<video controls width="640" height="264">
<source src="UrlToVideoAction" type="video/mp4">
Everything works fine. But if I now try to skip to a later position which isn't buffered at the time, the video player stops buffering/seeking. My Application doesn't throw any exception.
Then I tried to analyse the requests and responses. I noticed that my browser sends the following range header property when I skip to a specific position which isn't buffered:
Range: bytes=101318656-
It seems like the browser tries to fetch the whole ressource. The video player stops playing and if I retry to skip to a new position no further request will be sent. I've tested that behaivour with Firefox, Chrome and Edge. My tested file is about 250 mb. I also tested several other files all in MP4 format (x264, aac)
So my approach basically is to serve a video file in ASP.NET Core which should be able to skip to a specific position.
Thanks for any help.

WebRTC: View self-view while muting outgoing video in a call

Currently, the video mute functionality in webrtc is achieved by setting the enabled property of a video track to false
stream.getVideoTracks().forEach(function (track) {
track.enabled = false;
});
But the above code would not only mute the outgoing video, but the local self-view which is rendered using that local stream, also gets black frames.
Is there a way, to ONLY mute the outgoing video frames, but still be able to show a local self-view?
There's no easy way yet. Once MediaStreamTrack.clone() is supported by browsers, you could clone the video track to get a second instance of it with a separately controllable mute property, and send one track to your self-view and the other to the peerConnection. This would let you turn off video locally and remotely independently.
Today, the only workarounds I know of would be to call getUserMedia twice on Chrome (should work on https at least, where permissions will be persisted so the user won't be prompted twice) which would get you two tracks you could video-mute independently, or on Firefox you could use RTCRtpSender.replaceTrack() with a second "fake" video stream from getUserMedia using the non-standard { video: true, fake: true } constraint like this.