MediaRecorder has a delay of multiple seconda - html5-video

I'm trying to use a MediaRecorder to record a MediaStream and display it in a video element using a MediaSource. So the setup looks like:
Request a MediaStream from the browser
Add it to the MediaRecorder
Add the recorded blobs to the MediaSource Buffer
The result looks very good but there is one problem: There is a delay in the playback.
When displaying the MediaStream directly there is no delay so I sorted out the first bulletpoint as the problem.
Nevertheless, it seems like either the MediaRecorder or the MediaSource is adding a delay of about 3 seconds to the stream.
this.screenRecording = await mediaDevices.getDisplayMedia({ video: { frameRate: 60, resizeMode: 'none' } });
const mediaRecorder = new MediaRecorder(this.screenRecording);
mediaRecorder.ondataavailable = async (event: any) => {
if (this.screenReceiving.readyState === 'open') {
if (this.screenReceivingBuffer == null) {
this.screenReceivingBuffer = this.screenReceiving.addSourceBuffer('video/webm;codecs=vp8');
}
if (!this.screenReceivingBuffer.updating) {
this.screenReceivingBuffer.appendBuffer(await new Response(event.data).arrayBuffer());
}
}
};
mediaRecorder.start(16);
The above code is only copy & paste from the actual project so please don't expect it to work by copy & paste ;)
Does anyone have an idea why this delay exists?
Any ideas on how to tweak the browser to not add this delay?

Related

WebRTC replaceTrack, getStats not returning audioInputLevel

I have a WebRTC stream which is sending audio/video, I am displaying the volume in a meter widget which is retrieved from a getStats call on the peerConnection.
getStats(function (stats) {
var results = stats.result()
for (let i=0; i < results.length; i++) {
var res = results[i]
if (res.type == 'ssrc') {
volume = parseInt(res.stat('audioInputLevel'))
}
}
})
This is working fine, the issue is when I run replaceTrack to update the streams audio/video the above getStats returns 0 for the audio level.
navigator.mediaDevices.getUserMedia(media)
.then(stream => {
const tracks = stream.getTracks()
peerConnection.getSenders()
.forEach(sender => {
const newTrack = tracks.find(track => track.kind === sender.track.kind)
sender.replaceTrack(newTrack)
})
})
The local stream get's updated, the remote user get's updated and audio / video is working. But getStats is no longer returning the audioInputLevel.
Would anyone be able to help me understand why? Or what a fix maybe.
Thanks
audioLevel is broken in spec-stats, see https://bugs.chromium.org/p/chromium/issues/detail?id=920630#c16 and the linked bugs.

setSinkId change muliple audio ouputs

Here is the problem,
First I enumerate all the devices that I have available with in select elements:
navigator.mediaDevices.enumerateDevices()
When I change one output, it sounds in the device that I choose.
HTMLMediaElement.setSinkId(deviceId)
After if I play another audio and change the device output (setSinkId), it changes also the first one to the last deviceId. So I have both sounds in the same device.
Do I need to have the last adapter.js version to implement properly that problem?
********* EDITED **********
Following the above comment, it try the web audio, but not success. With getUserMedia everything is fine.
navigator.getUserMedia( { audio: true, video: false },
function (mediaStream) {
// Create an audio context for the audio
var ac = new (window.AudioContext || window.webKitAudioContext)();
// Create a clone of the stream, if not the id of all the stream is default
//var streamClone = stream.clone();
var ss = ac.createMediaStreamSource(mediaStream);
// Create a destination
var sd = ac.createMediaStreamDestination()
ss.connect(sd);
element.srcObject = sd.stream;
// Play the sound
element.play();
element.setSinkId(deviceId).then(function() {
console.log('Set deviceId('+deviceId+') in the selected audio element');
});
},
function (error) {
console.log(error);
}
);
But using my remote stream, I cannot get any noise
var ac = new (window.AudioContext || window.webKitAudioContext)();
// Create a clone of the stream, if not the id of all the stream is default
var streamClone = stream.clone();
var ss = ac.createMediaStreamSource(stream);
// Create a destination
var sd = ac.createMediaStreamDestination()
ss.connect(sd);
// Element is my HTMLMediaElement
element.srcObject = sd.stream;
// Play the sound
element.play();
element.setSinkId(deviceId).then(function() {
console.log('Set deviceId('+deviceId+') in the selected audio element');
});
this is most likely caused by how Chrome renders audio. See here for a description which also suggests using webaudio to workaround the problem.
adapter.js can not fix this.

WebRTC - Change device/camera in realtime

I'm having a problem trying to change my camera in real time, It works for the local video, but the remote person cannot see the new camera, and still sees the old one. I tried to stop the stream and init again but still not working. This is just some of my code.
I have searched everywhere and I can't find a solution. Can someone help me out?
function init() {
getUserMedia(constraints, connect, fail);
}
$(".webcam-devices").on('change', function() {
var deviceID = this.value;
constraints.video = {
optional: [{
sourceId: deviceID
}]
};
stream.getTracks().forEach(function (track) { track.stop(); });
init();
});
You need to actually change the track you're sending in the PeerConnection. In Firefox, you can use RTPSender.replaceTrack(new_track); to change without renegotiation (this is being added to the spec now). Otherwise, you need to add the new stream/track to the RTCPeerConnection, and remove the old one, and then process the onnegotiationneeded event and renegotatiate
See one of #jib's fiddles: Jib's replaceTrack() fiddle:
function flip() {
flipped = 1 - flipped;
return pc1.getSenders()[0].replaceTrack(streams[flipped].getVideoTracks()[0])
.then(() => log("Flip! (notice change in dimensions & framerate!)"))
.catch(failed);
}

WebRTC: Switch from Video Sharing to Screen sharing during call

Initially, I had two different webpages:
One was to do Video Call and
Other was to do Screen Sharing
Now, I want to do both of them in one page.
Here is the scenario:
During Live call, a user wants to stop sharing his/her video and start sharing screen.
Afterwards, again he/she wishes to turn off screen sharing and start video sharing.
For clarity, here are some questions I want to ask:
On Caller Side:
1) How can I change my local stream from video to screen and vice versa?
2) Once it is done, how can I assign it to the local video element?
On Callee Side:
1) How do I handle if the current stream I am receiving is changed from video to screen?
2) How do I handle if the stream I am receiving has stopped? I mean, now I can receive neither video nor screen (just audio)
Kindly, help me in this regards. If there are any open source codes available, kindly share their links too.
Just for your reference, I was trying to handle it using following code. (i know this is naive and won't work)
function handleUserMedia(newStream){
var localvideo = document.getElementById("localvideo");
localvideo.src = URL.createObjectURL(newStream);
localStream = newStream;
sendMessage('got user media');
if (isInitiator) {
maybeStart();
}
}
function handleUserMediaError(error){
console.log(error);
}
var video_constraints = {video: true, audio: true};
var screen_constraints = {video: { mandatory: { chromeMediaSource: 'screen' } }};
getUserMedia(video_constraints, handleUserMedia, handleUserMediaError);
//getUserMedia(screen_constraints, handleUserMedia, handleUserMediaError);
$scope.btnLabel = 'Share Screen';
$scope.toggleSelected = function () {
$scope.selected = !$scope.selected;
if($scope.selected)
{
getUserMedia(screen_constraints, handleUserMedia, handleUserMediaError);
$scope.btnLabel = 'Share Video';
}
else
{
getUserMedia(video_constraints, handleUserMedia, handleUserMediaError);
$scope.btnLabel = 'Share Screen';
}
};
Check this demo:
https://www.webrtc-experiment.com/demos/switch-streams.html
and the relevant tutorial:
https://www.webrtc-experiment.com/docs/how-to-switch-streams.html
simply renegotiate peer connections on both users' side!

Stop Video Capture programmatically in WinJS

I have a winJS where I am recording video. While I can make it work, I want to stop the camera recording automatically after 15 seconds. Currently the cam records more than 15 secs then trims out 15 secs from the video. I want the camera turned off/stop recording after 15secs automatically. I have the following code:
function captureVideo() {
WinJS.log && WinJS.log("", "sample", "status");
// Using Windows.Media.Capture.CameraCaptureUI API to capture a video
var dialog = new Windows.Media.Capture.CameraCaptureUI();
dialog.videoSettings.allowTrimming = true;
dialog.videoSettings.format = Windows.Media.Capture.CameraCaptureUIVideoFormat.mp4;
dialog.videoSettings.maxDurationInSeconds = document.getElementById("txtDuration").value;
dialog.captureFileAsync(Windows.Media.Capture.CameraCaptureUIMode.video).done(function (file) {
if (file) {
var videoBlobUrl = URL.createObjectURL(file, {oneTimeOnly: true});
document.getElementById("capturedVideo").src = videoBlobUrl;
localSettings.values[videoKey] = file.path;
} else {
WinJS.log && WinJS.log("No video captured.", "sample", "status");
}
}, function (err) {
WinJS.log && WinJS.log(err, "sample", "error");
});
}
The CameraCaptureUI that you are using sacrifices power for the ease of use and standard interface. If you need more power such as the ability to start and stop the recording, you should use the MediaCapture object. See my mediacap demo in codeSHOW. In it I am using the MediaCapture for recording audio, but you can likely figure out how to record video instead and add your concept of timing.