Is there a way to edit the local video stream 'localStream' before sending it to another peer via peerConnection() ?
navigator.getUserMedia({video: true, audio: true}, function(localMediaStream) {
var video = document.querySelector('video');
//How do I say edit a few pixes in the localMediaSttream before
//using peerConnection() to send it to another peer?
}, onFailSoHard);
Here are a few assumptions for tomorrow!
You can getUserMedia. Render stream in a video element. Use MediaSource APIs, get buffers; manipulate them. Do whatever you want!
Then capture stream from that "video" element.
It would be nice, if MediaSource APIs itself generate streams for us like WebAudio APIs.
Well, you can attach streams like this (after applying some affects on audio/video tracks):
peer.addStream ( new webkitStream (
yourStream.audioTracks || yourStream.getAudioTracks(),
yourStream.videoTracks || yourStream.getVideoTracks()
));
Related
Hello I am going to create a surveillance system. I would like to get a webcam video and a shared screen, but using addtrack will only get the media stream I declared later. Is there any way to get both streams.
thanks.
here is code offer side
let stream = video.srcObject;
let stream2 = shareVideo.srcObject;
stream.getTracks().forEach(track => peerConnection.addTrack(track, stream));
stream2.getTracks().forEach(track => peerConnection.addTrack(track, stream2));
and here is answer side
peerConnections[id].ontrack = (event) => {
console.log(event);
when i checked log. event has one track and stream[0] has mediastream bu steam[1] has no mediastream
I am taking a MediaStream and merging two separate tracks (video and audio) using a canvas and the WebAudio API. The MediaStream itself does not seem to fall out of sync, but after reading it into a MediaRecorder and buffering it into a video element the audio will always seem to play much earlier than the video Here's the code that seems to have the issue:
let stream = new MediaStream();
// Get the mixed sources drawn to the canvas
this.canvas.captureStream().getVideoTracks().forEach(track => {
stream.addTrack(track);
});
// Add mixed audio tracks to the stream
// https://stackoverflow.com/questions/42138545/webrtc-mix-local-and-remote-audio-steams-and-record
this.audioMixer.dest.stream.getAudioTracks().forEach(track => {
stream.addTrack(track);
});
// stream = stream;
let mediaRecorder = new MediaRecorder(stream, { mimeType: 'video/webm;codecs=opus,vp8' });
let mediaSource = new MediaSource();
let video = document.createElement('video');
video.src = URL.createObjectURL(mediaSource);
document.body.appendChild(video);
video.controls = true;
video.autoplay = true;
// Source open
mediaSource.onsourceopen = () => {
let sourceBuffer = mediaSource.addSourceBuffer(mediaRecorder.mimeType);
mediaRecorder.ondataavailable = (event) => {
if (event.data.size > 0) {
const reader = new FileReader();
reader.readAsArrayBuffer(event.data);
reader.onloadend = () => {
sourceBuffer.appendBuffer(reader.result);
console.log(mediaSource.sourceBuffers);
console.log(event.data);
}
}
}
mediaRecorder.start(1000);
}
AudioMixer.js
export default class AudioMixer {
constructor() {
// Initialize an audio context
this.audioContext = new AudioContext();
// Destination outputs one track of mixed audio
this.dest = this.audioContext.createMediaStreamDestination();
// Array of current streams in mixer
this.sources = [];
}
// Add an audio stream to the mixer
addStream(id, stream) {
// Get the audio tracks from the stream and add them to the mixer
let sources = stream.getAudioTracks().map(track => this.audioContext.createMediaStreamSource(new MediaStream([track])));
sources.forEach(source => {
// Add it to the current sources being mixed
this.sources.push(source);
source.connect(this.dest);
// Connect to analyser to update volume slider
let analyser = this.audioContext.createAnalyser();
source.connect(analyser);
...
});
}
// Remove all current sources from the mixer
flushAll() {
this.sources.forEach(source => {
source.disconnect(this.dest);
});
this.sources = [];
}
// Clean up the audio context for the mixer
cleanup() {
this.audioContext.close();
}
}
I assume it has to do with how the data is pushed into the MediaSource buffer but I'm not sure. What am I doing that de-syncs the stream?
A late reply to an old post, but it might help someone ...
I had exactly the same problem: I have a video stream, which should be supplemented by an audio stream. In the audio stream short sounds (AudioBuffer) are played from time to time. The whole thing is recorded via MediaRecorder.
Everything works fine on Chrome. But on Chrome for Android, all sounds were played back in quick succession. The "when" parameter for "play()" was ignored on Android. (audiocontext.currentTime continued to increase over time ... - that was not the point).
My solution is similar to Jacob's comment Sep 2 '18 at 7:41:
I created and connected a sine wave oscillator with inaudible 48,000 Hz, which played permanently in the audio stream during recording. Apparently this leads to the proper time progress.
An RTP endpoint that is emitting multiple related RTP streams that
require synchronization at the other endpoint(s) MUST use the same
RTCP CNAME for all streams that are to be synchronized. This
requires a short-term persistent RTCP CNAME that is common across
several RTP streams, and potentially across several related RTP
sessions. A common example of such use occurs when lip-syncing audio
and video streams in a multimedia session, where a single participant
has to use the same RTCP CNAME for its audio RTP session and for its
video RTP session. Another example might be to synchronize the
layers of a layered audio codec, where the same RTCP CNAME has to be
used for each layer.
https://datatracker.ietf.org/doc/html/rfc6222#page-2
There is a bug in Chrome, that plays buffered media stream audio with 44100KHz, even when it's encoded with 48000 (which leads to gaps and video desync). All other browsers seem to play it fine. You can choose to change codec to the one which supports 44.1KHz encoding or play a file from web link as a source (this way Chrome can play it correctly)
I've created a webCam Stream with
navigator.getUserMedia({ "video": true }, function(stream){
videoTag.src = window.URL.createObjectURL(stream);
videoTag.play();
}
Can I access the MediaStream object in stream in global scope?*
(something like navigator.getAllMediaStreams[0])
*edit: ...without adding logic to the getUserMedia function. My problem case is a qr-decoder-library, that gets the stream for me and I don't want to change the third party code
There is no list of active media streams kept by the browser.
You can save the stream to for example window.stream.
Sure:
navigator.allMediaStreams = [];
navigator.mediaDevices.getUserMedia({ "video": true }).then(stream => {
navigator.allMediaStreams.push(stream);
console.log(navigator.allMediaStreams.length); // 1
})
.catch(e => console.error(e));
console.log(navigator.allMediaStreams.length); // 0
Just understand that the array will be empty until the success callback fires.
It's like any other JavaScript object. As long as you keep a reference to a stream (so it doesn't get garbage collected), and don't call stop() on its tracks, you'll have a live video stream at your disposal, and the active-camera light, if there is one, will be on during this time.
Also like any other JavaScript variable, it is still tied to the page, even if it hangs off navigator, so it won't survive page navigation.
Initially, I had two different webpages:
One was to do Video Call and
Other was to do Screen Sharing
Now, I want to do both of them in one page.
Here is the scenario:
During Live call, a user wants to stop sharing his/her video and start sharing screen.
Afterwards, again he/she wishes to turn off screen sharing and start video sharing.
For clarity, here are some questions I want to ask:
On Caller Side:
1) How can I change my local stream from video to screen and vice versa?
2) Once it is done, how can I assign it to the local video element?
On Callee Side:
1) How do I handle if the current stream I am receiving is changed from video to screen?
2) How do I handle if the stream I am receiving has stopped? I mean, now I can receive neither video nor screen (just audio)
Kindly, help me in this regards. If there are any open source codes available, kindly share their links too.
Just for your reference, I was trying to handle it using following code. (i know this is naive and won't work)
function handleUserMedia(newStream){
var localvideo = document.getElementById("localvideo");
localvideo.src = URL.createObjectURL(newStream);
localStream = newStream;
sendMessage('got user media');
if (isInitiator) {
maybeStart();
}
}
function handleUserMediaError(error){
console.log(error);
}
var video_constraints = {video: true, audio: true};
var screen_constraints = {video: { mandatory: { chromeMediaSource: 'screen' } }};
getUserMedia(video_constraints, handleUserMedia, handleUserMediaError);
//getUserMedia(screen_constraints, handleUserMedia, handleUserMediaError);
$scope.btnLabel = 'Share Screen';
$scope.toggleSelected = function () {
$scope.selected = !$scope.selected;
if($scope.selected)
{
getUserMedia(screen_constraints, handleUserMedia, handleUserMediaError);
$scope.btnLabel = 'Share Video';
}
else
{
getUserMedia(video_constraints, handleUserMedia, handleUserMediaError);
$scope.btnLabel = 'Share Screen';
}
};
Check this demo:
https://www.webrtc-experiment.com/demos/switch-streams.html
and the relevant tutorial:
https://www.webrtc-experiment.com/docs/how-to-switch-streams.html
simply renegotiate peer connections on both users' side!
I’ve integrated vLine into a test site and I’m noticing that it’s picture-in-picture. Is that the only way this works? Is there a way to have both streams separate?
The picture-in-picture (PIP) mode occurs when you enable the vLine UI widgets, specifically the uiVideoPanel widget. Note that "ui": true enables all widgets, including the uiVideoPanel widget.
If you want to lay out the video streams in a custom manner, you can disable the uiVideoPanel widget and handle the mediaSession:addLocalStream and mediaSession:addRemoteStream events, where you can create the HTML <video> element with stream.createMediaElement(). You can put the resulting <video> element in any div and adjust the layout with CSS.
The following snippet was lifted from the vline-shell example:
// $client is the vline.Client that you created with vline.Client.create()
$client.on('add:mediaSession', onAddMediaSession, self);
// callback on new MediaSessions
function addMediaSession_(mediaSession) {
// add event handler for add stream events
mediaSession.on('mediaSession:addLocalStream mediaSession:addRemoteStream', function(event) {
// get the vline.MediaStream
var stream = event.stream;
// guard against adding a local video stream twice if it is attached to two media sessions
if ($('#' + stream.getId()).length) {
return;
}
// create video or audio element, giving it the the same id as the MediaStream
var elem = $(event.stream.createMediaElement());
elem.prop('id', stream.getId());
// video-wrapper is the id of a div in the page
$('#video-wrapper').append(elem);
});
// add event handler for remove stream events
mediaSession.on('mediaSession:removeLocalStream mediaSession:removeRemoteStream', function(event) {
$('#' + event.stream.getId()).remove();
});
}