I've created a webCam Stream with
navigator.getUserMedia({ "video": true }, function(stream){
videoTag.src = window.URL.createObjectURL(stream);
videoTag.play();
}
Can I access the MediaStream object in stream in global scope?*
(something like navigator.getAllMediaStreams[0])
*edit: ...without adding logic to the getUserMedia function. My problem case is a qr-decoder-library, that gets the stream for me and I don't want to change the third party code
There is no list of active media streams kept by the browser.
You can save the stream to for example window.stream.
Sure:
navigator.allMediaStreams = [];
navigator.mediaDevices.getUserMedia({ "video": true }).then(stream => {
navigator.allMediaStreams.push(stream);
console.log(navigator.allMediaStreams.length); // 1
})
.catch(e => console.error(e));
console.log(navigator.allMediaStreams.length); // 0
Just understand that the array will be empty until the success callback fires.
It's like any other JavaScript object. As long as you keep a reference to a stream (so it doesn't get garbage collected), and don't call stop() on its tracks, you'll have a live video stream at your disposal, and the active-camera light, if there is one, will be on during this time.
Also like any other JavaScript variable, it is still tied to the page, even if it hangs off navigator, so it won't survive page navigation.
Related
Hello I am going to create a surveillance system. I would like to get a webcam video and a shared screen, but using addtrack will only get the media stream I declared later. Is there any way to get both streams.
thanks.
here is code offer side
let stream = video.srcObject;
let stream2 = shareVideo.srcObject;
stream.getTracks().forEach(track => peerConnection.addTrack(track, stream));
stream2.getTracks().forEach(track => peerConnection.addTrack(track, stream2));
and here is answer side
peerConnections[id].ontrack = (event) => {
console.log(event);
when i checked log. event has one track and stream[0] has mediastream bu steam[1] has no mediastream
I'm trying to use a MediaRecorder to record a MediaStream and display it in a video element using a MediaSource. So the setup looks like:
Request a MediaStream from the browser
Add it to the MediaRecorder
Add the recorded blobs to the MediaSource Buffer
The result looks very good but there is one problem: There is a delay in the playback.
When displaying the MediaStream directly there is no delay so I sorted out the first bulletpoint as the problem.
Nevertheless, it seems like either the MediaRecorder or the MediaSource is adding a delay of about 3 seconds to the stream.
this.screenRecording = await mediaDevices.getDisplayMedia({ video: { frameRate: 60, resizeMode: 'none' } });
const mediaRecorder = new MediaRecorder(this.screenRecording);
mediaRecorder.ondataavailable = async (event: any) => {
if (this.screenReceiving.readyState === 'open') {
if (this.screenReceivingBuffer == null) {
this.screenReceivingBuffer = this.screenReceiving.addSourceBuffer('video/webm;codecs=vp8');
}
if (!this.screenReceivingBuffer.updating) {
this.screenReceivingBuffer.appendBuffer(await new Response(event.data).arrayBuffer());
}
}
};
mediaRecorder.start(16);
The above code is only copy & paste from the actual project so please don't expect it to work by copy & paste ;)
Does anyone have an idea why this delay exists?
Any ideas on how to tweak the browser to not add this delay?
I have an application which is embedding a live stream in it. To cater for delays I'd like to know what is the current timestamp of the stream and compare it with the time on the server.
What I have tested up till now is checking the difference between the buffered time of the video with the current time of the video:
player.bufferedEnd() - player.currentTime()
However I'd like to compare the time with the server instead and to do so I need to get the timestamp of the last requested .ts file.
So, my question is using video.js, is there some sort of hook to get the timestamp of the last requested .ts file?
Video.js version: 7.4.1
I had managed to solve this issue, however please bear with me I don't remember where I had found the documentation for this bit of code.
In my case I was working in an Angular application, I had a video component responsible for loading a live stream with the use of video.js. Anyway let's see some code...
Video initialisation
private videoInit() {
this.player = videojs('video', {
aspectRatio: this.videoStream.aspectRatio,
controls: true,
autoplay: false,
muted: true,
html5: {
hls: {
overrideNative: true
}
}
});
this.player.src({
src: '://some-stream-url.com',
type: 'application/x-mpegURL'
});
// on video play callback
this.player.on('play', () => {
this.saveHlsObject();
});
}
Save HLS Object
private saveHlsObject() {
if (this.player !== undefined) {
this.playerHls = (this.player.tech() as any).hls;
// get and syncing server time...
// make some request to get server time...
// then calculate difference...
this.diff = serverTime.getTime() - this.getVideoTime().getTime();
}
}
Get Timestamp of Video Segment
// access the player's playlists, get the last segment and extract time
// in my case URI of segments were for example: 1590763989033.ts
private getVideoTime(): Date {
const targetMedia = this.playerHls.playlists.media();
const lastSegment = targetMedia.segments[0];
const uri: string = lastSegment.uri;
const segmentTimestamp: number = +uri.substring(0, uri.length - 3);
return new Date(segmentTimestamp);
}
So above the main point is the getVideoTime function. The time of a segment can be found in the segment URI, so that function extracts the time from the segment URI and then converts it to a Date object. Now to be honest, I don't know if this URI format is something that's a standard for HLS or something that was set for the particular stream I was connecting to. Hope this helps, and sorry I don't have any more specific information!
Initially, I had two different webpages:
One was to do Video Call and
Other was to do Screen Sharing
Now, I want to do both of them in one page.
Here is the scenario:
During Live call, a user wants to stop sharing his/her video and start sharing screen.
Afterwards, again he/she wishes to turn off screen sharing and start video sharing.
For clarity, here are some questions I want to ask:
On Caller Side:
1) How can I change my local stream from video to screen and vice versa?
2) Once it is done, how can I assign it to the local video element?
On Callee Side:
1) How do I handle if the current stream I am receiving is changed from video to screen?
2) How do I handle if the stream I am receiving has stopped? I mean, now I can receive neither video nor screen (just audio)
Kindly, help me in this regards. If there are any open source codes available, kindly share their links too.
Just for your reference, I was trying to handle it using following code. (i know this is naive and won't work)
function handleUserMedia(newStream){
var localvideo = document.getElementById("localvideo");
localvideo.src = URL.createObjectURL(newStream);
localStream = newStream;
sendMessage('got user media');
if (isInitiator) {
maybeStart();
}
}
function handleUserMediaError(error){
console.log(error);
}
var video_constraints = {video: true, audio: true};
var screen_constraints = {video: { mandatory: { chromeMediaSource: 'screen' } }};
getUserMedia(video_constraints, handleUserMedia, handleUserMediaError);
//getUserMedia(screen_constraints, handleUserMedia, handleUserMediaError);
$scope.btnLabel = 'Share Screen';
$scope.toggleSelected = function () {
$scope.selected = !$scope.selected;
if($scope.selected)
{
getUserMedia(screen_constraints, handleUserMedia, handleUserMediaError);
$scope.btnLabel = 'Share Video';
}
else
{
getUserMedia(video_constraints, handleUserMedia, handleUserMediaError);
$scope.btnLabel = 'Share Screen';
}
};
Check this demo:
https://www.webrtc-experiment.com/demos/switch-streams.html
and the relevant tutorial:
https://www.webrtc-experiment.com/docs/how-to-switch-streams.html
simply renegotiate peer connections on both users' side!
Is there a way to edit the local video stream 'localStream' before sending it to another peer via peerConnection() ?
navigator.getUserMedia({video: true, audio: true}, function(localMediaStream) {
var video = document.querySelector('video');
//How do I say edit a few pixes in the localMediaSttream before
//using peerConnection() to send it to another peer?
}, onFailSoHard);
Here are a few assumptions for tomorrow!
You can getUserMedia. Render stream in a video element. Use MediaSource APIs, get buffers; manipulate them. Do whatever you want!
Then capture stream from that "video" element.
It would be nice, if MediaSource APIs itself generate streams for us like WebAudio APIs.
Well, you can attach streams like this (after applying some affects on audio/video tracks):
peer.addStream ( new webkitStream (
yourStream.audioTracks || yourStream.getAudioTracks(),
yourStream.videoTracks || yourStream.getVideoTracks()
));