Using video.js is it possible to get current HLS timestamp? - video.js

I have an application which is embedding a live stream in it. To cater for delays I'd like to know what is the current timestamp of the stream and compare it with the time on the server.
What I have tested up till now is checking the difference between the buffered time of the video with the current time of the video:
player.bufferedEnd() - player.currentTime()
However I'd like to compare the time with the server instead and to do so I need to get the timestamp of the last requested .ts file.
So, my question is using video.js, is there some sort of hook to get the timestamp of the last requested .ts file?
Video.js version: 7.4.1

I had managed to solve this issue, however please bear with me I don't remember where I had found the documentation for this bit of code.
In my case I was working in an Angular application, I had a video component responsible for loading a live stream with the use of video.js. Anyway let's see some code...
Video initialisation
private videoInit() {
this.player = videojs('video', {
aspectRatio: this.videoStream.aspectRatio,
controls: true,
autoplay: false,
muted: true,
html5: {
hls: {
overrideNative: true
}
}
});
this.player.src({
src: '://some-stream-url.com',
type: 'application/x-mpegURL'
});
// on video play callback
this.player.on('play', () => {
this.saveHlsObject();
});
}
Save HLS Object
private saveHlsObject() {
if (this.player !== undefined) {
this.playerHls = (this.player.tech() as any).hls;
// get and syncing server time...
// make some request to get server time...
// then calculate difference...
this.diff = serverTime.getTime() - this.getVideoTime().getTime();
}
}
Get Timestamp of Video Segment
// access the player's playlists, get the last segment and extract time
// in my case URI of segments were for example: 1590763989033.ts
private getVideoTime(): Date {
const targetMedia = this.playerHls.playlists.media();
const lastSegment = targetMedia.segments[0];
const uri: string = lastSegment.uri;
const segmentTimestamp: number = +uri.substring(0, uri.length - 3);
return new Date(segmentTimestamp);
}
So above the main point is the getVideoTime function. The time of a segment can be found in the segment URI, so that function extracts the time from the segment URI and then converts it to a Date object. Now to be honest, I don't know if this URI format is something that's a standard for HLS or something that was set for the particular stream I was connecting to. Hope this helps, and sorry I don't have any more specific information!

Related

Total duration not get in player using MP3 url when we pass header in 'react-native-track-player' in react native

I am using "react-native-track-player" for playing MP3 url in react native. But when I pass header authenticate at that time I am not able to get whole time of the MP3 url. In my screen it is necessary to show whole time of the url before player load. And also I am not able to do forward and backward action using "seekTO". for trackplayer the code is below,
var list = [currentItem].map(item => Object.assign(item,
{
artist: 'tootak',
headers: { Authorization: Global.authenticateUser },
url: item.is_local ?
('file://' + (item.url ? item.url : (item.URL ? item.URL : '')))
: Global.getMediaLink(item.url ? item.url : (item.URL ? item.URL : '')),
id: item.code,
artwork: Global.getUrl(item.images),
}))
await TrackPlayer.reset()
await TrackPlayer.add(list)
and for to seekTo ,
var time = await TrackPlayer.getposition()
await TrackPlayer.seekTo(time + 15)
They have referred to it in the docs that the library is only for streaming audio directly and to not depend on it to get something like duration.
react-native-track-player is a streaming library, which means it slowly buffers the track and doesn’t know exactly when it ends. The duration returned by this function is determined through various tricks and may not be exact or may not be available at all.
You should not trust this function. You should retrieve the duration from a database and feed it to the duration parameter in the Track Object.
We would need to use something like FFprobe or FFmpeg as a stream analyzer to retrieve the values. There is a package called get-audio-duration which does the job for you.

MediaRecorder has a delay of multiple seconda

I'm trying to use a MediaRecorder to record a MediaStream and display it in a video element using a MediaSource. So the setup looks like:
Request a MediaStream from the browser
Add it to the MediaRecorder
Add the recorded blobs to the MediaSource Buffer
The result looks very good but there is one problem: There is a delay in the playback.
When displaying the MediaStream directly there is no delay so I sorted out the first bulletpoint as the problem.
Nevertheless, it seems like either the MediaRecorder or the MediaSource is adding a delay of about 3 seconds to the stream.
this.screenRecording = await mediaDevices.getDisplayMedia({ video: { frameRate: 60, resizeMode: 'none' } });
const mediaRecorder = new MediaRecorder(this.screenRecording);
mediaRecorder.ondataavailable = async (event: any) => {
if (this.screenReceiving.readyState === 'open') {
if (this.screenReceivingBuffer == null) {
this.screenReceivingBuffer = this.screenReceiving.addSourceBuffer('video/webm;codecs=vp8');
}
if (!this.screenReceivingBuffer.updating) {
this.screenReceivingBuffer.appendBuffer(await new Response(event.data).arrayBuffer());
}
}
};
mediaRecorder.start(16);
The above code is only copy & paste from the actual project so please don't expect it to work by copy & paste ;)
Does anyone have an idea why this delay exists?
Any ideas on how to tweak the browser to not add this delay?

How to get the UTC date for when an image was uploaded using Images.get()?

I am writing a plug-in to export/sync data from DroneDeploy.
I looked at the JSON that is returned from Images.get(). Sometimes a "date_creation" field is available. Sometimes that field is not present. Furthermore, within this field, a "$date" field shows what looks like a string based date which I have to assume is a local date? Is the representation of that date the local time of the user who uploaded the image? Or has it already been converted to the time zone of the user currently logged in?
Sample snipped JSON of an image coming from the Images.get() API:
{
"name": "camera",
"drone_session_id": "1495472258_SUPPORTOPENPIPELINE",
"command": "log",
"drone_device_id": "1495472258_SUPPORTOPENPIPELINE",
"date_creation": {
"$date": "2017:02:09 11:37:46"
}
}
How can I get a reliable true UTC date of when this image was uploaded to DroneDeploy so I can determine whether or not I should export the image if a user comes back and requests another sync operation to occur?
The image exif information you are referring can be a little unreliable due to which camera capture the images.
In order to determine which images you should sync I would recommend looking at which plans you have and haven't synced.
I.E.
const allPlanIds$ = dronedeployApi.Plans.all().then((plans) => plans.map((plan) => plan.id));
const alreadySyncedPlanIds$ = fetch()... // get these planIds from your server
const planIdsToSync$ = Promise.all(allPlanIds$, alreadySyncedPlanIds$).then(([planIds, syncedPlanIds]) => planIds.filter((planId) => !syncedPlanIds.includes(planId)));
planIdsToSync$.then((planIdsToSync) => console.log(planIdsToSync)) // get images for these planIds

WebRTC - Change device/camera in realtime

I'm having a problem trying to change my camera in real time, It works for the local video, but the remote person cannot see the new camera, and still sees the old one. I tried to stop the stream and init again but still not working. This is just some of my code.
I have searched everywhere and I can't find a solution. Can someone help me out?
function init() {
getUserMedia(constraints, connect, fail);
}
$(".webcam-devices").on('change', function() {
var deviceID = this.value;
constraints.video = {
optional: [{
sourceId: deviceID
}]
};
stream.getTracks().forEach(function (track) { track.stop(); });
init();
});
You need to actually change the track you're sending in the PeerConnection. In Firefox, you can use RTPSender.replaceTrack(new_track); to change without renegotiation (this is being added to the spec now). Otherwise, you need to add the new stream/track to the RTCPeerConnection, and remove the old one, and then process the onnegotiationneeded event and renegotatiate
See one of #jib's fiddles: Jib's replaceTrack() fiddle:
function flip() {
flipped = 1 - flipped;
return pc1.getSenders()[0].replaceTrack(streams[flipped].getVideoTracks()[0])
.then(() => log("Flip! (notice change in dimensions & framerate!)"))
.catch(failed);
}

WebRTC: Switch from Video Sharing to Screen sharing during call

Initially, I had two different webpages:
One was to do Video Call and
Other was to do Screen Sharing
Now, I want to do both of them in one page.
Here is the scenario:
During Live call, a user wants to stop sharing his/her video and start sharing screen.
Afterwards, again he/she wishes to turn off screen sharing and start video sharing.
For clarity, here are some questions I want to ask:
On Caller Side:
1) How can I change my local stream from video to screen and vice versa?
2) Once it is done, how can I assign it to the local video element?
On Callee Side:
1) How do I handle if the current stream I am receiving is changed from video to screen?
2) How do I handle if the stream I am receiving has stopped? I mean, now I can receive neither video nor screen (just audio)
Kindly, help me in this regards. If there are any open source codes available, kindly share their links too.
Just for your reference, I was trying to handle it using following code. (i know this is naive and won't work)
function handleUserMedia(newStream){
var localvideo = document.getElementById("localvideo");
localvideo.src = URL.createObjectURL(newStream);
localStream = newStream;
sendMessage('got user media');
if (isInitiator) {
maybeStart();
}
}
function handleUserMediaError(error){
console.log(error);
}
var video_constraints = {video: true, audio: true};
var screen_constraints = {video: { mandatory: { chromeMediaSource: 'screen' } }};
getUserMedia(video_constraints, handleUserMedia, handleUserMediaError);
//getUserMedia(screen_constraints, handleUserMedia, handleUserMediaError);
$scope.btnLabel = 'Share Screen';
$scope.toggleSelected = function () {
$scope.selected = !$scope.selected;
if($scope.selected)
{
getUserMedia(screen_constraints, handleUserMedia, handleUserMediaError);
$scope.btnLabel = 'Share Video';
}
else
{
getUserMedia(video_constraints, handleUserMedia, handleUserMediaError);
$scope.btnLabel = 'Share Screen';
}
};
Check this demo:
https://www.webrtc-experiment.com/demos/switch-streams.html
and the relevant tutorial:
https://www.webrtc-experiment.com/docs/how-to-switch-streams.html
simply renegotiate peer connections on both users' side!