How come canPlayType('video/mov') returns an empty string, but I can still play the video? - html5-video

I'm on chrome 80, and I'm able to get an html5 video tag to play a .mov file without any trouble. However, canPlayType('video/mov'), canPlayType('video/quicktime'), and canPlayType('video/x-quicktime') all return "". canPlayType('video/mp4'), canPlayType('video/ogg'), canPlayType('video/webm') are all working as expected, however (they at least return "maybe"). Why doesn't this work as expected for any of the mov formats, and since I can clearly play an mov, is there another way I can programmatically, reliably tell this is true? I understand this checking is all a little iffy, (hence "maybe" and "probably"), but is it really this iffy? I'd love to at least see some documentation that actually says "sometimes canPlayType() will literally return "" even though you can actually play videos of that type. Sorry, that's as good as it gets."
Here's my checking code:
const video = document.createElement('video');
const canPlay = {
mp4: video.canPlayType('video/mp4'),
ogg: video.canPlayType('video/ogg'),
webm: video.canPlayType('video/webm'),
mov: video.canPlayType('video/mov'),
avi: video.canPlayType('video/avi'),
};
console.log(canPlay);
// outputs: { mp4: "maybe", ogg: "maybe", webm: "maybe", mov: "", avi: "" }

Related

WebRTC - RTCPeerConnection.onconnectionstatechange fires with RTCPeerConnection.connectionState = 'disconnected' without any reason

I am working on the WebRTC application for video chatting. On my local network everything works well. But when I try it to test through internet RTCPeerConnection.onconnectionstatechange fires with RTCPeerConnection.connectionState = 'disconnected' without any reason after some 20-30 seconds of communication. Another very confusing thing is that for example I have peer2 and peer3 started in the same browser in different tabs connected to peer1 and peer1 videostreaming to them. And after 20-30 seconds RTCPeerConnection.connectionState = 'disconnected' can fire on peer2 and at the same time peer3 continues to receive video stream from peer1. I have googled a bit and found this solution (which doesnt work in my case):
this.myRTCMediaMediatorConnections[id][hash].onconnectionstatechange=async function(e){
log('onSignalingServerMediaMediatorOfferFunc.myRTCMediaMediatorConnections['+id+']['+hash+'].onconnectionstatechange('+This.myRTCMediaMediatorConnections[id][hash].connectionState+')',10,true)
switch(This.myRTCMediaMediatorConnections[id][hash].connectionState){
case "failed":
This.disconnectMeFromMediatorConnection(targetId,logicGroupName,streamerId,streamerHash,id,hash)
break
case "closed":
This.disconnectMeFromMediatorConnection(targetId,logicGroupName,streamerId,streamerHash,id,hash)
break
case "disconnected":
if(await This.confirmPeerDisconnection(This.myRTCMediaMediatorConnections[id][hash]))This.disconnectMeFromConnection(targetId,logicGroupName,streamerId,streamerHash,id,hash)
break
}
log('onSignalingServerMediaMediatorOfferFunc.myRTCMediaMediatorConnections['+id+']['+hash+'].onconnectionstatechange',10,false)
}
this.confirmPeerDisconnection=async function(connectionObject){
log('confirmPeerDisconnection',10,true)
var b1=await this.confirmPeerDisconnectionFunc(connectionObject);
await new Promise(resolve=>setTimeout(resolve,2000));
var b2=await this.confirmPeerDisconnectionFunc(connectionObject);
log('confirmPeerDisconnection=>'+(b2-b1),10,false)
if(b2-b1>0)return false
return true;
}
this.confirmPeerDisconnectionFunc=async function(connectionObject){
var b=0
await connectionObject.getStats(null).then(function(stats){
stats.forEach((report)=>{if(report.type=='transport')Object.keys(report).forEach((statName)=>{if(statName==='bytesReceived')b=parseInt(report[statName])})})
})
return b
}
b2-b1 always equals 0 or less than 0. Can anyone give me an advise why RTCPeerConnection.onconnectionstatechange fires and how I can get rid of this bug.
Any help appriciated!

How to use 3rd party camera app with Ionic React Capacitor

Well the point is the following:
import {
Camera,
CameraResultType,
CameraSource,
Photo,
} from "#capacitor/camera";
export function usePhotoGallery() {
// import { usePhotoGallery } from "../hooks/usePhotoGallery"; per usarlo in esterna
const scatta = async () => {
const cameraPhoto = await Camera.getPhoto({
resultType: CameraResultType.Uri,
source: CameraSource.Camera,
quality: 100,
});
};
return {
scatta,
};
}
With this pretty basic code just calls the default camera and makes it take a shot.
It's working fine no problem. The issue comes in, by my case, by the fact that i have a custom ROM with no pre-installed camera. So no default camera. I know pretty much no one would have my issue. But I'd like to cover those 0.1% of user that will have this problem.
So, finally, how can I can use a 3rd party camera app? Or just create one if it's better.
For example I can take photos from Whatsapp, Instagram, Snapchat ecc. I guess cuz they have their own camera. So the point basically is:
How can I make the user select the app he prefers for thake the shot (like the pop-up that ask "open with *choiches*.."
If i can't do the previous option, then how can I do my own camera?
Sorry for my probably disgusting english

sharing image, link and title using react-native-share not working

I am using react-native-share to share content accross different social platform. For plain text it is working fine. But when I include image + message(contains link) + title, then it is not working.
Here is the code which I am using to do so:
let imagePath = null;
RNFetchBlob.config({
fileCache: true,
})
.fetch('GET', FILE_URL)
.then((resp) => {
imagePath = resp.path();
return resp.readFile('base64');
})
.then(async (base64Str) => {
const content = `Sharing Link ${link}`;
const base64Data = `data:image/jpeg;base64,${base64Str}`;
const shareOption = {
message:content,
url: base64Data,
title: 'This is title',
type: 'image/jpeg',
};
await Share.open(shareOption);
return RNFetchBlob.fs.unlink(imagePath);
});
} catch (error) {
console.error(error);
}
I tried different variations of it, what I found on SO and on other links. But non of them worked.
What I've tried and didn't worked:
Sharing URL of images instead of base64 - Only image is shared, message got removed. And in whats app image link is not converted to image.
by removing message key and only passing url. (In this way image is getting shared. But it has no relevance. Because I am not able to set any kind of message or link)
I am not sure what I am missing here.
This is an interesting project. I have to admit up front, this is a bit of a non-answer. I apologize for that, but I wanted to let you know what I found because it seemed more useful to share my research rather than just keep it in my head and it was too big for a comment.
So I took a look at the github issues for this project to see if your issue was documented. There are 93 open issues and a lot of people complaining about varieties of things that don't work the way you would think in combination of these variables (message, base64) especially on iOS. Have you taken a look at their issue log to see if you can find better detail? Are you using iOS?
Here's a few that seem to be left open that complain about almost the exact issue you posted about. Some offer workarounds. Some lead to dead-ends. If you haven't read these, it would behoove you to get up to speed. If you do know all this, maybe I'll just leave this post up for posterity if others come across it.
https://github.com/react-native-share/react-native-share/issues/760
https://github.com/react-native-share/react-native-share/issues/966
https://github.com/react-native-share/react-native-share/issues/831
https://github.com/react-native-share/react-native-share/issues/1025

Using video.js is it possible to get current HLS timestamp?

I have an application which is embedding a live stream in it. To cater for delays I'd like to know what is the current timestamp of the stream and compare it with the time on the server.
What I have tested up till now is checking the difference between the buffered time of the video with the current time of the video:
player.bufferedEnd() - player.currentTime()
However I'd like to compare the time with the server instead and to do so I need to get the timestamp of the last requested .ts file.
So, my question is using video.js, is there some sort of hook to get the timestamp of the last requested .ts file?
Video.js version: 7.4.1
I had managed to solve this issue, however please bear with me I don't remember where I had found the documentation for this bit of code.
In my case I was working in an Angular application, I had a video component responsible for loading a live stream with the use of video.js. Anyway let's see some code...
Video initialisation
private videoInit() {
this.player = videojs('video', {
aspectRatio: this.videoStream.aspectRatio,
controls: true,
autoplay: false,
muted: true,
html5: {
hls: {
overrideNative: true
}
}
});
this.player.src({
src: '://some-stream-url.com',
type: 'application/x-mpegURL'
});
// on video play callback
this.player.on('play', () => {
this.saveHlsObject();
});
}
Save HLS Object
private saveHlsObject() {
if (this.player !== undefined) {
this.playerHls = (this.player.tech() as any).hls;
// get and syncing server time...
// make some request to get server time...
// then calculate difference...
this.diff = serverTime.getTime() - this.getVideoTime().getTime();
}
}
Get Timestamp of Video Segment
// access the player's playlists, get the last segment and extract time
// in my case URI of segments were for example: 1590763989033.ts
private getVideoTime(): Date {
const targetMedia = this.playerHls.playlists.media();
const lastSegment = targetMedia.segments[0];
const uri: string = lastSegment.uri;
const segmentTimestamp: number = +uri.substring(0, uri.length - 3);
return new Date(segmentTimestamp);
}
So above the main point is the getVideoTime function. The time of a segment can be found in the segment URI, so that function extracts the time from the segment URI and then converts it to a Date object. Now to be honest, I don't know if this URI format is something that's a standard for HLS or something that was set for the particular stream I was connecting to. Hope this helps, and sorry I don't have any more specific information!

How to create a functioning small thumbnail with small play button with Spotify Apps API?

somewhat of a javascript novice here.
I'm trying to create this: http://i.imgur.com/LXFzy.png from the Spotify UI Guidelines.
Basically a 64x64 album cover with an appropriate sized play button.
This is what I have so far:
function DataSource(playlist) {
this.count = function() {
return playlist.length;
}
// make node with cover, trackname, artistname
this.makeNode = function(track_num) {
var t = playlist.data.getTrack(track_num);
// console.log(t);
var li = new dom.Element('li');
//generate cover image with play/pause button
var track = m.Track.fromURI(t.uri, function(a) {
var trackPlayer = new v.Player();
trackPlayer.track;
trackPlayer.context = a;
dom.inject(trackPlayer.node, li, 'top')
});
//track name
var trackName = new dom.Element('p', {
className: 'track',
text: t.name
});
//artist name
var artistName = new dom.Element('p', {
className: 'artist',
text: t.artists[0].name
});
dom.adopt(li, trackName, artistName);
return li;
}
}
This datasource function feeds into a pager function later in the code. This code generates image, artist name and track name just fine except I can't seem to get the image to be 64x64 without overriding with my own css. I'm sure there is a way to set this in javascript since the core Spotify CSS files include a class for it however I'm at a loss at how to do it.
Also the play button renders but gives an error in the console that the track has no method 'get' when I click on it. How am I suppose to know it needs a get? Is there some way I can see this player function so I know what I'm doing wrong with it?
Any help would be greatly appreciated, I'm sure it'll help droves of people too as there is no documentation anywhere I can find on how to do this.
Check the code here: https://github.com/ptrwtts/kitchensink/blob/master/js/player.js
The kitchensink app displays a lot of the Spotify Apps API functionality
For the playback button, I know that it doesn't seem to actually work for single tracks used as the context. It really only works if you use either an Artist, Album, or Playlist context. Not sure why that is.