Change video codec from VP8 to VP9 with native webRTC - webrtc

I'm trying to figure out how to change the video codec on webRTC from vp8 to vp9, and do not find a suiting answer to this anywhere. May someone lead/show me how its done? Thanks

I think you would need to munge SDP to make it happen. To my understanding the idea is that the endpoints negotiate the best possible codecs to be used.
The VP9 release news has some hints on how to change the preferred codec from VP8 to VP9 https://developers.google.com/web/updates/2016/01/vp9-webrtc?hl=en.

As browsers start to support setCodecPreferences, you can check for the mimetype of the codec you want to use by default to set the codec preference. For example if you want to prefer vp8 for video you can check for the "video/vp8" mimetype and set your codec preferences to vp8 codecs:
// note the following should be called before before calling either RTCPeerConnection.createOffer() or createAnswer()
let tcvr = pc.getTransceivers()[0];
let codecs = RTCRtpReceiver.getCapabilities('video').codecs;
let vp8_codecs = [];
// iterate over supported codecs and pull out the codecs we want
for(let i = 0; i < codecs.length; i++)
{
if(codecs[i].mimeType == "video/VP8")
{
vp8_codecs.push(codecs[i]);
}
}
// currently not all browsers support setCodecPreferences
if(tcvr.setCodecPreferences != undefined)
{
tcvr.setCodecPreferences(vp8_codecs);
}
Code adapted from this Pericror blog post to force audio/video codecs.

Related

React-Native-Video -> how to save a video

I'm currently using react-native-video and playing HLS video streams through the package.
Anyone know how i can download the video onto the phone gallery?
Looking into the package there isn't any methods for that, and wondering if there is another package to use
Thanks!
In my app, I download videos with RNFS (react-native-fs), then play it using react-native-video. Two different libraries that fulfill their purpose.
import RNFS from'react-native-fs'
const LOCAL_PATH_TO_VIDEO = Platform.OS === 'ios' ? `${RNFS.DocumentDirectoryPath}/mood-pixel-${timestamp}.mp4` : `${RNFS.ExternalDirectoryPath}/mood-pixel-${timestamp}.mp4`
RNFS.downloadFile({
fromUrl: REMOTE_URI_OF_VIDEO,
toFile: LOCAL_PATH_TO_VIDEO,
}).then(() => {
console.log('successful video download! Save LOCAL_PATH_TO_VIDEO onto device for later use')
})
After successful download, save the LOCAL_PATH_TO_VIDEO onto the device and use it to play the downloaded video.
In the end i didn't have a good solution for this. I did a ffmpeg to convert the hls back to mp4 (via server) and download it (web).
The way HLS is streamed make the data not compatible for saving into single data file. So there is a good reason why you might be unable to save video into file from the stream intended for presentation.
The other reason is that RN Video component does not offer this capability.
The tooling that saves a file, such as MP4, from streaming media content is typically different from streaming media players exactly in the way that they download chunks of data having the limitation in mind that those video chunks are intended for a file.

How can I change the Ant Media Server WebRTC resolution?

I'm sending video from OBS Studio to Ant Media Server at 1280x720, but the WebRTC embed iframe is serving it at 560x315. How can I make the latter match the former?
You can change WebRTC stream resolution by editing media constraints in /usr/local/antmedia/webapps/YOUR_APP/index.html file. For example, to make 360x240 you can set media constraint as:
var mediaConstraints = {
video : {width: 360,height: 240},
audio : true
};
You may also want to change video bitrate proportional to the resolution settings. You can pass bandwidth parameter of webrtcAdaptor bandwidth: value or max bandwidth: "unlimited". It's default 900 kbps.
I understand that I can resize the iframe, but when I do that it doesn't change the size of the video stream coming from Ant Media Server. How do I change that resolution?

Browser web cam stream has extremely low performance/frame rate

I am trying to test WebRTC and want to display my own stream as well as the peer's stream. I currently have a simple shim to obtain the camera's stream and pipe that into a video element, however the frame rate is extremely low. The rare thing about this is that I can try examples from the WebRTC site and they work flawlessly.. The video is smooth and there are no problems. I go to the console and my code resembles theirs.. What could be happening? I tried to create both a fiddle and run that code within brackets but it still performs horribly.
video = document.getElementById('usr-cam');
navigator.mediaDevices.getUserMedia({video : {
width : {exact : 320},
height : {exact: 240}
}})
.then(function(stream){
if(navigator.mozGetUserMedia)
{
video.mozSrcObject = stream;
}
else
{
video.srcObject = stream;
}
})
.catch(function(e){
alert(e);
});
Pretty much everything I do. Take into account that I am using the new navigator.mediaDevices() API instead of navigator.getUserMedia() but I don't see how that would matter since 1.I am using a shim provided by the WebRTC group named adapter.js which they themselves use. 2. I don't think how you obtain hold of the video stream would affect performance.
Alright, I feel very stupid for this one... I was kind of deceived by the fact that the video element will update the displayed image without you having to do anything but pipe the output stream, which means the image will update but just at really long intervals, making it seem as if the video is lagging. What I forgot to do was actually play() the video or add autoplay as its property... it works well now.

recording a remote webrtc stream with RecordRTC

I am using Opentok JavaScript WebRTC library to host a 1-to-1 video chat (peer-to-peer).
I can see my peer's video and hear the audio flawlessly.
My wish is to record audio / video of other chat party (remote). For this purpose, I'm using RecordRTC.
I was able to record the video of other chat participant (video is outputted to HTML video element), but, so far, I have not succeeded in recording audio (a dead-silence .wav file is as far as I could get). Using Chrome Canary (30.0.1554.0). This is my method:
var clientVideo = $('#peerdiv video')[0];//peer's video (html element)
var serverVideo = $('#myselfdiv video')[0];//my video (html element)
var context = new webkitAudioContext();
var clientStream = context.createMediaStreamSource(clientVideo.webRTCStream);
var serverStream = context.createMediaStreamSource(serverVideo.webRTCStream);
webRTCStream is a custom property i assigned to HTMLVideoElement object by modifying source of opentok js library. It contains MediaStream object linked to respective < video > element.
var recorder = RecordRTC({
video: clientVideo,
stream: clientStream
});
recorder.recordAudio();
recorder.recordVideo();
Video is recorded. Audio file is also created, it has a length that is close to video's length, however, it's completely silent (and yes, there was a lot of noise making on the other side during recording)
I've tested this with video element which displays my webcam's video stream (and audio), and it worked: both audio and video were recorded:
...
var recorder = RecordRTC({
video: serverVideo,
stream: serverStream
});
...
Is there something special about streams originating from a remote location? Any guidance on this issue would be very helpful.
This is the same issue occurs in following situations...
If not a stereo audio (dual channel audio)...i.e. it is mono audio
If audio input channels are not equal to audio output channels
If audio input device is not the default device selected on chrome
I'm still trying to find the actual issue.
I added this experiment for testing purpose... see console...
https://webrtc-experiment.appspot.com/demos/remote-stream-recording.html
Updated at: Saturday, 1 February 2014, 09:22:04 PKT
Remote audio recording is not supported; and this issue is considered as low-priority wontfix:
Support feeding remote WebRTC MediaStreamTrack output to WebAudio
Connect WebRTC MediaStreamTrack output to Web Audio API
Updated at March 28, 2016
Remote audio+video recording is now supported in RecordRTC, since Chrome version 49+.
Firefox, on the other hand, can merely record remote-audio.
If Chrome/WebRTC/Opus outputs mono audio by default and if that is the problem here, I see two options in that case:
By making opus output stereo - not sure how.
By making the RecordRTC/Recorderjs code work with mono
Or does anyone know any other recording library that works?
This actually now works fine in Firefox. I am using FireFox 29.0.1 and the AudioAPI can now work with audio streams sources grabbed from remote parties from a peer connection.
To test go to Muaz Khan's experiment page. I am not sure with what version of Firefox this rolled out but I would like to thank the team for cranking it out!
The chrome bug was moved to the AudioAPI team cr bug to track progress

Does video.js have a callback if unsupported?

I'm using video.js for my html5 video, but on older devices (such as BlackBerry OS4), neither the html5 or the Flash fallback work.
Is there any way to detect this - some sort of onError callback within videojs itself, or any other way that I can detect if the video isn't supported?
I've found this code, which works for older BB, but then also detects older IE, which would run hapily with the Flash fallback.
function supportsVideo() {
return !!document.createElement('video').canPlayType;
}
Any help or pointers would be appreciated.