How to switch local video without switching audio in a webrtc conference? - camera

I have two or more cameras connected to my pc. My goal is, to switch between the local cameras in an ongoing webrtc videoconference - but only to switch video from camera 1 to camera 2 NOT audio. Audio should always come from camera 1.
How to toggle between the two videoTracks?

See this answer.
Basically you can use replaceTrack() in Firefox to do this today for a seamless replacement of a camera. This is being added to the spec, but Chrome doesn't support it yet.
The best you can do in Chrome today is to get a new stream with the same mic but a different camera, remove the old stream/tracks from the PeerConnection and add the new one, and then handle onnegotiationneeded and renegotiate. This will likely cause a glitch, and will require at least a couple of round-trip-times to complete. (This will work in Firefox as well.

Related

Use getDisplayMedia to record the screen, when the microphone is added from the system, there is no sound!

In chrome browser, I am using MacBook to record a 30 minutes class, in code, use getDisplayMedia to get the screen stream:
navigator.mediaDevices.getDisplayMedia({audio:true,video:true}).then((mediaStream)=>{
mediaRecorder = new MediaRecorder(mediaStream,{mineType:'video/webm'})
})
2.When the recording started 5 minutes later, I connected an external microphone to the MacBook, then lectured through the microphone, and then hoped that my voice would be recorded through the microphone, but the strange thing is that since the microphone is connected, there is no recording screen behind The sound is down, what’s the matter, how can I operate so that I can continue recording when switching microphones?

HLS Player: Clear video.js buffer on click

I have two live videos feeding an encoder which creates H.264 chunk files and an HLS manifest which is being served by an apache web server.
A browser page using video.js shows a player. Pressing "play" on the browser properly plays the video. It works well.
However, if we change video sources (by flipping the switch in the picture below), there is a considerable delay (10 seconds) before the new content is displayed in the player. I'd like to get that to 3 seconds.
It appears that video.js and/or the HTML5 player in browser is buffering that amount of content. (if you delete the files on the web server, kill apache, or even pull the ethernet cable, the video keeps on playing!)
A button on the web page controls the switch. When clicked, I would also like to clear or reset the player so that it immediately re-reads the index.m3u8 manifest and downloads the new chunks.
So far, haven't found anything promising searching the internet or in the video.js API docs. There are lots of articles on API calls for fetching the current buffer percentage but cannot find any API for clearing it altogether.
Any ideas?
The encoder is set for 3 second chunks and the playlist depth is set for 10 entries.
I had a similar problem. Since i could not find a reliable API for this, i came up with a rather dirty workaround to clear the buffer:
var ctime = player.currentTime();
player.currentTime(0);
player.currentTime(ctime);
This currently works for me in all major browsers.

Liveview on Android/QX1 Sony Camera API 2.01 fails

Using the supplied Android demo from
https://developer.sony.com/downloads/all/sony-camera-remote-api-beta-sdk/
Connected up to the WIFI connection on a Sony QX1. The sample application finds the camera device and is able to connect to it.
The liveview is not displaying correctly. At most, one frame is shown and the code hits an exception in SimpleLiveViewSlicer.java
if (commonHeader[0] != (byte) 0xFF) {
throw new IOException("Unexpected data format. (Start byte)");
}
Shooting a photo does not seem to work. Zooming does work - lens is moving. Camera is working fine when using the PlayMemories app directly, so no hardware issue.
Hoping from advice from Sony on this one - standard hardware and demo application should work.
Can you provide some details of your setup?
What version of Android SDK are you compiling with?
What IDE and OS are you using?
Have you installed the latest firmware? (http://www.sony.co.uk/support/en/product/ILCE-QX1#SoftwareAndDownloads)
Edit:
We tested the sample code using a QX1 lens and the same setup as you and were able to run the sample code just fine.
One thing to check is whether the liveview is ready to transfer images. To confirm whether the camera is ready to transfer liveview images, the client can check “liveviewStatus” status of “getEvent” API (see API specification for details). Perhaps there is some timing issue due to connection speed that is causing the crash.

WebRTC: View self-view while muting outgoing video in a call

Currently, the video mute functionality in webrtc is achieved by setting the enabled property of a video track to false
stream.getVideoTracks().forEach(function (track) {
track.enabled = false;
});
But the above code would not only mute the outgoing video, but the local self-view which is rendered using that local stream, also gets black frames.
Is there a way, to ONLY mute the outgoing video frames, but still be able to show a local self-view?
There's no easy way yet. Once MediaStreamTrack.clone() is supported by browsers, you could clone the video track to get a second instance of it with a separately controllable mute property, and send one track to your self-view and the other to the peerConnection. This would let you turn off video locally and remotely independently.
Today, the only workarounds I know of would be to call getUserMedia twice on Chrome (should work on https at least, where permissions will be persisted so the user won't be prompted twice) which would get you two tracks you could video-mute independently, or on Firefox you could use RTCRtpSender.replaceTrack() with a second "fake" video stream from getUserMedia using the non-standard { video: true, fake: true } constraint like this.

recording a remote webrtc stream with RecordRTC

I am using Opentok JavaScript WebRTC library to host a 1-to-1 video chat (peer-to-peer).
I can see my peer's video and hear the audio flawlessly.
My wish is to record audio / video of other chat party (remote). For this purpose, I'm using RecordRTC.
I was able to record the video of other chat participant (video is outputted to HTML video element), but, so far, I have not succeeded in recording audio (a dead-silence .wav file is as far as I could get). Using Chrome Canary (30.0.1554.0). This is my method:
var clientVideo = $('#peerdiv video')[0];//peer's video (html element)
var serverVideo = $('#myselfdiv video')[0];//my video (html element)
var context = new webkitAudioContext();
var clientStream = context.createMediaStreamSource(clientVideo.webRTCStream);
var serverStream = context.createMediaStreamSource(serverVideo.webRTCStream);
webRTCStream is a custom property i assigned to HTMLVideoElement object by modifying source of opentok js library. It contains MediaStream object linked to respective < video > element.
var recorder = RecordRTC({
video: clientVideo,
stream: clientStream
});
recorder.recordAudio();
recorder.recordVideo();
Video is recorded. Audio file is also created, it has a length that is close to video's length, however, it's completely silent (and yes, there was a lot of noise making on the other side during recording)
I've tested this with video element which displays my webcam's video stream (and audio), and it worked: both audio and video were recorded:
...
var recorder = RecordRTC({
video: serverVideo,
stream: serverStream
});
...
Is there something special about streams originating from a remote location? Any guidance on this issue would be very helpful.
This is the same issue occurs in following situations...
If not a stereo audio (dual channel audio)...i.e. it is mono audio
If audio input channels are not equal to audio output channels
If audio input device is not the default device selected on chrome
I'm still trying to find the actual issue.
I added this experiment for testing purpose... see console...
https://webrtc-experiment.appspot.com/demos/remote-stream-recording.html
Updated at: Saturday, 1 February 2014, 09:22:04 PKT
Remote audio recording is not supported; and this issue is considered as low-priority wontfix:
Support feeding remote WebRTC MediaStreamTrack output to WebAudio
Connect WebRTC MediaStreamTrack output to Web Audio API
Updated at March 28, 2016
Remote audio+video recording is now supported in RecordRTC, since Chrome version 49+.
Firefox, on the other hand, can merely record remote-audio.
If Chrome/WebRTC/Opus outputs mono audio by default and if that is the problem here, I see two options in that case:
By making opus output stereo - not sure how.
By making the RecordRTC/Recorderjs code work with mono
Or does anyone know any other recording library that works?
This actually now works fine in Firefox. I am using FireFox 29.0.1 and the AudioAPI can now work with audio streams sources grabbed from remote parties from a peer connection.
To test go to Muaz Khan's experiment page. I am not sure with what version of Firefox this rolled out but I would like to thank the team for cranking it out!
The chrome bug was moved to the AudioAPI team cr bug to track progress