TokBox/Vonage allowing audio capture support when screensharing - webrtc

Screen Capture API, specifically getDisplayMedia(), currently supports screensharing and sharing the audio playing in your device (e.g: youtube) at the same time. Docs. Is this currently supported using TokBox/Vonage Video API? Has someone been able to achieve this?
I guess there could be some workaround using getDisplayMedia and passing the audio source when publishing, e.g: OT.initPublisher({ audioSource: newDisplayMediaAudioTrack }), but doesn't seem like a clean solution.
Thanks,

Manik here from the Vonage Client SDK team.
Although this feature does not exist in the Video Client SDK just yet, you can accomplish the sharing of audio with screen by creating a publisher like so:
let publisher;
try {
const stream = await navigator.mediaDevices.getDisplayMedia({video: true, audio: true });
const audioTrack = stream.getAudioTracks()[0];
const videoTrack = stream.getVideoTracks()[0];
publisher = OT.initPublisher({audioSource: audioTrack, videoSource: videoTrack});
} catch (e) {
// handle error
}

If you share a tab, but the tab doesn't play audio (static pdf or ppt) then the screen flickers. To avoid this, specify frameRate constraint for the video stream. see - https://gist.github.com/rktalusani/ca854ca8621c20488bea6e62ad04e341

Related

how to add screen share function using PeerJS?

Currently, i am working on a webRTC project where you can give call and receive call.i also want to add screen share functionality to it.
can anyone provide me a good documentation link?
i am currently following the official documentation of peerJS.
i was able to do audio-video calling but stuck on the screen sharing part.
Help Me!
You need to get stream just like you do with getUserMedia and then you give that stream to PeerJS.
It should be something like this:
var displayMediaOptions = {
video: {
cursor: "always"
},
audio: false
};
navigator.mediaDevices.getDisplayMedia(displayMediaOptions)
.then(function (stream) {
// add this stream to your peer
});
I'm working with and learning about WebRTC. From what I've read, I think the solution here probably hinges on getDisplayMedia. That's also what this React, Node and peerJS tutorial suggests (though I haven't tried it myself yet).
let screenShare = document.getElementById('shareScreen');
screenShare.addEventListener('click', async () => {
captureStream = await navigator.mediaDevices.getDisplayMedia({
audio: true,
video: { mediaSource: "screen" }
});
//Instead of adminId, pass peerId who will taking captureStream in call
myPeer.call(adminId, captureStream);
})

Set grid layout in video call UI (web SDK)

I want to achieve grid layout in my video call application which I am building with Agora's web SDK.
I was browsing the docs , but I couldn't get an help on how to achieve grid layout in video conferencing.
The best fit and grid layouts are only available in cloud recording APIs.
Any previous reference or github repo where it is implemented would also work.
Thanks for the help!
The Agora Web SDK provides a library for video streaming, it does not enforce a UI. Building the UI is your task. That being said, Agora makes its very easy to add video chat to your application.
In your case you can build a grid layout using CSS grid or any framework of your choosing. To connect Agora to your Grid layout you would use the stream-published event to create a new grid element, and subscribe to the new stream. Once the subscribe() promise resolves, use the video track's .play() method to play the video on a specific DOM element
client.on("user-published", async (user, mediaType) => {
// Initiate the subscription
await client.subscribe(user, mediaType);
// If the subscribed track is an audio track
if (mediaType === "audio") {
const audioTrack = user.audioTrack;
// Play the audio
audioTrack.play();
} else {
const videoTrack = user.videoTrack;
// Play the video the given DOM_ELEMENT
videoTrack.play(DOM_ELEMENT);
}
});

Implement Live Broadcast by Agora.io

I am trying to implement Live Broadcast by Agora.io in React Native mobile application. I have previously implemented video call successfully. I have gone through documentation, compare and contrast video call to live broadcast ( both web sdk). I could only find a difference in mode of the client which corresponds to channelProfile in react-native sdk. In documentation it says there are three different modes: Communication, Live Broadcast and Gaming. When I implemented video call I assigned 1 for the value of channelProfile, it worked fine, quality was good enough. However, when I assign 2 for channelProfile to indicate it is a Live Broadcast, the quality goes heavily down. Do I make anything wrong while implementation of Live Broadcast? How can I improve the quality of Live Broadcast?
For consideration, I put my code below:
const config = {
appid: 'MY APP ID',
channelProfile: this.props.navigation.getParam('channelProfile', 2),
clientRole: this.props.navigation.getParam('clientRole', 1),
videoEncoderConfig: {
width: 360,
height: 480,
bitrate: 1,
frameRate: FPS30,
orientationMode: Adaptative,
},
audioProfile: AudioProfileDefault,
audioScenario: AudioScenarioDefault
}
RtcEngine.on('userJoined', (data) => {
console.warn("user joined", data);
const { peerIds } = this.state;
if (peerIds.indexOf(data.uid) === -1) {
this.setState({
peerIds: [...this.state.peerIds, data.uid]
})
}
})
RtcEngine.on('error', (error) => {
console.warn("error", error);
})
RtcEngine.init(config);
In Agora's SDK there used to be three channel-modes, but recently the gaming SDK has been combined with the native SDKs so there are only two channel-modes, communication and broadcast.
Each mode optimizes for different qualities within the channel and within the streams. For broadcasting the documentation mentions that when using default bitrate that broadcast mode uses twice the bitrate of communication.
If you are having quality issues you should consider changing your bitrate, currently your code is setting the bitrate to 1 which is very low. Agora provides a list of suggested resolution profiles, fps, and bit rates.
Agora Video Bitrate documentation: https://docs.agora.io/en/Interactive%20Broadcast/API%20Reference/oc/Classes/AgoraVideoEncoderConfiguration.html#//api/name/bitrate

Audio Player in NativeScript-Vue

I have an mp3 playlist and I want to play these audio tracks in an audio player in NativeScript-Vue. However, there is no plugin for it.
However, there is a NativeScript plugin nativescript-audio which can be used for playing audio.
In the following Playground example, you will notice that it has been adopted to play in a NativeScript-Vue application.
https://play.nativescript.org/?template=play-vue&id=83Hs3D&v=19
This can work, however, the problem is that the player is mounted in the mounted() hook, and even the mp3 file path is supplied there. However, for me, the mp3 file is loaded asynchronously, added to a Vuex store, and then available as computed property in the component.
How can I adopt this code to take the mp3 file from a computed property rather than hard-coded in mounted()?
Here is the documentation for this plugin - https://github.com/bradmartin/nativescript-audio
I was able to find a solution.
Watch your computed property. Let's say it's called media.
On change, update the audio track using the following code:
const playerOptions = {
audioFile: this.media,
loop: false,
autoplay: false
}
this._player
.playFromUrl(playerOptions)
.then(function(res) {
console.log(res);
})
.catch(function(err) {
console.log('something went wrong..', err);
});

Is it possible broadcast audio with screensharing with WebRTC

is it possible broadcast audio with screensharing with WebRTC?
Simple calling getUserMedia with audio: true fails by permission denied error.
Is there any workeround which could be used to broadcast audio also?
Will be audio implemented beside screensharing?
Thanks.
Refer this demo: Share screen and audio/video from single peer connection!
Multiple streams are captured and attached to a single peer connection. AFAIK, audio alongwith chromeMediaSource:screen is "still" not permitted.
Updated at April 21, 2016
Now you can capture audio+screen using single getUserMedia request both on Firefox and Chrome.
However Chrome merely supports audio+tab i.e. you can NOT capture full-screen along with audio.
Audio+Tab means any chrome tab along with microphone.
Updated at Jan 09, 2017
You can capture both audio and screen streams by making two parallel (UNIQUE) getUserMedia requests.
Now you can use addTrack method to add audio tracks into screen stream:
var audioStream = captureUsingGetUserMedia();
var screenStream = captureUsingGetUserMedia();
var audioTrack = audioStream.getAudioTracks()[0];
// add audio tracks into screen stream
screenStream.addTrack( audioTrack );
Now screenStream has both audio and video tracks.
nativeRTCPeerConnection.addStream( screenStream );
nativeRTCPeerConnection.createOffer(success, failure, options);
As of May 2020
To share the audio track of the screen share you can use getDisplayMedia instead of getUserMedia. Docs.
navigator.mediaDevices.getDisplayMedia({audio: true, video: true})
This is currently only supported in Chrome / Edge and it is only supported when using the "Chrome Tab" sharing option. You'll see a checkmark for Share audio in the dialog box.
In Firefox, you can use getUserMedia to grab a screenshare/etc and mic audio in the same request, and can attach it to a PeerConnection. You can combine it with other streams -- multiple audio or video tracks in a single PeerConnection in Firefox requires Firefox 38 or later. Currently 38 is Developer Edition (formerly termed Aurora). 38 should go to release in around 9 weeks or so.
yes you can record audio and screen record on chrome with two requests.
getScreenId(function (error, sourceId, screen_constraints) {
capture screen
navigator.getUserMedia = navigator.mozGetUserMedia || navigator.webkitGetUserMedia;
navigator.getUserMedia(screen_constraints, function (stream) {
navigator.getUserMedia({audio: true}, function (audioStream) {
stream.addTrack(audioStream.getAudioTracks()[0]);
var mediaRecorder = new MediaStreamRecorder(stream);
mediaRecorder.mimeType = 'video/mp4'
mediaRecorder.stream = stream;
document.querySelector('video').src = URL.createObjectURL(stream);
var video = document.getElementById('screen-video')
if (video) {
video.src = URL.createObjectURL(stream);
video.width = 360;
video.height = 300;
}
}, function (error) {
alert(error);
});
}, function (error) {
alert(error);
});
});