Broadcasting-to multiple peer - webrtc

How can we have a stream in which video and screen share in simultaneously in one single stream using webRTC so it can be send to all the peers in one stream

You need to use canvas html element and blend streams received via calls to getUserMedia (webcam) and getDisplayMedia (screen) into that canvas element.
Then take resulting blended picture from that canvas element and stream it with RTCPeerConnection.
Live example here:
https://unrealstreaming.net:8443/UnrealWebRTCPublishingDemo/publish.aspx

Related

react native webrtc replace video stream with new stream

I am implement screen captured inside my webrtc application, I successfull get mediaStream + get frame from Boardcast Extension, but I dont know why I can not replace the videotrack from localVideoTrack to screen captured video track, here is my code:

Streamlit-webrtc with custom video source

I'm working on a traffic detection project, where I get video stream from Youtube livestream using CamGear and then process each frame with OpenCV. I want to display final video with detections in Streamlit app. I tried passing processed frames to st.frame, but for some reason only first few frames are shown and then the video freezes. Now I am looking at using webrtc component, but I can’t figure out, how to change video source from webcam. Thanks.

How to close peerConnection in WebRtc

WebRtc sample
If I press call, allow/deny menu is shown in the upper screen. When I press allow, Chrome starts displaying red circle in the header which signals that microphone is active. All other sounds in others Chromes tabs are muted (for example youtube video).
When I press hang, red circle does not dissapears, youtube video in second tab is still muted. I have to press F5 to restore state which was before call.
Is there any PeerConnection to stop the call and stop mic recording ?
Yes, the reason the "red light" is still on is that the media tracks in the mediaStream object (called localstream on that particular page) gathered through getusermedia is still playing/active.
To close a media stream and thus release any hold on the local media inputs simply call localstream.stop() that will end the stream and stop accessing the local media input. This will stop all associated media tracks.
You can also do this by calling stop() on the individual media tracks(grabbed from getVideoTracks or getAudioTracks from the mediaStream).
As for other audio being turned down in other pages/apps. Here is the crbug handling that issue.
Side Note: if the media is being pushed through a peerConnection, then you will probably have to renegotiate the connection due to the mediaStream changing.

Could one WebRtcConnection handle both screen sharing and camera video?

I am trying to create a peer to peer meeting basing on WebRtc, I am able to see him on his camera, or watch his shared screen, but may I watch his screen and meanwhile watching him on his camera?
Try this demo: https://www.webrtc-experiment.com/demos/screen-and-video-from-single-peer.html
For offerer; it attaches two unique MediaStream objects:
Audio+Video
Screen
Remember, Firefox doesn't support this feature, yet!

HTML5 camera on video element filter not copying to canvas

I am making a new collage application using google chromes new WebRTC features that let me access the camera in javascript. I have been able to put the camera feed on a video element, take snapshots of the camera and store them in variables and them draw them onto my canvas.
My new problem is that even when the css -webkit-filter is changed on the video element with css (by clicking the video preview), the copied data is raw and not filtered. Is there any way to copy the draw the filtered data from the video element? or to draw a filter onto a region in a canvas?