How do I send my local video stream to multiple remote peers? Do I need to instantiate one PeerConnection per remote peer? Or can the same PeerConnection be used for all remote peers at the same time?
According to user dom on #webrtc on irc.w3.org, each PeerConnection is associated with a single remote peer. The developer is responsible for sharing the same stream instance with multiple PeerConnections:
<Cow_woC> Can a single PeerConnection connect to multiple remote peers, or only a single one at a time? If I want to stream the same video to multiple remote peers, what am I supposed to do?
<dom> Cow_woC, you need to manage several PeerConnection objects
<dom> and plug your video stream to each of them
<Cow_woC> dom: How do I share the camera feed with multiple PeerConnections? Is getUserMedia() allowed to return the same resource (and share it) multiple times?
<Cow_woC> dom: Or am I responsible for keeping the reference around and passing it to multiple PeerConnections?
<dom> the latter
Related
I am studying this sample and working to modify it so that video can be streamed from one browser instance and then appear on the second instance.
I have a few questions:
Is the reason that the ICEServers array is empty because of localhost? Would this still be okay running on two browser instances?
I also don't understand why there are two RTCPeerConnection objects? Shouldn't there only be a single object since the RTCPeerConnection class contains methods for both local and remote?
In a scenario where an endpoint adds multiple video streams to a peer connection, the onaddstream event handler is invoked multiple times on the peer end.
Is there any means by which an application (on the peer end) can determine between the different video streams (within onaddstream handler)? By identifying the stream, each stream can be associated with a different video element.
RTCPeerConnection has a couple of methods:
getRemoteStreams
getLocalStreams
These methods return an array of MediaStream's associated with the remote/local end of the connection. Every MediaStream has an id, so you will be able to identify them.
There is also getStreamById, but it is deprecated, replaced by the two above.
I am able to stream video with Kurento using WebRTC, I need to implement multi party audio conference using MCU feature of Kurento Media server. So audio coming from all clients are merged and send back that combined audio to all clients in efficient manner using WebRTC.
if it will works then we need only two connection(one for send and one for receive) other wise we need peer connection to all clients using WebRTC. It is not feasible to establish peer connection to all all clients.
Please suggest me any sample code which have implemented MCU for audio using Kurento Media Server or guide me to implement same using Kurento Media Server.
I'm afraid there's no code that allows that un Kurento. There is the Composite media element, but that is usually for audio AND video. It combines streams into a single stream matrix of the required size, usually more than 9 streams may have performance problems. If you only want to process audio, surely it could handle much more than 9 streams. To use only audio just connect AUDIO stream to the HubPort.
EDIT 1
The code to generate the media elements needed, and the correct way establish an audio-only connection is as follows.
WebRtcEndpoint webrtc = new WebRtcEndpoint.Builder(pipeline).build();
Composite composite = new Composite.Builder(pipeline).build();
HubPort hubport = new HubPort.Builder(composite).build();
webrtc.connect(hubport, MediaType.AUDIO);
Please note that the connection is from the WebRtcEndpoint to the HubPort. If you need it to be bidirectional, you'll need to connect that way also.
hubport.connect(webrtc, MediaType.AUDIO);
I'm currently trying to make 3 arduinos talking to each other with ZigBee, and it's kinda working.
But I currently use AT mode on the Bees and it's a little bit harsh when I have to switch the destination address in the Coordinator of the network (1 Coordinator and 2 Routers)
Can I put the Coordinator in API mode (to make it easier to switch addresses with xbee-api for Arduino) but still be able to communicate with the AT routers and be able to send/receive data from them?
Thanks for your answer :)
Absolutely, and it's common to set up a network like that. You can have AT routers connected to "dumb" hosts that just send streams of serial data, and an API coordinator that receives from multiple routers, identifying the source of each message using the headers of the API frames, and able to send unicast messages back to individual routers or broadcast messages to all routers.
Make use of the 0x10 Transmit Request API frame to send from the coordinator to the routers. You'll receive either 0x90 or 0x91 frames (depending on the setting of ATAO).
I am trying to create an application which requires a user to send his local video stream to multiple peers using webRTC. As far as I've seen I am responsible for managing several PeerConnection objects because a PeerConnection can only connect to a single peer at a time. What I want to know is if it is possible to create a connection and send my local stream to a peer without him sendig his local stream to me using webRTC.
Simply don't call peer.addStream for broadcast-viewers to make it oneway streaming!
You can disable audio/video media lines in the session description by setting OfferToReceiveAudio and OfferToReceiveVideo to false.
3-Way handshake isn't drafted by RTCWebb IETF WG, yet.
Because browser needs to take care of a lot stuff simultaneously like multi-tracks and multi-media lines; where each m-line should point out a unique peer.