For screen sharing, whenever we invoke navigator.mediaDevices.getDisplayMedia() we need to select the screen that we wish to share, before the streaming can be started. This works fine if it is one to one sharing.
But if I am doing 1 to many screen sharing, we have to create a separate WebRTC connection to each peer. The problem is, if I have 5 peers, I have to select the screen 5 times because every time I call getDisplayMedia() for different peer, it will requires me to select the screen again.
There is no need to call getDisplayMedia multiple times, you can add the track/stream from the first getDisplayMedia call to more than one peerconnection.
See https://webrtc.github.io/samples/src/content/peerconnection/multiple/ for a sample illustrating this. There is only a single call (to getUserMedia in this case)
Related
So I'm trying to set up a 3-way video call with WebRTC.
The first two connections connect fine. I have it set up so that the third person to join the room sends an offer to the first 2 sockets in the room (I'm using Node.js socket.io as a signalling server). The offer successfully sends to both sockets, and both sockets send back their answers. Both answers contain 1 audio track and 1 video track, however, for some reason, the connection only works with one of them.
The result is that with 3 users, the first user (usually) ends up seeing both peers, and then the second and third people only see one other person. When I look at the connectionState for each peer connection, one of them is stuck in "connecting", while everything else looks fine.
Any ideas?
Are you creating a new PeerConnection for the first two people in the room? When you say first 2 sockets it sounds like you are trying to use the same PeerConnection for multiple connections.
Each PeerConnection is only for a 1:1 connection. What you are trying to create is a Mesh Topology I believe.
I'm following Open-Duo sample for android (From AgoraIO Github). And I got stuck in this case:
User A and user B are connecting by video call, user B close app by tap recent and click clear app. so user B close the app but User A still in call but no signal from B.
The video call is still continue but it's one side call.
How can I detect user B leave the conversation by this way?
If users clear/swipe from recent, you will get a callback called onTaskRemoved in your service, using this callback you can send a signal to User A before you destroy it.
Note: For this, to work you need to move your signaling code to a service.
From the perspective of A, under this situation, the behaviors of B between accidentally leave and network down have no difference:
A will not know B is gone untill time is out.
A will not know B is gone for what.
B can rejoin into the old channel once the App get restarted or network is up.
I have been using SimpleWebRTC lib for my project.
How to change dynamic remote video resolution during a call (like google hangout when resizing browser)
hangout browser resizing will change remote video resolution size (.videoWidth .videoHeight)
Is this associated with webrtc plan b?
I would like to know how it is implemented for many peer connection.
Tell the sending end (say via DataChannels) to change resolution to NxM. At the sending end, until the new APIs are available to change a getUserMedia/MediaStream capture size on the fly, you can request a second camera/mic stream and replace the existing streams with them. (Note: this will cause onnegotiationneeded i.e. renegotiation, and the far side would see a new output stream.)
Smoother (but only in Firefox thus far -- in the standardization process) would be to use RTPSender.replaceTrack() to change the video track without touching audio or renegotiating.
Another option that will exist (though doesn't yet in either browser) is to use RTPSender.width/height (or whatever syntax gets agreed) to scale outgoing video before encoding.
Plan B for multistream/BUNDLE (which Chrome implements) was not adopted; Firefox has now (in Fx38 which goes out in a few days) implemented the Unified Plan; expect to see a blog post soon from someone on how to force the two to work together (until Chrome gets to implementing Unified Plan)
I developed a game for android with the google play services realtime multiplayer feature. I currently have a problem when matching the players. I dont use any invite feature, so all players just use the automatch functionality.
My game can be played with 4 players, but games with just 3 or 2 players are also possible. For my testing with 2 devices i use:
RoomConfig.createAutoMatchCriteria(minNumberOfOpponents, maxNumberOfOpponents, 0);
If i keep starting, ending and restarting games for a number of times, it often happens that the clients are not connected correctly. In the working cases the games onRoomConnected is called correctly and the game starts. In some cases tho, this is not happening. In theses cases, one device finds the other device and its onPeerJoined() and onRoomConnecting() callback is called. onRoomConnected() is never called tho. Thats because the other device gets no information whatsoever, just the roomCreated callback is called, and thats it.
So one device finds the other, and gets the information that another device joined the room. It also gets informed when this device leaves the room again. But the other device doesnt recognize any of this.
If this helps. i had some issues with losing connection before, and fixed it by restarting the apiClient everytime a room left on any clint. I dont think this is related tho.
I thought that might be a problem with leaving the current room correctly, and somehow joining the old room again, but it also happens ehen starting the app for the first time. Also the apiClient reconnect should avoid this problem
Thx in advance
Edit: It seems like its just my nexus 5 which produces the error. Every other device i tested works fine. The Nexus 5 does too in most cases. If the clients get connected and the game starts, there has never been any problem. The errrr just happens on this one device, and only in maybe 5 out of 6 cases, when searching an online game.
It just stops getting any callbacks called, sometimes right after the onRoomCreated(), sometime after he found another peer and onRoomConnecting(), and sometimes after onRoomConnected() has been called.
The other device gets its appropiate callbacks called tho in these cases.
So if the error device stops at onRoomCreated() the other device finds the client.
If the error device finds the other device and gets onRoomConnecting() called and stops after, the other device gets its onRoomConnected().
And if the error device gets its onRoomConnected() called, it sometimes even stops getting any messages from there on, while the other device is already in the game.
Maybe this helps someone. i'm not 100 % sure i fixed my problem. Haven't tested it in depth, but it seems everything is working fine now.
My problem was, that i have 2 different threads in my application, where the standard activity GUI thread starts the apiClient and handles the callbacks, while the gameengine thread initiates the room creation and sends the reliable messages via the apiClient.
It seems, sometimes things mess up while the peers connect and trade their first data. Currently i avoided directly calling any apiClient actions from the gameengine thread, but use runOnGuiThread to handle these actions on the Activity Gui Thread.
I have created a web-service app and i want to populate my view controllers according to the response i fetch(via GET) in main thread. But i want to create a scheduled timer which will go and control my server, if there becomes any difference(let's say if the count of an array has changed) i will create a local notification. As far as i read from here and some google results, i cant run my app in background more then ten minutes expect from some special situations(Audio, Vo-IP, GPS).. But i need to control the server at least one per minute.. Can anyone offer some idea-or link please?
EDIT
I will not sell the app in store, just for a local area network. Let's say, from the server i will send some text messages to the users and if a new message comes, the count of messages array will increment, in this situation i will create a notification. I need to keep this 'controlling' routing alive forever, whether in foreground or background. Does GCD give such a solution do anyone have any idea?
Just simply play a mute audio file in loop in the background, OR, ping the user's location in the background. Yes, that will drain the battery a bit, but it's a simple hack for in-home applications. Just remember to enable the background types in your Info.plist!
Note: "[...] I fetch (via GET) in main thread." This is not a good approach. You should never fetch any network resources on the main thread. Why? Because your GUI, which is maintained by the main thread, will become unresponsive whenever a fetch isn't instantaneous. Any lag spike on the network results in a less than desirable user experience.
Answer: Aside from the listed special situations, you can't run background apps. The way I see it:
Don't put the app in the background. (crappy solution)
Try putting another "entity" between the app and the "server". I don't know why you "need to control the server at least one per minute" but perhaps you can delegate this "control" to another process outside the device?
.
iOS app -> some form of proxy server -> server which requires
"babysitting" every minute.