How messengers (IM) works (listening)? - instant-messaging

My task is to write a Messenger program for both internal and external staffs, I actually made it. However, I thing this is really not a good approach by using the client software keep "check-mesg" from server. So I think I am just simulating the IM program.
I want to make the client app become a listening server, and let user p2p talking without a "mesg-centre" at the main server(unless offline mesg happen). The question is how do I tell the external user ( other client app ) my location while I am behind a router ?
Are those other IM programs running on the client machine as a server too? and how do they get through ?
Thanks in advance!

It's quite complicated to connect to systems behind a router and not always possible. A well-documented way to do this with UDP is the STUN protocol (used mainly for SIP-based VoIP). If it is not possible to get behind the router, you can only use a server in the open network as intermediator (some P2P systems also promote well-connected peers to such intermediators). SIP uses TURN for as intermediator protocol. SIP's protocol to find out the right solution for a client is ICE.
See also NAT traversal.

Related

WebRTC: do I need a TURN server? (Would it help?)

I have a webcam chat room application (so it's many-to-many video sharing) using WebRTC and a mediasoup server.
I am having problems with SOME of my users not being able to get an incoming video feeds to work. It's a difficult problem because I can't reproduce it at all, and I can't easily "remote-debug" the problem since most of my users are very non-technical. So far the only thing I can tell for certain is that it seems to be network-related, not browser-related, as I have had bug reports from people using Firefox, Chrome, Safari, and Edge. I'm running my server (mediasoup v2) on port 443 with no firewall on the server box, so that should make the door as wide as possible. I just don't know what the exact problem is yet so I'm feeling around in the dark.
So, I'm trying solutions. I don't think(?) I have a TURN server set up but from what I have read, it seems like adding one certainly can't hurt, and could help with my situation.
I don't fully understand the entire WebRTC protocol or RFC 7118 (this stuff is really complicated!) or exactly what/where/how a TURN server fits into the bigger picture. It would help, right? A lot of Googling has led to no clear answers. Would love some help! Thank you!
WebRTC tries everything it can do to make a p2p connection, but there are times that it will fail. The turn server acts as a last resort so that the peers can both connect through the turn server. Obviously this is not a p2p connection, so there will be extra latency, and you will have to make sure that your turn server has enough bandwidth to cover all of the connections you expect.
TL;DR, If you need 100% connection rates, you should have a turn server.
I believe AWS has a ready made instance you can spin up, or if you could use this open source coturn server https://github.com/coturn/coturn
On a debugging note... Check your ice candidates type. You should see host and srflx if you only have a STUN server, but if you have a TURN server you will also see relay. You can replicate this issue by discarding the ice candidates that have host and srflx types.
I'm running my server (mediasoup v2) on port 443 with no firewall on the server box, so that should make the door as wide as possible
That is websocket. The media traffic runs over UDP typically and mediasoup uses random ports. A TURN server which is configured on udp port 443 may help in some cases.
The other problem is UDP being blocked which is easy to reproduce with a local firewall.. Mediasoup supports something called ice-tcp which will allow media to run over a TCP connection. You should check if your mediasoup installation uses ice-tcp. If it does not, a TURN server with TURN/TCP will help.

With WebRTC, is it possible to connect successfully every time without TURN sever?

These days, I'm really into webRTC technology, and I've been studying webRTC. But, I'm faced with a problem.
I understand that webRTC is using the ICE framework, which has TURN, STUN sever for relay and signaling. But as this article said, webRTC doesn't need a TURN server.
So I'm really curious whether it is possible to connect successfully every time without a TURN sever?
If it is, please tell me the way, and if it isn't, how often are peers using the TURN server in average?
Thank you.
(PS, Azar (one of the biggest apps using webRTC) also said they don't use the TURN sever on their website)
Yes it's possible to connect without a TURN server. Every time? Yes. Everyone? No. Because firewalls.
The Holy Grail of WebRTC is a direct client-to-client network connection without going through an intermediary server (a relay).
TURN is an intermediary server. It's used as a fallback when peers are behind symmetric NATs.
Negotiating this, is the purpose of ICE. There are articles written on how, but in short, "ICE agents" (browsers) collaborate on both ends, communicating through your JS signaling channel, to poke holes from inside the firewall on each end to connect up.
This related answer suggests TURN usage is ~20%.
STUN is not a relay, but merely a mirror server for agents to learn their own external IPs.

How can I implement own webrtc server in my project?

I want to implement webrtc server in my project. I want to make my own webrtc server and deploy it in amazon server. How can I achieve this?
WebRTC is a peer-to-peer protocol so you don't need a server for this.
You will need a signaling server for session negotiation. How you'll implement this depends on the technology that you'll use - client side: polling, ajax, websockets, stomp etc and server side.
For STUN/TURN you can deploy an existing server or use RFC and develop your own from scratch.
#Adrian Ber is correct, you need a signalling server such as this one:
https://github.com/peers/peerjs-server
You can set one of these up on AWS
You'll also need some code on the client side. There is a matching javascript client library (which does most of the work) here: http://peerjs.com/
There are some examples on the peerjs web site - they either need to be run on your local machine or on https servers (browsers will no longer allow camera access over http)
Ignore the people saying that WebRTC is peer to peer only. There is no reason why you can't implement an application, run it on a server, and treat it as a 'peer' for the sake of webRTC when it is actually a server.
That said, we've looked into pulling the WebRTC implementation out of Chrome, but it is a huge task. Depending on what you want to do, you will likely only need to support a subset of WebRTC functionality (Data channel / unreliable for example if you're doing a multiplayer web game).
There might be a few implementations out there that have cropped up now, but last I checked there wasn't anything of note.

Can I simplify WebRTC signalling for computers on the same private network?

WebRTC signalling is driving me crazy. My use-case is quite simple: a bidirectional audio intercom between a kiosk and to a control room webapp. Both computers are on the same network. Neither has internet access, all machines have known static IPs.
Everything I read wants me to use STUN/TURN/ICE servers. The acronyms for this is endless, contributing to my migraine but if this were a standard application, I'd just open a port, tell the other client about it (I can do this via the webapp if I need to) and have the other connect.
Can I do this with WebRTC? Without running a dozen signalling servers?
For the sake of examples, how would you connect a browser running on 192.168.0.101 to one running on 192.168.0.102?
STUN/TURN is different from signaling.
STUN/TURN in WebRTC are used to gather ICE candidates. Signaling is used to transmit between these two PCs the session description (offer and answer).
You can use free STUN server (like stun.l.google.com or stun.services.mozilla.org). There are also free TURN servers, but not too many (these are resource expensive). One is numb.vigenie.ca.
Now there's no signaling server, because these are custom and can be done in many ways. Here's an article that I wrote. I ended up using Stomp now on client side and Spring on server side.
I guess you can tamper with SDP and inject the ICE candidates statically, but you'll still need to exchange SDP (and that's dinamycally generated each session) between these two PCs somehow. Even though, taking into account that the configuration will not change, I guess you can exchange it once (through the means of copy-paste :) ), stored it somewhere and use it every time.
If your end-points have static IPs then you can ignore STUN, TURN and ICE, which are just power-tools to drill holes in firewalls. Most people aren't that lucky.
Due to how WebRTC is structured, end-points do need a way to exchange call setup information (SDP) like media ports and key information ahead of time. How you get that information from A to B and back to A, is entirely up to you ("signaling server" is just a fancy word for this), but most people use something like a web socket server, the tic-tac-toe of client-initiated communication.
I think the simplest way to make this work on a private network without an internet connection is to install a basic web socket server on one of the machines.
As an example I recommend the very simple https://github.com/emannion/webrtc-web-socket which worked on my private network without an internet connection.
Follow the instructions to install the web socket server on e.g. 192.168.1.101, then have both end-points connect to 192.168.0.101:1337 with Chrome or Firefox. Share camera on both ends in the basic demo web UI, and hit Connect and you should be good to go.
If you need to do this entirely without any server, then this answer to a related question at least highlights the information you'd need to send across (in a cut'n'paste demo).

How can i broadcast UDP packet to the browser

I am beginner level.
I try to broadcast data to the browsers in local area ( Same router by sending . . . 255 ).
I should implement the real time streaming service to the local level browsers.
But it will occur high traffic when the client browsers is increased.
To broadcast data, it seems to need UDP protocol.
But web browser base on TCP.
So i investigated the webRTC that is based on UDP.
But i don't really know how to use this.
Is it possible to broadcast the data to the web browser like chrome in local area ?
If not, why it is impossible to implement ? just for hazard of DDOS ? How can i solve this high traffic problem ?
( It really occur high traffic when each clients respond to every data from server (TCP) or the server send same data to the every client amount to number of clients (not broadcasting).
I just want to implement that the server just send one broadcasting datagram packet to the local area and each clients in local level receive same one data from the server but not respond to that. )
From a web app (not a modified web browser itself), you cannot create nor manipulate raw (UDP/TCP) sockets.The sandboxing and other mechanisms won't let you.
with webRTC, you will need to make an handshake, and use ICE.
=> You cannot push to a peer knowing only his IP/port
=> You have to have the receiver accept and acknowledge the transfer
you might have more change with WebSockets, but that requires additional mechanisms as well and not all parties will be able to support web socket (or accept the upgrade from http to WS).
For illustration purpose, you can see the work of Jess on a web-based bit torrent. He has exactly the same problems. https://github.com/feross/webtorrent