With all these p2p video applications, I am curious how one could view (in an external player such as vlc) the WebRTC data streams? I started using netstat to see incoming connections, but that didn't get me anywhere. I was hoping that there is a way to view webRTC data streams outside of the browser.
For example, in firebug it's easy to view POST and GET requests, however there's nothing on WebRTC connections.
If you want to actually play the media, there is not a way. It is encrypted with a key that is exchanged in the DTLS handshake at the beginning of the peerconnection.
You can see the UDP packets in wireshark(their source and destination ports) but the media type and actually being able to play it is not possible unless you are privy to the master key exchanged so that you can decrypt the media(which is not possible if you are using the browser javascript APIs).
Related
As far as I search, All WebRTC handshakes are done through any signaling server [ HTTP, WebSocket, etc..] even through Mail or Whatsapp.
But I expect to connect without using any of them. Is there any way to archive this?
If yes, please give me a brief solution. ThankYou!
WebRTC (as implemented in browsers) requires signalling. You can choose a different medium for signalling such as a piece of paper or calling someone on the phone, but it needs signalling.
If you want to be able to initiate unsolicited connections you need to use TCP or UDP directly which (if we ignore NAT and firewalls) don't have that restriction.
I am beginner level.
I try to broadcast data to the browsers in local area ( Same router by sending . . . 255 ).
I should implement the real time streaming service to the local level browsers.
But it will occur high traffic when the client browsers is increased.
To broadcast data, it seems to need UDP protocol.
But web browser base on TCP.
So i investigated the webRTC that is based on UDP.
But i don't really know how to use this.
Is it possible to broadcast the data to the web browser like chrome in local area ?
If not, why it is impossible to implement ? just for hazard of DDOS ? How can i solve this high traffic problem ?
( It really occur high traffic when each clients respond to every data from server (TCP) or the server send same data to the every client amount to number of clients (not broadcasting).
I just want to implement that the server just send one broadcasting datagram packet to the local area and each clients in local level receive same one data from the server but not respond to that. )
From a web app (not a modified web browser itself), you cannot create nor manipulate raw (UDP/TCP) sockets.The sandboxing and other mechanisms won't let you.
with webRTC, you will need to make an handshake, and use ICE.
=> You cannot push to a peer knowing only his IP/port
=> You have to have the receiver accept and acknowledge the transfer
you might have more change with WebSockets, but that requires additional mechanisms as well and not all parties will be able to support web socket (or accept the upgrade from http to WS).
For illustration purpose, you can see the work of Jess on a web-based bit torrent. He has exactly the same problems. https://github.com/feross/webtorrent
I'm playing around with WebRTC, and what I'd like to achive is:
User1 opens the browser at 192.168.x.x
User2 opens the browser at 192.168.x.x
The same page
User 1 clicks call, user2 displays the stream on his screen.
I've created a signaling server with node and socket.io and I'm able to exchange messages betweeen users using socket.io rooms.
The steps I'm following are:
Get User Media
Create peerconnection1 - no ice servers
add the stream on peerconnection
create the offer
send offer via sockets
Receive the offer and create peerconnection2 - no ice servers
sending the answer
I've also put some logging in "onicecandidate" and "onaddstream" to see when they are called, and on "onaddstream" I create the videoelement.
When I press the call button I see on the other computer that the video element becomes black but I dont see any video neither audio.
For sure I'm missing some vital steps,
Could someone tell me the steps I have to do to make a correct call and exchange all the necessary data to display the stream on the other side?
Thank you very much
A STUN server is used to get an external network address.
TURN servers are used to relay traffic if direct (peer to peer) connection fails.
see this image describes how peerconnection works
webRTC Basics
You should still have at least a stun server referenced for one of your ICE servers. I would use 'stun:stun.l.google.com:19302' || 'stun:23.21.150.121', even though you do not technically need one.
But if you do not have ICE servers, you do not need to worry about gathering candidates. A couple of things that could be happening.
Make sure you Add your streams to each connection BEFORE creating your offer and creating your answer, it says you get the user media but not that you add it to your peerconnection
You are not setting your local and remote descriptions
Offering computer should set their local description when creating it
Answering computer should set their remote description with the offering description and set their local description with the one they create
Make sure you send the answer sdp to the initial offering computer and that that offering computer sets it as their remote description.
Streams WILL NOT send to each other unless you add the needed streams, create your descriptions, and then set their local and remote descriptions accordingly.
If that does not work, you should probably post your code for the page.
since few days I'm trying to build a basic webRTC Videochat. I've got some Demos running localy, even via LAN. But now I want to build one by my one at the really basics without so much overload some Demos come with.
But I still don't get a complete peer connection.
Eg. this example seems to be broken, because I can't "createSignalingChannel();" w3.org/TR/webrtc/#simple-example
Some other examples (https://webrtc-experiment.appspot.com/) want me to link their scripts, but I wont do this, because I want to understand the magic of the peer connection and how to get a handshake between 2 browsers.
I also explored examples with the Google App Engine but thats not what I want.
I want to run it in really easy JS and HTML just on the minimum of what is neccessary.
Here is my code:
https://github.com/mexx91/basicVideoRTC EDIT: Should work now
So what will I have to add to get an handshake and peer connection, so that I can send eg. the mediaStream to eachother.
Thanks a lot!
createSignalingChannel() is only pseudo-code to illustrate the existence of a separate channel. You need for the initial connection handling a separate message channel.
You can achieve that with hosted services like Pusher, Brightcontext or PubNub, or you can host your own backend with open-source projects like socket.io or SignalR.
Then you just need to send the offers, answers and iceCandidates through your separate channel.
List of Realtime Services: http://www.leggetter.co.uk/real-time-web-technologies-guide
Imagine a video conferencing web-app, which users A and B originally access from some webserver. Suppose that web app supports presence, so the web server knows who's currently on-line. Imahine the UI allows A to try and place a video call to B. Via say XMLHttpRequest(), A's browser informs the server this is wanted, and B's javascript pops up something saying that A wants to call B. No WebRTC has happened at all yet. But at this stage, A can indirecttly communicated with B by sending messages using e.g. XMLHttpeRequest. In WebRTC parlance, this is the "signalling channel". So, A and B can both interact with their ICE agents to discover candidate addresses, and SDP descriptions, and send these to each ot6her, via the server, over this signallinh channel. E.g. the web app on A calls a WebRTC API to get its ICE candidates, and packages these up as it sees fit, to send to B. B's reader receives this message from the server (e.g over a WebSocket or long poll) and hyence it can unpack this, and format as needed to send to the ICE agent on B, using the RTCPeerConnection object. Similalrly, SDP offer/answer can be sent betweent he two apps, and passe through into the ICE agnet in the browsers, to get agreed media formats etc. At that stage, media connections can get set uo by the browser (meida streams are added to the RTCPeerConnection initially (which aren't communicating, but whihc have attributes that can be queried to describe the codec etc, and when the API is asked to create an SDP description, it does that using these attributes, but adjust the IP address and port based on how the ICE agent on each local browser has figured out what addresses can reach that local browser / port (NAT traversal).
In my setup, I have a custom server in the cloud handling audio and video so I don't need (and don't want) the whole "where am I and what's my private and public address etc." discovery process. Essentially I want the SDP offer and don't care about the IP-address/port; that offer goes to the server, the server chooses codecs and gets the SRTP key and replies with an SDP answer to the browser which would contain a public address, the codec choice and it's key. Ideally the browser starts sending media to the server and the server simply sends "peer" media back from whence it came (which would tunnel back through any UDP friendly NAT devices).
I know this is technically possible because I already do this with Win32/OSX desktop clients... the question is, is this possible with WebRTC and RTCPeerConnection? I've tried a few configuration types, e.g. {} and { "iceServers": [] } but it still seems to go through discovery gyrations. Are there perhaps other ways to shortcut the process? Thanks!
No, you cannot skip the process, since the WebRTC implementation forces the use of ICE and STUN checks, to fix some security problems. So, the current Chrome implementation will force that the STUN checks are made to the ip/ports negotiated in the ICE candidates.
But yes, there are many applications working without this requirement. One day we have to change to better and more secure implementations. The day is now...
No, you can t skip it in webrtc browsers, but webrtc devices (here your gateway) can simplify the process by only implementing ICE Lite.