I want to study relations of ICE/STUN/TURN protocol and network packets under operation of WebRTC. I found many testing products. Because I want to observe ICE/STUN/TURN mode of operation step by step, those testing products can not help me. I want to found a tool like pjsip icedemo tool (http://www.pjsip.org/pjnath/docs/html/ice_demo_sample.htm ).Have anyone used that tool?
I think that best test tools are Firefox and Chrome.
Turn on the logs for WebRTC and ICE and you will able to see all the details.
Related
I’m suspecting a network issue between 2 servers of my application. How can i test this network leg? Simplest solution please, maybe a test application?
You can simple ping between the devices to check for packet loss and you have tools like hping for advanced test of tcp and other protocols. Another recommended tool for checking performance is iperf
https://en.wikipedia.org/wiki/Iperf
I would like to implement a video / audio call feature from a browser. The goal is to allow two users to communicate remotely without having to install a third part (when I say third part, I'm talking about a software or an extension on a browser).
I know WebRTC, which is very popular today and free. However, it is very difficult to implement and the documentation is difficult to understand (not very easy for a beginner).
Here is the official webRTC documentation, and honestly, where to start? https://webrtc.org/start/
If you have an experience about WebRTC, is it possible to share with positive or negative points? This would be very useful for the community.
Moreover, if you have experience with another library, I think it would be interesting to hear it.
There is no other way to develop a call service in a website without the use of WebRTC today.
The alternatives are:
Use WebRTC
Use Flash (which is... dead)
Use a plugin (which is... dying as a mechanism in browsers)
Use an app you download (not exactly a service in a website)
Node.js is the way to go, but you will need to learn some new technology, especially when it comes to the backend.
The servers you will need are:
1. The traditional web application server
2. A signaling server (the one you plan on using Node.js for - you can use that for the web application server as well)
3. A STUN/TURN server (for NAT traversal)
4. Maybe a media server, depending on your use case
For some alternative open source and commercial products, you can check this WebRTC Developer Tools Landscape
Are there currently solutions where your server can act as the peer of a WebRTC connection?
The reason I am interested in WebRTC is not the peer-to-peer part of it, but because it enables you to use UDP. You could let players participate in a fast-paced game like Quake without needing any plugins.
It seems that essentially this same question was asked before, but surely things must now be quite different as 2 years have passed.
Yes, it is possible to deploy your WebRTC peer code on server. But since you need to run it on server, it's essentially different from how you run the WebRTC code within the browser - i.e. through a Java Script.
For server based WebRTC peer, you would need to use the WebRTC native code available on platforms - Windows, Mac OS X, Linux, Android and iOS. You can get the WebRTC native code from - https://webrtc.org/native-code/development/
Follow the instructions here to download and build the environment. Sample applications are also present in the repository at the locations - src/webrtc/examples and src/talk/examples
In summary you have use the WebRTC source code that is embedded in the browser in your application code and call the relevant methods / API for the WebRTC functionality.
I have answered similar question at: WebRTC Data Channel server to clients UDP communication. Is it currently possible?
We have implemented the exact same thing: a server/client way of using WebRTC. Besides we also implemented data port multiplexing, so that server would only need to expose one data port for all rtcdata channels.
Here is 2018 update for you: your off-the-shelf solutions are:
Red 5 Pro, Wowza, Kurento, Unreal Media Server, Flashphoner
Also note that in modern public networks TCP is not much slower than UDP;
but UDP may have a considerable packet loss, so try WebRTC-TCP for your Quake idea.
As indicated in an original question: Cast Sender running on Android Chrome or Firefox
Is there a way to launch Chromecast via WebRTC?
I have no doubt that when launched CC will be as big as the original iPhone, and the only thing preventing full HTML5 functionality is using the DIAL protocol.
As I already know this isn't possible, #ali-naddaf could you enter this as a feature request by any chance please?
WebRTC is working great on our dev CC devices, the only thorn is having to use the Chrome extension to launch it before using WebRTC and WebSockets to take over. Avoiding DIAL for launching pure web apps would be a real killer deal IMO.
BTW, I haven't been able to find a formal way to make feature requests so this Q/A mechanism was my approach.
In WebRTC, I always see the implementation about peer-to-peer and how to get video streaming from one client to another client. How about server-to-client?
Is it possible for WebRTC to streaming video file from server-to-client?
(I am thinking about using WebRTC Native C++ API to create my own server application to connect to the current implementation on chrome or firefox browser client application.)
OK, if it is possible, will it be faster than many current video streaming services?
Yes it is possible as the server can be one of the peers in that peer-to-peer session.
If you respect the protocols and send the video in SRTP packets using VP8, the browser will play it. To help you build these components on other applications or servers, you can check this page and this project as a guide.
Now, comparing WebRTC with other streaming services... It will depend on several variables like the Codec or the protocol. But, for instance, comparing WebRTC (SRTP over UDP with VP8 Codec) against Flash (RTMP over TCP with H264 Codec), I would say that WebRTC wins.
The player will be Flash Player against the native <video> tag.
The transport would be TCP against UDP.
But of course, everything depends on what you are sending to the client.
I have written some apps and plugins using the native WebRTC API, and there isn't a lot of information out there yet, but here are a few useful resources to get you started:
QT Example: http://research.edm.uhasselt.be/jori/qtwebrtc
Native to Browser example: http://sourcey.com/webrtc-native-to-browser-video-streaming-example/
I started with the WebRTC Native C++ to Browser Video Streaming Example but it doesnot build anymore with the actual WebRTC Native Code.
Then I made modifications merging into a standalone process :
management of the peerConnection (the peerconnection_server)
access to Video4Linux capture (the peerconnection_client).
Removing the stream from browser to the WebRTC Native C++ client give a simple solution to access throught a WebRTC browser to a Video4Linux device that is available from GitHub webrtc-streamer.
Live Demo
We are attempting to replace MJPEGs with Webrtc for our server software and have a prototype module for doing this using a smattering of components tied to the Openwebrtc project. It has been an absolute bear to do, and we have frequent ICE negotiation errors (even over a simple LAN), but it mostly works.
We also built a prototype with the Google Webrtc module, but it had many dependencies. I find it easier to work with the Openwebrtc modules because Google's stuff is so tightly tied to general peer-to-peer scenarios on the browser.
I compiled the following from scratch:
libnice 0.1.14
gstreamer-sctp-1.0
usrsctp
Then I have to interact with libnice a bit directly to gather candidates. Also have to write out the SDP files by hand. But the amount of control--being able to control the source of the pipeline--makes it worthwhile. The resulting pipeline (with two clients off one server source) is below:
Of course. I'm writting a program using native WebRTC api which can join the conference as a peer and record both video and audio.
see: How to stream audio from browser to WebRTC native C++ application
and you can definitely streaming media from native app.
I'm sure you can use dummy_audio_file to streaming audio from local file, and you can find a way to access the video streaming progress by your own.
Yes it is. We have developed an load test tool to publish and play for Ant Media Server. This tool can broadcast media file. We used the same native WebRTC library used in Ant Media Server.
Sure it's possible, it allows covert live streaming to WebRTC, for example:
OBS/FFmpeg ---RTMP---> Server ---WebRTC--> Chrome/Client
For this scenario, it allows the ultra low latency live streaming, about 600~800ms, to play the live streaming by WebRTC. Please take a look at this demo.