Are there currently solutions where your server can act as the peer of a WebRTC connection?
The reason I am interested in WebRTC is not the peer-to-peer part of it, but because it enables you to use UDP. You could let players participate in a fast-paced game like Quake without needing any plugins.
It seems that essentially this same question was asked before, but surely things must now be quite different as 2 years have passed.
Yes, it is possible to deploy your WebRTC peer code on server. But since you need to run it on server, it's essentially different from how you run the WebRTC code within the browser - i.e. through a Java Script.
For server based WebRTC peer, you would need to use the WebRTC native code available on platforms - Windows, Mac OS X, Linux, Android and iOS. You can get the WebRTC native code from - https://webrtc.org/native-code/development/
Follow the instructions here to download and build the environment. Sample applications are also present in the repository at the locations - src/webrtc/examples and src/talk/examples
In summary you have use the WebRTC source code that is embedded in the browser in your application code and call the relevant methods / API for the WebRTC functionality.
I have answered similar question at: WebRTC Data Channel server to clients UDP communication. Is it currently possible?
We have implemented the exact same thing: a server/client way of using WebRTC. Besides we also implemented data port multiplexing, so that server would only need to expose one data port for all rtcdata channels.
Here is 2018 update for you: your off-the-shelf solutions are:
Red 5 Pro, Wowza, Kurento, Unreal Media Server, Flashphoner
Also note that in modern public networks TCP is not much slower than UDP;
but UDP may have a considerable packet loss, so try WebRTC-TCP for your Quake idea.
Related
I am researching implementation of a WebRTC-SIP gateway/bridge. That is, for example, to make a WebRTC call to a SIP end point via a SIP server like Asterisk. I know that Asterisk already supports this but I need an intermediary server for various needs like logging, recording, integration with local auth/signalling and other app modules. I looked at Kurento, Openwebrtc (Ericson) and the lesser known Intel's Collaboration Suite for WebRTC.
I need a server-side solution to interact with my Node Application server. Specifically, the server-API should be able to generate a SDP for a RTP end point and convert WebRTC SDP to the more generic SDP used by Legacy SIP servers or have a way to bridge these two end-points. I feel comfortable that this is possible with Kurento (saw a post on except that I am not aware of any jsSip/sipML5 kind of API for Kurento. Kurento itself is not meant to provide signalling. For e.g., if the SDP generated by Kurento for the rtpEndpoint in Kurento has to be used in a SIP call/INVITE, how would one implement it? For that matter, how would one initiate a SIP INVITE, for example, from Kurento? Are there third-party modules to do this?
Has anyone used the any of the servers listed above for a similar use case?
This is a programming question. I am looking for server APIs to implement a WebRTC to SIP gateway/bridge for media transcoding (if required), SDP transformation and SIP signalling.
I've got WebRTC peer to peer working but when I want to broadcast a single camera to multiple clients obviously peer to peer isn't suitable.
I've found solutions like
http://lynckia.com
and
http://www.medooze.com/products/mcu/webrtc-support.aspx
But the first I can't get setup (and it seems to have cross browser issues)
the second just feels like we're hitting a nail with a nuclear missile.
All I need is a relay, I don't need to decode / recode streams.
I just need
The Broadcaster to connect to the server (peer to peer)
The clients to connect to the server (peer to peer)
The server to relay the stream from the broadcaster to the clients.
Is there any software out there that offers this solution that I've missed? is there an alternative working and scalable alternative?
Thanks
Jitsi Video Bridge works pretty much exactly how you describe.
On your server you can run Janus, to which your broadcaster can provide a stream via RTP.
Have a look at an example configuration file.
After writing a configuration file which defines how the server receives stream from the broadcaster, you should be able to launch janus in the background via a command line interface tool:
$ janus --daemon --config=config_file.conf
Also, see streaming test demo.
Note: I have not tested this thoroughly.
Have a look at this github-repo inspired from muaz khan's WebRTC p2p scalable broadcast. This can work great on LAN. On internet, I am not sure how well it can work as of now though we are improving it on the go.
If you just want to broadcast from a peer to a set of peers, if they don't care about the latency, the best solution is to covert WebRTC to live streaming, without transcoding just muxing:
Peer(Publisher) ---WebRTC--> Server --RTMP/HLS/DASH--> Peers/Players
If this works good for you, SRS is able to covert WebRTC to live streaming.
Because live streaming allows you to use CDN or TCP to deliver the streams, and the latency is about 3~5s, so this solution is only available when Peers/Players never need to communicate to the Peer(Publisher).
If you want all those peers to talk to each other, it's very complex and need a WebRTC SFU cluster to do this, there will be a huge number of streams. For example, if allows 100 peers to talk to each other, there will be 100x100 = 10k streams.
It's too complicated, so I don't think there is good open-source solution right now(at 2022.02).
In WebRTC, I always see the implementation about peer-to-peer and how to get video streaming from one client to another client. How about server-to-client?
Is it possible for WebRTC to streaming video file from server-to-client?
(I am thinking about using WebRTC Native C++ API to create my own server application to connect to the current implementation on chrome or firefox browser client application.)
OK, if it is possible, will it be faster than many current video streaming services?
Yes it is possible as the server can be one of the peers in that peer-to-peer session.
If you respect the protocols and send the video in SRTP packets using VP8, the browser will play it. To help you build these components on other applications or servers, you can check this page and this project as a guide.
Now, comparing WebRTC with other streaming services... It will depend on several variables like the Codec or the protocol. But, for instance, comparing WebRTC (SRTP over UDP with VP8 Codec) against Flash (RTMP over TCP with H264 Codec), I would say that WebRTC wins.
The player will be Flash Player against the native <video> tag.
The transport would be TCP against UDP.
But of course, everything depends on what you are sending to the client.
I have written some apps and plugins using the native WebRTC API, and there isn't a lot of information out there yet, but here are a few useful resources to get you started:
QT Example: http://research.edm.uhasselt.be/jori/qtwebrtc
Native to Browser example: http://sourcey.com/webrtc-native-to-browser-video-streaming-example/
I started with the WebRTC Native C++ to Browser Video Streaming Example but it doesnot build anymore with the actual WebRTC Native Code.
Then I made modifications merging into a standalone process :
management of the peerConnection (the peerconnection_server)
access to Video4Linux capture (the peerconnection_client).
Removing the stream from browser to the WebRTC Native C++ client give a simple solution to access throught a WebRTC browser to a Video4Linux device that is available from GitHub webrtc-streamer.
Live Demo
We are attempting to replace MJPEGs with Webrtc for our server software and have a prototype module for doing this using a smattering of components tied to the Openwebrtc project. It has been an absolute bear to do, and we have frequent ICE negotiation errors (even over a simple LAN), but it mostly works.
We also built a prototype with the Google Webrtc module, but it had many dependencies. I find it easier to work with the Openwebrtc modules because Google's stuff is so tightly tied to general peer-to-peer scenarios on the browser.
I compiled the following from scratch:
libnice 0.1.14
gstreamer-sctp-1.0
usrsctp
Then I have to interact with libnice a bit directly to gather candidates. Also have to write out the SDP files by hand. But the amount of control--being able to control the source of the pipeline--makes it worthwhile. The resulting pipeline (with two clients off one server source) is below:
Of course. I'm writting a program using native WebRTC api which can join the conference as a peer and record both video and audio.
see: How to stream audio from browser to WebRTC native C++ application
and you can definitely streaming media from native app.
I'm sure you can use dummy_audio_file to streaming audio from local file, and you can find a way to access the video streaming progress by your own.
Yes it is. We have developed an load test tool to publish and play for Ant Media Server. This tool can broadcast media file. We used the same native WebRTC library used in Ant Media Server.
Sure it's possible, it allows covert live streaming to WebRTC, for example:
OBS/FFmpeg ---RTMP---> Server ---WebRTC--> Chrome/Client
For this scenario, it allows the ultra low latency live streaming, about 600~800ms, to play the live streaming by WebRTC. Please take a look at this demo.
I am attempting to build a client/server game architecture and would like to begin testing the game using my local Mac as the server. I have found several articles on Bonjour, but that is for local network traffic only. My goal is to make this application work over the Internet, connecting to a hosted application on a static address to facilitate turn data. However, I'm at a loss as to which Cocoa APIs to use for this purpose. Should I use NSConnection, NSStream subclasses, or good 'ol C sockets and whatnot. The game state is already built in Objective-C and is ready to be set in motion once I have the server facilities ready. Any insight?
NSConnection, NSStrean and C sockets are build for different needs. You need to specify the needs of your game and the kind of service in order to get more help. If you want to develop a client-server application that relies on the Internet and not on the local network, Bonjour will not be able to help.
C sockets, and Cocoa APIs that wrap around them are intended to operate on an open network stream between the client and the server. The advantage of having an open stream is that you can have the server send data to the client without the client having requested for it. NSURLConnection in Cocoa works differently. With it, you can perform HTTP requests and receive formatted responses from a server.
If your application is based on HTTP requests, I recommend you take a look an NSURLConnection, or AFNetworking, as a 3rd party alternative. If your application relies on open streams, I recommend you take a look at CFNetwork from Apple (C wrapper around BSD sockets that originates from the days Macs had Carbon, with great performance, stability and versatility) and GCDAsyncSocket, a 3rd party library wrapped around BSD Sockets, supports Crand Central Dispatch, is Objective-C ready, and does the job wonderfully.
I hope I helped.
I suggest you to use sockets, since they're not hard to use and are a standard way. I've even written an asynchronous wrapper class around BSD sockets: https://github.com/H2CO3/TCPHelper
This is for simple, one-to-one TCP protocol connections, supporting both IPv4 and IPv6. You can send and receive raw NSData and possibly build a protocol around it.
Foundation classes such as NSURLConnection are not particularily for this purpose; rather than to interact with standard HTTP servers (I suppose you don't want to implement a full HTTP server for a game).
NSNetServices may suit you just like CFNetwork, but the latter is a bit harder to use. If you'd like to use Foundation classes, I'd recommend NSNetServices.
Hope it helps.
There are many different ways to accomplish this. It really depends on how you'll be passing the data and what it will be used for.
First, I would setup a hostname that you can use for development purposes with your game. You can use anything like http://dyn.com/dns/ to setup this for your Mac. Then you can enable a compiler setting to switch out the development / production URL's.
Next, I would recommend using TCP sockets for your game (using CocoaAsyncSocket - https://github.com/robbiehanson/CocoaAsyncSocket). This method should work fine your your use case. Since you are doing turn-based data (and since all of that data is vital) I would not recommend using UDP sockets (but those would work if you were solely passing position, video, or audio data where a dropped packet might not matter).
I am working with an electronics appliance manufacturer to embed LAN based control systems into the products. The idea is to serve up a system configuration/control interface through a web browser so clients never need to install software. We can communicate with the appliance by sending and receiving serial data through the embedded module. Since the appliance can also be controlled from a front panel UI, it creates a challenge to keep a remote web interface in sync with very low latency. It seems like websockets or some sort of Push is what we need for handling real time events from the server to clients.
I am using a Lantronix Mathport AR embedded device server. Out of the box the unit will serve up any custom HTML and java servlets/applets. We have the option to install a lightweight Linux distro if we need more flexibility. I am not sure how to implement any server side apps since the device is not running standard Apache. I believe it is using Boa.
Can anyone guide me in the right direction of how to do this?
Some general info...The WebSocket protocol (draft spec here) is a simple layer on top of TCP. What this means is that, if you already have a TCP server for your platform, implementing the WebSocket is just a matter of hours. The protocol specifies a handshake and two ways of sending data frames.
I strongly suggest you start by reading the 39 pages spec.
As Tihauan already mentioned, start by reading the spec, and also note that there are still some changes ongoing, although websockets is now more stable than it was 1 year ago.
Key point for me was the requirement that websocket data is entirely UTF-8 text, which lends itself nicely to JSON based message definitions.
Our system uses a form of embedded linux, so we then added and made use of the following libraries:
"libwebsockets" from:
http://git.warmcat.com/cgi-bin/cgit/libwebsockets/
"jansson" from:
http://www.digip.org/jansson/
Using the above as support libraries, we created an internal lightweight "client/server" that allowed our other software modules to register for certain, applicable, websocket messages, and respond as needed. Worked great.
Good luck and best regards,
I'm a bit late, but Mozilla posted a guide entitled "Writing WebSocket servers", which literally guides you through writing a websocket server.
You will need to already know how HTTP works and have medium programming experience. Depending on language support, knowledge of TCP sockets may be required. The scope of this guide is to present the minimum knowledge you need to write a WebSocket server.
https://developer.mozilla.org/en-US/docs/Web/API/WebSockets_API/Writing_WebSocket_servers