We have mobile application that historically has used RTSP streaming to allow a user to watch a live stream, which currently is published via Wowza Streaming Engine. We have had a need to lower stream latency, so have gravitated towards WebRTC to achieve this.
The problem is that there seems to be a lack of documentation, or examples regarding the implementation of a react-native WebRTC viewer which connects to a remote stream.
Does anyone out there have any documentation, or code examples for this kind of implementation?
I do note there is a react-native-webrtc library, however, all examples demonstrate connecting two peers on mobile phones with their video cameras i.e. Like facetime. We are after an example demonstrating someone on a phone connecting to a remote streaming server with a video feed.
Cheers,
If you want a webrtc client to connect to a server you need a server doing webrtc with the proper signaling that fit your need. Webrtc don't care which signaling you use, so you have to choose it or choose a the platform you need.
There are a lot of different media server, or library that support webrtc in server side all having there specific signaling(ex: Freeswitch, Kurento etc), or no signaling (ex: Mediasoup). Few will have a react native version as Media Streaming is not really something in the javascript/UI side but you can do something with the webrtc react-native lib.
Twillio has a lot of supported platform and could be a good start if you search a ready to use solution.
I am trying to build a live stream video app.
I built an rtmp server which is ready for publishing and playing streams. I need a way to capture mobile's user camera and send the online stream to my rtmp server.
I use react-native in client side. I found react-native-camera which is great in dealing with camera but I couldn't find any event/api available for accessing camera stream in their documentations.
Another problem is the way that I have to send the stream to rtmp server. I have no knowledge in this area so any kind of help will be appreciated.
For anyone else who faces the same issue, This repo is the ultimate solution.
https://github.com/NodeMedia/react-native-nodemediaclient
Is it possible to use a TURN server to play the role of a media server to relay streams ? (like Janus or Kurento)
user1<----->turn<------>user2
TURN servers only help in 1to1 call as media(Encrypted) relayer through NAT. We can't decrypt the media at TURN server.
WebRTC MediaServer/Gateways like Janus, will help in advanced use cases like streaming, conference, PSTN/SIP and recording.
Read the tutorial and choose the media server based on your use cases
I am building a web-based project which has webcam one-way broadcasting part .(A user can open its own cam and some viewers can join its room to only view and listen).
So i have decided to use Kurento Media Server(KMS) because of not having any experince with flash.
My questions in my head:
Do i need anything extra beside KMS to make a user broadcast webcam?
Can Kurento provide me the live streaming to webpage?
And What is the difference using Red5 or Kurento?
Thanks in advance
Do i need anything extra beside KMS to make a user broadcast webcam?
You'll probably need a TURN server for users that have some port limitations
Can Kurento provide me the live streaming to webpage?
Sure! Check the tutorials and the documentation for a full list of features.
And What is the difference using Red5 or Kurento?
Kurento is more than just a media server. It is a pluggable platform that offers computer vision and augmented reality capabilities, on top of video and audio streaming, recording and playing. It also offers WebRTC out of the box, which is something Red5 can't do as of today.
Disclaimer: I'm part of the Kurento team.
In WebRTC, I always see the implementation about peer-to-peer and how to get video streaming from one client to another client. How about server-to-client?
Is it possible for WebRTC to streaming video file from server-to-client?
(I am thinking about using WebRTC Native C++ API to create my own server application to connect to the current implementation on chrome or firefox browser client application.)
OK, if it is possible, will it be faster than many current video streaming services?
Yes it is possible as the server can be one of the peers in that peer-to-peer session.
If you respect the protocols and send the video in SRTP packets using VP8, the browser will play it. To help you build these components on other applications or servers, you can check this page and this project as a guide.
Now, comparing WebRTC with other streaming services... It will depend on several variables like the Codec or the protocol. But, for instance, comparing WebRTC (SRTP over UDP with VP8 Codec) against Flash (RTMP over TCP with H264 Codec), I would say that WebRTC wins.
The player will be Flash Player against the native <video> tag.
The transport would be TCP against UDP.
But of course, everything depends on what you are sending to the client.
I have written some apps and plugins using the native WebRTC API, and there isn't a lot of information out there yet, but here are a few useful resources to get you started:
QT Example: http://research.edm.uhasselt.be/jori/qtwebrtc
Native to Browser example: http://sourcey.com/webrtc-native-to-browser-video-streaming-example/
I started with the WebRTC Native C++ to Browser Video Streaming Example but it doesnot build anymore with the actual WebRTC Native Code.
Then I made modifications merging into a standalone process :
management of the peerConnection (the peerconnection_server)
access to Video4Linux capture (the peerconnection_client).
Removing the stream from browser to the WebRTC Native C++ client give a simple solution to access throught a WebRTC browser to a Video4Linux device that is available from GitHub webrtc-streamer.
Live Demo
We are attempting to replace MJPEGs with Webrtc for our server software and have a prototype module for doing this using a smattering of components tied to the Openwebrtc project. It has been an absolute bear to do, and we have frequent ICE negotiation errors (even over a simple LAN), but it mostly works.
We also built a prototype with the Google Webrtc module, but it had many dependencies. I find it easier to work with the Openwebrtc modules because Google's stuff is so tightly tied to general peer-to-peer scenarios on the browser.
I compiled the following from scratch:
libnice 0.1.14
gstreamer-sctp-1.0
usrsctp
Then I have to interact with libnice a bit directly to gather candidates. Also have to write out the SDP files by hand. But the amount of control--being able to control the source of the pipeline--makes it worthwhile. The resulting pipeline (with two clients off one server source) is below:
Of course. I'm writting a program using native WebRTC api which can join the conference as a peer and record both video and audio.
see: How to stream audio from browser to WebRTC native C++ application
and you can definitely streaming media from native app.
I'm sure you can use dummy_audio_file to streaming audio from local file, and you can find a way to access the video streaming progress by your own.
Yes it is. We have developed an load test tool to publish and play for Ant Media Server. This tool can broadcast media file. We used the same native WebRTC library used in Ant Media Server.
Sure it's possible, it allows covert live streaming to WebRTC, for example:
OBS/FFmpeg ---RTMP---> Server ---WebRTC--> Chrome/Client
For this scenario, it allows the ultra low latency live streaming, about 600~800ms, to play the live streaming by WebRTC. Please take a look at this demo.