I am trying to build a live stream video app.
I built an rtmp server which is ready for publishing and playing streams. I need a way to capture mobile's user camera and send the online stream to my rtmp server.
I use react-native in client side. I found react-native-camera which is great in dealing with camera but I couldn't find any event/api available for accessing camera stream in their documentations.
Another problem is the way that I have to send the stream to rtmp server. I have no knowledge in this area so any kind of help will be appreciated.
For anyone else who faces the same issue, This repo is the ultimate solution.
https://github.com/NodeMedia/react-native-nodemediaclient
Related
We have mobile application that historically has used RTSP streaming to allow a user to watch a live stream, which currently is published via Wowza Streaming Engine. We have had a need to lower stream latency, so have gravitated towards WebRTC to achieve this.
The problem is that there seems to be a lack of documentation, or examples regarding the implementation of a react-native WebRTC viewer which connects to a remote stream.
Does anyone out there have any documentation, or code examples for this kind of implementation?
I do note there is a react-native-webrtc library, however, all examples demonstrate connecting two peers on mobile phones with their video cameras i.e. Like facetime. We are after an example demonstrating someone on a phone connecting to a remote streaming server with a video feed.
Cheers,
If you want a webrtc client to connect to a server you need a server doing webrtc with the proper signaling that fit your need. Webrtc don't care which signaling you use, so you have to choose it or choose a the platform you need.
There are a lot of different media server, or library that support webrtc in server side all having there specific signaling(ex: Freeswitch, Kurento etc), or no signaling (ex: Mediasoup). Few will have a react native version as Media Streaming is not really something in the javascript/UI side but you can do something with the webrtc react-native lib.
Twillio has a lot of supported platform and could be a good start if you search a ready to use solution.
I built a React.js and Node.js app, inspired by KVS example .
It is working for a few participants, a Master can stream his webcam video & audio via WebRTC, and get every viewer webcam video. But we realized it won't be suitable for 50 people as the Master CPU and network increase over each viewer connection.
For every viewer a peerConnection is created with Master. We'd rather the Master to send his stream only once to a server (then sending it to viewers)
We would like to go for a SFU solution, would it be possible with the Javascript SDK ?
It is advised for the C SDK to use putMedia & GStreamer, is there an equivalent in javascript ?
I saw that mediaSoup could do the SFU part, could it be used with Kinesis ?
I'm new to Webrtc and Javascript. I'm trying to build a video chat application with recording functionality on the server. Currently, I use Easyrtc as Webrtc wrapper to provide the video chat functionality and it's working great. I also setup TURN server on the cloud using Coturn and use this on Easyrtc config.
I would now like to add video recording on the server and learned that this is achieved via media server. I'm keeping an eye on Kurento for this.
I'm just confused with media server in general.
Can Media Server replace TURN Server?
If TURN and Media server are required, can Kurento be installed on the same server as Coturn?
Can I have Easyrtc and add Kurento for video recording? If yes, how can Kurento record the video stream from Easyrtc/Coturn? Would appreciate pseudocode if possible.
Am I on right track? Any other advice to consider?
Should highly appreciate your comments.
Thank you!
In WebRTC, I always see the implementation about peer-to-peer and how to get video streaming from one client to another client. How about server-to-client?
Is it possible for WebRTC to streaming video file from server-to-client?
(I am thinking about using WebRTC Native C++ API to create my own server application to connect to the current implementation on chrome or firefox browser client application.)
OK, if it is possible, will it be faster than many current video streaming services?
Yes it is possible as the server can be one of the peers in that peer-to-peer session.
If you respect the protocols and send the video in SRTP packets using VP8, the browser will play it. To help you build these components on other applications or servers, you can check this page and this project as a guide.
Now, comparing WebRTC with other streaming services... It will depend on several variables like the Codec or the protocol. But, for instance, comparing WebRTC (SRTP over UDP with VP8 Codec) against Flash (RTMP over TCP with H264 Codec), I would say that WebRTC wins.
The player will be Flash Player against the native <video> tag.
The transport would be TCP against UDP.
But of course, everything depends on what you are sending to the client.
I have written some apps and plugins using the native WebRTC API, and there isn't a lot of information out there yet, but here are a few useful resources to get you started:
QT Example: http://research.edm.uhasselt.be/jori/qtwebrtc
Native to Browser example: http://sourcey.com/webrtc-native-to-browser-video-streaming-example/
I started with the WebRTC Native C++ to Browser Video Streaming Example but it doesnot build anymore with the actual WebRTC Native Code.
Then I made modifications merging into a standalone process :
management of the peerConnection (the peerconnection_server)
access to Video4Linux capture (the peerconnection_client).
Removing the stream from browser to the WebRTC Native C++ client give a simple solution to access throught a WebRTC browser to a Video4Linux device that is available from GitHub webrtc-streamer.
Live Demo
We are attempting to replace MJPEGs with Webrtc for our server software and have a prototype module for doing this using a smattering of components tied to the Openwebrtc project. It has been an absolute bear to do, and we have frequent ICE negotiation errors (even over a simple LAN), but it mostly works.
We also built a prototype with the Google Webrtc module, but it had many dependencies. I find it easier to work with the Openwebrtc modules because Google's stuff is so tightly tied to general peer-to-peer scenarios on the browser.
I compiled the following from scratch:
libnice 0.1.14
gstreamer-sctp-1.0
usrsctp
Then I have to interact with libnice a bit directly to gather candidates. Also have to write out the SDP files by hand. But the amount of control--being able to control the source of the pipeline--makes it worthwhile. The resulting pipeline (with two clients off one server source) is below:
Of course. I'm writting a program using native WebRTC api which can join the conference as a peer and record both video and audio.
see: How to stream audio from browser to WebRTC native C++ application
and you can definitely streaming media from native app.
I'm sure you can use dummy_audio_file to streaming audio from local file, and you can find a way to access the video streaming progress by your own.
Yes it is. We have developed an load test tool to publish and play for Ant Media Server. This tool can broadcast media file. We used the same native WebRTC library used in Ant Media Server.
Sure it's possible, it allows covert live streaming to WebRTC, for example:
OBS/FFmpeg ---RTMP---> Server ---WebRTC--> Chrome/Client
For this scenario, it allows the ultra low latency live streaming, about 600~800ms, to play the live streaming by WebRTC. Please take a look at this demo.
Does any one knows about RTSP streaming using WOWza Server ?
I want to play it on a MPMoviePlayer controller in iOS6 but it shows not enough buffer to keep it up. My webservice urls work fine because I have also checked them using a browser but I can't find anything about RTSP streaming.
Does any one have any tutorials about RTSP streaming on iPhone using WOWza Server ?
rtsp streaming is not possible on iPhone, iPad and iPod touch. Please refer to the following link.
http://www.wowza.com/forums/content.php?62
MPMoviePlayer only supports HTTP live streaming. to have RTSP working on iPhone, you must implement your client on the iOS.
There is library live555 which implements for you, but you must integrate it with the code. Also decoding of the stream must be implemented in software, by you or a 3rd party library.
Wowza support re-restreaming the content as HLS, if this is your wowza server than there are easy instructions on the www.wowza.com site, if its not perhaps they are streaming HLS.
If you have to use rtsp, there are several good players available on the app store, or you can build your own using lib 555 as mentioned or one of our open or closed sourced frameworks.
https://github.com/mooncatventures-group/AVDemoPlay2L