Streaming RTSP to WebRTC using Kurento - webrtc

I have been testing out Kurento for a while now.
I have gone through one2many sample, and got everything working.
Now I would like to do the same, but have the "presenter" be an RTSP source.
I don't have much experience with RTSP, so I might be missing something. I have looked over several samples and they all use the PlayerEndpoint, which receives an rtsp://... address.
For my implementation, I would rather the camera access a Kurento URL in order to initiate the RTSP stream.
Since I have very limited experience with RTSP, I'm not sure if this is possible and if it's a common practice.
If not, what are the alternative in a case where I don't know the RTSP URI in advance and don't have a UI to input it at runtime?

Follow this example project, you can simply make Kurento streaming RTSP to WebRTC on the fly.

Related

React Native - Connecting to remote WebRTC stream

We have mobile application that historically has used RTSP streaming to allow a user to watch a live stream, which currently is published via Wowza Streaming Engine. We have had a need to lower stream latency, so have gravitated towards WebRTC to achieve this.
The problem is that there seems to be a lack of documentation, or examples regarding the implementation of a react-native WebRTC viewer which connects to a remote stream.
Does anyone out there have any documentation, or code examples for this kind of implementation?
I do note there is a react-native-webrtc library, however, all examples demonstrate connecting two peers on mobile phones with their video cameras i.e. Like facetime. We are after an example demonstrating someone on a phone connecting to a remote streaming server with a video feed.
Cheers,
If you want a webrtc client to connect to a server you need a server doing webrtc with the proper signaling that fit your need. Webrtc don't care which signaling you use, so you have to choose it or choose a the platform you need.
There are a lot of different media server, or library that support webrtc in server side all having there specific signaling(ex: Freeswitch, Kurento etc), or no signaling (ex: Mediasoup). Few will have a react native version as Media Streaming is not really something in the javascript/UI side but you can do something with the webrtc react-native lib.
Twillio has a lot of supported platform and could be a good start if you search a ready to use solution.

Videochat without using webrtc?

I say "without" using webrtc because, webrtc currently supports very few codecs, I would like to make a videochat using a custom or experimental codecs, so I'm starting with implementing basic udp websockets, but how do I implement something like SDP? Since the ip of the clients will keep changing, how do I maintain connection? Can anyone point me to a decent way of implementing this?

Is it possible to deliver RTSP stream via Kurento. WebRTC to RTSP

I want to use Kurento as media server which takes WebRTC as an input and provides RTSP stream as url: rtsp://kurento/streamName
Is this possible?
I saw https://github.com/lulop-k/kurento-rtsp2webrtc/ project which does opposite thing.
My final goal is to deliver a stream to mobile browsers via JSMPEG.
This is not possible, as Kurento team says: "We can consume it, but not produce it."
Now, as a common solution for this, you could stream from Kurento to Wowza media server using an RTP endpoint, and then re-stream RTSP from Wowza. In KMS google group there is a lot of content related to the integration between the two of them.

Is using Kurento Media Server can provide me webcam broadcasting like Red5?

I am building a web-based project which has webcam one-way broadcasting part .(A user can open its own cam and some viewers can join its room to only view and listen).
So i have decided to use Kurento Media Server(KMS) because of not having any experince with flash.
My questions in my head:
Do i need anything extra beside KMS to make a user broadcast webcam?
Can Kurento provide me the live streaming to webpage?
And What is the difference using Red5 or Kurento?
Thanks in advance
Do i need anything extra beside KMS to make a user broadcast webcam?
You'll probably need a TURN server for users that have some port limitations
Can Kurento provide me the live streaming to webpage?
Sure! Check the tutorials and the documentation for a full list of features.
And What is the difference using Red5 or Kurento?
Kurento is more than just a media server. It is a pluggable platform that offers computer vision and augmented reality capabilities, on top of video and audio streaming, recording and playing. It also offers WebRTC out of the box, which is something Red5 can't do as of today.
Disclaimer: I'm part of the Kurento team.

Is it possible to use WebRTC to streaming video from Server to Client?

In WebRTC, I always see the implementation about peer-to-peer and how to get video streaming from one client to another client. How about server-to-client?
Is it possible for WebRTC to streaming video file from server-to-client?
(I am thinking about using WebRTC Native C++ API to create my own server application to connect to the current implementation on chrome or firefox browser client application.)
OK, if it is possible, will it be faster than many current video streaming services?
Yes it is possible as the server can be one of the peers in that peer-to-peer session.
If you respect the protocols and send the video in SRTP packets using VP8, the browser will play it. To help you build these components on other applications or servers, you can check this page and this project as a guide.
Now, comparing WebRTC with other streaming services... It will depend on several variables like the Codec or the protocol. But, for instance, comparing WebRTC (SRTP over UDP with VP8 Codec) against Flash (RTMP over TCP with H264 Codec), I would say that WebRTC wins.
The player will be Flash Player against the native <video> tag.
The transport would be TCP against UDP.
But of course, everything depends on what you are sending to the client.
I have written some apps and plugins using the native WebRTC API, and there isn't a lot of information out there yet, but here are a few useful resources to get you started:
QT Example: http://research.edm.uhasselt.be/jori/qtwebrtc
Native to Browser example: http://sourcey.com/webrtc-native-to-browser-video-streaming-example/
I started with the WebRTC Native C++ to Browser Video Streaming Example but it doesnot build anymore with the actual WebRTC Native Code.
Then I made modifications merging into a standalone process :
management of the peerConnection (the peerconnection_server)
access to Video4Linux capture (the peerconnection_client).
Removing the stream from browser to the WebRTC Native C++ client give a simple solution to access throught a WebRTC browser to a Video4Linux device that is available from GitHub webrtc-streamer.
Live Demo
We are attempting to replace MJPEGs with Webrtc for our server software and have a prototype module for doing this using a smattering of components tied to the Openwebrtc project. It has been an absolute bear to do, and we have frequent ICE negotiation errors (even over a simple LAN), but it mostly works.
We also built a prototype with the Google Webrtc module, but it had many dependencies. I find it easier to work with the Openwebrtc modules because Google's stuff is so tightly tied to general peer-to-peer scenarios on the browser.
I compiled the following from scratch:
libnice 0.1.14
gstreamer-sctp-1.0
usrsctp
Then I have to interact with libnice a bit directly to gather candidates. Also have to write out the SDP files by hand. But the amount of control--being able to control the source of the pipeline--makes it worthwhile. The resulting pipeline (with two clients off one server source) is below:
Of course. I'm writting a program using native WebRTC api which can join the conference as a peer and record both video and audio.
see: How to stream audio from browser to WebRTC native C++ application
and you can definitely streaming media from native app.
I'm sure you can use dummy_audio_file to streaming audio from local file, and you can find a way to access the video streaming progress by your own.
Yes it is. We have developed an load test tool to publish and play for Ant Media Server. This tool can broadcast media file. We used the same native WebRTC library used in Ant Media Server.
Sure it's possible, it allows covert live streaming to WebRTC, for example:
OBS/FFmpeg ---RTMP---> Server ---WebRTC--> Chrome/Client
For this scenario, it allows the ultra low latency live streaming, about 600~800ms, to play the live streaming by WebRTC. Please take a look at this demo.