Check media pipeline on kurento media server - webrtc

How can I check media pipeline on kurento media server in real time. And how many media pipelines open or close

You can get statistics and existing pipelines/elements through Kurento's ServerManager class (Java docs, JS docs). That class contains a method called getPipelines
An example of implementing the Server Manager can be seen in the Kurento Monitor project.

Related

How to keep streaming if initial client loses connection?

I'm working on an app that streams out multiple presenters via the Agora Live Streaming protocol. Everything works great so long as the person who started the live stream stays connected, however if they lose internet, the stream stops, even if other presenters are still online.
Is there a way to tell the live stream to keep going until "stop live streaming" is called (or all presenters are offline)? My code can handle updating the transcoding config (e.g. video layout) when they go offline.
After multiple discussions with Agora Support, it appears the answer is no, if only using the web SDK, however they are introducing a new server side feature to make this possible.
It's currently in beta, so you'll have to ask Agora Support to enable it for your account, but once you've done so you can create and update an RTMP converter via their server side API instead of relying on the client SDK to manage the stream: https://docs-preprod.agora.io/en/Interactive%20Broadcast/streaming_restful
I'm assuming you're using startLiveStreaming method using the Agora Web SDK. You can attach event listeners on all hosts to listen for primary host's online status, in case the primary host (the host that calls the start method) goes offline - a secondary host can call the start (and transcode) method.
You can also use Agora RTM to signal this status.

Apache + Nginx Server With RTMP Live Streaming

I am using CWP Pro (Control Web Panel)
I have selected webserver = Apache + Nginx
I want to install RTMP and want to live stream on my website with obs studio.
My queries are =
Do I need to install NGINX even if I am using Apache+Nginx server ?
Maximum tutorials / search results are showing NGINX + RTMP installation guide. Do I need to install NGINX too ? Or only RTMP module ?
After installing RTMP, I have created url for streaming (e.g. rtmp://my_ip_address/live/stream_key), and added it in OBS studio. Started OBS streaming. But I am stuck at Code To Embed this live streaming in my html page of my website. How Can I EMbed it with video player lie video.js or other suggestions ?
Please consider about this solution in two parts:
CWP, the admin control dashboard, to manage your system and live streams.
Media System, the live streaming system, to publish by OBS, to play the live stream by some proper protocols.
Generally, there are some HTTP Callback and HTTP-API between the two system, so it's better to deploy and build them separately.
For Media System, the generally workflow is:
Generate the live stream URL by your CWP system, like the RTMP url you mentioned.
Use encoder, OBS as such, to publish the RTMP stream. RTMP is the widely used protocol by encoder, SRT is an optional, WebRTC is also able to publish live stream now, see this post.
Depends on your scenarios, H5 or Mobile, use some players to play the live stream. Well, it's complex, but RTMP definitely doesn't work, please use HLS/HTTP-FLV/DASH/WebRTC, see this post.
There are some commercial solutions too, which does the same things.

Is it possible to save a video stream between two peers in webrtc in the server, realtime?

Suppose I have 2 peers exchanging video with webRTC. Now I need both of the streams to be saved as video files in the central server. Is is possible to do it realtime? (Storing/Uploading the video from peers is not an option).
I thought of making a 3 node webRTC connection, with the 3rd node being the server. This way, I can screen record the 3rd node's stream or save it using some other way. But I am not sure about the reliability/feasibility of the implementation.
This is for a mobile application, and I would avoid any method that involves uploading/saving.
PS: I'm using Agora.io for the purpose of video-conference.
in my opinion
you can do it like the record demo:https://webrtc.github.io/samples/src/content/getusermedia/record/.
record each stream to blobs and push them to your server with websocket.
then convert the blobs to a webm file or just add in a video
Agora doesn't offer on-premise recording out of the box but they do provide thee code for you to be able to launch your own on-premise recording using your own server. Agora has the code and instructions to deploy on GitHub: https://github.com/AgoraIO/Basic-Recording
The way it works, once you have set up the Agora Recording SDK, the client would trigger the recording to start, via user interaction (button tap) or some other event (i.e. peer-joined or stream-subscribed) this will trigger the recording service to join the channel and record the streams. _The service outputs the video file once recording has stopped.
you need a WebRTC media server.
WebRTC media servers makes it possible to support more complex
scenarios WebRTC media servers are servers that act as WebRTC clients
but run on the server side. They are termination points for the media
where we’d like to take action. Popular tasks done on WebRTC media
servers include:
Group calling Recording Broadcast and live streaming Gateway to other
networks/protocols Server-side machine learning Cloud rendering
(gaming or 3D) The adventurous and strong hearted will go and develop
their own WebRTC media server. Most would pick a commercial service or
an open source one. For the latter, check out these tips for choosing
WebRTC open source media server framework.
In many cases, the thing developers are looking for is support for
group calling, something that almost always requires a media server.
In that case, you need to decide if you’d go with the classing (and
now somewhat old) MCU mixing model or with the more accepted and
modern SFU routing model. You will also need to think a lot about the
sizing of your WebRTC media server.
For recording WebRTC sessions, you can either do that on the client
side or the server side. In both cases you’ll be needing a server, but
what that server is and how it works will be very different in each
case.
If it is broadcasting you’re after, then you need to think about the
broadcast size of your WebRTC session.
link:https://bloggeek.me/webrtc-server/

Adobe Media Server Alternative for VideoChat

I currently have a video chat app working on web(Flash) and android via Adobe AIR, it uses Adobe Media Server (RTMP) as backend for video streaming and shared objects, my question is, if there is another server or solution that provides many to many live video broadcast maybe using H.264 codec from android and iOS, have some sort of user list and room list stored in a database or similar, I want to move away from Adobe as it has many limitations on mobile devices.
Live video is crucial in 1 to many broadcasts that will have hundreds of viewers at the same time.
Thanks for reading!
Ulex.fr created an RTMP connector for Asterisk (the free PBX platform).
Used with the Asterisk Vonference application, it allows you to create conference rooms for 1 to many configuration, with audio and video. The only one limitation is the power of your server. You can plan a scalable architecure in order to broadcast one video to many (many could be unlimited). We developp a specific protocol to connect and manage the connection based on the telephony events. I think we already done a direct RTMP connection that skip this protocol too.
All the project done by ulex.fr is free, OpenSource and GPL.
Get the full project here : https://github.com/voximal/asterisk-rtmp
(a live demo is available)
We already develop an RTMP stack for android with video (using the camera), this allows you to create your own application without using AIR.
You can check Adobe Cirrus, it's still in the beta stage (actually IMHO Adobe forgot about it), but it works on web, desktop and mobile too. Check this Video Phone example, it can handle chat applications without a problem.
http://labs.adobe.com/technologies/cirrus/samples/
You could take a look at Red5 Media Server, which is an open source solution. There are other options like the Wowza's solutions on AWS, but they come a higher cost...
Ok as today, we have decided that we can manage the users,rooms and messages via Google Firebase Real Time Database, and the live video stream using ANT Media Server

Is it possible to use WebRTC to streaming video from Server to Client?

In WebRTC, I always see the implementation about peer-to-peer and how to get video streaming from one client to another client. How about server-to-client?
Is it possible for WebRTC to streaming video file from server-to-client?
(I am thinking about using WebRTC Native C++ API to create my own server application to connect to the current implementation on chrome or firefox browser client application.)
OK, if it is possible, will it be faster than many current video streaming services?
Yes it is possible as the server can be one of the peers in that peer-to-peer session.
If you respect the protocols and send the video in SRTP packets using VP8, the browser will play it. To help you build these components on other applications or servers, you can check this page and this project as a guide.
Now, comparing WebRTC with other streaming services... It will depend on several variables like the Codec or the protocol. But, for instance, comparing WebRTC (SRTP over UDP with VP8 Codec) against Flash (RTMP over TCP with H264 Codec), I would say that WebRTC wins.
The player will be Flash Player against the native <video> tag.
The transport would be TCP against UDP.
But of course, everything depends on what you are sending to the client.
I have written some apps and plugins using the native WebRTC API, and there isn't a lot of information out there yet, but here are a few useful resources to get you started:
QT Example: http://research.edm.uhasselt.be/jori/qtwebrtc
Native to Browser example: http://sourcey.com/webrtc-native-to-browser-video-streaming-example/
I started with the WebRTC Native C++ to Browser Video Streaming Example but it doesnot build anymore with the actual WebRTC Native Code.
Then I made modifications merging into a standalone process :
management of the peerConnection (the peerconnection_server)
access to Video4Linux capture (the peerconnection_client).
Removing the stream from browser to the WebRTC Native C++ client give a simple solution to access throught a WebRTC browser to a Video4Linux device that is available from GitHub webrtc-streamer.
Live Demo
We are attempting to replace MJPEGs with Webrtc for our server software and have a prototype module for doing this using a smattering of components tied to the Openwebrtc project. It has been an absolute bear to do, and we have frequent ICE negotiation errors (even over a simple LAN), but it mostly works.
We also built a prototype with the Google Webrtc module, but it had many dependencies. I find it easier to work with the Openwebrtc modules because Google's stuff is so tightly tied to general peer-to-peer scenarios on the browser.
I compiled the following from scratch:
libnice 0.1.14
gstreamer-sctp-1.0
usrsctp
Then I have to interact with libnice a bit directly to gather candidates. Also have to write out the SDP files by hand. But the amount of control--being able to control the source of the pipeline--makes it worthwhile. The resulting pipeline (with two clients off one server source) is below:
Of course. I'm writting a program using native WebRTC api which can join the conference as a peer and record both video and audio.
see: How to stream audio from browser to WebRTC native C++ application
and you can definitely streaming media from native app.
I'm sure you can use dummy_audio_file to streaming audio from local file, and you can find a way to access the video streaming progress by your own.
Yes it is. We have developed an load test tool to publish and play for Ant Media Server. This tool can broadcast media file. We used the same native WebRTC library used in Ant Media Server.
Sure it's possible, it allows covert live streaming to WebRTC, for example:
OBS/FFmpeg ---RTMP---> Server ---WebRTC--> Chrome/Client
For this scenario, it allows the ultra low latency live streaming, about 600~800ms, to play the live streaming by WebRTC. Please take a look at this demo.