I am building a web-based project which has webcam one-way broadcasting part .(A user can open its own cam and some viewers can join its room to only view and listen).
So i have decided to use Kurento Media Server(KMS) because of not having any experince with flash.
My questions in my head:
Do i need anything extra beside KMS to make a user broadcast webcam?
Can Kurento provide me the live streaming to webpage?
And What is the difference using Red5 or Kurento?
Thanks in advance
Do i need anything extra beside KMS to make a user broadcast webcam?
You'll probably need a TURN server for users that have some port limitations
Can Kurento provide me the live streaming to webpage?
Sure! Check the tutorials and the documentation for a full list of features.
And What is the difference using Red5 or Kurento?
Kurento is more than just a media server. It is a pluggable platform that offers computer vision and augmented reality capabilities, on top of video and audio streaming, recording and playing. It also offers WebRTC out of the box, which is something Red5 can't do as of today.
Disclaimer: I'm part of the Kurento team.
Related
We have mobile application that historically has used RTSP streaming to allow a user to watch a live stream, which currently is published via Wowza Streaming Engine. We have had a need to lower stream latency, so have gravitated towards WebRTC to achieve this.
The problem is that there seems to be a lack of documentation, or examples regarding the implementation of a react-native WebRTC viewer which connects to a remote stream.
Does anyone out there have any documentation, or code examples for this kind of implementation?
I do note there is a react-native-webrtc library, however, all examples demonstrate connecting two peers on mobile phones with their video cameras i.e. Like facetime. We are after an example demonstrating someone on a phone connecting to a remote streaming server with a video feed.
Cheers,
If you want a webrtc client to connect to a server you need a server doing webrtc with the proper signaling that fit your need. Webrtc don't care which signaling you use, so you have to choose it or choose a the platform you need.
There are a lot of different media server, or library that support webrtc in server side all having there specific signaling(ex: Freeswitch, Kurento etc), or no signaling (ex: Mediasoup). Few will have a react native version as Media Streaming is not really something in the javascript/UI side but you can do something with the webrtc react-native lib.
Twillio has a lot of supported platform and could be a good start if you search a ready to use solution.
I'm new to Webrtc and Javascript. I'm trying to build a video chat application with recording functionality on the server. Currently, I use Easyrtc as Webrtc wrapper to provide the video chat functionality and it's working great. I also setup TURN server on the cloud using Coturn and use this on Easyrtc config.
I would now like to add video recording on the server and learned that this is achieved via media server. I'm keeping an eye on Kurento for this.
I'm just confused with media server in general.
Can Media Server replace TURN Server?
If TURN and Media server are required, can Kurento be installed on the same server as Coturn?
Can I have Easyrtc and add Kurento for video recording? If yes, how can Kurento record the video stream from Easyrtc/Coturn? Would appreciate pseudocode if possible.
Am I on right track? Any other advice to consider?
Should highly appreciate your comments.
Thank you!
I currently have a video chat app working on web(Flash) and android via Adobe AIR, it uses Adobe Media Server (RTMP) as backend for video streaming and shared objects, my question is, if there is another server or solution that provides many to many live video broadcast maybe using H.264 codec from android and iOS, have some sort of user list and room list stored in a database or similar, I want to move away from Adobe as it has many limitations on mobile devices.
Live video is crucial in 1 to many broadcasts that will have hundreds of viewers at the same time.
Thanks for reading!
Ulex.fr created an RTMP connector for Asterisk (the free PBX platform).
Used with the Asterisk Vonference application, it allows you to create conference rooms for 1 to many configuration, with audio and video. The only one limitation is the power of your server. You can plan a scalable architecure in order to broadcast one video to many (many could be unlimited). We developp a specific protocol to connect and manage the connection based on the telephony events. I think we already done a direct RTMP connection that skip this protocol too.
All the project done by ulex.fr is free, OpenSource and GPL.
Get the full project here : https://github.com/voximal/asterisk-rtmp
(a live demo is available)
We already develop an RTMP stack for android with video (using the camera), this allows you to create your own application without using AIR.
You can check Adobe Cirrus, it's still in the beta stage (actually IMHO Adobe forgot about it), but it works on web, desktop and mobile too. Check this Video Phone example, it can handle chat applications without a problem.
http://labs.adobe.com/technologies/cirrus/samples/
You could take a look at Red5 Media Server, which is an open source solution. There are other options like the Wowza's solutions on AWS, but they come a higher cost...
Ok as today, we have decided that we can manage the users,rooms and messages via Google Firebase Real Time Database, and the live video stream using ANT Media Server
Short version:
I need an in-browser solution to deliver the webcam and mic streams to a server.
Long version:
I'm trying to create a live streaming application. So far I've only managed to figure out this workflow:
Client creates stream (some transcoder is probably required here)
Client sends(publishes?) stream to server (basically hosts an RTMP/other stream that should be accessible by my server)
Server transcodes, transrates, etc. and publishes the stream to a CDN
Viewers watch published stream
Ideally, I'd like a browser-based solution that requires minimal setup from the client's end (a Flash plugin download might be acceptable) and streams the webcam and mic inputs to the server. I'm either unaware of the precise keywords or am looking for the wrong thing, but I can't find an apt solution.
Solutions that involve using ffmpeg or vlc to publish a stream aren't really what I'm looking for, since they require additional download and setup, and aren't restricted to just webcam and mic inputs. WebRTC probably won't serve the same quality but if all else fails, I think it can get the job done, at least for some browsers.
I'm using Ubuntu for development and have just activated a trial license for Wowza streaming server and cloud.
Is ffmpeg/vlc et. al. the only way out? Or is there something that can do the job in a single browser tab?
If you go the RTMP way, Adobe Flash Player supports H.264 encoding directly. Since you mentioned Wowza you can find an example and complete source code (including the fla) in the examples directory. There's also a demo here. There are many other open-source Flash capture plugins.
You can also use the aforementioned Flash recorder without Wowza. In this case you'll need a RTMP server, a notable example being the Nginx RTMP module which supports recording (to flv) and also offers callbacks that allow you to launch the transcoding once the recording is done.
With WebRTC you can record (getUserMedia, MediaStreamRecorder) small media chunks and send them to the server where they will get concatenated or using the peer-to-peer communications features of WebRTC (RTCPeerConnection). For a detailed overview see my answer here.
In both cases you'll have issues with devices/browsers that don't support Flash or WebRTC, eg. iPhones, Safari. Plus getUserMedia doesn't capture the same format across all browsers: Firefox audio/video in WebM and Chrome audio in wav and video in WebM.
For mobile devices you'll probably have to write apps.
Firstly I just want to know that what is the difference between freeSwitch and Red5?
As I have very good working experience with red5 and I have made many app's that streamed video/audio using Red5.
But now I am not able to understand that If Red5 can do video/audio conferencing or streamed live video then what is the use of FreeSwitch in conferencing or in other things.
I want to make a app in PHP or rails or Django(Python) where users can record their voice by participating in conference and while recording, the voice of all users will be broadcast to other members.
So now I really want to know what will be the right solution to do this?
FreeSWITCH is primarily a telephony application server. So, it is oriented on solving the telephony tasks. There is also support for WebRTC, and some work is being done for video conferencing.
What you can easily do with FreeSWITCH, is allowing users join your Red5 conference from the telephony network.