How to play UDP stream on browser? - udp

I'm new to Stack Overflow and I have a question: I want to make an online TV website. I've got the UDP streaming links for each TV channels (ex. udp://#225.1.2.249:30120). Then, I can't find the way how to play that stream on video player in browsers. (I've already tried Video.js and hls.js but both not work, but it easily play on VLC).
I know that my grammar is terrible, hope you forgive that.

You don’t. Browsers support HTTP, Websockets, and webrtc. Any other protocol requires a browser plugin.

Related

using webrtc for audio broadcast

I'm trying to stream a microphone/audio to multiple clients.
the broadcaster is a screenless raspberry, so I can't open a Webbrowser and click on "share mircophone"
The clients will be using their smartphone to listen.
the latency must be super low.
I did not find any WebRTC Demo that worked. All of them are either p2p or the scalable Broadcasting from muaz khan is only working for the initiator; not clients.
I came across Janus (which I didn't really understand what exactly this is doing) but I don't get how to install this and how to configure it.
Is there any way to easily share the microphone's output via WebRTC? Something like Apache hosting a simple website where the microphone audio is hosted on?
Thanks for all the ideas on how to solve it!
Is there any way to easily share the microphone's output via WebRTC?
No. There's nothing easy or simple about WebRTC.
the broadcaster is a screenless raspberry, so I can't open a Webbrowser and click on "share mircophone"
This is the simplest option... running a browser. Are you sure you need to actually allow it to access the audio device?
In the past, I've used a flag on Chromium to get around this problem. I don't remember exactly what that flag was, but looking at the list, it might have been...
--use-fake-ui-for-media-stream
You might also be able to use --enable-kiosk-mode.
At a minimum, if you were to open the browser interactively and enable access, that page would get automatic access in the future.
I did not find any WebRTC Demo that worked. All of them are either p2p
WebRTC is peer-to-peer, but remember that the "server" can be one of those "peers".
Finally, you can look into using GStreamer, but don't expect anything quick and easy. https://github.com/centricular/gstwebrtc-demos

How to Capture Live Stream from camera and watch live or save for later viewing?

I am trying to make an online examination portal. When students start the exam, their webcam will start automatically and record the stream live and store in the server. Invigilators will either watch the students live or they can watch the saved live streams later.
I researched about this and found WebRTC as a possible solution along with a gateway server like Kurento. But later found out that WebRTC is not supported in Safari, which is a setback! My application should run successfully in web portal in any modern browsers which includes safari and also in android or iphone.
So can anyone suggest a possible solution to my problem? Which technology should I use that can support all browsers and OS?
Also, it would be helpful if you can provide links to good documentation or tutorials.
Note from the future (2020): This answer really isn't accurate anymore.
WebRTC is one problem... capture from the camera with getUserMedia is another. Safari doesn't support either.
There is no video capture API in Safari currently. The only thing you can do is make a native app for iOS.
Worse yet, because of Apple's restrictive policies, alternative browsers, such as Chrome, are crippled on iOS as they aren't allowed to use their own browser engines.
Use standards based technologies like getUserMedia and WebRTC for your primary web-based application. If you decide that the economics of your situation enable it, you can make an iOS app to work alongside until Apple decides to participate in modern browser standards like everyone else.
You can use Mediadevices.getUserMedia (https://developer.mozilla.org/en-US/docs/Web/API/MediaDevices/getUserMedia) to capture webcam stream on browser (chrome and firefox).
To play with webcam stream on safari, you would have to use a pollyfill - https://github.com/Temasys/AdapterJS
To record the video/audio stream, you can make use of Media recorder api https://developer.mozilla.org/en-US/docs/Web/API/MediaRecorder (Note : recording stream is still a challenge in Safari as there is no support/pollyfill. However, it works perfectly on Chrome and Firefox latest versions).
Helpful demonstrations :
https://webrtc.github.io/samples/
https://mozdevs.github.io/MediaRecorder-examples/index.html
https://codepen.io/collection/XjkNbN/
https://hacks.mozilla.org/2016/04/record-almost-everything-in-the-browser-with-mediarecorder/

WebRTC -- can getUserMedia use local stream?

I want to let WebRTC encoded and play h264(NAL) stream(local file).
In the WebRTC tutorial, getUserMedia is use for get local camera connecting to the system, I don`t know if the getUserMedia function support
capture the local stream file like h264 stream.
If it doesn't work that way, may be I should modify WebRTC source code(I'm studying it).
Here is the question, If i change WebRTC code, how can i integration the new code into browser? Made it a plugin?
Firefox supports an extension to the <video> element that you can use to do this.
First, set the source of a video element:
v1.src = "file:///...";
Then you can call the (currently prefixed) mozCaptureStream or mozCaptureStreamUntilEnded function to get a MediaStream.
stream = v1.mozCaptureStream();
The proposed specification.
Note however that you need to ensure that the file is same origin with respect to the page. The same origin rules for file:/// are probably going to cause issues. Otherwise your MediaStream isn't going to be accessible to you. One way to ensure that is not to set the location directly, but to load the file using an <input type="file"> element.
As noted in other answers, Firefox currently only supports the baseline profile of H.264.
First, you are right getusermedia will not work for you. However, there are a couple of options.
Hack a stream together using RTCDataChannel. Breaking up the media stream and delivering each packet and then handling it on the client side.
Take a look at this demo for precorded media streams. I do not believe that H264 is addressed but it could help you on your way(probably for Firefox only)
Use some sort of webrtc breaker/endpoint that is native to stream the file. I know specifically that others(including myself) have streamed H264 to Firefox through the Janus-Gateway
Couple of asides:
Firefox only supports Baseline profiles in streaming h264 for a webrtc peerconnection
Chrome does not support h264 for webrtc at all
Are you trying to have getUserMedia return a h.264 encoded stream?
In which case, today it will only be possible with Firefox today, under some specific environment (cisco 264 plugin installed) and only for the base profile.
Chrome promised in november to add this capacity, but there is no timeline that I know of Expect at least Q2 2015.
Using our (temasys) commercial plugin you will soon be able to do that in IE and Safari.
Those are the only options on client side I can think of. On server side you can use whatever you want to transcode, including janus, kurento, powermedia, licode/lynkia, ....
Note: using other means like Datachannel or WebSocket are ok to transfer files, but would greatly reduce the user experience as you would not have all the added recovery (and security) mechanisms included in SRTP, DTLS, and would also not have specific mistreated media enhancements that are in webRTC like jitter, buffers, netQ, ect ...

All the examples of WebRTC are video chat, is it possible to send any type of video over WebRTC?

So I want to be able to send a normal video from a video file (AVI or any other) through WebRTC, can that be done? The only examples I see of WebRTC are video chats, so I feel as if its only geared towards webcam and chats.
So my question is, technically can sending normal video from a video file (not webcam) over WebRTC be done?
Try: "Pre-recorded media streaming" --- Documentation and Source Code.
This experiment uses MediaSource API to render Blobs in <video> element. This experiment has some issues need to be fixed e.g. it can't send longer WebM videos.
You can try this experiment as well.
The codecs typically used in AVI are not directly supported by WebRTC clients, but if you are writing your own standalone client then of course it could read an AVI or other video file and transcode it to VP8 video and Opus audio (or whatever other codecs you were able to negotiate), and transmit it via RTP. If you are trying to do video transcoding in JavaScript in a browser then that will be very slow.

Lot of noise in webrtc audio/video

I have developed a video chat app with webrtc api. I have fallowed the steps given by webrtc. Video working fine. But there is a lot of noice from my laptop. sound is not clear.
But in google developed demo site https://apprtc.appspot.com/ works with out any noise(better compare with us).
I fallowed the same procedure what they did. But no luck.
But in headset this echo is not hearing. This happens when we haering the sound from laptop without headset.
Please give me some suggestion on this.
Thanks in advance. Looking foward for the response.
Give a look at this demo webrtc conferencing supports four callers. This link describes the implentation details architecture