Azure Media Services - Can't connect to RTMP - rtmp

I wanted to stream from my mobile device (iPhone 7 Plus / V14.1) with Larix and GoPro App to Azure Media Services:
rtmp://test-livestream-usso.channel.media.azure.net:1935/live/xxxxxxxxxxxxxx70af
Both devices can't connect to the RTMP on Azure. When I tip the rtmp url into my OBS, it works perfect. Any idea what the issue could be here?
Bests,
Yanick

OBS and a lot of encoders automatically add on a stream key at the end of the path.
Just add an additional /whatever after the GUID to make it work.
Also make sure you are not sending >30fps. We only accept up to 30fps 1080P.
BTW, my GoPro Hero 8 works just fine adding an additional stream key path after the ingest path GUID.

Related

How to prepare a live video stream to be fed to an HTML5 Video MediaSource?

I have a transfer of a live video stream from a server to the javascript function of a client browser:
server: gstreamer x264enc-hardware ! whatever-I-want ! appsink
=== transfer of data stream with a proprietary protocol ===>
HTML5 browser client: javascript function receives data sent by the appsink
In other words, I'm trying to display a h264 live stream created on a server with a proprietary transfer protocol, the data re-appearing in a javascript function inside an HTML5 client browser.
I was thinking of using MediaSource MSE in the browser to decode h264 and display the image.
Note that the video stream settings (video only, resolution, bandwidth) are fixed and known on both sides. So, everything can be hard-coded and the purpose is not to implement a generic solution.
What could I use on the server side (replacement of the "whatever-I-want" gstreamer plugin) so that the work in the HTML5 browser is not too complicated?
One solution would be to do nothing on the server side and use the broadway.js library to decode NALU h264 in javascript but it obviously doesn't leverage video MediaSource and the decoding capability of the browser.
Could I use Gstreamer avmux_dash and hope that MediaSource can input the transmitted data?
Alternatively, how could I create "MP4 fragments" and could MediaSource read them "easily"?
One approach, that has been used by some major players in the past to translate from one streaming protocol to another, is to receive in your proprietary transfer protocol and then re-package into HLS or DASH on a local server on the device.
You can then stream from that local host to a regular HLS or DASh player on the device.
It sounds inefficient (it is inefficient) but it works, even on mobile devices which their lower processing and power capabilities.

How to turn webcam to rtsp

I have a product that can analyze video after inputting an rtsp url.
I would like to use a webcam to stream and feed my product the webcam rtsp.
How can I do that?
It will depend on the webcam you are using - most support RTSP but many do not publish the interface to access the stream as they are designed to be used with the webcam's own companion app.
There are some web resources which provide the RTSP urls for common web cams - you may find it hard to find a match as new versions of webcams roll out but it should give you a feel how to try accessing a vendors camera if you have a specific web cam you are testing against. Some examples (at the time of writing):
https://www.getscw.com/decoding/rtsp
https://soleratec.com/get-support/rtsp/
If you can't find the info for the camera you are using, and you have the companion app, you can also use a network sniffer tool like Wireshark (https://www.wireshark.org) and try to search the traffic for 'rtsp://' pattern.
If you just need to test your app and have access to a raspberry pi with a camera module you can also use this to generate an RTSP stream - there are several approaches for this but one I have found reliable is the v4l2rtspserver server:
https://github.com/mpromonet/v4l2rtspserver
There are specific instructions for setting it up on PI (https://github.com/mpromonet/v4l2rtspserver/wiki/Setup-on-Pi) and you can also verify it is working using VLC player on a laptop etc before testing in your specific application.
There are also a small number of test RTSP urls available on the web - the most reliable seem to be the one at this link provided by Wowza (again, link valid at time of writing):
https://www.wowza.com/html/mobile.html

Kurento Media WebRTC to RTP

I am using kurento's master git to make a WebRTC to RTP bridge.
MediaPipeline pipeline = kurento.createMediaPipeline();
WebRtcEndpoint webRtcEndpoint = new WebRtcEndpoint.Builder(pipeline).build();
HttpGetEndpoint httpEndpoint=new HttpGetEndpoint.Builder(pipeline).build();
org.kurento.client.Fraction fr= new org.kurento.client.Fraction(1, 30);
VideoCaps vc= new VideoCaps(VideoCodec.H264,fr);
httpEndpoint.setVideoFormat(vc);
AudioCaps ac= new AudioCaps(AudioCodec.PCMU, 65536);
httpEndpoint.setAudioFormat(ac);
webRtcEndpoint.connect(httpEndpoint);
However inspite of this the output video playing is encoded to webm . I have tried various other approaches as well ( using RTP ENdpoint , using Gstream filter , using VLC HTTP to RTP streamer ) . however no method gives me a video playable on safari and IE ie H264 encoded . Requesting media developers and kurento team for help .
Safari and IE do not support RTP/H.264. From you code, I understand that you are trying to create a WebRTC to tag bridge. In that case, the HttpGetEndpoint will provide media through HTTP pseudostreaming. However, Kurento only provides that type of live HTTP pseudostreaming in WebM format. To be best of my knowledge, neither Safari nor IE support WebM, hence what you want to do will not work independenlty on the caps you force to the HttpGetEndpoint. You will be only able to see it working on Chrome, Fireforx or other browsers with WebM support.
The only solution for you could be the HttpGetEndpoint providing media in MP4 format (or any other format supported by IE and Safari), but creating the live stream in that format is very tricky and we (the Kurento team) did not had the time for implementing that and this feature is not in our short term roadmap.
However, we have many users integrating WebRTC with IE and Safari using RTMP. In that case, you need to integrate Kurento with an RTMP capable media server (this can be done in different ways) and later let the RTMP media server to serve media to the browsers.

How to put local audio to chrome cast in iPhone

I want to play local audio file to chrome cast device,I am getting MPMedia item and i created server in iPhone using iPhonehttpserver.
My problem is how to send MPMediaitem to chromecast device using httpserver. Please help me
You need to send the url that points to your (locally) served media to chromecast; your local server on iPhone provides a url through which other clients can access the media, send that url to chromecast so it can play the media.

Android MedioPlay How to play http or rtsp stream?

I'm trying used mediaplay play http or rtsp protocol uri from server, when I play the
address as http://**.wma or *.mp3 ,it can working,but I tryed played the address as
"http://qr.fm.qq.com/qqradio?qqradio",it didn't working.
and also I'trying used VideoView play rstp protocol uri from server,when I play the adress as
“rstp://*.sdp”,it can working ,but I tryed play the adress as "rtsp://vs1.thmz.com/radio31"
,it didn't working.
Anybody help me and tell me how
These are live streams, not static files, so, while it may play back some .wma and .mp3 content - these live streams are not defined like that.
Are you sure the first stream link is valid? After a quick scan with nmap, it seems you may need to be in china to connect to this feed (qq.com Registrant Country Code - CN) I get 1000 scanned ports all filtered, usually means a firewall blocking specific geographic regions.
rtsp://vs1.thmz.com/radio31 -> This is a Windows Media Audio stream, using WMA2 codec, delivered via RTSP, which according to the Android Supported Media Formats: http://developer.android.com/guide/appendix/media-formats.html - is NOT supported.