Android MedioPlay How to play http or rtsp stream? - android-mediaplayer

I'm trying used mediaplay play http or rtsp protocol uri from server, when I play the
address as http://**.wma or *.mp3 ,it can working,but I tryed played the address as
"http://qr.fm.qq.com/qqradio?qqradio",it didn't working.
and also I'trying used VideoView play rstp protocol uri from server,when I play the adress as
“rstp://*.sdp”,it can working ,but I tryed play the adress as "rtsp://vs1.thmz.com/radio31"
,it didn't working.
Anybody help me and tell me how

These are live streams, not static files, so, while it may play back some .wma and .mp3 content - these live streams are not defined like that.
Are you sure the first stream link is valid? After a quick scan with nmap, it seems you may need to be in china to connect to this feed (qq.com Registrant Country Code - CN) I get 1000 scanned ports all filtered, usually means a firewall blocking specific geographic regions.
rtsp://vs1.thmz.com/radio31 -> This is a Windows Media Audio stream, using WMA2 codec, delivered via RTSP, which according to the Android Supported Media Formats: http://developer.android.com/guide/appendix/media-formats.html - is NOT supported.

Related

Azure Media Services - Can't connect to RTMP

I wanted to stream from my mobile device (iPhone 7 Plus / V14.1) with Larix and GoPro App to Azure Media Services:
rtmp://test-livestream-usso.channel.media.azure.net:1935/live/xxxxxxxxxxxxxx70af
Both devices can't connect to the RTMP on Azure. When I tip the rtmp url into my OBS, it works perfect. Any idea what the issue could be here?
Bests,
Yanick
OBS and a lot of encoders automatically add on a stream key at the end of the path.
Just add an additional /whatever after the GUID to make it work.
Also make sure you are not sending >30fps. We only accept up to 30fps 1080P.
BTW, my GoPro Hero 8 works just fine adding an additional stream key path after the ingest path GUID.

How to turn webcam to rtsp

I have a product that can analyze video after inputting an rtsp url.
I would like to use a webcam to stream and feed my product the webcam rtsp.
How can I do that?
It will depend on the webcam you are using - most support RTSP but many do not publish the interface to access the stream as they are designed to be used with the webcam's own companion app.
There are some web resources which provide the RTSP urls for common web cams - you may find it hard to find a match as new versions of webcams roll out but it should give you a feel how to try accessing a vendors camera if you have a specific web cam you are testing against. Some examples (at the time of writing):
https://www.getscw.com/decoding/rtsp
https://soleratec.com/get-support/rtsp/
If you can't find the info for the camera you are using, and you have the companion app, you can also use a network sniffer tool like Wireshark (https://www.wireshark.org) and try to search the traffic for 'rtsp://' pattern.
If you just need to test your app and have access to a raspberry pi with a camera module you can also use this to generate an RTSP stream - there are several approaches for this but one I have found reliable is the v4l2rtspserver server:
https://github.com/mpromonet/v4l2rtspserver
There are specific instructions for setting it up on PI (https://github.com/mpromonet/v4l2rtspserver/wiki/Setup-on-Pi) and you can also verify it is working using VLC player on a laptop etc before testing in your specific application.
There are also a small number of test RTSP urls available on the web - the most reliable seem to be the one at this link provided by Wowza (again, link valid at time of writing):
https://www.wowza.com/html/mobile.html

How to play live FLV stream?

I am capturing video from webcam in my PC and in the fly convert it to FLV (using ffmpeg).
As a result I have a continuously growing .FLV file.
And now I would like to play it as a live stream.
I was trying VLC but it plays the file no longer than the duration read from file on initialization.
What player can I use for live playing FLV?
I am working on Ubuntu 16.04.
Thank you in advance for your answers!
You cannot play live FLV directly but there is a tricky protocol popular among Chinese live streaming platform called "http-flv" that would play live flv within http framework.
Why http-flv?
Latency for HLS / Dash is long. It is about 10 to 20+ seconds.
Http-flv reduces end-to-end the latency to ~5 seconds. It could be played on browsers with MSE support.
How it works?
FLV is a simple container that "supports" file-based progressive streaming because one could get partial byte range in a flv video and still play it ( for mp4, you would need meta like moov etc for playback. )
For file server, host a growing flv file and remove the HTTP response header "content length" so that when client request the file, it does not know the response body size. It would keep the connection and receive videos segments until connection ends.
On client side, use flv.js to fetch only the latest segments for a flv file and perform the playback.
A lot of other tricks that would make the pipeline work.
There are a lot of source online you could play around with. Here are some references:
https://github.com/Bilibili/flv.js/
https://github.com/winshining/nginx-http-flv-module
A blog about how to achieve this: https://www.yanxurui.cc/posts/server/2017-11-25-http-flv/

Kurento Media WebRTC to RTP

I am using kurento's master git to make a WebRTC to RTP bridge.
MediaPipeline pipeline = kurento.createMediaPipeline();
WebRtcEndpoint webRtcEndpoint = new WebRtcEndpoint.Builder(pipeline).build();
HttpGetEndpoint httpEndpoint=new HttpGetEndpoint.Builder(pipeline).build();
org.kurento.client.Fraction fr= new org.kurento.client.Fraction(1, 30);
VideoCaps vc= new VideoCaps(VideoCodec.H264,fr);
httpEndpoint.setVideoFormat(vc);
AudioCaps ac= new AudioCaps(AudioCodec.PCMU, 65536);
httpEndpoint.setAudioFormat(ac);
webRtcEndpoint.connect(httpEndpoint);
However inspite of this the output video playing is encoded to webm . I have tried various other approaches as well ( using RTP ENdpoint , using Gstream filter , using VLC HTTP to RTP streamer ) . however no method gives me a video playable on safari and IE ie H264 encoded . Requesting media developers and kurento team for help .
Safari and IE do not support RTP/H.264. From you code, I understand that you are trying to create a WebRTC to tag bridge. In that case, the HttpGetEndpoint will provide media through HTTP pseudostreaming. However, Kurento only provides that type of live HTTP pseudostreaming in WebM format. To be best of my knowledge, neither Safari nor IE support WebM, hence what you want to do will not work independenlty on the caps you force to the HttpGetEndpoint. You will be only able to see it working on Chrome, Fireforx or other browsers with WebM support.
The only solution for you could be the HttpGetEndpoint providing media in MP4 format (or any other format supported by IE and Safari), but creating the live stream in that format is very tricky and we (the Kurento team) did not had the time for implementing that and this feature is not in our short term roadmap.
However, we have many users integrating WebRTC with IE and Safari using RTMP. In that case, you need to integrate Kurento with an RTMP capable media server (this can be done in different ways) and later let the RTMP media server to serve media to the browsers.

WebRTC -- can getUserMedia use local stream?

I want to let WebRTC encoded and play h264(NAL) stream(local file).
In the WebRTC tutorial, getUserMedia is use for get local camera connecting to the system, I don`t know if the getUserMedia function support
capture the local stream file like h264 stream.
If it doesn't work that way, may be I should modify WebRTC source code(I'm studying it).
Here is the question, If i change WebRTC code, how can i integration the new code into browser? Made it a plugin?
Firefox supports an extension to the <video> element that you can use to do this.
First, set the source of a video element:
v1.src = "file:///...";
Then you can call the (currently prefixed) mozCaptureStream or mozCaptureStreamUntilEnded function to get a MediaStream.
stream = v1.mozCaptureStream();
The proposed specification.
Note however that you need to ensure that the file is same origin with respect to the page. The same origin rules for file:/// are probably going to cause issues. Otherwise your MediaStream isn't going to be accessible to you. One way to ensure that is not to set the location directly, but to load the file using an <input type="file"> element.
As noted in other answers, Firefox currently only supports the baseline profile of H.264.
First, you are right getusermedia will not work for you. However, there are a couple of options.
Hack a stream together using RTCDataChannel. Breaking up the media stream and delivering each packet and then handling it on the client side.
Take a look at this demo for precorded media streams. I do not believe that H264 is addressed but it could help you on your way(probably for Firefox only)
Use some sort of webrtc breaker/endpoint that is native to stream the file. I know specifically that others(including myself) have streamed H264 to Firefox through the Janus-Gateway
Couple of asides:
Firefox only supports Baseline profiles in streaming h264 for a webrtc peerconnection
Chrome does not support h264 for webrtc at all
Are you trying to have getUserMedia return a h.264 encoded stream?
In which case, today it will only be possible with Firefox today, under some specific environment (cisco 264 plugin installed) and only for the base profile.
Chrome promised in november to add this capacity, but there is no timeline that I know of Expect at least Q2 2015.
Using our (temasys) commercial plugin you will soon be able to do that in IE and Safari.
Those are the only options on client side I can think of. On server side you can use whatever you want to transcode, including janus, kurento, powermedia, licode/lynkia, ....
Note: using other means like Datachannel or WebSocket are ok to transfer files, but would greatly reduce the user experience as you would not have all the added recovery (and security) mechanisms included in SRTP, DTLS, and would also not have specific mistreated media enhancements that are in webRTC like jitter, buffers, netQ, ect ...