getUserMedia: access system output stream - webrtc

I know how to access audio input devices via getUserMedia() and route them to the WebAudio API. This works fine for microphones and such. But in my use-case, I would rather like to hook into the audio stream of an output device. The use case is that I want to create a spectrum analyser for audio coming from a digital audio workstation (DAW) running on the same PC.
I tried to enumerate the devices and call getUserMedia() with the device id of an audio device, but the stream returned only showed silence data. The only solution I found so far is to install an audio loopback device (like Soundflower on Macs) to route the DAW's output to and then use this as an input device for getUserMedia(). But this will require the user to install 3rd party software.
Is there any way to hook directly into the audio stream of an output device instead, before it is actually sent to the physical device (speaker or external soundcard)?

This can be achieved using the desktop capture APIs (chrome.desktopCapture.chooseDesktopMedia). An example for chrome is included here

Related

How to prepare a live video stream to be fed to an HTML5 Video MediaSource?

I have a transfer of a live video stream from a server to the javascript function of a client browser:
server: gstreamer x264enc-hardware ! whatever-I-want ! appsink
=== transfer of data stream with a proprietary protocol ===>
HTML5 browser client: javascript function receives data sent by the appsink
In other words, I'm trying to display a h264 live stream created on a server with a proprietary transfer protocol, the data re-appearing in a javascript function inside an HTML5 client browser.
I was thinking of using MediaSource MSE in the browser to decode h264 and display the image.
Note that the video stream settings (video only, resolution, bandwidth) are fixed and known on both sides. So, everything can be hard-coded and the purpose is not to implement a generic solution.
What could I use on the server side (replacement of the "whatever-I-want" gstreamer plugin) so that the work in the HTML5 browser is not too complicated?
One solution would be to do nothing on the server side and use the broadway.js library to decode NALU h264 in javascript but it obviously doesn't leverage video MediaSource and the decoding capability of the browser.
Could I use Gstreamer avmux_dash and hope that MediaSource can input the transmitted data?
Alternatively, how could I create "MP4 fragments" and could MediaSource read them "easily"?
One approach, that has been used by some major players in the past to translate from one streaming protocol to another, is to receive in your proprietary transfer protocol and then re-package into HLS or DASH on a local server on the device.
You can then stream from that local host to a regular HLS or DASh player on the device.
It sounds inefficient (it is inefficient) but it works, even on mobile devices which their lower processing and power capabilities.

Unable to stream in stereo

I am using a 3d mic that works like a charm on the iPhone using 1/8th jack into an adapter. It works great with the camera app so I know the hardware is able to receive the stereo.
However in my agora.io iPhone app I have the following settings:
audio.setChannelProfile(.liveBroadcasting)
audio.setAudioProfile(.musicHighQualityStereo, scenario: .showRoom)
Is there anything else I need to do for it to work?
I was able to reach Agora Support. The following answer is what I received:
iOS devices does not support stereo audio capture. You would need to use external video source which support stereo audio to do the capture.
I wish this were included in the iOS documentation.
For my use case, a Mac app would be better, so I'm just going to go with that!

How to turn webcam to rtsp

I have a product that can analyze video after inputting an rtsp url.
I would like to use a webcam to stream and feed my product the webcam rtsp.
How can I do that?
It will depend on the webcam you are using - most support RTSP but many do not publish the interface to access the stream as they are designed to be used with the webcam's own companion app.
There are some web resources which provide the RTSP urls for common web cams - you may find it hard to find a match as new versions of webcams roll out but it should give you a feel how to try accessing a vendors camera if you have a specific web cam you are testing against. Some examples (at the time of writing):
https://www.getscw.com/decoding/rtsp
https://soleratec.com/get-support/rtsp/
If you can't find the info for the camera you are using, and you have the companion app, you can also use a network sniffer tool like Wireshark (https://www.wireshark.org) and try to search the traffic for 'rtsp://' pattern.
If you just need to test your app and have access to a raspberry pi with a camera module you can also use this to generate an RTSP stream - there are several approaches for this but one I have found reliable is the v4l2rtspserver server:
https://github.com/mpromonet/v4l2rtspserver
There are specific instructions for setting it up on PI (https://github.com/mpromonet/v4l2rtspserver/wiki/Setup-on-Pi) and you can also verify it is working using VLC player on a laptop etc before testing in your specific application.
There are also a small number of test RTSP urls available on the web - the most reliable seem to be the one at this link provided by Wowza (again, link valid at time of writing):
https://www.wowza.com/html/mobile.html

Creating a WebRTC receiver

I am new to WebRTC and trying to figure out how to create a program outside a browser which receives a WebRTC audio stream and outputs it on speakers.
Are there any WebRTC libraries for Java or C#?
That receiver will be running on a linux machine.
--
I've been thinking about using getUserMedia() to access the microphone. But then:
In what format will such a stream be transmitted?
Let's say I use WebRTC2SIP and build a Java endpoint using JSIP;
or I just use a socket and send the stream over http.
What audio format will I get on the receiver side? So far I have read WebRTC does compress the stream somehow.
I guess there are two ways for you:
build the whole WebRTC voice engine for android/iOS or Mac etc., and just use the API provide by VOE.
build standalone NS/VAD/AECM/AGC modules and using it in your project. for example, you build standalone NS module for android mobile, you use AudioRecord(java layer, android things) to record sound from MIC, and do the noise suppression process on these data(jni layer, WebRTC things), and finally playback the processed data by using AudioTrack(java layer, android things).
EDIT:
for the 2nd situation, the format is PCM raw data.
Check out the working Audio demo and code at demo.easyrtc.com
The code is all open source and can be checked out at https://github.com/priologic/easyrtc
You can look for any known issues around easyRTC at our forum at
https://groups.google.com/forum/#!forum/easyrtc
Also check out our main site at easyrtc.com

Android MedioPlay How to play http or rtsp stream?

I'm trying used mediaplay play http or rtsp protocol uri from server, when I play the
address as http://**.wma or *.mp3 ,it can working,but I tryed played the address as
"http://qr.fm.qq.com/qqradio?qqradio",it didn't working.
and also I'trying used VideoView play rstp protocol uri from server,when I play the adress as
“rstp://*.sdp”,it can working ,but I tryed play the adress as "rtsp://vs1.thmz.com/radio31"
,it didn't working.
Anybody help me and tell me how
These are live streams, not static files, so, while it may play back some .wma and .mp3 content - these live streams are not defined like that.
Are you sure the first stream link is valid? After a quick scan with nmap, it seems you may need to be in china to connect to this feed (qq.com Registrant Country Code - CN) I get 1000 scanned ports all filtered, usually means a firewall blocking specific geographic regions.
rtsp://vs1.thmz.com/radio31 -> This is a Windows Media Audio stream, using WMA2 codec, delivered via RTSP, which according to the Android Supported Media Formats: http://developer.android.com/guide/appendix/media-formats.html - is NOT supported.