How to get what codec am I using in WebRTC? - webrtc

How to get information about what codec am I using in WebRTC? And how to change it to another.
With default settings I get bad quality audio, like it is from a speaker.

the getStats API provides that information. See this sample
or alternatively chrome's chrome://webrtc-internals page.
apprtc has an information window that shows the codec used when you press 'i' while in a call.
The default is opus which should give you 'hd quality'

To change the codec you actually need to tamper with SDP. I also needed the same and to make my life easier I even wrote a SDP parser.

Related

How can I send audio data from a microphone to an API?

I would like to use an API that checks pronunciation. Its input is audio. I would like to achieve the following: the user speaks to the microphone - an audio file is generated - and it is sent to the API. The API sends back the answer - the evaluation of the pronunciation. How can achieve this?
I would be also interested in how to display the microphone for the users.
My main aim is to make it work in a browser.
Thank you very much for your answer.
I couldn't set up the audio recording yet.

Webrtc-sip codec transcoding

I have a setup to make sip-webrtc audio call and I have a scenario, where my browser uses only opus codec and my asterisk using only ilbc codec. So how can we make an audio to work in this scenrio? As I read from google we can do this by codec trans-coding(codec conversion) but I am not sure where and how I can implement this.
Can someone help me out on this please?
Regards,
Aravind
The WebRTC spec supports ilbc can you get your browser to use it?
If you can't then the fallback option is always to fallback to the G711 audio codec (ulaw or alaw) which has pretty much universal support including by WebRTC, Asterisk and every SIP phone I've ever come across.

AT&T Enhanced WebRTC + play .mp3

I'm trying to use AT&T's WebRTC JavaScript development framework. I don't see documentation for modifying a live call. Is it possible to modify a live call by playing an .mp3 audio file?
http://developer.att.com/enhanced-webrtc
I'm not familiar with AT&T framework, but you can simply add another AUDIO tag that you will play during the call whenever you want. If you also want to interrupt the call, then you can simply mute the AUDIO/VIDEO tag with the remote stream attached.

SMIL adaptive streaming in Videojs

What is required to use SMIL file to utilize adaptive streaming in a videojs player. I have created the SMIL file in my wowza application and it is creating my 4 separate streams and making them available. However I cannot get my webpage, that uses videojs, to correctly play the SMIL file. Hints on that coding or where to go to find the correct documentation would be greatly appreciated.
There aren't many implementations of SMIL players. I'm sure I've seen wowza URLs that suggest it will output the SMIL as other formats, something like whatever.smil/manifest.m3u8. That's HLS which could be played on mobile and Safari natively and with videojs-contrib-hls elsewhere.
I know the question is old, but I've been struggling with this recently, so I want to share my experience in case anyone is interested. My scenario is very similar: want to deliver adaptive bitrate streaming from Wowza to clients using videojs.
There is a master link that explains how to setup and run Wowza Transcoder for live streaming, and how to set up your Adaptive Bitrate Streams using an SMIL file. Following the video in there you can achieve to have a stream that uses ABS, but the SMIL file is attached to the stream name, so it is not a solution if you have streams that come to Wowza from another Media Server origin and that need to be transcoded before being served to the clients. In the article there are a few key things mentioned (like the Stream Name Groups), but somehow things doesn't seem pretty clear, at least to me. So here is some clarification from what I understood from all articles I read and what I did to achieve ABS:
You can achieve ABS in Wowza either with SMIL files or with Stream Name Groups (NGRP). NGRP refres to a block of streams that is defined in the Transcoder template that can be played back using multi-bitrate streaming (dynamically) (<- this is what I used). And SMIL files are used to create a "static" list of streams for multi-bitrate VOD streaming. If you are using Wowza Origin-Edge Delivery you'll need the .smil file, because NGRP do not get forwarded to the edge. (Source for all this information: here).
In case you need the SMIL file, you probably need to generate a new one for every stream, and probably you want to do that in an automated way, so best way would be through an HTTP request (in the link above it is explained how to achieve this).
In case you can live with NGRP, things are a bit easier:
You need to enable Wowza Transcoder (this is pretty easy and steps are in the video I mention above).
You should create your own Transcoder Template with the different stream presets you want to deliver, as an example you can check the default ones that are already there. The more presets you add, the more work Wowza will need to do whenever a stream comes, since it will need to generate a new stream for every preset that you have defined.
Now is when we generate the NGRPs. In your Transcoder Template, you can generate as many NGRPs as you want (to clarify: these are like groups of streams, that you will be able to set in your clients video player. Each NGRP contains the streams that the video will be able to use when doing the adaptive bitrate streaming). For instance, these are the default NGRPs:
If you play the ngrp "_mobile" in the clients video player, the ABS algorithm in the player will be able to adapt itself to play either the 240p or the 160p streams based on the client capabilities.
So imagine you have these two NGRP. In order to play them in videoJS, you will need to set the source to:
http://[wowza-ip-address]:1935/<name-of-your-application>/ngrp:myStream_all/playlist.m3u8
or
http://[wowza-ip-address]:1935/<name-of-your-application>/ngrp:myStream_mobile/playlist.m3u8
... based on how many options you want to provide to the client player to use for the ABS. (For instance: if your targets are old mobile devices, you probably just want to offer a couple of low bitrate streams).
(This would be in case you're delivering an HLS stream. If other format, the extension would change, for instance if you are delivering a DASH stream you would have "/manifest.mpd" instead of "playlist.m3u8").
That is all, there is also a very helpful link in video.js documentation explaining how it does the bitrate switching: here.
I hope it helps someone! At least clarifying things! :)

Streaming web video to Roku

Does anyone know how technically to send videos (i.e. Youtube Videos) to a Roku player? There is a "Twonky Beam" app that allows streaming and what it appears to do is to send .mp4 files to Roku for playback. See the demo here: http://gigaom.com/video/youtube-on-roku-twonky-airplay/
This is done without a "Twonky Beam" Roku app. Looks like something that Roku supports natively, although I cannot find anything documented.
I want to know how they were able to accomplish this without Roku being a UPNP or DLNA device.
Any insights here would be great!
There are discussions on how to extract the mp4 URL from YouTube here and here
In terms of how to do airplay style video playback on Roku, you would use the External Control Protocol to launch a channel with the URLs of the video you wish to play back, or once your channel is launched, us the ECP in combination with the roInput component to send the URL's to your channel. Your channel would then send the URLs to a video playback compoenent which would initiate playback from Youtube or whatever source you send it. If you want to play URL's from your device (android/IOS) you would need to run a web server on the device to serve videos to the device.
here is an Open Source YouTube project referenced in that second thread.
Any unofficial project that plays video's from YouTube is subject to DMCA takedown by YouTube should they decide your project does not fit with their goals.
roInput is not really well documented, here is an example that demonstrates both roInput and launch parameters (launch parameters are keywords you include in an http POST):
function main(params as object)
if params.parameter <> invalid then
print "This channnel was launched with Launch Parameters!"
print params
else
print "launched without input parameters"
end if
port=CreateObject("roMessagePort")
input=createobject("roInput")
input.setmessageport(port)
while true
msg=wait(100,port)
if type(msg)="roInputEvent" then
params=msg.getinfo()
print params
end if
end while
end function
so your parameters might be "vidurl=http://myserver.com/video300k.mp4&vidurl=http://myserver.com/video600k.mp4" if you wanted to send multiple bit rate videos.
there are plenty of examples of how to play video on a Roku in the RokuSDK, the simplest being the simplevideoplayer exmaple.
As to the last part of the question re UPNP, you can find a roku on your lan either via brute force telnet on port 8060 to every ip or by using SSDP, also documented in the ECP guide linked above