webrtc - change the video resolution at otherside - webrtc

I want to reduce internet usage in a webrtc call by changing the resolution of the incoming video.
I read content related to Media_Streams_API but did not find anything similar.
is this possible?
I'm just a user and I do not have access to a webrtc server

Related

Send real-time video via wifiSend

I would like to make a personal application to be installed on two iPhones. The first to be used as a webcam that transmits to the second via wifi.
Having no experience with xCode, I am looking for a code example to connect 2 devices via wifi and transmit a real-time video stream.
Unfortunately, the documentation and examples I found are deprecated or partial and inconsistent.
Where can I find some code examples to help me solve my problem, preferably in ObjectiveC (but also in Swift)?
Thank you

How to measure the performance of my site's video streaming and playback?

I have developed a site that hosts user videos. I store the video files in AWS S3, I deliver them through AWS Cloudfront and I use video.js as the site's player with HTML5 as default and flash as fallback.
Generally the video streaming seems to work fine but in some cases I receive complaints from users for slow or choppy video playback. I want to create some tests to measure the performance of streaming in order to be able to distinguish user problems (e.g. slow connection at the user side) or with my service.
Are there any best practices or tools to collect video delivery metrics? I'm interested in open source solutions or something that I can implement myself because it's just a personal project, but I don't want to rediscover the wheel.
Testing progressive download implies checking the transmission bandwidth and its continuity. For example for a high transmission rate the initial client buffer will be filled faster and the playback will start sooner. However, losing that transmission capacity at some later time can cause re-buffering. The total transmission time of your file must be lower than the video duration.
To identify potential issues you can start with the S3 bucket logs and the CloudFront cache statistics and access logs.
There's a load testing tool written in Java called Apache JMeter. It cannot use JavaScript so it must be configured to request the files directly.
The disadvantage of using a load test tool in a single location is pretty evident. Different geographical areas and carriers have different characteristics and test results will be different.
There are online, non open-source tools that can load test from multiple locations but they are generally paid though some offer free trials.
Here's another way to look at this.
but in some cases I receive complaints from users for slow or choppy video playback.
If you're using an Adaptive HLS stream, and you're CloudFront, and the video is still choppy to some users, that's probably because of their own internet connection speeds.
In that case, you can encode your video in multiple resolutions (using just one AWS MediaConvert job, btw) - like 1080p, 720p, 360p, 240p, 144p etc.
And then Videojs has a stream switcher plugin that will 1) automatically start playing the highest possible resolution - and no higher - that's right for the viewer's connection and 2) give the user the option via a "Settings" (gear) icon in the control bar that they can use to switch resolutions manually.
That way, even those with really poor internet connections should be able to watch your video.
Of course, the other alternative is to use progressive download videos that the viewer can simply click play, then immediately click pause, and wait for the video to buffer, and then play it after it's fully downloaded.
Check out the Videojs Resolution Switcher demo here.
-- Ravi Jayagopal

How can I limit HTML5 video bandwidth usage in Video.js?

Trying to roll out a video to a client with limited bandwidth. The client is concerned that the video will eat up all the bandwidth at their field office. In testing, I've discovered that even though my video is encoded at 420kbps, when downloading the client it still utilizes about 1.5mbps. Is there a way to control the maximum bandwidth used by video.js or the video tag?
Unfortunately no. The html5 video element doesn't have any throttling options. It's completely up to the browser to decide how to fetch the video data. Some will download the whole thing at once, others will download pieces as you need them. All I think will use as much of the pipe as you give them.
The media source extensions proposal hopes to add some ability here, but that won't be available for a while.
I would find somewhere else besides the office to host the video, like Amazon S3.

Streaming web video to Roku

Does anyone know how technically to send videos (i.e. Youtube Videos) to a Roku player? There is a "Twonky Beam" app that allows streaming and what it appears to do is to send .mp4 files to Roku for playback. See the demo here: http://gigaom.com/video/youtube-on-roku-twonky-airplay/
This is done without a "Twonky Beam" Roku app. Looks like something that Roku supports natively, although I cannot find anything documented.
I want to know how they were able to accomplish this without Roku being a UPNP or DLNA device.
Any insights here would be great!
There are discussions on how to extract the mp4 URL from YouTube here and here
In terms of how to do airplay style video playback on Roku, you would use the External Control Protocol to launch a channel with the URLs of the video you wish to play back, or once your channel is launched, us the ECP in combination with the roInput component to send the URL's to your channel. Your channel would then send the URLs to a video playback compoenent which would initiate playback from Youtube or whatever source you send it. If you want to play URL's from your device (android/IOS) you would need to run a web server on the device to serve videos to the device.
here is an Open Source YouTube project referenced in that second thread.
Any unofficial project that plays video's from YouTube is subject to DMCA takedown by YouTube should they decide your project does not fit with their goals.
roInput is not really well documented, here is an example that demonstrates both roInput and launch parameters (launch parameters are keywords you include in an http POST):
function main(params as object)
if params.parameter <> invalid then
print "This channnel was launched with Launch Parameters!"
print params
else
print "launched without input parameters"
end if
port=CreateObject("roMessagePort")
input=createobject("roInput")
input.setmessageport(port)
while true
msg=wait(100,port)
if type(msg)="roInputEvent" then
params=msg.getinfo()
print params
end if
end while
end function
so your parameters might be "vidurl=http://myserver.com/video300k.mp4&vidurl=http://myserver.com/video600k.mp4" if you wanted to send multiple bit rate videos.
there are plenty of examples of how to play video on a Roku in the RokuSDK, the simplest being the simplevideoplayer exmaple.
As to the last part of the question re UPNP, you can find a roku on your lan either via brute force telnet on port 8060 to every ip or by using SSDP, also documented in the ECP guide linked above

Grab all frames from IP camera

I have a Sony IPELA snc-ch180 camera. It's a network camera.
Is there a way I can save all frames from the live images onto my hard drive?
Any idea what software can do this.
Yes you can. You need to check the camera reference and find out supported protocols, some of which are likely to be well known ones (alternatively, you reverse engineer them using network sniffer). Then you implement a client that connects to the camera and receives video stream. Having it received you parse it, decode it into frames, decompress/recompress if necessary and save to hard drive.