Gems/packages/APIs for generating virtual webcam to stream a video file to - webcam

Goal: Create a testing framework for using a file that will mimic folks streaming their webcam video to our server
Thoughts so far: The two thoughts I'm trying to locate is to generate my own virtual webcam package or finding a way to inject a video file as a livestream to the webcam.
Anyone got any suggestions? I'm trying to find options for Ruby, C#, or Python

So the only easy solution I found for this in the end is only in the ChromeDriver. You can pass in the following options to ChromeDriver when launched:
--use-file-for-fake-video-capture=<filename>
--use-file-for-fake-audio-capture=<filename>
--use-fake-ui-for-media-stream
--use-fake-device-for-media-stream
This will let you send in an audio & a video stream in as the default fake audio/video devices for the browser.

Related

How to turn webcam to rtsp

I have a product that can analyze video after inputting an rtsp url.
I would like to use a webcam to stream and feed my product the webcam rtsp.
How can I do that?
It will depend on the webcam you are using - most support RTSP but many do not publish the interface to access the stream as they are designed to be used with the webcam's own companion app.
There are some web resources which provide the RTSP urls for common web cams - you may find it hard to find a match as new versions of webcams roll out but it should give you a feel how to try accessing a vendors camera if you have a specific web cam you are testing against. Some examples (at the time of writing):
https://www.getscw.com/decoding/rtsp
https://soleratec.com/get-support/rtsp/
If you can't find the info for the camera you are using, and you have the companion app, you can also use a network sniffer tool like Wireshark (https://www.wireshark.org) and try to search the traffic for 'rtsp://' pattern.
If you just need to test your app and have access to a raspberry pi with a camera module you can also use this to generate an RTSP stream - there are several approaches for this but one I have found reliable is the v4l2rtspserver server:
https://github.com/mpromonet/v4l2rtspserver
There are specific instructions for setting it up on PI (https://github.com/mpromonet/v4l2rtspserver/wiki/Setup-on-Pi) and you can also verify it is working using VLC player on a laptop etc before testing in your specific application.
There are also a small number of test RTSP urls available on the web - the most reliable seem to be the one at this link provided by Wowza (again, link valid at time of writing):
https://www.wowza.com/html/mobile.html

webcam cannot be accessed by WSL

I'm very new to WSL. I want to run a python code on ubuntu shell on my win10 PC. This code needs access to webcam, but it seems that the webcam is not opened properly..I have checked online and I found several posts 1-2 years ago which said that the integrated webcam cannot be accessed by WSL..Is there any update or trick that can use webcam on WSL?
Many thanks!
You can't still access the webcam in WSL but there are several ways to access the video stream in WSL. The most obvious one is converting your webcam to a web-based streaming protocol like RTSP, MPEG, FFMPEG streams (example) where you get almost similar experiences.
Also, there are many applications where you can make your webcam as an IPcam stream then use the stream URL instead of using the camera index in OpenCV or any other application.

How RTSP stream can be loaded to an HTML5 player to support multiple OS and browsers?

I am trying to play an rtsp format url in browser .i used video js to do it. But i cant play it or the browser is not streaming it.
I tested in IE,Chrome, and Firefox.
Could you please help me to find a solution?

Creating a WebRTC receiver

I am new to WebRTC and trying to figure out how to create a program outside a browser which receives a WebRTC audio stream and outputs it on speakers.
Are there any WebRTC libraries for Java or C#?
That receiver will be running on a linux machine.
--
I've been thinking about using getUserMedia() to access the microphone. But then:
In what format will such a stream be transmitted?
Let's say I use WebRTC2SIP and build a Java endpoint using JSIP;
or I just use a socket and send the stream over http.
What audio format will I get on the receiver side? So far I have read WebRTC does compress the stream somehow.
I guess there are two ways for you:
build the whole WebRTC voice engine for android/iOS or Mac etc., and just use the API provide by VOE.
build standalone NS/VAD/AECM/AGC modules and using it in your project. for example, you build standalone NS module for android mobile, you use AudioRecord(java layer, android things) to record sound from MIC, and do the noise suppression process on these data(jni layer, WebRTC things), and finally playback the processed data by using AudioTrack(java layer, android things).
EDIT:
for the 2nd situation, the format is PCM raw data.
Check out the working Audio demo and code at demo.easyrtc.com
The code is all open source and can be checked out at https://github.com/priologic/easyrtc
You can look for any known issues around easyRTC at our forum at
https://groups.google.com/forum/#!forum/easyrtc
Also check out our main site at easyrtc.com

play music stream with gstreamer-sharp

I am looking fo an example showing me how to play an mp3 stream from a URL.
I am trying to build a comand line client for apache using mono with gstreamer.
So far I haven't found any clue how to use gstreamer sharp - does anybody have any experience?
Have a look at Banshee source code, we have a GStreamerSharp backend.