WebRTC Safari integration test: How to mock getUserMedia video? - safari

I would like to implement an integration test mocking a WebCam video stream using the Safari browser. Following the WebRTC testing documentation Chrome provides the --use-file-for-fake-video-capture=<filename> flag to mock the video stream. I have not found any documentation for the Safari browser.
Is there a way to fake a webcam video for testing using the Safari browser?

Related

Does Google's Handwriting API work in web browsers yet?

I am looking for a handwriting recognition API that I can embed in my web application.
I found this: Handwriting Recognition API from Google.
The post also refers to a Demo.
It is my understanding that this API works in all (newer) Chromium based web browsers, i. e. Google Chrome or MS Edge.
I opened the demo in both browsers (Google Chrome 106.0.5249.119 and MS Edge 106.0.1370.42) on my Win-11-64-bit machine. Same error message in both browsers:
Handwriting recognition API is not supported on this platform.
Any idea why I get this error message? Is there anything I can do to fix this?

my browsers won't allow live streaming and video calling or audio calling even after enabling webrtc on all browsers

i am working a social network platform and wanted to integrate the live streaming services and also the chat voice and audio calling services. they are all active, except that I keep getting "Sorry your browser does not have webrtc" even after installing this extension on all my browsers, it still says the same for Chrome, Firefox, and Microsoft Edge. The only browser working perfectly well is the apple safari browser. Now I don't know how to fix this.
WebRTC has to work under https environment. That's why your website isn't working. Deploy the application with SSL security.

Use WebRtc to deliver hls to Safari on IOS

I found this github repo which allows me to use WebRtc to deliver hls over the WebRtc Datachannel.
The Problem is that it does not support Safari on IOS.
The newest versions of Safari on IOS do support WebRtc datachannels and they have native support for hls playback.
This is my problem is it only possible to use WebRtc for data transfer on Browsers which support Media Source Extensions or can I also use WebRtc to deliver the .ts files to the Safari Browser on IOS?
I am a developer of P2P Media Loader and we are working on iOS support right now. Hopefully, we will have a prototype soon.
On iOS Safari you can exchange video and audio data using WebRTC Data Channels but you can not put that data and play into HTML video element without API like Media Source Extensions.
We are currently testing a different approach to do that on iOS Safari.
iOS Safari doesn't support Media Source Extensions of HTML5 video element.
Therefore, you cannot play synchronized audio and video on iOS Safari by any hand-made approaches that use various API such as Canvas, Web Audio API etc..
iOS Safari has two built-in methods for playing synchronized audio and video:
a. Native HLS playback
b. WebRTC PeerConnection
If you choose WebRTC PeerConnection, you will have to transcode AAC audio used in HLS to Opus audio required by WebRTC, and to transmux HLS to WebRTC.
Ugly, CPU-consuming and really pointless. What does it buy you? Why not to use native HLS playback on iOS Safari? But if you insist on option b, then there is number of software media servers that will do it for you.

Agora SDK compatibility with Safari - both macos and ios

I am building a PWA with Agora broadcasting API. I managed to get the video stream playing on desktop Chrome, but not on Safari. The documentation says Safari is supported on both MacOS and iOS, but it doesn't seem like the case.
When I opened the client page on Safari, instead of playing the video stream, it just create a video player without content. I don't see any data being streamed in the inspector view, or there isn't any activity going on at all.
Do I need to do something different with Safari?
Agora.io provided an auto-diagnostic page for their Web SDK, which may be useful for you:
agora_webrtc_troubleshooting

WebRTC and won't play GarageBand manipulated sound after redirected with sound flower. Only not working in chrome

I am writing a web based app the requires real time audio manipulation, specifically a pitch shift in a users voice. For now my prototype uses GarageBand to pitch shift and soundflower to redirect the audio as my input audio source on the browser. Then using webRTC (simple webRTC library)I send a users web cam video and the manipulated webRTC stream to other browsers. This works great in Firefox but I have not had luck with chrome. The video channel is received fine but the audio is only silent on chrome. Any ideas ?