Android-webrtc Stream data from other than camera source directly to webbrowser - webrtc

UseCase - I am using android projection API to capture my android device screen. The output is displayed in a SurfaceView. Next i want to project surfaceview data to a web browser using Webrtc.
I have seen lots of example which uses device camera and stream it output to web browser. how can i stream a video being played on surfaceView/TextureView to web browser.

Related

Agora Web RTC SDK video zoomed in and back camera is not being mirrored on mobile browser

I am using Agora Web RTC SDK to implement a live stream scenario. The video is being zoomed in automatically and on a mobile browser the back camera is not being mirrored.
I have tried setting up the videoEncoder when creating video camera track using createCameraVideoTrack as following.
encoderConfig: {
encoderConfig: '720p_1'
}
But not getting anywhere. The video is mirrored on remote viewer's screen and aspect ratio is all messed up.
Can someone help. Please

How to integrate with Google home hub to get camera stream?

For now, the media target is just Chromecast service from document here:https://developers.google.com/assistant/smarthome/traits/camerastream.html#response-nodejs
but,You can stream video from these security cameras to your Google Home Hub from document here:
https://support.google.com/googlenest/answer/9137164?hl=en-AU&ref_topic=7029100
so,how to integrate with Google home hub to get camera stream?
Thank you!
If you follow the guide on CameraStream, it will show you how to integrate with any cast target. The Google Home Hub is a cast target, and a security camera will stream to it if the Hub is the target receiver.

Agora SDK compatibility with Safari - both macos and ios

I am building a PWA with Agora broadcasting API. I managed to get the video stream playing on desktop Chrome, but not on Safari. The documentation says Safari is supported on both MacOS and iOS, but it doesn't seem like the case.
When I opened the client page on Safari, instead of playing the video stream, it just create a video player without content. I don't see any data being streamed in the inspector view, or there isn't any activity going on at all.
Do I need to do something different with Safari?
Agora.io provided an auto-diagnostic page for their Web SDK, which may be useful for you:
agora_webrtc_troubleshooting

WebRTC and won't play GarageBand manipulated sound after redirected with sound flower. Only not working in chrome

I am writing a web based app the requires real time audio manipulation, specifically a pitch shift in a users voice. For now my prototype uses GarageBand to pitch shift and soundflower to redirect the audio as my input audio source on the browser. Then using webRTC (simple webRTC library)I send a users web cam video and the manipulated webRTC stream to other browsers. This works great in Firefox but I have not had luck with chrome. The video channel is received fine but the audio is only silent on chrome. Any ideas ?

Can IOS devices stream m3u8 segmented video from the local file system using html5 video and phonegap/cordova?

I'm using phonegap/cordova 2.1 and my app has locally stored assets (on the device) which need to be encrypted on build and decrypted in memory when used in execution. The app is for iPad only.
For the videos I want to implement something similar to http://codebycoffee.com/2012/01/08/decrypting-http-streaming-video-with-nsurlprotocol where mediafilesegmenter is used to segment and encrypt each video file and then a custom encrypted file url protocol serves the key for each video when the video player requests the m3u8 file.
My problem is that I can't seem to play m3u8 files by using the html5 video tag and phonegap/cordova. I have even created unencrypted video segments with the use of mediafilesegmenter as a test. These can be played by opening the m3u8 file with VLC in OSX but when using the video tag in phonegap/cordova I get 'loading...' message followed by a popup saying "The operation could not be completed"
OK so it turns out that m3u8 files have to be served over HTTP and I have gone down the route of bundling a cocoahttpserver https://github.com/robbiehanson/CocoaHTTPServer in with the app. This way I can request the video streams with:
<video src="http://127.0.0.1:12345/path.m3u8"