Disabling Local stream on Remote Side after Call is connected via WebRtc in Android - webrtc

I'm trying to isolate video and audio and am able to control the video feed from the caller side, however, unable to turn off the local video stream on the remote side since its an audio call. Any suggestions on how to isolate the video and audio feeds. It doesn't work just by removing the streams by getting the getStream.

Related

How to stream video from Inskam endoscope to PC via Wi-Fi?

I have Inskam Wi-Fi endoscope.
After launch, it starts Wi-Fi network. To see video stream, you need to connect to network via phone with Inskam application for Android or iOS installed.
But I need to capture a video stream on my PC.
I think that camera turns on the streaming application. My idea is to directly access the streaming resource.
I've tried to access it from my PC via connect to camera Wi-Fi network:
http://192.168.29.102:8080/
http://192.168.29.102:8080/?action=stream
http://192.168.1.1:8080/
http://192.168.1.1:8080/?action=stream
Connection is failed.
How to capture video stream of camera via PC?
Do you want to use your PC to play the stream from your Wi-Fi endoscope, which only provides iOS/Android app to watch the stream?
Maybe there is another solution: Use iOS or Android to play the stream, then use OBS to capture the screen of your mobile phone(please see tutorial for iOS or Android), now you could use OBS to stream to anywhere.
It works like this:
Wi-Fi ---Wifi--> iOS/Android ---USB--> OBS
endoscope (PC)
OBS is running on your PC and you could record the stream.
If you would to broadcast the stream to internet or other mobile phone, then you could use OBS to publish it to a media server(SRS) or live streaming platform, like bellow:
OBS ---RTMP--> SRS/YouTube/Twitch ---> Players
(PC) (PC/Mobile)
It enables multiple users to watch the stream.

How to send camera stream to rtmp server by react native

I am trying to build a live stream video app.
I built an rtmp server which is ready for publishing and playing streams. I need a way to capture mobile's user camera and send the online stream to my rtmp server.
I use react-native in client side. I found react-native-camera which is great in dealing with camera but I couldn't find any event/api available for accessing camera stream in their documentations.
Another problem is the way that I have to send the stream to rtmp server. I have no knowledge in this area so any kind of help will be appreciated.
For anyone else who faces the same issue, This repo is the ultimate solution.
https://github.com/NodeMedia/react-native-nodemediaclient

Live streaming audio with WebRTC browser => server

I'm trying to sent some audio stream from my browser to some server(udp, also try websockets).
I'm recording audio stream with webrtc , but I have problems with transmitting data from a nodeJS client to the my server.
Any idea? is it possible to send audio stream to the server using webrtc(openwebrtc)?
To get audio from the browser to the server, you have a few different possibilities.
Web Sockets
Simply send the audio data over a binary web socket to your server. You can use the Web Audio API with a ScriptProcessorNode to capture raw PCM and send it losslessly. Or, you can use the MediaRecorder to record the MediaStream and encode it with a codec like Opus, which you can then stream over the Web Socket.
There is a sample for doing this with video over on Facebook's GitHub repo. Streaming audio only is conceptually the same thing, so you should be able to adapt the example.
HTTP (future)
In the near future, you'll be able to use a WritableStream as the request body with the Fetch API, allowing you to make a normal HTTP PUT with a stream source from a browser. This is essentially the same as what you would do with a Web Socket, just without the Web Socket layer.
WebRTC (data channel)
With a WebRTC connection and the server as a "peer", you can open a data channel and send that exact same PCM or encoded audio that you would have sent over Web Sockets or HTTP.
There's a ton of complexity added to this with no real benefit. Don't use this method.
WebRTC (media streams)
WebRTC calls support direct handling of MediaStreams. You can attach a stream and let the WebRTC stack take care of negotiating a codec, adapting for bandwidth changes, dropping data that doesn't arrive, maintaining synchronization, and negotiating connectivity around restrictive firewall environments. While this makes things easier on the surface, that's a lot of complexity as well. There aren't any packages for Node.js that expose the MediaStreams to you, so you're stuck dealing with other software... none of it as easy to integrate as it could be.
Most folks going this route will execute gstreamer as an RTP server to handle the media component. I'm not convinced this is the best way, but it's the best way I know of at the moment.

kurento media server not recording remote audio

I have extended tutorial one to one call for recording.
Original http://doc-kurento.readthedocs.io/en/stable/tutorials.html#webrtc-one-to-one-video-call
Extended https://github.com/gaikwad411/kurento-tutorial-node
Everything is fine but recording the remote audio.
When caller and callee videos are recorded, in the caller video recording callee voice is absent and vica versa.
I have searched kurento docs and mailing lists but did not find solution.
The workarounds I have in mind
1. Use ffmpeg to combine two videos
2. Use composite recording, I will also need to combine remote audio stream.
My questions are
1) Why it is happening, because I can hear the remote audio in ongoing call, but not in recording. In recording I can hear my own voice only.
2) Is there another solution apart from composite recording.
This is perfectly normal behaviour. When you connect a WebRtcEndpoint to a RecorderEndpoint, you only get the media that the endpoint is pushing into the pipeline. As the endpoint is one peer of a WebRTC connection between the browser and the media server, the media that the endpoint pushes into the pipeline is whatever it receives from the browser that has negotiated that WebRTC connection.
The only options that you have, as you have states already, are post-processing or composite mixing.

Real time live streaming with an iPhone for robotics

For a research purpose, I developed and app to control a wheeled mobile robot using the gyro and the accelerometer of an iPhone. The robot has a IP address, and I control it by sending messages through a socket. Since the robot has to be controlled from anywhere in the world, I mounted a camera on top of it. I tried to stream the video from the camera using the http live streaming protocol and vlc, but the latency is too high (15-30sec) to properly control it.
Now, vlc has the possibility to stream over udp or http, but the point is how do I decode the stream on the iPhone? How should I treat the data coming into the socket in order to visualize them as a continuous live video?