How can we add pause or mute functionality to the injected RTMP stream? - agora.io

So I am using the RTMP Injection sample app. Is there a way to add 'mute' and 'pause' controls to the injected stream?

Read this blog to learn how to implement a live broadcast using RTMP Injection: https://www.agora.io/en/blog/how-to-build-a-live-broadcasting-web-app/.
The agora-broadcast-client.js file has the code for toggling the stream (play and pause) as well as muting and unmuting.

Related

capture MediaStream from a local video file in react naitve

The purpose is to play a local video on the host side and stream it on the participants' side. The video should pause and seek for everyone when the host does it.
For this, I want to capture a stream (of MediaStream type) of a local video file while it is playing and pass it into WebRTC.
Just like on the web we have a captureStream() method to capture stream from video or canvas, do we have anything similar in react-native? Or any other way to achieve the same goal?
I could not find a relative solution with the RTCView of react-native-webrtc or the react-native-video. Any type of solution or suggestion would be helpful.
Thank you in advance.

WebRTC Audio Streaming without attaching to a html element

With reference to this tutorial. WebRTC - Voice Demo
The audio stream is attached to an <audio /> element for both Local and Remote audio.
I am a bit confused now. Aren't we suppose to connect the local audio to a Mic, instead of playing it?
Besides, for remote audio, is it possible for me to play the audio only without attaching it to an <audio /> element?
No, you need to create an audio element to play local audio as well.
The moment you call navigator.webkitGetUserMedia with constraints { video: false, audio: true }, you get your mic stream as a parameter on the callback. Then in order to play it, you need to attach it to an audio element, just like you would for remote audio, however, the remote audio stream will come from ontrack event instead.
But honestly, there is no reason for you to play your own local stream since the sound of your voice will overlap with the sound from the audio player. I guess they did it on the tutorial for demonstration purposes.
Btw, this tutorial you referred looks quite old, so if things don't work, that might be the reason why.

Webrtc stream local video file

How would one stream a local media file(video file) to peers?( i am using janus-gateway - videoroom plugin for this ).
For audio there is webAudio, but what about the video?
Thanks!
Update: Maybe someone has an example? Or a small code snippet? Maybe a link to some lib?
Render the local video on Canvas & create stream object from Canvas element.
And then you can add the stream to PeerConnection.
Then stream will be sent to remote peer(Janus/Browser/any server).
Demo: https://webrtc.github.io/samples/src/content/capture/canvas-pc/
Source: https://github.com/webrtc/samples/blob/gh-pages/src/content/capture/canvas-pc/js/main.js#L45

webrtc video stream stop sharing

I have created webrtc based video chat suing peerjs.
The local and remote video element is created using control:
local:
'video id= [local_peer_id] autoplay="true" controls="true">'
remote:
and
'video id= [remote_peer_id] autoplay="true" controls="true">'
Now during the video chat if local user mute auido remote user can not hear anything and its working perfect.
Problem is with the video. If local user pause his own video he can see the video is paused but remote user still can see his video live.
on the other hand if remote user pause his video, local user still can see his video live.
Any one tell what need to do to deploy the feathure
"Pause" and "resume" video that works real time for both peer?
You need to know the difference between the HTML tags and the WebRTC streams...
You can have streams running, without having them attached to any HTML tag, and the media can still be sent and received by each peer. So, each peer can attach the stream to a audio/video tag, and the tag will only act as a player that you use to play a stream that is already running.
So, if you mute the HTML tag, you will only be muting the player, and not the stream. If you want to make anything to have effect on the other peer, you need to do stuff on the stream or in the peer connection.
In particular, to mute and resume audio or video, you need to toggle the media tracks in the media stream
// create a button to toggle video
var button = document.createElement("button");
button.appendChild(document.createTextNode("Toggle Hold"));
button.onclick = function(){
mediaStream.getVideoTracks()[0].enabled =
!(mediaStream.getVideoTracks()[0].enabled);
}
To pause/resume audio, use getAudioTracks() instead.
calling mediaStream.stop() will stop the camera
where mediaStream is the stream that you got when calling getUserMedia
Here is a working example
mediaStream.getAudioTracks()[0].stop();
mediaStream.getVideoTracks()[0].stop();
Hope this will work with new standards. Its working fine in my app.

What's the recommended way to programmatically play an audio stream using iPhone SDK?

I have an mp3 on a remote server. Want to play it as a stream, using my own UI,and my own ViewController.
How would you recommend doing that?
I used the code from this page to make a streamer, and it worked well, and was easy to implement.
http://cocoawithlove.com/2009/06/revisiting-old-post-streaming-and.html