Twilio remote video is dark on iOS Safari - safari

I am using Twilio Video for creating a video/chat application, and the remote video tracks are displayed dark on Safari IOS (Using Safari Technology Preview) as shown in the picture below. (One way video)
I was thinking the issue is with the Browsers Autoplay Policy, but i think this should not be the case, since the audio track is played, while the video track remains dark.
Also, i make sure that the user presses a "Join Call" button, to ensure user interaction, which allows the rendering of a component which runs this React useEffect.
The codec of the video is H264, to ensure all Safari users can join the Room (Group Room)
useEffect(() => {
const canConnectToRoom = !room && !isConnectingToRoom && token && roomName
if (canConnectToRoom) {
connectToRoom(token, {
video: false,
name: roomName,
})
}
}, [room, isConnectingToRoom, token, roomName, connectToRoom])
Any help would be appreciated, thanks.
UPDATE 1:
TO RECEIVE THE REMOTE VIDEO STREAM
1. Render a RemoteParticipant component:
The collapsed code is for rendering a fallback UI when the remote camera is disabled (This gets shown when the remote camera is disabled, but when the remote camera is enabled just a dark screen)
2. Extract the participant publications
3. Render the publication tracks as:
4. Render the Video Track
5. Warnings in Safari Console
The warnings in the console are printed out until i allow microphone access. I tried joining the room after allowing microphone access, we eliminate console warnings that way, but remote video is still dark.

Related

How to publish LocalScreenShare tracks from IOS to Web browser in React-Native using Twilio

I want to integrate Screen Share feature in my react-native application in which I am using Twilio for video communication. In Web we are able to achieve this by following these steps.
1 : We get the media device stream using
navigator.mediaDevices.getDisplayMedia({
video: true,
});
2 : Then we get the first stream tracks using
const newScreenTrack = first(stream.getVideoTracks());
3 : After that we set this newScreenTrack in some useState
const localScreenTrack = new TwilioVideo.LocalVideoTrack(
newScreenTrack
);
4 : After that we first unpublish the previous tracks and publish the new tracks using
videoRoom.localParticipant.publishTrack(newScreenTrack, {
name: "screen_share",
});
5 : And finally we pass these tracks in our ScreenShare component and render these tracks to View the screenShare from remote Participant.
I need to do the same thing in my react-native application as well. Where if localParticipant ask for screenShare permission to another participant. Participant will accept the permission and able to publish the localScreenShare tracks.
If anyone know this please help me in this. It would be really helpful. Thank you
I think this is an issue with the react-native-twilio-video-webrtc package. It seems that, as you discovered in this issue, that screen sharing was previously a feature of the library and it was removed as part of a refactor.
Sadly, the library does more work than the underlying Twilio libraries to look after the video and audio tracks. The Twilio library is built to be able to publish more than one track at a time, however this React Native library allows you to publish a single audio track and a single video track using the camera at a time.
In order to add screen sharing, you can either support pull requests like this one or refactor the library to separate getting access to the camera from publishing a video track, so that you can publish multiple video tracks at a time, including screen tracks.

Agora.io User can't display the other user but can send self image and display it

I am using Agora SDK on web, it works completely fine on Google Chrome and Firefox browsers on macOS and windows, but on Safari, the user can send his own image and he can also sees himself but he can't see the person he is in videocall with.
Agora's Web SDK leverages the WebRTC API, and for this reason Safari browser has some limitations. Your question does not include much detail or code, so its difficult to make a definitive diagnosis, but the issue sounds like the browser autoplay policy is blocking the remote video stream from playing. Usually the autoplay policy does not affect the local streams (because the local streams do not play the local audio), so you only need to deal with the remote streams.
There are two options for working around the autoplay policy.
Bypass the autoplay block when the playback fails.
When detecting an autoplay block, instruct the user to click on the webpage to resume the playback:
stream.play("agora_remote"+ stream.getId(), function(err){
if (err && err.status !== "aborted"){
// The playback fails. Guide the user to resume the playback by clicking.
document.querySelector("#agora_remote"+ stream.getId()).onclick=function(){
stream.resume().then(
function (result) {
console.log('Resume succeeds: ' + result);
}).catch(
function (reason) {
console.log('Resume fails: ' + reason);
});
}
}
});
Bypass the autoplay block in advance.
If you prefer dealing with the autoplay block in advance, choose one of the following methods:
Method one: Play the stream without sound first by Stream.play("ID", { muted: true }), because autoplay without sound is allowed.
Method two: In your UI design, instruct the user to interact with the webpage, either by clicking or touching, to play the stream.
For more details about the two implementations for bypassing the autoplay policy ahead in advance, I would recommend taking a look at Agora's "Deal with autoplay policy" guide.

how to post video on fb timeline using share dialog with javascript sdk?

I have developed one application. I have to upload video on fb timeline, from developed application trough Facebook JavaScript sdk with help of FB.ui method.
i have shared part of my code, which i tried to post video on facebook timeline.when i used this code, video get upload as a link. it will navigate to new tab and play when i click on that link.(my video type is mp4.)
FB.ui({
method: 'feed',
display: 'popup',
type:'mp4',
source:filePath,
picture:filePath,
},function (response) {
if (response && !response.error_message) {
alert('Posting completed.');
} else {
alert('Error while posting.');
}
I expect the video to be play on my timeline instead of posting as a link.
I expect the video to be play on my timeline instead of posting as a link.
That expectation is simply unfounded – this isn’t supposed to work this way, and never has.
You would need to share a link to an HTML document, that has the video embedded via Open Graph meta tags, see https://developers.facebook.com/docs/sharing/webmasters#video
But Facebook has begun limiting the occasions on which they actually play such videos inline; so even if you implement this properly and technically correct, there is no guarantee any more it will play in news feed; users clicking on such a post might simply get redirected to your external site to play the video there.

WebRTC: View self-view while muting outgoing video in a call

Currently, the video mute functionality in webrtc is achieved by setting the enabled property of a video track to false
stream.getVideoTracks().forEach(function (track) {
track.enabled = false;
});
But the above code would not only mute the outgoing video, but the local self-view which is rendered using that local stream, also gets black frames.
Is there a way, to ONLY mute the outgoing video frames, but still be able to show a local self-view?
There's no easy way yet. Once MediaStreamTrack.clone() is supported by browsers, you could clone the video track to get a second instance of it with a separately controllable mute property, and send one track to your self-view and the other to the peerConnection. This would let you turn off video locally and remotely independently.
Today, the only workarounds I know of would be to call getUserMedia twice on Chrome (should work on https at least, where permissions will be persisted so the user won't be prompted twice) which would get you two tracks you could video-mute independently, or on Firefox you could use RTCRtpSender.replaceTrack() with a second "fake" video stream from getUserMedia using the non-standard { video: true, fake: true } constraint like this.

webrtc video stream stop sharing

I have created webrtc based video chat suing peerjs.
The local and remote video element is created using control:
local:
'video id= [local_peer_id] autoplay="true" controls="true">'
remote:
and
'video id= [remote_peer_id] autoplay="true" controls="true">'
Now during the video chat if local user mute auido remote user can not hear anything and its working perfect.
Problem is with the video. If local user pause his own video he can see the video is paused but remote user still can see his video live.
on the other hand if remote user pause his video, local user still can see his video live.
Any one tell what need to do to deploy the feathure
"Pause" and "resume" video that works real time for both peer?
You need to know the difference between the HTML tags and the WebRTC streams...
You can have streams running, without having them attached to any HTML tag, and the media can still be sent and received by each peer. So, each peer can attach the stream to a audio/video tag, and the tag will only act as a player that you use to play a stream that is already running.
So, if you mute the HTML tag, you will only be muting the player, and not the stream. If you want to make anything to have effect on the other peer, you need to do stuff on the stream or in the peer connection.
In particular, to mute and resume audio or video, you need to toggle the media tracks in the media stream
// create a button to toggle video
var button = document.createElement("button");
button.appendChild(document.createTextNode("Toggle Hold"));
button.onclick = function(){
mediaStream.getVideoTracks()[0].enabled =
!(mediaStream.getVideoTracks()[0].enabled);
}
To pause/resume audio, use getAudioTracks() instead.
calling mediaStream.stop() will stop the camera
where mediaStream is the stream that you got when calling getUserMedia
Here is a working example
mediaStream.getAudioTracks()[0].stop();
mediaStream.getVideoTracks()[0].stop();
Hope this will work with new standards. Its working fine in my app.