React native webview HTML5 video sound not working in IOS silent mode - react-native

I am using WebView to load a webpage which has an embedded video player. It works fine when the app is in ringer mode. But does not have any sound when App is in silent mode. I am not well aware of IOS. Any help would be appreciated.
<WebView startInLoadingState={true}
mediaPlaybackRequiresUserAction={false}
javaScriptEnabled={ true }
source={{uri:'http://ab24.live/player'}}/>

Since I can't comment yet, just adding that it's been fixed (last comment of github issue).
So in order to avoid needing to call that hacky workaround function, now you just need to add useWebKit={true} to the WebView component.
Fix was implemented last month and should work with Expo V32+ versions.

Assuming that you're using expo and you have come up against this bug, you can get around this problem using the following:
import { Audio } from "expo";
...
async playInSilentMode() {
// To get around the fact that audio in a `WebView` will be muted in silent mode
// See: https://github.com/expo/expo/issues/211
//
// Based off crazy hack to get the sound working on iOS in silent mode (ringer muted/on vibrate)
// https://github.com/expo/expo/issues/211#issuecomment-454319601
await Audio.setAudioModeAsync({
playsInSilentModeIOS: true,
allowsRecordingIOS: false,
interruptionModeIOS: Audio.INTERRUPTION_MODE_IOS_MIX_WITH_OTHERS,
shouldDuckAndroid: false,
interruptionModeAndroid: Audio.INTERRUPTION_MODE_ANDROID_DO_NOT_MIX,
playThroughEarpieceAndroid: true
});
// console.log(" 🔈 done: setAudioModeAsync");
await Audio.setIsEnabledAsync(true);
// console.log(" 🔈 done: setIsEnabledAsync");
const sound = new Audio.Sound();
await sound.loadAsync(
require("./500-milliseconds-of-silence.mp3") // from https://github.com/anars/blank-audio
);
// console.log(" 🔈 done: sound.loadAsync");
await sound.playAsync();
sound.setIsMutedAsync(true);
sound.setIsLoopingAsync(true);
// console.log(" 🔈 done: sound.playAsync");
}
Then, reference this in componentDidMount():
async componentDidMount() {
await playInSilentMode();
}

Related

React Native Fetch Blob/React Native Blob Util Fail in Production but not in Developer time

I am trying to download a pdf generated through an own api that worked normally for me until yesterday, since then it has stopped working without any modification. Reviewing in developer mode through the metro everything seems to work correctly without any problems (I download the pdf normally), but when deploying the application in the playstore it closes unexpectedly, leaving me without knowing why this happens.
First I was using the React Native Fetch Blob, then I used React Native Blob Util hoping it would solve the problem but it keeps happening. Do you guys have any ideas for why does this happen?
The PDF file download this function:
const generarReciboPdf = async (datosDeuda:FormularioScreenParams,formaPago:ReactText,nroCheque?:string) =>{
const{config,fs} = ReactNativeBlobUtil
if (formaPago === 4 && (nroCheque == ""|| undefined)) {
return console.log('debe llenar todos los campos');
}
const { DownloadDir } = fs.dirs;
const token = await AsyncStorage.getItem('token');
let datosDeudaCompleto= {
...datosDeuda,
forma_pago:formaPago,
nro_cheque:nroCheque
}
let url = 'https://sys.arco.com.py/api/appRecibos/generarReciboPdf/'+JSON.stringify(datosDeudaCompleto)
// return console.log(url);
const options = {
fileCache: true,
addAndroidDownloads: {
useDownloadManager: true, // true will use native manager and be shown on notification bar.
notification: true,
mime:'application/pdf',
path: `${DownloadDir}/recibo_${datosDeuda.mes_deuda.replace(/ /g, "")}_${datosDeuda.razon_social.replace(/ /g, "")}.pdf`,
description: 'Downloading.',
},
};
config(options).fetch('GET', url,{Authorization :'Bearer '+token}).then((res) => {
console.log('se imprimio correctamte el pdf');
setErrorPdf(1)
}).catch((error)=>{
console.log(error);
setErrorPdf(-1);
});
}
Also, this error appears in Play Console: "PlayConsole Error Image".
PlayConsole Error Image

how to add screen share function using PeerJS?

Currently, i am working on a webRTC project where you can give call and receive call.i also want to add screen share functionality to it.
can anyone provide me a good documentation link?
i am currently following the official documentation of peerJS.
i was able to do audio-video calling but stuck on the screen sharing part.
Help Me!
You need to get stream just like you do with getUserMedia and then you give that stream to PeerJS.
It should be something like this:
var displayMediaOptions = {
video: {
cursor: "always"
},
audio: false
};
navigator.mediaDevices.getDisplayMedia(displayMediaOptions)
.then(function (stream) {
// add this stream to your peer
});
I'm working with and learning about WebRTC. From what I've read, I think the solution here probably hinges on getDisplayMedia. That's also what this React, Node and peerJS tutorial suggests (though I haven't tried it myself yet).
let screenShare = document.getElementById('shareScreen');
screenShare.addEventListener('click', async () => {
captureStream = await navigator.mediaDevices.getDisplayMedia({
audio: true,
video: { mediaSource: "screen" }
});
//Instead of adminId, pass peerId who will taking captureStream in call
myPeer.call(adminId, captureStream);
})

Issue with WebRTC/getUserMedia in iOS 14 Safari and phone sleep/unlock

I seem to have noticed a regression with getUserMedia in iOS 14 Safari. Here are steps to reproduce:
Go to https://webrtc.github.io/samples/src/content/getusermedia/gum/ on iOS 14 Safari
Click "Open camera" and accept camera permissions; you should see local camera video.
Click the power button and lock the phone; let the phone go to sleep
Unlock/wake the phone; the local camera video is gone.
This does not happen on devices running iOS 13.
My questions are:
Can anyone else confirm this on their devices? I have only tested on iPhone 11 so far.
Has anyone found a solution yet?
Yes, I am having the a similar strange issue with iOS 14.2 and getUserMedia I can only get
navigator.mediaDevices.getUserMedia({video: true }) to work
If I change it to:
navigator.mediaDevices.getUserMedia({ audio: true, video: true })
it will fail.
It's not an issue with code as I tested my project on safari MacOS, chrome for MacOS, linux Firefox.
As a temp fix so I could move on with my life for the moment I did this:
const constraints = navigator.userAgent.includes("iPhone") ? {video:true} : {
audio:true,
video: {
width: { ideal: 640 },
height: {ideal: 400 }
}
};
Yes also here!
I check this behavior in Browserstack with iOS:
12.x: ✓
13.x: ✓
14.x: ✗
Try this:
navigator.mediaDevices.getUserMedia({ audio: true, video: true })
.then(stream => {
const videoTracks = stream.getVideoTracks();
console.log(videoTracks[0].enabled);
document.querySelector('video').srcObject = stream;
});
// Output
true <-- ?
Then if you try again get the camera, but replacing the video track on the previous MediaStream works.
Sometimes if you use video constraints with facingMode: 'user' also works, why? I don't know.
I still can't find a consistent solution.
Having the same issue on iPad pro 2nd generation with iOS 14.7.1 and iPhone 7 iOS 14.6.x. The only solution I found that seems to constantly work is to call getUserMedia separated by audio and video constraints. As an example:
async function getMedia(constraints) {
let videoStream = null;
let audioStream = null;
try {
videoStream = await navigator.mediaDevices.getUserMedia({video: true});
audioStream = await navigator.mediaDevices.getUserMedia({audio: true});
/* use the stream */
} catch (err) {
/* handle the error */
}
}
You can replace {video: true} or {audio: true} with your desired constraints. Then you can either work with the separate MediaStream objects or to construct your own MediaStream object from the audio and video tracks of your streams.

How to add a in-app browser in React Native (Expo)

How to open a web browser in expo rather than using the expo browser.I want open the browser in inside the app.
you can use the following library react-native-inappbrowser
follow the installation from the github page
import { Linking } from 'react-native'
import InAppBrowser from 'react-native-inappbrowser-reborn'
...
async openLink() {
try {
const url = 'https://www.google.com'
if (await InAppBrowser.isAvailable()) {
const result = await InAppBrowser.open(url, {
// iOS Properties
dismissButtonStyle: 'cancel',
preferredBarTintColor: '#453AA4',
preferredControlTintColor: 'white',
readerMode: false,
animated: true,
modalPresentationStyle: 'overFullScreen',
modalTransitionStyle: 'partialCurl',
modalEnabled: true,
// Android Properties
showTitle: true,
toolbarColor: '#6200EE',
secondaryToolbarColor: 'black',
enableUrlBarHiding: true,
enableDefaultShare: true,
forceCloseOnRedirection: false,
// Specify full animation resource identifier(package:anim/name)
// or only resource name(in case of animation bundled with app).
animations: {
startEnter: 'slide_in_right',
startExit: 'slide_out_left',
endEnter: 'slide_in_left',
endExit: 'slide_out_right'
},
headers: {
'my-custom-header': 'my custom header value'
},
waitForRedirectDelay: 0
})
Alert.alert(JSON.stringify(result))
}
else Linking.openURL(url)
} catch (error) {
Alert.alert(error.message)
}
}
...
you can check the example app here
for expo it becomes little bit complicated please check the related tutorial by Medium
if you want reading mode in ios please refer this link
reader-mode-webview-component-for-react-native
The expo-web-browser package opens an in app browser.
This worked for me with expo. I tried react-native-inappbrowser-reborn first, but isAvailable() was throwing an error.
import * as WebBrowser from 'expo-web-browser'
...
WebBrowser.openBrowserAsync(url, {showTitle: true})

How to register a headset button event in react native?

I'm using the latest version of React Native. I'm trying to console log something whenever the headset button is clicked on Android. So far, I've been unsuccessful.
I tried the react-native-music-control. In the docs it says that
MusicControl.on('togglePlayPause', ()=>{console.log('clicked')})
should work. But I'm not sure if its only for ios or android too.
This is my componentDidMount (the render returns a 'hello' text).
componentDidMount() {
MusicControl.enableControl('play', true);
MusicControl.enableControl('pause', true);
MusicControl.enableControl('stop', true);
MusicControl.enableControl('togglePlayPause', true);
MusicControl.on('play', () => { console.log('----'); });
MusicControl.on('pause', () => { console.log('----'); });
MusicControl.on('togglePlayPause', () => { console.log('----'); });
}
'----' is logged only when I disconnect the headset and not any other time.
The official doc on github says:
MusicControl.on('togglePlayPause', ()=> {}); // iOS only
The iOS only comment states pretty clearly that this event is only available on iOS and would not trigger on Android
So, the package react-native-keyevent works really well with android. After linking the package, I had to configure it in the MainActivity.java (which I was skipping by mistake).
After I did that, it worked properly in Android.