I'm using react-native-incall-manager and React-Native-WebRTC to make audio calls.
I want to know the available audio device list to choose an audio route.
I was only discovered:
Event {"availableAudioDeviceList": "["WIRED_HEADSET","EARPIECE","SPEAKER_PHONE"]", "selectedAudioDevice": "EARPIECE"}
when call
DeviceEventEmitter.addListener('onAudioDeviceChanged', function (data) {
console.log('Event', data);
});
I just want to know
availableAudioDeviceList
without listener event. How can I do that?
Thank you.
Related
This is a known problem - if an iPhone is in silent mode, the video won't playback with sound. There are various solutions including this one but the solutions are always provided in functional components and I've been unsuccessful in translating them to work in my class components. I understand the basic idea is to set the sound:
await Audio.setAudioModeAsync({ playsInSilentModeIOS: true });
But I don't know how to associate that with the video playback. The solutions make use of ref's so I tried something like:
async TriggerAudio() {
try {
await Audio.setAudioModeAsync({ playsInSilentModeIOS: true });
} catch (error) {
console.log("error for trigger audio " + error);
}
this.videoRef.current.playAsync();
}
And called that in a function that returns the Video tag - I declared the videoRef variable in the constructor, but I don't think I understand how to use ref's in a class component because there's no playAsync function associated with the ref. I also put the setAudioModeAsync in componentDidMount - even asked for permissions, but the video still plays with no sound when the phone is silenced.
How do I solve this for a class component?
UPDATE
If I create a ref in the Constructor:
this.videoRef = React.createRef();
And set the ref parameter of Video to that value:
<Video source={{
uri: url
}}
ref={this.videoRef}
useNativeControls
resizeMode="contain"
style={styles.videoComment}
/>
And create a button with onPress set to the above TriggerAudio function, it works great - the challenge I have is that the ref variables have to be unique for each video (I believe) but I don't know what videos will be loaded until I make API calls in the componentDidMount function, so how do I create ref's for each video so that each video's audio will start correctly?
I'm using React Native Track Player in a React Native project and playing an array of music tracks.
After ending the audio tracks, I'm getting the following error in the emulator:
(This error is thrown not only when I send an array but also even after playing an individual music file)
Error:
Attempt to invoke virtual method 'double
java.lang.Double.doubleValue()' on a null object reference.
As a solution what I tried to stop the player once the array of tracks is empty, but it is also.
useTrackPlayerEvents([Event.PlaybackQueueEnded], async event => {
if (event.type === Event.PlaybackQueueEnded) {
TrackPlayer.stop();
}
});
Anybody who is familiar with react-native-track-player, can please help me to solve this issue?
Thank you.
The above code should be modified to check for undefined objects as follows (not null objects.)
useTrackPlayerEvents([Event.PlaybackQueueEnded], async event => {
if (event.type === Event.PlaybackQueueEnded && event.nextTrack !== undefined) {
TrackPlayer.stop();
}
});
I am developing react native project and I am loading some graphs from server response.
It is a Tab based app and this code is written in first tab.
But, In some use cases that data is not loading to that graph properly.
I have written that code in componentDidMount(), But it will call only once. But, My requirement is I have to call whenever view loaded, That time only render method is calling.
I have tried to add addlistener for navigation, But, Due to its it not navigation stack throwing error.
I have found some solution like below.
componentDidMount() {
}
fetchGraphData = () => {
//some code fetching from DB and redux based on conditions
}
render() {
this.fetchGraphData();
return (
);
}
}
But, This is not good practice as per code standards.
I am not receiving props, But, We are using some graphs which are
loading from data. My requirement is I have to call api fetch data
method after screen load every time.
Any suggestions, I have to call that fetchGraphData() once render method or view loaded.
Your problem is that when you move the 'fetchGraphData' function to a screen with the 'fetchGraphData' function, you must execute it. This problem can be solved by something simpler than I thought.
componentDidMount() {
this.fetchGraphData();
}
You can try rendering again when you move to a screen with a function.
this.props.navigation.push('functionMoveScreen') // Rendering the screen again.
import React from "react";
import { Player } from "#react-native-community/audio-toolkit";
export default class RNAudiotoolkit extends React.Component {
componentDidMount(){
new Player("some_audio_file.mp3").play();
console.log(Player.isPlaying);
}
}
Above is the minimum code I've whittled down to, the audio track does play but, console.log(Player.isPlaying) always returns "false" but the audio file is running. Any input on why this isn't working. I can only suspect it has something to do with MediaStates, but have unsuccessfully gotten anything to work. If you have experience with this package your input is greatly appreciated.
documentation: https://github.com/react-native-community/react-native-audio-toolkit/blob/master/docs/API.md
Edit: Fixed and tested; answer is a combination of my original and mAhMoUdDaFeR's answer.
If you look above the documentation for the play method, you will see the documentation for prepare(Function callback). In it, it states:
...Otherwise the file is prepared when calling play() which may result in a small delay.
This means that if you check the .isPlaying property immediately after calling play() like you are doing, it is not guaranteed that the file is actually playing by time your console.log(Player.isPlaying) executes.
There is also the second issue that .isPlaying is not a static property on the Player class despite how it appears in the docs. It is actually a property of the Player instance that you need to create to play an audio file.
If you want to see that .isPlaying is indeed working correctly, one potential check is to run your code in a callback function passed into .play() as the docs show:
play(Function ?callback)
Start playback.
If callback is given, it is called when playback has started.
So a simple example would be to write your example code like this (saving the created instance and then logging in a callback):
componentDidMount(){
const p = new Player("some_audio_file.mp3").play(() => console.log('in callback', p.isPlaying));
console.log('immediately after play', p.isPlaying);
}
I created a new project to test this and if you run the above code, you'll see the following printed out illustrating the issue:
immediately after play false
in callback true
isPlaying is not a static method in the component Player, so you can't use Player.isPlaying, you can get isPlaying from the created instance (object) of this Player.
Try keeping a reference of the player object and then accessing its children:
this.player = new Player("some_audio_file.mp3").play()
and then log:
console.log(this.player.isPlaying)
I'm programming a react native game app for blind kids to help them with maths. There is a game in which they have to count how many animals are in the screen; when an animal is pressed it emits a sound.
In react native there is the onPress property of <TouchableWithouFeedback /> which allows me to play the sound, but when it comes to visually impaired users I have to announce that there is an animal instead of just playing the sound.
How can I know if a certain View is focused by the screen reader and call a function to do that?
There doesn't seem to be any way to react to the screen reader focusing on a particular object. Instead, you need to use the accessibilityLabel property on each animal object.
<TouchableOpacity accessible={true} accessibilityLabel="This is a tiger">
...
</TouchableOpacity>
When the user selects this object with say, a single tap, they will hear "This is a tiger." Then, after double-tapping the screen, they should hear the associated sound that all other users would normally hear.
I don't think there's really much more you can do than this with the given APIs. Not sure if the limitations are at the OS SDK or React Native level.
Check out the React Native docs on Accessibility for further details.
There's no way to detect, currently, if an element has VoiceOver or TalkBack focus. (I doesn't implement UIAccessibilityFocus neither TYPE_VIEW_ACCESSIBILITY_FOCUSED for Android
The only way to solve this is by developing a native module for images adding native listeners for accessibility events. This means in Android for example:
public void installAccessibilityDelegate() {
setAccessibilityDelegate(new AccessibilityDelegate() {
#Override
public boolean onRequestSendAccessibilityEvent(ViewGroup viewGroup, View child, AccessibilityEvent event) {
if (event.getEventType() == AccessibilityEvent.TYPE_VIEW_ACCESSIBILITY_FOCUSED) {
sendReactNativeEvent("start");
return false;
}
if (event.getEventType() == AccessibilityEvent.TYPE_VIEW_ACCESSIBILITY_FOCUS_CLEARED) {
sendReactNativeEvent("end");
return false;
}
return super.onRequestSendAccessibilityEvent(viewGroup, child, event);
}
});
}
My group developed a ios/android component which exposes those events for an image which is currently distributed via npm: https://www.npmjs.com/package/react-native-accessible-image