I'm writing a sampler app in React-Native using Expo and have run into the follow error:
react-native - require() must have a sing string literal argument
Pre-recorded samples are played called like this:
const SAMPLES = [
{ name: 'air horn', uri: require('../samples/air_horn.mp3') },
{ name: 'rim shot', uri: require('../samples/rimshot.mp3') },
]
playSample = (uri) => {
const { soundObject, status } = Expo.Audio.Sound.create(
uri,
{ shouldPlay: true }
)
}
It's somewhat based off the Expo documentation and works fine. However, I also want the user to record their own samples, which means it's coming from a custom URL:
playCustomSample = () => {
const customSample = this.props.navigation.state.params
? require(this.props.navigation.state.params.custom.toString())
: require('./samples/blank.mp3')
try {
const { soundObject, status } = Expo.Audio.Sound.create(
customSample,
{ shouldPlay: true }
)
} catch(error) {
console.log('ERROR:', error)
}
When I log the custom param I'm being passed by navigation, it's there. So I get that I'm doing it conceptually wrong--that require() requires a string, but I'm not going to know the name/location of the file until it is recorded.
How would I get access to the audio file without knowing the link ahead of time so I can include it?
Also, I don't have access to a Mac so ejecting it from Expo, so using something like react-native-audio or react-native-sound isn't an option.
Related
I have a button with an address, and when it opens, I want to use the "default" app which is installed. The reason is, for example, many iOS users uninstall Apple Maps app, so they only have Google. Checking if iOS ? 'maps' : 'google' isn't safe because it can't be Platform dependent.
This is using Expo SDK 46.
I then read to try something like:
const openUrl = () => {
const mapNames = ['comgooglemaps', 'maps'];
const hasApp = mapNames.find(async name => {
try {
return await Linking.canOpenURL(
`${name}://?center=${vehicle.coordinates.latitude}, ${vehicle.coordinates.longitude}`,
);
} catch (_e) {
return false;
}
});
openMap({
provider: hasApp,
end: vehicle.streetAddress,
});
};
but this isn't working because Linking.canOpenURL is always returning the first item since it's a "string" and there fore meets the API requirements of "given URL can be handled".
So I tried an alternate option, based on research on other suggestions:
const openUrl = async () => {
let hasGoogleMaps = false;
await Linking.canOpenURL('comgooglemaps').then(canOpen => {
if (canOpen) {
hasGoogleMaps = true;
}
});
openMap({
provider: hasGoogleMaps ? 'google' : 'apple',
end: vehicle.streetAddress,
});
};
This too fails to open Google Maps on iOS.
My question is: how can I for sure know if I have google maps installed and not base it on the Platform.OS itself?
Bonus question: is it true I cannot install Google Maps on a simulator?
i have a bunch of audio files local to my app and i want to load them dynamically based on a component's state, the only way i found to load the audio with expo av is to use "require", but this method keeps returning "invalid call" whenever i try to use a variable of any sort or any template literals in the path string in it.
i tried even storing the paths in a json file and then referrirng to the path directly there and still got the invalid call error.
const { sound } = await Audio.Sound.createAsync(require(audioPaths['paths'][fileKey]), {}, playbackStatusUpdate);
how do you guys go about this issue? my files are local so i can't take advantage of streaming/loading them from network. does expo av offer any alternative to the require method? i need any tips or advice you might have
PS: if you need any more details about the situation please ask me and i will fill you in
Edit: this is how my paths json looks like
{
"paths": [
"../assets/Records/1.mp3",
"../assets/Records/2.mp3",
"../assets/Records/3.mp3",
"../assets/Records/4.mp3"
]
}
The issue is related to audio paths not being declared as System.registerDynamic.
you should define paths in JSON like this
"paths": [
require('./assets/one.mp3'),
require('./assets/two.mp3'),
require('./assets/three.mp3'),
]
}
and call this without require,
const { sound } = await Audio.Sound.createAsync(audioPaths['paths'][fileKey], {}, playbackStatusUpdate);
here is a snack I used
Theoretically when you want to upload files in a react native app, you will use either formData, or fileupload or react-native-fs or expo-file-system.
I recommend you the expo-file-system since you use expo.
See complete implementation here
But saying i have a bunch of audio files local to my app means that your audio files are already uploaded into a directory in your project folder and just you want those audio to be played dynamically using the expo-av Audio.Sound.createAsync() with require(). This is how I would do that:
import * as React from 'react';
import { Text, View, StyleSheet, Button } from 'react-native';
import { Audio } from 'expo-av';
export default function App() {
const [sound, setSound] = React.useState();
async function playSound() {
console.log('Loading Sound');
const { sound } = await Audio.Sound.createAsync( require('./assets/Hello.mp3')
);
setSound(sound);
console.log('Playing Sound');
await sound.playAsync();
}
React.useEffect(() => {
return sound
? () => {
console.log('Unloading Sound');
sound.unloadAsync();
}
: undefined;
}, [sound]);
return (
<View style={styles.container}>
<Button title="Play Sound" onPress={playSound} />
</View>
);
}
This sample is for playing one audio, but in your question you want the audio to be played dynamically. For that you can only use react-native useEffect hook to create a kind of repeatable actions. I would first create a method playSound like this:
playSound = async () => {
await Audio.Sound.createAsync( require('' + source);
};
Here source is the path to an audio sent as variable and you may want to use function goToNext() and resumePlayList() to change the path of source variable like:
const goToNext = () => {
for(let i = 0; i < noGuest; i++){
source = JsonPath[i];
}
I'm implementing deep linking with expo in my react native app. I've managed to do it using this code with this tutorial and this documentation for adjusting it to my nested stacks:
const linking = {
prefixes:[prefix],
config: {
screens: {
Drawer: {
screens: {
Tabs: {
screens: {
Profile:"profile"
}
}
}
},
}
}
}
return (
<NavigationContainer linking={linking}>
<RootStackScreen actions={actions} showLoader={showLoader} user={user} {...props} />
</NavigationContainer>
)
}
If I use myscheme://profile it works as expected, but only if the app is opened in the background. When the app is closed, then it just open it in my initial home screen, I tried googling and searching but couldn't find any explanation that fits what I did. I also tried adding the getInitialRoute function to linking, which triggers when the app was closed and was opened from a deep link, but couldn't figure how I can use it to activate the navigation.
async getInitialURL() {
const url = await Linking.getInitialURL(); // This returns the link that was used to open the app
if (url != null) {
//const { path, queryParams } = Linking.parse(url);
//console.log(path,queryParams)
//Linking.openURL(url)
return url;
}
},
I suppose that you confirmed that your function getInitialURL is getting called when your app is launched? Also, the commented code within the if (url != null) { aren't supposed to be commented right?
If the above is fine then the issue could be related to the debugger being enabled. As per React Native's documentation (https://reactnative.dev/docs/linking#getinitialurl):
getInitialURL may return null while debugging is enabled. Disable the debugger to ensure it gets passed.
I was experiencing this same issue and doing the following helped me
From the component at the root of your navigation stack, where you configure deep linking, add the following code:
const ApplicationNavigator = () => {
useEffect(() => {
// THIS IS THE MAIN POINT OF THIS ANSWER
const navigateToInitialUrl = async () => {
const initialUrl = await Linking.getInitialURL()
if (initialUrl) {
await Linking.openURL(initialUrl)
}
}
navigateToInitialUrl()
}, [])
const linking = {
prefixes: ['<your_custom_scheme>://'],
config: {
/* configuration for matching screens with paths */
screens: {},
},
}
return (
// Your components/navigation setup
)
}
So apparently, your app received the url but somehow "uses" it to wake the app up from background. When it is in the foreground, the useEffect runs and uses the URL to navigate to the intended screen.
PS: Make sure that your linking tree matches your app tree
There are a couple of things you can check.
Verify that the structure for linking.config matches your navigation structure. I've had a similar issue in the past, and resolved it by making sure my config structure was correct.
Ensure that the linking object is setup properly. Refer to the docs to verify. From the looks of it, the linking object you've showed doesn't have the getInitialURL property in it.
Confirm that you've setup the native side of things as documented.
Hopefully something works out! Let me know if it doesn't. 🙂
Based on https://documentation.onesignal.com/v7.0/docs/react-native-sdk#handlers
Deep linking in iOS from an app closed state
You must be Modify the application:didFinishLaunchingWithOptions in your AppDelegate.m file to use the following:
NSMutableDictionary *newLaunchOptions = [NSMutableDictionary dictionaryWithDictionary:launchOptions];
if (launchOptions[UIApplicationLaunchOptionsRemoteNotificationKey]) {
NSDictionary *remoteNotif = launchOptions[UIApplicationLaunchOptionsRemoteNotificationKey];
if (remoteNotif[#"custom"] && remoteNotif[#"custom"][#"u"]) {
NSString *initialURL = remoteNotif[#"custom"][#"u"];
if (!launchOptions[UIApplicationLaunchOptionsURLKey]) {
newLaunchOptions[UIApplicationLaunchOptionsURLKey] = [NSURL URLWithString:initialURL];
}
}
}
RCTBridge *bridge = [[RCTBridge alloc] initWithDelegate:self launchOptions:newLaunchOptions];
also in reactnavigation:
https://reactnavigation.org/docs/deep-linking/
const linking = {
prefixes: ["https://example.com", "example://"],
config,
async getInitialURL() {
const url = await Linking.getInitialURL();
if (url != null) {
return url;
}
},
};
<NavigationContainer linking={linking}>
...
</NavigationContainer>
I was having the same problem. In iOS(flutter build) I solved this by adding "Content Available." The article is here: Apple Content Available Document. I am using OneSignal so in the api I added that field. Now even if the app is forced closed it awakes and deep links work. For Onesignal I had to use "content_available" : true. The complete Onesignal postman code is:
{
"app_id": "1234",
"included_segments": ["Test"],
"content_available" : true,
"contents": {
"en": "Hi"
},
"data": {
"dynamic_link": "https://google.com"
},
"headings": {
"en": "Testing"
}
}
After using rn-fetch-blob to get .mp3 file, i typed console.log(filePathIos) and got the path like this: "/var/mobile/Containers/Data/Application/80D7496B-862A-44D6-9D3A-F0EF31B565CF/Documents/xxx.mp3"
My code for playing track:
let filePathIos = `${fs.dirs.DocumentDir}/${fileName}`;
const thisSong = props.songs[0]
const track = {
id: thisSong.id.toString(),
url: filePathIos,
title: thisSong.title,
artist: "xxx"
}
const togglePlayback = async () => {
setAudioStatus(!AudioStatus)
const currentTrack = await TrackPlayer.getCurrentTrack();
if (currentTrack == null) {
await TrackPlayer.add(track);
await TrackPlayer.play();
} else {
if (AudioStatus) {
await TrackPlayer.play();
} else {
await TrackPlayer.pause();
}
}
}
The problem is: The local file is downloaded sucessfully and i can open it locally. But the above code dont work. when i try to press toggle button, xcode shows some logs as: "Starting/Resuming playback" but the song is still not played.
If i try to change a litte in the code abve ( change the url to an online music url) like this:
const track = {
id: thisSong.id.toString(),
url: 'https://audio-previews.elements.envatousercontent.com/files/103682271/preview.mp3',
title: thisSong.title,
artist: "xxx"
}
then the app works like a charm and it can play the song.
Im using these packages to build app, and for some specific reasons, i cannot upgrade/downgrade these packages:
platform: react native for IOS
react-native-track-player#1.1.3
react-native#0.59.2
rn-fetch-blob#0.10.15
Things i tried with no luck:
changes url from url: filePathIos to url: 'file://' + encodeURIComponent(filePathIos)
Thank you for any ideas!
Well, found it. The problem is that version 1.2.3 has an error on Track.swift that is fixed on the Dev branch:
func getSourceUrl() -> String {
return url.isLocal ? url.value.path : url.value.absoluteString
}
in version 1.2.3 is:
func getSourceUrl() -> String {
return url.value.absoluteString
}
This prevents local files to be played.
You can use the "Dev" branch as a workaround. I will fill a bug report.
I'm trying to adapt an augmented reality app I wrote in JS that only works in Firefox on Android to a react native app that can work in either Android or iOS. Since I need the camera input, I'm using react-native-webrtc (rather than importing the html and js I have been using, since I'm also trying to reduce framerate lag). I've been trying to parse out the demo here:
https://github.com/oney/RCTWebRTCDemo/blob/master/main.js
But the demo app is quite complex since it is a video chatroom (from what I can surmise). I just need to access the camera and keep it as the background of the app. This is what I have so far:
import React, { Component } from 'react';
import {
AppRegistry,
View,
} from 'react-native';
import {
RTCPeerConnection,
RTCMediaStream,
RTCIceCandidate,
RTCSessionDescription,
RTCView,
MediaStreamTrack,
getUserMedia,
} from 'react-native-webrtc';
let localStream;
function getLocalStream(isFront, callback) {
MediaStreamTrack.getSources(sourceInfos => {
let videoSourceId;
for (const i = 0; i < sourceInfos.length; i++) {
const sourceInfo = sourceInfos[i];
if(sourceInfo.kind == "video" && sourceInfo.facing == (isFront ? "front" : "back")) {
videoSourceId = sourceInfo.id;
}
}
getUserMedia({
audio: false,
video: {
mandatory: {
minWidth: 500,
minHeight: 300,
minFrameRate: 30
},
facingMode: (isFront ? "user" : "environment"),
optional: [{ sourceId: sourceInfos.id}]
}
}, function(stream) {
console.log("dddd", stream);
callback(stream);
}, logError);
});
}
function logError(error) {
console.log("logError: ", error);
}
let container;
var CanvasTest = React.createClass({
getInitialState: function() {
return {
isFront: true,
selfViewSrc: null};
},
componentDidMount: function() {
container = this;
},
render() {
return (
<View>
<RTCView streamURL={this.state.selfViewSrc} />
{console.log("this.state: ", this.state)}
{getLocalStream(true, function(stream) {
//localStream = stream;
//container.setState({selfViewSrc: stream.toURL()});
})
}
</View>
);
}
});
AppRegistry.registerComponent('CanvasTest', () => CanvasTest);
Everything is okay until I try to call the getLocalStream function. I get an "undefined is not an object" error for that line. (I've commented out the lines inside the callback to see if they are causing the problem, they are not).
This is what I get from the console in Android Studio:
E/ReactNativeJS: undefined is not an object (evaluating 'WebRTCModule.mediaStreamTrackGetSources')
E/EGL_emulation: tid 3106: eglSurfaceAttrib(1165): error 0x3009 (EGL_BAD_MATCH)
W/OpenGLRenderer: Failed to set EGL_SWAP_BEHAVIOR on surface 0xa0899300, error=EGL_BAD_MATCH
I think I'm calling the function in the wrong place. I want the view to load up the camera stream when the app starts. What am I doing wrong?
Is there a simpler example somewhere of using WebRTC in react native?
About undefined is not an object
It may because of not install it properly.
I'd suggest restart a fresh build again:
remove npm module: rm -rf $YourProject/node_modules/react-native-webrtc
clean npm cache: npm cache clean
clear gradle build intermediate files or
clear xocde project by Product -> clean
( it depends on your env )
npm install react-native-webrtc
follow the documents steps by steps carefully (Android / iOS)
be sure grant all permissions mentions on documents then try again.
Where to execute getLocalStream()
in your case, you can execute it in ComponentDidMount
otherwise, in some case, app may warn that you can't setState() in render()
(setState() will trigger render() normally, the warning is to prevent infinite loop.)
Suggestion
I would suggest you to NOT test it on simulators as possible for libraries which needs to access lots of hardware's functionalities.