Deep linking - doesn't work if app is closed - react-native

I'm implementing deep linking with expo in my react native app. I've managed to do it using this code with this tutorial and this documentation for adjusting it to my nested stacks:
const linking = {
prefixes:[prefix],
config: {
screens: {
Drawer: {
screens: {
Tabs: {
screens: {
Profile:"profile"
}
}
}
},
}
}
}
return (
<NavigationContainer linking={linking}>
<RootStackScreen actions={actions} showLoader={showLoader} user={user} {...props} />
</NavigationContainer>
)
}
If I use myscheme://profile it works as expected, but only if the app is opened in the background. When the app is closed, then it just open it in my initial home screen, I tried googling and searching but couldn't find any explanation that fits what I did. I also tried adding the getInitialRoute function to linking, which triggers when the app was closed and was opened from a deep link, but couldn't figure how I can use it to activate the navigation.
async getInitialURL() {
const url = await Linking.getInitialURL(); // This returns the link that was used to open the app
if (url != null) {
//const { path, queryParams } = Linking.parse(url);
//console.log(path,queryParams)
//Linking.openURL(url)
return url;
}
},

I suppose that you confirmed that your function getInitialURL is getting called when your app is launched? Also, the commented code within the if (url != null) { aren't supposed to be commented right?
If the above is fine then the issue could be related to the debugger being enabled. As per React Native's documentation (https://reactnative.dev/docs/linking#getinitialurl):
getInitialURL may return null while debugging is enabled. Disable the debugger to ensure it gets passed.

I was experiencing this same issue and doing the following helped me
From the component at the root of your navigation stack, where you configure deep linking, add the following code:
const ApplicationNavigator = () => {
useEffect(() => {
// THIS IS THE MAIN POINT OF THIS ANSWER
const navigateToInitialUrl = async () => {
const initialUrl = await Linking.getInitialURL()
if (initialUrl) {
await Linking.openURL(initialUrl)
}
}
navigateToInitialUrl()
}, [])
const linking = {
prefixes: ['<your_custom_scheme>://'],
config: {
/* configuration for matching screens with paths */
screens: {},
},
}
return (
// Your components/navigation setup
)
}
So apparently, your app received the url but somehow "uses" it to wake the app up from background. When it is in the foreground, the useEffect runs and uses the URL to navigate to the intended screen.
PS: Make sure that your linking tree matches your app tree

There are a couple of things you can check.
Verify that the structure for linking.config matches your navigation structure. I've had a similar issue in the past, and resolved it by making sure my config structure was correct.
Ensure that the linking object is setup properly. Refer to the docs to verify. From the looks of it, the linking object you've showed doesn't have the getInitialURL property in it.
Confirm that you've setup the native side of things as documented.
Hopefully something works out! Let me know if it doesn't. 🙂

Based on https://documentation.onesignal.com/v7.0/docs/react-native-sdk#handlers
Deep linking in iOS from an app closed state
You must be Modify the application:didFinishLaunchingWithOptions in your AppDelegate.m file to use the following:
NSMutableDictionary *newLaunchOptions = [NSMutableDictionary dictionaryWithDictionary:launchOptions];
if (launchOptions[UIApplicationLaunchOptionsRemoteNotificationKey]) {
NSDictionary *remoteNotif = launchOptions[UIApplicationLaunchOptionsRemoteNotificationKey];
if (remoteNotif[#"custom"] && remoteNotif[#"custom"][#"u"]) {
NSString *initialURL = remoteNotif[#"custom"][#"u"];
if (!launchOptions[UIApplicationLaunchOptionsURLKey]) {
newLaunchOptions[UIApplicationLaunchOptionsURLKey] = [NSURL URLWithString:initialURL];
}
}
}
RCTBridge *bridge = [[RCTBridge alloc] initWithDelegate:self launchOptions:newLaunchOptions];
also in reactnavigation:
https://reactnavigation.org/docs/deep-linking/
const linking = {
prefixes: ["https://example.com", "example://"],
config,
async getInitialURL() {
const url = await Linking.getInitialURL();
if (url != null) {
return url;
}
},
};
<NavigationContainer linking={linking}>
...
</NavigationContainer>

I was having the same problem. In iOS(flutter build) I solved this by adding "Content Available." The article is here: Apple Content Available Document. I am using OneSignal so in the api I added that field. Now even if the app is forced closed it awakes and deep links work. For Onesignal I had to use "content_available" : true. The complete Onesignal postman code is:
{
"app_id": "1234",
"included_segments": ["Test"],
"content_available" : true,
"contents": {
"en": "Hi"
},
"data": {
"dynamic_link": "https://google.com"
},
"headings": {
"en": "Testing"
}
}

Related

React Native component not making new query after mount

We're using react-native-web so native and web are in one code base. I have an instance where a user clicks the back button to return to a main page and this should fire a re-query of the backend. We're also using Apollo hooks for queries, useQuery
So far, this works for web but not for native. I tried creating a useEffect hook to check if navigation and specifically navigation.isFocused() like so:
const {
data,
loading: childProfilesLoading,
error: childProfilesError,
refetch: refetchChildProfiles,
} = useQuery(LIST_PROFILES, {
fetchPolicy: 'no-cache',
})
// this method also exists on the previous page
const goBack = () => {
if (history) {
history.goBack()
} else if (navigation) {
navigation.goBack()
}
}
useEffect(() => {
if (navigation?.isFocused()) {
refetchChildProfiles()
}
}, [navigation, refetchChildProfiles])
but this doesn't work. Is there something I'm missing in forcing a refetch on native?

how to perform deep linking on React Native Navigation

I have a request to implement deep linking in our React Native application whereby clicking a link will take them directly into the installed app. I am able to successfully direct them to the app. However, it must navigate to a certain page.
To address the problem, I use the code below. If there is a better approach to handle the problem, I would appreciate it if you could share it with me!
const useUrl = async () => {
const url = await Linking.getInitialURL();
if (url) {
Navigation.push(componentId, {
component: {
name: 'screen',
},
});
}
};
react-native and react-navigation both handle this as part of a feature set within the "Linking" that they offer. I can't find a way to handle that with React Native Navigation?
For me, I just add path to my stack navigator in react native like this
Product: {
screen: ProductScreen,
path: 'product/:productId'
},
and make sure your web that handle deep link have a similar path in routing. For example https://yourweb.com/product/:productId
And in your screen file, add this
useEffect(() => {
Linking.addEventListener('url', _handleDeepLink)
return () => {
Linking.removeEventListener('url', _handleDeepLink);
}
}
const _handleDeepLink = (event) => {
if (event) {
const route = event.url.replace(/.*?:\/\//g, '')
const id = route.match(/\/([^\/]+)\/?$/)[1];
if (id !== undefined) {
// do your code here if screen just opened via deep link
}
}
}

react-native - require() must have a sing string literal argument

I'm writing a sampler app in React-Native using Expo and have run into the follow error:
react-native - require() must have a sing string literal argument
Pre-recorded samples are played called like this:
const SAMPLES = [
{ name: 'air horn', uri: require('../samples/air_horn.mp3') },
{ name: 'rim shot', uri: require('../samples/rimshot.mp3') },
]
playSample = (uri) => {
const { soundObject, status } = Expo.Audio.Sound.create(
uri,
{ shouldPlay: true }
)
}
It's somewhat based off the Expo documentation and works fine. However, I also want the user to record their own samples, which means it's coming from a custom URL:
playCustomSample = () => {
const customSample = this.props.navigation.state.params
? require(this.props.navigation.state.params.custom.toString())
: require('./samples/blank.mp3')
try {
const { soundObject, status } = Expo.Audio.Sound.create(
customSample,
{ shouldPlay: true }
)
} catch(error) {
console.log('ERROR:', error)
}
When I log the custom param I'm being passed by navigation, it's there. So I get that I'm doing it conceptually wrong--that require() requires a string, but I'm not going to know the name/location of the file until it is recorded.
How would I get access to the audio file without knowing the link ahead of time so I can include it?
Also, I don't have access to a Mac so ejecting it from Expo, so using something like react-native-audio or react-native-sound isn't an option.

How to navigate screen on notification open in React Native with One Signal?

Here is my code, how can I navigate user to the desired screen when clicked on a notification or button in a notification.
componentWillMount() {
OneSignal.addEventListener('received', this.onReceived);
OneSignal.addEventListener('opened', this.onOpened);
OneSignal.addEventListener('registered', this.onRegistered);
OneSignal.addEventListener('ids', this.onIds);
OneSignal.inFocusDisplaying(2);
OneSignal.requestPermissions({
alert: true,
badge: true,
sound: true
});
}
componentWillUnmount() {
this.isUnmounted = true;
OneSignal.removeEventListener('received', this.onReceived);
OneSignal.removeEventListener('opened', this.onOpened);
OneSignal.removeEventListener('registered', this.onRegistered);
OneSignal.removeEventListener('ids', this.onIds);
}
onReceived(notification) {
console.log("Notification received: ", notification);
}
onOpened(openResult) { // HERE I WANT TO NAVIGATE TO ANOTHER SCREEN INSTEAD OF HOME SCREEN
this.isNotification = true;
let data = openResult.notification.payload.additionalData;
let inFocus = openResult.notification.isAppInFocus;
console.log('Message: ', openResult.notification.payload.body);
console.log('Data: ', openResult.notification.payload.additionalData);
console.log('isActive: ', openResult.notification.isAppInFocus);
console.log('openResult: ', openResult);
}
onRegistered(notifData) {
console.log("Device had been registered for push notifications!", notifData);
}
onIds(device) {
try {
AsyncStorage.setItem("#SC:deviceInfo", JSON.stringify(device));
} catch (error) {
console.log(error);
}
}
Do anyone have knowledge about all this, React Native + OneSignal + React Navigation + Redux. Please help!
To achieve the desired behavior you can do couple of things. You can manually check the notification and state of the router and if its necessary redirect the user to the screen or you can use the Deep Linking functionality.
To use Deep Linking you attach url parameter to your notification while sending it. To direct user to the correct screen in your app you can use react-navigation deep linking functionality.
From One Signal Documentation
url string The URL to open in the browser when a user clicks on the
notification. Example: http://www.google.com
Note: iOS needs https or updated NSAppTransportSecurity in plist
From React Navigation Documentation
Deep Linking
In this guide we will set up our app to handle external URIs. Let's start with the SimpleApp that we created in the
getting started guide. In this example, we want a URI like
mychat://chat/Taylor to open our app and link straight into Taylor's
chat page.
You can dispatch a NavigationAction or perform a navigate action when onOpened is fired. Following snippet should work:
componentWillMount() {
OneSignal.inFocusDisplaying(0);
OneSignal.removeEventListener('opened', this.onOpened.bind(this));
OneSignal.addEventListener('opened', this.onOpened.bind(this));
}
onOpened(openResult) {
let data = openResult.notification.payload.additionalData;
// ScreenName is the name of the screen you defined in StackNavigator
this.props.navigation.navigate('ScreenName', data)
}
In search for the solution I landed on this question and I think most of the answers are now old. So, in case anyone looking for the solution can try this.
OneSignal.setNotificationOpenedHandler((notificationResponse) => {
const { notification } = notificationResponse;
if (notification) {
const { additionalData = null } = notification;
if (additionalData) {
const { type } = additionalData;
navigateToScreen(type);
}
}
});
const navigateToScreen = (type) => {
switch (type) {
case "post":
case "track":
navigation.navigate("SinglePost");
return;
default:
return;
}
};
In case someone else comes with a similar problem to mine I want to add onto what #Mostafiz Rahman said. The app I was working on had a bunch of nested stacks and tabs (react-navigation v1) inside of a drawer, and if Stack1 was backgrounded and the notification was for Stack2 I couldn't get them to jump around.
I ended up putting the logic as described by Mr. Rahman in every one of the stacks' first screens -- 1st screen of Stack1, 1st screen of Stack2, etc -- and that did it!

Simple example for accessing the camera via WebRTC in react-native (Android)

I'm trying to adapt an augmented reality app I wrote in JS that only works in Firefox on Android to a react native app that can work in either Android or iOS. Since I need the camera input, I'm using react-native-webrtc (rather than importing the html and js I have been using, since I'm also trying to reduce framerate lag). I've been trying to parse out the demo here:
https://github.com/oney/RCTWebRTCDemo/blob/master/main.js
But the demo app is quite complex since it is a video chatroom (from what I can surmise). I just need to access the camera and keep it as the background of the app. This is what I have so far:
import React, { Component } from 'react';
import {
AppRegistry,
View,
} from 'react-native';
import {
RTCPeerConnection,
RTCMediaStream,
RTCIceCandidate,
RTCSessionDescription,
RTCView,
MediaStreamTrack,
getUserMedia,
} from 'react-native-webrtc';
let localStream;
function getLocalStream(isFront, callback) {
MediaStreamTrack.getSources(sourceInfos => {
let videoSourceId;
for (const i = 0; i < sourceInfos.length; i++) {
const sourceInfo = sourceInfos[i];
if(sourceInfo.kind == "video" && sourceInfo.facing == (isFront ? "front" : "back")) {
videoSourceId = sourceInfo.id;
}
}
getUserMedia({
audio: false,
video: {
mandatory: {
minWidth: 500,
minHeight: 300,
minFrameRate: 30
},
facingMode: (isFront ? "user" : "environment"),
optional: [{ sourceId: sourceInfos.id}]
}
}, function(stream) {
console.log("dddd", stream);
callback(stream);
}, logError);
});
}
function logError(error) {
console.log("logError: ", error);
}
let container;
var CanvasTest = React.createClass({
getInitialState: function() {
return {
isFront: true,
selfViewSrc: null};
},
componentDidMount: function() {
container = this;
},
render() {
return (
<View>
<RTCView streamURL={this.state.selfViewSrc} />
{console.log("this.state: ", this.state)}
{getLocalStream(true, function(stream) {
//localStream = stream;
//container.setState({selfViewSrc: stream.toURL()});
})
}
</View>
);
}
});
AppRegistry.registerComponent('CanvasTest', () => CanvasTest);
Everything is okay until I try to call the getLocalStream function. I get an "undefined is not an object" error for that line. (I've commented out the lines inside the callback to see if they are causing the problem, they are not).
This is what I get from the console in Android Studio:
E/ReactNativeJS: undefined is not an object (evaluating 'WebRTCModule.mediaStreamTrackGetSources')
E/EGL_emulation: tid 3106: eglSurfaceAttrib(1165): error 0x3009 (EGL_BAD_MATCH)
W/OpenGLRenderer: Failed to set EGL_SWAP_BEHAVIOR on surface 0xa0899300, error=EGL_BAD_MATCH
I think I'm calling the function in the wrong place. I want the view to load up the camera stream when the app starts. What am I doing wrong?
Is there a simpler example somewhere of using WebRTC in react native?
About undefined is not an object
It may because of not install it properly.
I'd suggest restart a fresh build again:
remove npm module: rm -rf $YourProject/node_modules/react-native-webrtc
clean npm cache: npm cache clean
clear gradle build intermediate files or
clear xocde project by Product -> clean
( it depends on your env )
npm install react-native-webrtc
follow the documents steps by steps carefully (Android / iOS)
be sure grant all permissions mentions on documents then try again.
Where to execute getLocalStream()
in your case, you can execute it in ComponentDidMount
otherwise, in some case, app may warn that you can't setState() in render()
(setState() will trigger render() normally, the warning is to prevent infinite loop.)
Suggestion
I would suggest you to NOT test it on simulators as possible for libraries which needs to access lots of hardware's functionalities.