I have a OneSignal account.
I have a front-end PWA built with VueJs that collects users & sends there location in a tag.
OneSignal.push(function() {
OneSignal.sendTags({
latitude: latitude,
long: longitude
})
})
I have successfully sent push messages from the API, when addressing all or a specific segment.
My problem is when using the location filter.
I have tried with this Json :
{"app_id": "APIKEY","contents": {"en": "English Message"},"filters": [{"field": "location", "radius": "1000", "lat": "50.747164", "long": "3.345545"}]}
I realise it's counter intuitive to send latitude and expect to filter on lat... but this is what i found in the documentation. Also some things made me think 'location' is a field of its own, not part of the tag. But I could not find this field when creating a segment manualy. So I've hit a dead end here.
If I should use a different platform, or a whole different approach, I'm open to that. Currently I'm stuck with no extra info.
register for native devices on OneSignal.
If this is happening to you you are possibly not registered on native platforms.
Check in your One-signal under settings if you have registered native devices.
Location based sending only works with native device setup.
If 'location' is not under segment then it is just not active.
Related
I want to integrate Screen Share feature in my react-native application in which I am using Twilio for video communication. In Web we are able to achieve this by following these steps.
1 : We get the media device stream using
navigator.mediaDevices.getDisplayMedia({
video: true,
});
2 : Then we get the first stream tracks using
const newScreenTrack = first(stream.getVideoTracks());
3 : After that we set this newScreenTrack in some useState
const localScreenTrack = new TwilioVideo.LocalVideoTrack(
newScreenTrack
);
4 : After that we first unpublish the previous tracks and publish the new tracks using
videoRoom.localParticipant.publishTrack(newScreenTrack, {
name: "screen_share",
});
5 : And finally we pass these tracks in our ScreenShare component and render these tracks to View the screenShare from remote Participant.
I need to do the same thing in my react-native application as well. Where if localParticipant ask for screenShare permission to another participant. Participant will accept the permission and able to publish the localScreenShare tracks.
If anyone know this please help me in this. It would be really helpful. Thank you
I think this is an issue with the react-native-twilio-video-webrtc package. It seems that, as you discovered in this issue, that screen sharing was previously a feature of the library and it was removed as part of a refactor.
Sadly, the library does more work than the underlying Twilio libraries to look after the video and audio tracks. The Twilio library is built to be able to publish more than one track at a time, however this React Native library allows you to publish a single audio track and a single video track using the camera at a time.
In order to add screen sharing, you can either support pull requests like this one or refactor the library to separate getting access to the camera from publishing a video track, so that you can publish multiple video tracks at a time, including screen tracks.
I am working on a React Native app that uses TalkJS. We want users to get a push notification for every message they receive. The messages are going through but we aren't getting notifications. We have uploaded our .p12 to the TalkJS dashboard and followed the docs for setting up a React Native project. Below is the relevant code we're injecting to the TalkUI loadScript. We followed https://talkjs.com/docs/Features/Notifications/Mobile_Push_Notifications.html
const res = await window.talkSession.registerDevice({ platform: "ios", pushRegistrationId: "${deviceToken}"});
alert("registering deviceToken ${deviceToken} response: " + res)
We are getting the alert with the correct deviceToken but this method does not return anything. We tried .then and an error first callback but nothing is coming back from this method.
Edit: this method is not designed to return anything, so the response is expected to be empty.
The Promise was designed to not return anything as Kapobajza mentioned.
From their support chat: "We have an issue regarding push notifications on iOS when using React Native."
Edit: this issue has been resolved. In addition to work that needed to be done by the TalkJS team, push notifications require a user to have a 'role', as is very loosely implied in the Overview for Push Notifications
I have a react native messaging app done with Expo. I got notifications to work, but the problem is that each message is a separate notification.
I would like to group notifications sent by the same person. Currently, I have:
[Notification]
John - Hey, how are you?
[Notification]
John - Long time no see!
and I would like them to merge as a single one when the second message is received, like this:
[Notification]
John |
Hey, how are you?
Long time to see!
I might be missing something because I cannot find anyone else wondering about such a common functionality.
The code I use to send notifications from my backend (python):
headers = {
'Accept': 'application/json',
'Accept-encoding': 'gzip, deflate',
'Content-Type': 'application/json',
}
session.post(
"https://exp.host/--/api/v2/push/send",
json = {
"to": expo_token,
"title": username,
"body": message_content,
},
headers=headers
)
In iOS you should use apns-collapse-id
https://developer.apple.com/documentation/usernotifications/setting_up_a_remote_notification_server/sending_notification_requests_to_apns
An identifier you use to coalesce multiple notifications into a single
notification for the user. Typically, each notification request causes
a new notification to be displayed on the user’s device. When sending
the same notification more than once, use the same value in this
header to coalesce the requests. The value of this key must not exceed
64 bytes.
Update
For using collapse feature, you can use for Notifications other service. Which supports collapse and react native - for example
https://documentation.onesignal.com/docs/how-notifications-work#section-notification-collapsing
It seems like you can not control how notifications group in Expo app now, but I noticed that on IOS they are being grouped together automatically, while on Android you need to set notification.androidMode to collapse.
Take a look at current documentation: https://docs.expo.dev/versions/latest/config/app/#androidmode
app.json
Path: notification.androidMode
Type: enum
Show each push notification individually (default) or collapse into one (collapse).
All your messages will still be separate notifications, but at least they will collapse into one (stack).
That will only work in standalone app, nor in Expo Go.
In Titanium there is the following property:
Titanium.Geolocation.lastGeolocation
As explained in this doc: http://docs.appcelerator.com/platform/latest/#!/api/Titanium.Geolocation
It is described as follows:
JSON representation of the last geolocation received.
LastEvent is the JSON version of the last geolocation sent by the OS.
This does not trigger a geolocation attempt, nor wait for such. If no
geolocation has happened, this value may be null or undefined.
My question is when it says "lastGeolocation", does it mean the last geolocation recieved my the device ever (e.g. The app could have got a location from another app such as Google Maps), or just the last geolocation received by the app?
I believe it should be the device's last location irrespective to which app requested for it.
It is easily checkable with following steps:
Get a fresh current location from the Titanium app & check the value of lastLocation.
Do a location request from another place with different app like Google Maps.
Then check it again in Titanium app.
At the time of this post I believe that there are no geofencing modules available for react native so I would like to implement an alternative poor man's strategy. I discovered react native's geolocation module however the official doc is not clear:
1) Does the Geolocation module run in the background and get the current user coordinates automatically (even if the app is in the background)? If yes, are these stored in a variable or a state?
2) If (1) is true, how can I detect a change in state? Because once I detect a change in state (i.e. user's location) I would like to push this new location to a remote server and store it in a database. On the other hand I do not want to store each and every inch the user is moving!
Does this strategy make sense at all? My concerns are battery consumption as well of course..
1) You need to enable this feature in xcode Then save the location you get from the 'navigator.geolocation.getCurrentPosition' callback somewhere in you app e.g. in your AsyncStorage
2) The callback is only getting called if the location is changed. So you don't need explicitly detect any location change.