I have been playing around with the new iBeacons in iOS 7. I have one device setup as a beacon, and the other device ranging to detect when I am near, far, immediate, etc. I'd like to know very quickly when I cross between these ranges. Is there any way to adjust the latency? I find that I have to move my device around very slowly or I will not know when I cross these thresholds.
No, you would not be able to adjust the beacon latency. As Apple says in Region Monitoring Guide:
To prevent spurious notifications, iOS does not deliver region
notifications until certain threshold conditions are met.
Specifically, the user’s location must cross the region boundary and
move away from that boundary by a minimum distance and remain at that
minimum distance for at least 20 seconds before the notifications are
reported.
Apple does not define what the latency is, it seems it's not fast enough for your application.
You can have a tradeoff - to implement beacon ranging yourself using Core Bluetooth and listen to the CBPeripheral advertisement events while scanning and range using RSSI:
centralManager:didDiscoverPeripheral:advertisementData:RSSI:
If you are using a custom beacon, such as the RadiusNetworks VirtualiBeacon VM image you can adjust the frequency of the advertisements. The flip side your app must run in the foreground opposed to CoreLocation delivering beacon events even when your app is not running.
Related
Context
IOS Background and suspended mode
Library version: react-native-ble-plx 1.0.3
Platform: iOS.
Platform logs (XCode)
Expected Behavior
When the app is even in the background or suspended mode of ios, the callback of the startDeviceScan gets invoked.
Current Behavior
When the app is in the background or suspended mode of ios, the callback of the startDeviceScan never gets invoked. I assume there is never discovery event found yet, according to the apple ble doc. Is it possible to configure the 'startDeviceScan' to make the app scan in the background and suspended mode?
After a few research, I find something helpful related with ble scanning in ios background & suspend mode.
1.) Scanning is throttled down when iOS devices are in the background.
While scanning in the foreground will likely immediately discover a device advertising next to it, discovery in the background can take up to ~60 times longer. The iOS system makes no assumptions that the user would prefer one app to have better Bluetooth functionality than another (or that only one app wants to use it). And since it is shared functionality, they want users to have a uniform experience across apps. You should check out the technical specifications regarding Advertising and Scanning intervals to get a better idea of what Apple has to do under the covers.
2.) Your devices may have already discovered each other before entering the background.
We must remember that Apple disables the CBCentralManagerScanOptionAllowDuplicatesKeyscanning flag when we enter the background. Since you're not even specifying this flag, it defaults to NO anyways. So if they've even seen each other once, you will not get another callback when they are in the background.
3.) Respect the Limits of Advertising Data
Although advertising packets in general can hold a variety of information about the peripheral device, you may advertise only your device’s local name and the UUIDs of any services you want to advertise. There are also limits as to how much space you can use when advertising data. When your app is in the foreground, it can use up to 28 bytes of space in the initial advertisement data for any combination of the two supported advertising data keys. If this space is used up, there are an additional 10 bytes of space in the scan response that can be used only for the local name. Any service UUIDs that do not fit in the allotted space are added to a special “overflow” area; they can be discovered only by an iOS device that is explicitly scanning for them. While your app is in the background, the local name is not advertised and all service UUIDs are place in the overflow area.
We are currently using ExoPlayer for one of our applications, which is very similar to the HQ Trivia app, and we use HLS as the streaming protocol.
Due to the nature of the game, we are trying to keep all the viewers of this stream to have the same latency, basically to keep them in sync.
We noticed that with the current backend configuration the latency is somewhere between 6 and 10 seconds. Based on this fact, we assumed that it would be safe to “force” the player to play at a bigger delay (15 seconds, further off the live edge), this way achieving the same (constant) delay across all the devices.
We’re using EXT-X-PROGRAM-DATE-TIME tag to get the server time of the currently playing content and we also have a master clock with the current time (NTP). We’re constantly comparing the 2 clocks to check the current latency. We’re pausing the player until it reaches the desired delay, then we’re resuming the playback.
The problem with this solution is that the latency might get worse (accumulating delay) over time and we don’t have other choice than restarting the playback and redo the steps described above if the delay gets too big (steps over a specified threshold). Before restarting the player we’re also trying to slightly increase the playback speed until it reaches the specified delay.
The exoPlayer instance is setup with a DefaultLoadControl, DefaultRenderersFactory, DefaultTrackSelector and the media source uses a DefaultDataSourceFactory.
The server-side configuration is as follows:
cupertinoChunkDurationTarget: 2000 (default: 10000)
cupertinoMaxChunkCount: 31 (default: 10)
cupertinoPlaylistChunkCount: 15 (default: 3)
My first question would be if this is even achievable with a protocol like HLS? Why is the player drifting away accumulating more and more delay?
Is there a better setup for the exoPlayer instance considering our specific use case?
Is there a better way to achieve a constant playback delay across all the playing devices? How important are the parameters on the server side in trying to achieve such a behaviour?
I would really appreciate any kind of help because I have reached a dead-end. :)
Thanks!
The only sollution for this is provided by:
https://netinsight.net/product/sye/
Their sollution includes frame accurate sync with no drift and stateful ABR. This probably can’t be done with http based protocols hence their sollution is built upon UDP transport.
I know that the present mode of the swap chain can be used to sync the frame rate to the refresh rate of the screen (with VK_PRESENT_MODE_FIFO_KHR for example).
But is there a way of limiting the frame rate to a fraction of the monitor refresh rate? (eg. I want my application to run at 30 FPS instead of 60.)
In other words, is there a way of emulating what wglSwapIntervalEXT(2) does for OpenGL?
Vulkan is a low-level API. It tries to give you the tools you need to build the functionality you want.
As such, when you present an image, the API assumes that you want the image presented as soon as possible (within the restrictions of the swapchain). If you want to delay presentation, then you delay presentation. That is, you don't present the image until it's near the time to present a new image, based on your own CPU timings.
After searching for a very long time for a way to play notification noises only through the headphones (when plugged in), on a stream separate from STREAM_MUSIC, in a way that could interrupt and be completely audible over any background music, Android finally came out with the AudioAttributes API. By using the following code, I'm able to achieve exactly what I want for notifications, at least in API 21 or higher (STREAM_MUSIC is the best option I've found for lower versions):
AudioAttributes audioAttributes = new AudioAttributes.Builder()
.setUsage(AudioAttributes.USAGE_ASSISTANCE_SONIFICATION)
.setContentType(AudioAttributes.CONTENT_TYPE_SONIFICATION)
.build();
Unfortunately, there doesn't appear to be any way to adjust the volume of the sonification in my app's settings. I currently use the AudioManager in the following way, but it only allows volume adjustments to streams, and none of STREAM_ALARM, STREAM_NOTIFICATION, STREAM_RING, or STREAM_MUSIC applies to whatever routing strategy is used for the sonification:
audioManager.setStreamVolume(AudioManager.STREAM_NOTIFICATION, originalVolume, 0);
Does anyone have any suggestion on how to set the volume corresponding to the AudioAttributes output? Keep in mind that the audio is actually played in a BroadcastReceiver that's used for the actual notification, and the audio setting would be specified in just some settings Activity.
Well, it appears that I missed a critical table in the API documentation:
https://source.android.com/devices/audio/attributes.html
It seems that STREAM_SYSTEM is the equivalent of what I was attempting to do with AudioAttributes. Basically, using the code I have above is sufficient for API 21 and forward, and use of STREAM_SYSTEM does everything necessary for the AudioManager and APIs prior to 21.
When developing an app that uses Bluetooth Low Energy, there comes a time when the iOS device loses connection to the peripheral. (Sometimes for hours.)
In order to reconnect to an existing peripheral, the app must constantly scan in the background at a specific rate throughout the day(s), even when the app is backgrounded.
The problem is, iOS will not guarantee that your app will not get killed, due to memory constraints, etc.
Information found in the iPhone OS Programming guide states that:
Apps that work with Bluetooth peripherals can ask to be woken up if
the peripheral delivers an update when the app is suspended. This
support is important for Bluetooth-le accessories that deliver data at
regular intervals, such as a Bluetooth heart rate belt. When an app
includes the UIBackgroundModes key with the bluetooth-central value in
its Info.plist file, the Core Bluetooth framework keeps open any
active sessions for the corresponding peripheral. In addition, new
data arriving from the peripheral causes the system to wake up the app
so that it can process the data. The system also wakes up the app to
process accessory connection and disconnection notifications.
The problem does not arise when the phone is connected to a device and the application is background. It does happen, however, when the device is disconnected and the app is backgrounded. In this specific case, the phone is no longer connected to the peripheral, and therefore no longer getting notifications.
Many people have discussed this before, either on Stack Overflow or the Apple forums, and I believe one of the Apple developers has responded saying:
We're aware of this issue and are trying to come up with a solution.
Currently, there is no workaround."
My question is, is there a way to at least improve your chances of not getting killed by iOS due to memory constraints?
For example, an instant messaging app (IMO) seems to run quite nicely in the background. After days and days of not being used, the app will wake up and display a gChat message.
I’m questioning things such as
Strong pointers
Overall memory size
Reducing memory size when app is backgrounded or minimized
Reducing frequency of background operation
Etc.
Why do you need background execution even when the bluetooth hardware is disconnected?
I don't think that you need to "rescan continuously" to reconnect again, if the hardware is "paired" with the iPhone/iPad, it will reconnect itself. Like a bluetooth headset. Or not?
AFAIK you have no chances to accomplish what you are asking for.
A normal App is always suspended when the user go back to home. The app has approx. 5 secs of background time to stop timers, save state ecc ecc.
There are special background modes that allows you to have more background time, and each of this mode (explained in the page you linked) has a different behavior.
About the bluetooth mode:
The descripted behavior is not an issue, but it's by design:
the app is suspended
when the app is suspended, it can be killed by the OS to free ram (and there are no tricks to avoid this), but the system will wake up if needed.
the app is then awaken every time a notification is received (awaken from suspended state or lauched from "previously-killed" state)
the app has 10 seconds to do tasks (save informations, ecc ecc). Moreover, can request +10 mins. of background time for a particular task
after the 10 secs (or 10 min) the app is suspended again
The example you wrote about the chat app is incorrect: chat apps usually doesn't use any background mode, simply they forward you messages using push notifications. When you open the app, the app connect to a server that stores all your messages and download it.
You can get "more uptime" using location background mode (routing app can work in background), or using a combination of significative location changes (the app is awaken) and the 10 minutes background time, but I think that Apple will reject an app that "abuse" this.
Shortly, you have to design your app to support this behavior.
I found this in some more Apple documentation:
Memory Usage for Background Apps
Every app should free up as much memory as is practical upon entering
the background. The system tries to keep as many apps in memory at the
same time as it can, but when memory runs low it terminates suspended
apps to reclaim that memory. Apps that consume large amounts of memory
while in the background are the first apps to be terminated.
Practically speaking, your app should remove strong references to
objects as soon as they are no longer needed. Removing strong
references gives the compiler the ability to release the objects right
away so that the corresponding memory can be reclaimed. However, if
you want to cache some objects to improve performance, you can wait
until the app transitions to the background before removing references
to them.
Some examples of objects that you should remove strong references to
as soon as possible include:
Image objects
Large media or data files that you can load again from disk Any other
objects that your app does not need and can recreate easily later To
help reduce your app’s memory footprint, the system automatically
purges some data allocated on behalf of your app when your app moves
to the background.
The system purges the backing store for all Core Animation layers.
This effort does not remove your app’s layer objects from memory, nor
does it change the current layer properties. It simply prevents the
contents of those layers from appearing onscreen, which given that the
app is in the background should not happen anyway. It removes any
system references to cached images. (If your app does not have a
strong reference to the images, they are subsequently removed from
memory.) It removes strong references to some other system-managed
data caches.
From the apple documentation i have always assumed the following:
connectPeripheral:options:
Establish a connection to the peripheral.
- (void)connectPeripheral:(CBPeripheral *)peripheral options:(NSDictionary *)options;
Parameters
peripheral
The peripheral to connect to.
options
A dictionary to customize the behavior of the connection. See CBConnectPeripheralOptionNotifyOnDisconnectionKey.
Discussion
**This never times out**. Use cancelPeripheralConnection: to cancel a pending connection.'
With the important part being on the fact that this never times out. I would assume this would hand off to the system so that it will connect automatically to the peripheral when it comes into range, thus removing the need for full backgrounding. Someone correct me if im wrong though!