How to accurately measure time difference xcode - objective-c

I need to super accurately measure time difference in xcode. I'm fairly new to Xcode and Objective C but I'm not really sure how to start it.
I need to measure the time difference between when a sound is played through the speaker and when the sound is recorded through the mic.
I'm not sure how to measure time this accurately as it will be a very short amount of time.
Any pointers or help would be much appreciated!
Thanks

The most precise clock is probably mach_absolute_time---returning ticks since the device was last booted. See this Q&A from Apple for an example of how to use it and convert it into a nicer unit.

Related

What is the best way to playback animation exactly N times in blender?

Like in the title. Can be anything, python script or whatever. Also I need to measure time of the playback and/or average framerate. Thanks in advance

React Native, IBECONS, RSSI value to distance conversion

How to stabilize the RSSI (Received Signal Strength Indicator) of low energy Bluetooth beacons (BLE) for more accurate distance calculation?
We are trying to develop an indoor navigation system and came across this problem where the RSSI is fluctuating so much that, the distance estimation is nowhere near the correct value. We tried using an advance average calculator but to no use,
The device is constantly getting RSSI values, how to filter them, how to get the mean value, I am completely lost, please help.
Can anyone suggest any npm library or point in the right direction, I have been searching for many days but have not gotten anywhere.
FRONT END: ReactNative BACKEND: NODEJS
In addition to the answer of #davidgyoung, we would like to point out that any filtering method is a compromise between quality of noise level reduction and the time-lag introduced by this filtration (depending on the characteristic filtering time you use in your method). As was pointed by #davidgyoung, if you take characteristic filtering period T you will get an average time-lag of about T/2.
Thus, I think the best approach to solve your problem is not to try to find the best filtering method but to make changes on the transmitter’s end itself.
First you can increase the number of signals, transmitter per second (most of the modern beacon allow to do so by using manufacturer applications and API).
Secondly, you can increase beacon's power (which is also usually one of the beacon’s settings), which usually reduces signal-to-noise ratio.
Finally, you can compare beacons from different vendors. At Navigine company we experimented and tested lots of different beacons from multiple manufacturers, and it appears that signal-to-noise ratio can significantly vary among existing manufacturers. From our side, we recommend taking a look at kontakt.io beacons (https://kontakt.io/) as an one of the recognized leaders with 5+ years experience in the area.
It is unlikely that you will find a pre-built package that will do what you want as your needs are pretty specific. You will most likely have to wtite your own filtering code.
A key challenge is to decide the parameters of your filtering, as an indoor nav use case often is impacted by time lag. If you average RSSI over 30 seconds, for example, the output of your filter will effectively give you the RSSI of where a moving object was on average 15 seconds ago. This may be inappropriate for your use case if dealing with moving objects. Reducing the averaging interval to 5 seconds might help, but still introduces time lag while reducing smoothing of noise. A filter called an Auto-Regressive Moving Average Filter might be a good choice, but I only have an implementation in Java so you would need to translate to JavaScript.
Finally, do not expect a filter to solve all your problems. Even if you smooth out the noise on the RSSI you may find that the distance estimates are not accurate enough for your use case. Make sure you understand the limits of what is possible with this technology. I wrote a deep dive on this topic here.

Oxyplot: IsValidPoint on realtime LineSerie

I've been using oxyplot for a month now and I'm pretty happy with what it delivers. I'm getting data from an oscilloscope and, after a fast processing, I'm plotting it in real time to a graph.
However, if I compare my application CPU usage to the one provided by the oscilloscope manufacturer, I'm loading a lot more the CPU. Maybe they're using some gpu-based plotter, but I think I can reduce my CPU usage with some modifications.
I'm capturing 10.000 samples per second and adding it to a LineSeries. I'm not plotting all that data, I'm decimating it to a constant number of points, let's say 80 points for a 20 secs measure, so I have 4 points/sec while totally zoomed out and a bit more detail if I zoom in to a specific range.
With the aid of ReSharper, I've noticed that the application is calling a lot of times (I've 6 different plots) the IsValidPoint method (something like 400.000.000 times), which is taking a lot of time.
I think the problem is that, when I add new points to the series, it checks for every point if it is a valid point, instead of the added values only.
Also, it spends a lot of time in the MeasureText/DrawText method.
My question is: is there a way to override those methods and to adapt it to my needs? I'm adding 10.000 new values each second, but the first ones remain the same, so there's no need for re-validate them. Also, the text shown doesn't change.
Thank you in advance for any advice you can give me. Have a good day!

reduce frame rate

i am using cocos2d, my game is working great but after a while the frame rate is reduce more and more...
i have checked with instruments and there are no leaks or allocations..
i am not allocating anything in my game. and i remove unused frames from cache during game.
the only way it comes back to normal frame rate is if i exit the scene and return back..
i just cant understand who is the cause ! my app is done and i cant publish it like that.
any help ?????
how can i find who is the cause ????
thanks
What exactly are you doing in your game? There are many optimization tips, such as using CCSpriteBatchNode when you have lots of sprites that use the same texture, etc.
If you aren't allocing anything as you say (which I find unlikely), then perhaps you're doing some heavy (and unnecessary) game logic calculations every frame... for instance, having hundreds of sprites and you're doing something like calculate the distance between each of them every frame...
Also, what device are you using?

Objective C - detect sound

How can I detect sound on the iPhone?
I currently have it somewhat working using the method described in this article http://mobileorchard.com/tutorial-detecting-when-a-user-blows-into-the-mic/
However, as mentioned in that article, in noisy rooms for example, the method won't work.
So in a way, I'm looking for methods that I can use to separate background noise from actual noise intended?
There probably won't be an existing code on the internet, but if someone can point me to the right direction, it would be very appreciated.
Thank you,
Tee
It's definitely not a great solution, but you could determine a baseline amplitude (ambient sound) and trigger your events when you determine that the amplitude is a certain amount greater for a set amount of time.