detecting heartbeat peakpower using iphone sdk? - objective-c

i want to detect heart rate using iphone sdk does someone knows any method for calculating heartbeat rate?

Fast Fourier Transform is a class of algorithms that can quickly turn samples into an analysis that tells you how prominently ceratin frequencies occur in that sample. For more check out:
Wikipedia: FFT
Literate program example: Cooley-Tukey FFT
This is relevant to your problem because: (1) heart rate is itself a frequency, and (2) most of the sound that comes through the body that you can measure will be within a certain frequency range. Dropping frequencies outside this range means dropping all or mostly noise.
Good luck!

Well I've seen various implementations. Some of them use the accelerometer to detect minute movements in your arm/hand when you hold the phone, some of them can use the microphone, you could also do a manual 'tap' interface where you tap the screen while checking your own pulse.

Related

How do you measure RSSIs of different parts of the spectrum(like FM, DVB-T and so on) using LabView?

I am doing a project on indoor localisation using fingerprinting. Is it possible to build a system in LabView which can scan the entire spectrum and provide me the RSSI measurements of different types of signals?(say FM, GSM, DVB-T and so on.) In case it has to be done separately, can someone please point me to some resources that would help me to find the RSSIs of say, FM signals? I am new to SDRs and would really appreciate some help. I have used this paper as a reference:
http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=7444902
There can be no general method. RSSI is inherently signal-specific, and hence, for each signal type, you will need a different estimator. An estimator that estimates the received signal strength of FM broadcasts will only see noise in a DVB-T signal at incredibly high noise power with no stable signal at all, whereas a DVB-T RSSI indicator would only see a slowly moving interfere in an FM signal.
I see you're using the indoor-positioning-system tag. That is a very bad thing: In an indoor scenario, fading is extremely important, and your received signal strength allow absolutely no conclusions on the distance to the transmitter. This is pretty much the definition of the indoor channel, where lots and lots of reflections overlap and interefere, and there's not always a direct line of sight between transmitter and receiver.
I'm afraid you still have quite some theory to read up on.

Using NAudio, How do I get the amplitude and rhythm of an MP3 file?

The wife asked for a device to make the xmas lights 'rock' with the best of music. I am going to use an Arduino micro-controller to control relays hooked up to the lights, sending down 6 signals from C# winforms to turn them off and on. I want to use NAduio to separate the amplitude and rhythm to send the six signals. For a specific range of hertz like an equalizer with six bars for the six signals, then the timing from the rhythm. I have seen the WPF demo, and the waveform seems like the answer. I want to know how to get those values real time while the song is playing.
I'm thinking ...
1. Create a simple mp3 player and load all my songs.
2. Start the songs playing.
3. Sample the current dynamics of the song and put that into an integer that I can send to which channel on the Arduino micro-controller via usb.
I'm not sure how to capture real time the current sound information and give integer values for that moment. I can read the e.MaxSampleValues[0] values real time while the song is playing, but I want to be able to distinguish what frequency range is active at that moment.
Any help or direction would be appreciated for this interesting project.
Thank you
Sounds like a fun signal processing project.
Using the NAudio.Wave.WasapiLoopbackCapture object you can get the audio data being produced from the sound card on the local computer. This lets you skip the 'create an MP3 player' step, although at the cost of a slight delay between sound and lights. To get better synchronization you can do the MP3 decoding and pre-calculate the beat patterns and output states during playback. This will let you adjust the delay between sending the outputs and playing the audio block those outputs were generated from, getting near perfect synchronization between lights and music.
Once you have the samples, the next step is to use an FFT to find the frequency components. Fortunately NAudio includes a class to help with this: NAudio.Dsp.FastFourierTransform. (Thank you Mark!) Take the output of the FFT() function and sum out the frequency ranges you want for each controlled light.
The next step is Beat Detection. There's an interesting article on this here. The main difference is that instead of doing energy detection on a stream of sample blocks you'll be using the data from your spectral analysis stage to feed the beat detection algorithm. Those ranges you summed become inputs into individual beat detection processors, giving you one output for each frequency range you defined. You might want to add individual scaling/threshold factors for each frequency group, with some sort of on-screen controls to adjust these for best effect.
At the end of the process you will have a stream of sample blocks, each with a set of output flags. Push the flags out to your Arduino and queue the samples to play, with a delay on either of those operations to achieve your synchronization.

Plot a graph of Time vs RSSI for a 433Mhz RF ASK Receiver

Hi Im using the following RF module
http://www.apogeekits.com/rf_receiver_module_rx433.htm
on an embedded board with the PIC16F628A. Sadly, I realized that the signal strength was in analog form and couldn't get any ideas to get the RSSI reading off the pin because well my PIC is digital DUH!.
My basic idea was
To get the RSSI value from my Receiver
Send it to the PIC
Link the PIC to a PC via RS232
Plot a graph of time vs RSSI of the receiver (so I can make out how close my TX is to my RX)
I thought it was bloody brilliant at first but ive hit a dead end here. Any ideas on getting the RSSI data to my PC from this receiver would be nice.
Thanks in Advance
You can get a PIC that has an integrated ADC for sampling the analog signal. Or, you can use an external ADC chip to do the conversion. You would connect that to your PIC using SPI or I2C.
The simplest thing to do is obviously to use a more appropriate microcontroller - one with an ADC! There are many (most), including PICs (though that wouldn't be my first choice).
Attaching an external SPI or I2C ADC might be a bit tedious since having no SPI or I2C on your part, you'd have to bit-bash it. If you do that, use an SPI part - its simpler. Your sample rate will suffer and may end-up being a bit jittery if you are not careful.
Another solution is to use a voltage controlled PWM, then use the timer input capture to time the pulse width. That will give you good regularity and potentially good resolution. You can get a chip (example) to do that, or grow your own. That last option requires a triangle wave input as well as the measured (control) voltage, but on the same site...
In a similar vein, you could use a low frequency VCO (example) and use the output to clock one of the timers, then using a second timer periodically sampling the first and reset it. The count will relate to the voltage, though not necessarily a linear relationship, linearisation could be none on the PIC or at the receiving PC - I'd go for the latter - your micro will suck at arithmetic (performance wise) - even integer arithmetic, especially if it involves division.

How to detect heart pulse rate without using any instrument in iOS sdk?

I am right now working on one application where I need to find out user's heartbeat rate. I found plenty of applications working on the same. But not able to find a single private or public API supporting the same.
Is there any framework available, that can be helpful for the same? Also I was wondering whether UIAccelerometer class can be helpful for the same and what can be the level of accuracy with the same?
How to implement the same feature using : putting the finger on iPhone camera or by putting the microphones on jaw or wrist or some other way?
Is there any way to check the blood circulation changes ad find the heart beat using the same or UIAccelerometer? Any API or some code?? Thank you.
There is no API used to detect heart rates, these apps do so in a variety of ways.
Some will use the accelerometer to measure when the device shakes with each pulse. Other use the camera lens, with the flash on, then detect when blood moves through the finger by detecting the light levels that can be seen.
Various DSP signal processing techniques can be used to possibly discern very low level periodic signals out of a long enough set of samples taken at an appropriate sample rate (accelerometer or reflected light color).
Some of the advanced math functions in the Accelerate framework API can be used as building blocks for these various DSP techniques. An explanation would require several chapters of a Digital Signal Processing textbook, so that might be a good place to start.

Detecting heartbeats signals with "Digital heart beat rate sensor (IC)" - iOS

I just bought Digital heart beat rate sensor:
http://www.dealextreme.com/p/digital-heart-beat-rate-sensor-3-5mm-data-port-16009
And I'm looking how I can make application for iOS to work with.
Sensor has 3.5mm jack and I can detect signal with audio framework on iOS.
Can you give me some guidelines how to start with detecting these signals into heart beat rates?
That sensor looks rather like one I have here in my junk box. If so, it generates a voltage signal which depends on the pressure exerted on it by the skin against which it is pressed. If there is a strong pulse at the point of pressure, I see a signal on an oscilloscope which has a component at the pulse rate: so it is at a frequency of around 1-2Hz.
This is WAY below the audio range, and in most audio interfaces would be filtered out before it ever got to the audio in ADC. I don't have a handy iPhone to check this on, but it would be bad design if the audio input did let such frequencies through. And Mr Jobs (R.I.P.) did not approve of bad design!
There is also a lot of interference at other frequencies: mains hum (50Hz here), and at lower frequencies spurious signals from muscle twitches.
To make this work, you would need some sort of signal conditioning. If it was up to me, I would use a high input impedance amplifier, with about a 0.1Hz - 10Hz passband, followed by a voltage to frequency converter. That would give me a tone, which i could set in the audio band, whose frequency varied up & down as the pressure on the sensor changes. That would let me use fairly simple frequency detection software to recover the pressure waveform, which could then be processed using autocorrelation or similar techniques to recover the heartbeat frequency. A DTMF decoder is not the right tool, though.
I did find when I played about with the senor that it was very touchy, responding to almost everything going, and it wouldn't be easy to pick out the heartbeat. Your sensor may be different, though.