I am working on understanding modulation and demodulation when I ran into a rather weird problem with my graphs. It seems the spectrum of my modulated signal is shown in time while my signal was shown in frequency. Is there something I did wrong in the setup? Thanks!
It looks like the Spectrum of Modulated Signal is in the frequency domain, but the X-Axis is just the number of non-redundant points in FFT and not scaled to a normalized frequency.
Check the Spectral Measurements VI that is attached to Spectrum of Modulated Signal and adjust the settings to get it to a normalized frequency. You can change the X-Axis label to read Frequency instead of Time, but that is cosmetic (and proper).
Related
I want to measure spectrum Occupancy of any one of the GSM band using Gnuradio and a USRP for 24 hours.
Is there any way to save the waterfall plot of gnuradio in image file or any other format?
If not is there any other way to show the spectrum occupancy for certain amount of time in one image or graph?
Is there any way to save the waterfall plot of gnuradio in image file or any other format?
Middle-mouse-button -> Save.
If not is there any other way to show the spectrum occupancy for certain amount of time in one image or graph?
This is a typical case for "offline processing and visualization". I'd recommend you just build a GNU Radio flow graph that takes the samples from the USRP, applies decimating band pass filters (best case: in shape of the GSM pulse shape), and then calculates the power of the resulting sample streams (complex_to_mag_squared) and then just saves these power vectors.
Then you could later easily visualize them with e.g. numpy/matplotlib, or whatever tool you prefer.
The problem really is that GSM spectrum access happens in the order of microseconds, and you want to observe for 24 hours – no visualization in this world can both represent accurately what's happening and still be compact. You will need to come up with some intelligent measure built atop of the pure occupancy information.
I have set of data which includes position of a car and unknown emitter signal level. I have to estimate the distance based on this. Basically signal levels varies inversely to the square of distance. But when we include stuff like multipath,reflections etc we need to use a diff equation. Here come the Hata Okumura Model which can give us the path loss based on distance. However , the distance is unknown as I dont know where the emitter is. I only have access to different lat/long sets and the received signal level.
What I am asking is could you guys please guide me to techniques which would help me estimate the distance based on current pos and signal strength.All I am asking for is guidance towards a technique which might be useful.
I have looked into How to calculate distance from Wifi router using Signal Strength? but he has 3 fixed wifi signals and can use the FSPL. However in an urban environment it doesnot work.
Since the car is moving, using any diffraction model would be very difficult. The multipath environment is constantly changing due to moving car, and any reflection/diffraction model requires well-known object geometry around the car. In your problem you have moving car position time series [x(t),y(t)] which is known. You also have a time series of rough measurement of the distance between the car and the emitter [r(t)] of unknown position. You need to solve the stationary unknown emitter position (X,Y). So you have many noisy measurement with two unknown parameters to estimate. This is a classic Least Square Estimation problem. You can formulate r(ti) = sqrt((x(ti)-X)^2 + (y(ti)-Y)^2) and feed your data into this equation and do least square estimation. The data obviously is noisy due to multipath but the emitter is stationary and with overtime and during estimation process, the noise can be more or less smooth out.
Least Square Estimation
My goal is to achieve something that was previously asked in this site (outside from SO). In this external site the questions is unanswered, and in order to give more visibility and to try to get an answer I translate it to here:
The issue is:
I have a small simulation of particles flowing through a wire mesh structure, and I'm interested in calculating the mass flow rate and volume fraction of particles at certain cross sections. I think I understand how to calculate mass flow rate by setting up small regions and dumping particle count and velocity from that region. I assume that volume fraction works in a similar fashion, except I only need to know the size of my particles and my dump region.
What I'm wondering is this - is it possible to do these things in Paraview? I can set up planes and slices and such, but I can't seem to extract much useful information out of them.
Further on down the road, what I would like to do would be to plot contours of volume fraction at certain planes, and plot the volume fraction along the vertical axis so I can see how high the particles are piling up on top of the screen, based on particle size, wire size, etc. Can Paraview do any of this?
This is a visualization issue. I don't know how make it with Paraview. The idea is count how much particles cross the slice.
My first approach was piped: DataReader | Spherical Glyph | Slice with normal fixed handly along z axis but nothing results. Also I tried to adding the filter Surface Flow and nothing too. Probably I am piping the data in a bad way.
To see the pipelining process I add an image (focus in PlotOverLine1 and its above pipes):
I'm trying to make an embedded thingy that detects the presence of a 19kHz tone from an electret microphone. I have a multistage bandpass filter/preamp hooked into the ADC of a microcontroller, and am trying to figure out the best way to digitally condition the signal in order to detect the presence of the tone.
I have implemented a Goertzel filter to look for the frequency of interest. My ADC takes 400 samples at a frequency of 4000KHz, then the micro processes the block and adds the result to a 100 point moving average. Looking at terminal output after each block, I can definitely see an overall jump in the numbers when the transmitter is turned on. However, there's a lot of noise in the power readings when the thing is turned on, and the the noise floor in the room I'm in keeps changing, too. I am not sure how to tune the thresholding level/filter out all of this noise.
I've tried a few things, but they all seem to be pretty noisy as the baseline of my signal drifts all over the place:
Preprocessing the block with Hamming/Blackman windows
Ratio of total received block power to band power in filter output
Ratio of power of band in interest (19kHz) to a band outside of,
but near band of interest (18.5kHz)
EDIT: I've done some more reading since posting this. Is calculating (2*Ew)/(N*Et) where Ew is the output from my filter and Et is the sum of the squares in my block the best way to do this test?
Any advice on how to deal with this and/or do a better method of signal extraction?
Thanks!
I am new to labview and I need help.
I am using myrio with gyroscope, and when I display the gyroscope values I get noise.
My question is: How can I implement lowpass filter to reduce the noise in X , Y and Z rates of the gyroscope?
I searched a lot, but I did not understand how can I know what is the sampling frequency, the low and the high cutoff frequency.
Thank you so much.
If you're data is noisy you should try to fix the problem before you digitize the data. If a physical low-pass filter will do the trick, install one. The better the signal before the DAQ the better the data will be once it's digitized.
Some other signal conditioning considerations: make sure to reduce the length of wire from the gyroscope to the DAQ to only what's necessary, if possible eliminate any sources of noise from the environment (like any large rotating magnets--seriously I once helped someone who was complaining about noise when they were using an unshielded wire next to an MRI machine), and if you're going to add any signal conditioning try to amplify close to your sensor.
If you still would like to filter in software, there's an example included with LabVIEW that demonstrates both the point-by-point VIs and the array based VIs. It's called PtByBp and Array Based Filter.vi and can be found in the Example Finder under Analysis, Signal Processing and Mathematics >> Filtering and Conditioning
Please install this FREE toolkit from ni.com: http://sine.ni.com/nips/cds/view/p/lang/en/nid/212733
There are examples and good ready to use application how to use myRIO gyroscope and how to do proper DSP.
Sampling frequency is how fast you sample. Look for this value in the ADC settings. Low and high cutoffs - play with those values. Doing an FFT on your signal may help you to determine spectral frequency density, and decide where to cut.