Hi I am trying to demodulated a GFSK signal using gnu radio
I connected an osmocom source to FIR filter and the filter to Quadrature demod (which output the signal to a file)
as shown here:
my flow graph
the Quadrature demod gain is - samp_rate/(2*pi*deviation/8).
Using Audacity I opened the file containing the demodulated signal. however no matter what packet I send (even if I don't send anything) I don't see the peeks which suppose to stand for a packet.
demodulated signal shown in audacity
you can see hear an example ,it seems that my device is sending packets constantly.
what am i doing wrong?
I think you should change Sample Rate from 4M sps to a value greater than Ch0 Frequency (currently 868M Hz).
Related
I'm using two USRP B200 boards to make RF communication between them with SMA cable wired.
And I'm using GNU Radio Companion with DVB-S2 flowgraphs to send&receive some test videos.
Thankfully, these flowgraphs did work.
https://github.com/drmpeg/gr-dvbs2
https://github.com/drmpeg/gr-dvbs2rx
And what I want is to check the error rate of received signal (like displaying "Current error rate: 5%" in the terminal), showing if the signal (video) is well received or not. I tried to find this kind of error rate function in LDPC or any other decoders, but I couldn't.
Or could there be any option to check the error rate between my original video file and received file? Even after all RF communication is done?
I appreciate, many thanks in advance.
I'm trying to implement the UPLINK of a Ground Station controlling a small satellite. The idea is that the link should stay always active in between each transmitted telecommand. For this, I need to insert some DUMMY or IDLE sequence bytes such as 0xAA or similar.
I have found that some people already faced a similar issue and posted their questions here:
https://www.ruby-forum.com/t/constant-carrier-digital-transmission/163379
https://lists.gnu.org/archive/html/discuss-gnuradio/2016-08/msg00148.html
So far, the best I could achieve was to modify the EventStream Source block from https://github.com/osh/gr-eventstream in order to preload the vectors with my dummy sequence (i.e. 0xAA) instead of preloading them with zeroes. This is a general overview of the GNURadio graph I'm using:
GNURadio Flowgraph Picture
This solution however introduces a huge latency and the sent message does not appear at the output until a huge amount of time has expired (in the order of several seconds).
Is there a way of programming the USRP using GNURadio so that it constantly sends a fixed sequence which should only be interrupted when an incoming message is passed? I assume that the USRP has the ability of reading tagged streams in order to schedule transmissions. However, I'm not sure how to fit this in my specific application.
Thanks beforehand!
Joa
I believe this could be done using a TCP or UDP source block.
Your control information could be sent to the socket using TCP/UDP. GNU Radio would then collect and transmit the packets. Your master control program would then have to handle the IDLE stuffing but solving the problem external to GNU Radio is easier.
Your master control program would basically do the following:
1. tx control data as needed
2. if no control data ready before next packet must be sent send an IDLE packet
I have a NI DAQ USB-6341 w/ BNC termination. I have a device that outputs a +/- 10V signal indicating its current status. The device operates correctly, signal confirmed using a multimeter.
The DAQ is also CAPABLE of reading the signal correctly- when attached to an analog input channel, the test panel successfully reads a 10V range.
However, when I create a simple VI to read the voltage, it absolutely refuses to read anything exceeding 5.473V or below -5.306V.
Is there some sort of configuration I'm missing here? Some setting to 'unlock' the full range? I have used the analog output to put out a +/- 10V signal before with no problem.
In DAQmx create channel.vi, you can configure voltage range.
Please check short Labview help.
If you want to set up V range as -10V to 10V,
Please, try this setup.
Since nothing is specified in terms of configuration labview will generate a default configuration that is somehow wrong. You can do two things, either specify the correct configuration manually or use express VIs.
I would use an express VI, you can configure and check those similar to a test panel. When satisfied you can convert the express VI to labview code for further programming.
I have a GPS circuit board from china. The only information I can find on this thing is :"amoj GPS 04C www.amoj.com"
It has a serial (DB9) connection and I would like to determine how to putty into it or something.
How can I determine what the port settings that are required to access this?
Pictures below:
Photos in Dropbox
The Jupiter TU60 serial interface is 9600 8N1 by default. The only sentence it will output automatically is the flash checksum message about a second after power up. Google the datasheet for the device and it will let you know about this.
To have it output the position and other information, you must command it to do so. There is a default set of commands that are active after power up. They begin with ## and are from the protocol used by Motorola. Refer to the M12+ Users Guide and Supplement (available online) for information on how to use these commands. I have been able to enter them from Realterm. The only tricky part is calculating the checksum. You can use most hex calculators to do that.
According to the datasheet, the unit goes into survey mode automatically and after about 24 hours goes into position hold. The 1PPS and 10KHz signals are valid to less than a microsecond after a few minutes after power up and to 50nS after a day. I have compared this to another standard I have to verify this. You can use the ##Ea command to get the status of the unit and the M12+ Manual will tell you how to decode it.
Look for $GP... messages at 4800 and 9600BPS as yegorich suggest. Common NMEA messages output by GPS devices are $GPGGA, $GPVTG, $GPRMC.. If you find that data coming out, use Google to look up NMEA 0183 sentence structure and you will have what you need...
I have the same board with the Navman jupiter T Tu60 GPS 1pps 10khz GPS Module on it. I just received my sma antenna and have hooked it up. I am using 12.6V power to the centre pin.
It outputs 1pps on the led with no signal, so that is not to be trusted. Mine is labeled 1pps and 10khz underneath the pcb but these are actually swapped! I put the 10KHz output on my dso and get a 10KHz square wave 50% duty cycle signal but there is ringing on the waveform rise so I have to set the trigger level to 0.8v to get the dso to register the 10KHz frequency. I suspect this may be because the output expects a load and is not seeing one. Now, was I using ac or dc coupling?
I too am getting nothing on the serial. I tried 9600, 4800 using putty on com1 (I have a nice old motherboard) and then tried reversing rx and tx but no luck. As of now I am checking out the serial signals with the dso to see if I can work out what is happenning. I suspect that these boards are rubbish, and useful as power supplies only.
It reads 10.0000 on my hp 5328a counter and sometimes reads 9.9999. It would be nice to be able to talk to the gps to see whether it has satellite lock.
Please let me know how you get on and if you find out any further info.
Brett VK6EZ.
I successfully use WasapiLoopbackCapture() for recording audio played on system, but I'm looking for a way to record what the user would actually hear through the speakers.
I'll explain: If a certain application plays music, WASAPI Loopback shall intercept music samples, even if Windows main volume-control is set to 0, meaning: even if no sound is actually heard through audio-card's output-jack (speakers/headphone/etc).
I'd like to intercept the audio actually "reaching" the output-jack (after ALL mixers on the audio-path have "done their job").
Is this possible using NAudio (or other infrastructure)?
A code-sample or a link to a such could come in handy.
Thanks much.
No, this is not directly possible. The loopback capture provided by WASAPI is the stream of data being sent to the audio hardware. It is the hardware that controls the actual output sound, and this is where the volume level is applied to change the output signal strength. Apart from some hardware- and driver-specific options - or some interesting hardware solutions like loopback cables or external ADC - there is no direct method to get the true output data.
One option is to get the volume level from the mixer and apply it as a scaling factor on any data you receive from the loopback stream. This is not a perfect solution, but possibly the best you can do without specific hardware support.