Having an issue with serial connection between UAV and Gimbal device - embedded

I have a gimbal device which takes MAVLink commands over a uart port. I need to connect it to a UAV through the telem port of a Pixhawk board.
When I use a serial to USB cable and connect the gimbal to my PC and send the appropriate serialized MAVLink commands through the connection I can get the gimbal to work as expected. When connect the telem output of the Pixhawk board to my PC, save a chunk of the data (byte string) going through, and send the same data to the gimbal through the same process as above, the gimbal works as expected. But connecting the gimbal to the Pixhawk board in the same way fails to make it work. The baud rates on both sides are the same.
Any idea what the issue might be?
I don't have a background in hardware and I'm not sure how data transfer over serial takes place but I think the issue might be on the "network layer" of the communication. I tried slowing down the rate of messages from the pixhawk side but it didn't help. To try and reproduce the issue, I tried to add randomized strings in between messages and adding delays in between parts of a message while sending messages from my PC to the gimbal but it still worked as expected, so I'm really at my wits end at identifying where the issue might be.
Another idea I had was saving the "stream" of data coming from the telem port and sending the same to the gimbal from my PC but I didn't find a way to do this. Is there an equivalent to pcap files for serial communication?

Related

Issues communicating with devices over usb hub

I'm facing issues when communicating with devices over USB hub. When enumerating devices directly to host port, it does work, some devices over usb hub have issues.
Setup: STM32F103C8 - MAX3421E - LUFA (usb stack) (ported to MAX3421E (host) and STM32F103C8T6 (device)) - USB Full-Speed setup
Scenario:
When I attach device directly to host, I don't experience any issues enumerating almost all (some devices seems to be faulty and have weird/nonstandard behavior) devices. But when I try to enumerate over usb hub, devices starts to behave very strangely. I'm receiving much more NAK's from devices than when connected directly to host. Some devices are able to return Device Descriptor, but retrieving Configuration Descriptor fail. Some devices return Toggle Error after several NAK's, this could be remedied so far by delaying retry IN token. Also there is different behavior of devices when connected over different hubs. I.e. one device has no problems when connected to HUB1, but have issues when connected to HUB2. Then I have HUB3 (7 port) which internally acts as HUB in HUB. On this HUB3 device working fine on port behind secondary internal hub, but not on primary ports exposed over "root" hub.
I'm in suspicion that hub's TT could be somehow interfering with usb communication, but according to information I have found, TT should not be enabled under Full-Speed setup.
I have checked (many times) that I'm setting correct device address assigned during SetAddress phase (which is proved by returning Device Descriptor). When I step debug it seems that I can get Configuration Descriptor also, but while in normal system run, it isn't retrieved, but only over hub.
Does anyone has any ideas, what to look after? I've run out of ideas here after week of trying to find a root cause.
Thanks
so...
- as usual after searching for root cause, solution after days of trying comes naturally after asking on somewhere (this is hapenning to me always, but I do try prior asking always)
- when using hubs, make sure you don't suspend SOF generation during control transfers. LUFA just resume bus inside control transfer routines, so make sure you don"t stops and reenable SOF within (my fault as I'm using ported version to MAX)
- if you have tight main loop make sure you don"t reinitialize usb transfer without completion of previous try, but if you do so, check you don't owerwrite data which haven"t been processed yet fully (especially when using interrupt-driven transfer complete processing) [things seems to work when you have quite some debug output, as it delay that time critical transfers]
Enumeration over hub isuues are now second to none. Small glitches are subject for tweaking.
Unfortunately as I was in question for electrical issues, I had to unsolder usb host shield and soldered another one, which in light of new information seems unneeded. Nevermind, I have trained my soldering skills.

MBED Serial dropping data

I use MBED (online IDE & libraries) for my application with host board NUCLEO-411RE and 4D Systems touch display connected by full duplex serial communication.
I am able to send data successfully from host to display without errors. However when sending data from display back to host I am losing data.
Reducing baud to 9600 does not solve the problem.
The host processor remains in a super loop with the first action to check if LCD sends serial data ( lcd4d.readable() ).
Host then receives one character at a time ( lcd4D.getc() ), echos it to the PC via usb ( pc.printf(&recChar) ) and does some further processing.
I am also monitoring the physical host receive pin on a separate terminal session. Using this I am certain that the LCD sends data correctly, however this data is not received and echoed correctly by the host processor (echo to PC is only used for debugging purposes).
Refer to super loop code snippet:
do {
if ( lcd4D.readable() ) {
recChar = lcd4D.getc();
pc.printf(&recChar);
lcd4D_intr_Rx();
}
Also refer to attached screen print showing terminal left PC echo (data loss) and terminal right hardware pin monitor (confirming data sent correctly).
Implementing SerialRX interrupt also does not help the situation with data loss still occurring.
Thanks for any suggestions; I am out of ideas.
I have solved the problem.
The issue was that the host processor needed to respond fast enough to the serial data received. I basically implemented a fast serial receive buffer and ensured that received characters are buffered immediately upon interrupt.

Capture RAW data from Ethernet using Wireshark

I am new to Wireshark and capturing packets and all Stuff. Let me get it to the straight.
I have a hardware which outputs its data over Ethernet using a UDP Broadcast. I Can directly plug a Ethernet Cable to a In-line RJ-45 Coupler (attached to the hardware) and my PC Running Wireshark.
REQUIREMENTS : I need to Capture RAW Data which my hardware is broadcasting so that it can be given to other team so as to know the format in which it is providing for further post processing.
What I Did : Initially , I connected the Ethernet Cable from my home and Started capturing the packets which didn't make any sense to me.
Can you please point out if I am going in correct direction ? Sorry if its a very basic question, but raw data from the hardware is important for my further tasks....
As far as any software can understand a wire you will always get a packet. Between you (in front of a computer) and the cable in the in the RJ-45 jack sits a NIC (network interface controller, i.e. your network card).
Your Ethernet NIC will read the current on the cable (in manchester encoding for ethernet) and synchronize itself to any Ehternet traffic on that cable. What does "synchronizing" mane in there? In front of any Ehternet traffic come 64 alternate bits of 0s and 1s which are meant to synchronize the clocks on both communicating NICs. Without proper clock synchronization some data may be misinterpreted.
But why I am talking about clock synchronization? Because if you want the data as RAW as it is on the cable you will not get it. A NIC will never send any synchronization bit to the rest of the computer, therefore it is absolutely impossible to read exactly what is on the cable by using software.
On the other hand I find hard to believe you want the RAW data as RAW as that. After the synchronization bits come an Ethernet encapsulated packed. Yup, Ethernet uses packets. They're link layer packets (layer 2 in OSI).
And wireshark gives you exactly that (in most cases, see note at the end for two exceptions to this rule): every Ehternet packet that the NIC understands, manages to sync, and manages to read without collision is sent to the kernel and then read by wireshark. A cable has electrical interference and has no provision against collisions (it's just a piece of cooper!) therefore the NIC abstracts things like interferences and collisions.
I'll repeat it once more: After abstracting the synchronization bits, sender collisions (which turn the cable into one huge interference) and plain interferences; all that remains is a stream of packets, one after the other.
Extra Notes
NICs sometimes do ignore some Ethernet packets: packets that are not directed to their MAC. This can be changed by enabling promiscuous mode (available in most NICs). This is irrelevant for broadcast packets.
There are exception to the rule of wireshark getting all the traffic coming from the NIC:
If the traffic comes incredibly quick, wireshark may drop out of kernel schedule and not see some packets. It happens, nothign can be done about it.
If you listen on all interfaces (as opposed to selecting a single interface to listen at), wireshark will strip the Ethernet (or Wifi) headers. This is a wireshark hack needed to make output files uniform (and possible to be read by other applications).
TL;DR, wireshark output (pcap) is pretty much just the stream of packets that it got from the NIC, one after the other. That is as RAW as you can get with software.

AT command sent to modem from microcontroller is ignored

Please help, I spent hours trying to rootcause the problem.
Setup: MSP430 (F5529 microcontroller), HW UART (USCI_A0) and modem connected (uBlox Leon G100). I am sure the USCI_A0 port works well because if I connect it via USB-to-RS232 to the PC I can see the correct traffic.
A simple
uart_puts(UART_MODEM, "AT+CPWROFF\r");
shall put AT+CPWROFF to the modem. And it probably does but modem does not power off. If I simply connect modem to PC and type "AT+CPWROFF" finished by Enter then the modem powers off.
I also confirmed on the PC console that the "\r" works well - it just returns the carriage, correctly. Tried "\n\r", does not work either. Checked in the modem settings for S3 character and it's 013 (CR or \r).
I have no idea what is wrong. PC-to-modem works well (I can power off the modem), MSP430-to-PC works ok (I can see that the microconroller really puts AT+CPWROFF\r on the UART), but MSP430-to-modem does not work.
I can determine if the modem is turned on or off by looking at the current consumed.
Please, any hints?
OK, issue solved. If you take a look at the detected baudrate on the links I sent, you will notice that with PC the exact baudrate was 9585 (which is 0.14% of sync difference) while with the MCU it is 7862 (which is 2.73% of sync difference). I switched to 4MHz crystal and now have approx 0.23% out of sync. It's really wondering that the DCO gives so unstable clock.

Using WinPCap for UDP receiving

I would like to use WinPCap library for "reliable" UDP receiving in my C++ application. All examples that I found, using this library for capturing and then proceding. Is there any way (example) how to configure PCap for streaming mode and receive UDP only and on uder defined port or how to solve this. In this time I have reliable UDP server able to receiving 0.5Gb/s. But on slower PC I have a packet lose I can see packets in ethereal but not in application.
thanks
vsm
I assume that you have already tried all of the more standard methods of increasing the number of datagrams that you are able to process? Things like increasing the recv buffer size, speeding up the processing that you do per datagram and using IOCP to allow you to bring more threads to bear on the problem or using RIO if you can target Windows 8?
If so then using WinPCap might work but it sounds like a bit of an extreme solution.
What you need to do is create a filter so that you only capture the datagrams that you are interested in... The docs include examples which use filters.
I have server from here: http://www.gamedev.net/topic/533159-article-using-udp-with-iocp/. This code working with IOCP. Its working fine on WIndows XP. There is no problem with receiving 0.5Gb/s. But now on Win7 is little unreliable. Sometimes there are packets positions error. (my device generating udp packets and in its payload there is PacketNumber - number increasing with each packet. When error occured i write all packet numbers into file. I can see for exmaple: 10,11,290,13,14... ). Is there any known differences in WinXP and Win7 for IOCP and multi threading? Or do you konw any free UDP server with IOCP processing?
In procedding loop I only adding packets into buffer and checking their numbers.