Compact Framework serial port and balance - compact-framework

So, to open up a serial port and successfully transmit data from the balance through the serial port, i need to make sure that the settings on the serialPort object match the actual settings of the balance.
Now, the question is how do i detect that the connection hasn't been established due to the settings being different? No exception is thrown by serialPort.Open to indicate that the connection has been established. Yes, the settings are valid, but if they don't match the device (balance) settings; I am in the dark as to why the weight off the balance is not being captured.
Any input here?

Without knowing any more information on the format of the data you expect from your balance, only general serial port settings mismatch detection techniques are applicable.
If the UART settings are significantly incorrect, you'll likely see a lot of framing errors: when the UART is expecting a 1 stop bit, it will in fact see a 0. You can detect this with the ErrorReceived event on the port.
private void OnErrorReceived(object sender, SerialErrorReceivedEventArgs e)
{
if ((e.EventType & SerialError.Frame) == SerialError.Frame)
{
// your settings don't match, try something else
}
}

If things are close, but still incorrect, the .NET serial port object may not even give you an error (that is, until something catastrophic occurs).
My most common serial port communication failure occurs due to mismatched baud rates. If you have a message that you know you can get an 'echo' for, try that as part of a handshaking effort. Perhaps the device you're connecting to has a 'status' message. No harm will come from requesting it, and you will find out if communication is flowing correctly.
For software handshaking (xon xoff) There's very little you can do to detect whether or not it's configured right. The serial port object can do anything from ignore it completely to have thread exception errors, depending on the underlying serial port driver implementation. I've had serial port drivers that completely ignore xon/xoff, and pass the characters straight into the program - yikes!
For hardware handshaking, the basic echo strategy for baud rate may work, depending on how your device works. If you know that it will do hardware handshaking, you may be able to detect it and turn it on. If the device requires hardware handshaking and it's not on, you may get nothing, and vice versa.
Another setting that's more rarely used is the DTR pin - data terminal ready. Some serial devices require that this be asserted (ie, set to true) to indicate that it's time to start sending data. It's set to false by default; give toggling it a whirl.
Note that the serial port object is ... finicky. While not necessarily required, I would consider closing the port before you make any changes.
Edit:
Thanks to your comments, it looks like this is your device. It says the default settings should be:
1200 baud
Odd parity
1 stop bit
Hardware handshaking
It doesn't specify how many data bits, but the device says it supports 7 and 8. I'd try both of those. It also says it supports 600, 1200, 2400, 4800, 9600, and 19200 baud.
If you've turned on hardware handshaking, enabled DTR (different things) and cycled through all the different baud rates, there's a good chance that it's not your settings. It could be that the serial cable that's being used may be wired incorrectly for your device. Some serial cables are 'passthrough' cables, where the 1-9 pins on one side match exactly with the 1-9 pins on the other. Then, you have 'crossover' cables, where the "TX" and "RX" cables are switched (so that when one side transmits, the other side receives, a very handy cable.)
Consider looking at the command table in the back of the manual there; there's a "print software version" command you could issue to get some type of echo back.

Serial ports use a very, very old communications technology that use a very, very old protocol called RS-232. This is pretty much as simple as it gets... the two end points have synchronized clocks and they test the line voltage every clock cycle to see if it is high or low (with high meaning 0 and low meaing 1, which is the opposite of most conventions... again an artifact of the protocol's age). The clock synchronization is accomplished through the use of stop bits, which are really just rest time in between bytes. There are also a few other things thrown into the more advanced uses of the protocol such as parity bits, XON/XOFF, etc, but those all ride on top of this very basic communication layer. Detecting a mismatch of the clocks on each end of the serial line is going to be nearly impossible -- you'll just get incorrect data on the recieving end. The protocol itself has no way built in to identify this situation. I am unaware of any serial driver that is smart enough to notice the input data being clocked an an inappropriate frequency. If you're using one of the error detection schemes such as parity bits, probabilistically every byte will be declared an error. In short, the best you can do is check the incoming data for errors (parity errors should be detected by your driver/software layer, whereas errors in the data received by your app from that layer will need to be checked by your program -- the latter can be assisted by the use of checksums).

Related

Capture RAW data from Ethernet using Wireshark

I am new to Wireshark and capturing packets and all Stuff. Let me get it to the straight.
I have a hardware which outputs its data over Ethernet using a UDP Broadcast. I Can directly plug a Ethernet Cable to a In-line RJ-45 Coupler (attached to the hardware) and my PC Running Wireshark.
REQUIREMENTS : I need to Capture RAW Data which my hardware is broadcasting so that it can be given to other team so as to know the format in which it is providing for further post processing.
What I Did : Initially , I connected the Ethernet Cable from my home and Started capturing the packets which didn't make any sense to me.
Can you please point out if I am going in correct direction ? Sorry if its a very basic question, but raw data from the hardware is important for my further tasks....
As far as any software can understand a wire you will always get a packet. Between you (in front of a computer) and the cable in the in the RJ-45 jack sits a NIC (network interface controller, i.e. your network card).
Your Ethernet NIC will read the current on the cable (in manchester encoding for ethernet) and synchronize itself to any Ehternet traffic on that cable. What does "synchronizing" mane in there? In front of any Ehternet traffic come 64 alternate bits of 0s and 1s which are meant to synchronize the clocks on both communicating NICs. Without proper clock synchronization some data may be misinterpreted.
But why I am talking about clock synchronization? Because if you want the data as RAW as it is on the cable you will not get it. A NIC will never send any synchronization bit to the rest of the computer, therefore it is absolutely impossible to read exactly what is on the cable by using software.
On the other hand I find hard to believe you want the RAW data as RAW as that. After the synchronization bits come an Ethernet encapsulated packed. Yup, Ethernet uses packets. They're link layer packets (layer 2 in OSI).
And wireshark gives you exactly that (in most cases, see note at the end for two exceptions to this rule): every Ehternet packet that the NIC understands, manages to sync, and manages to read without collision is sent to the kernel and then read by wireshark. A cable has electrical interference and has no provision against collisions (it's just a piece of cooper!) therefore the NIC abstracts things like interferences and collisions.
I'll repeat it once more: After abstracting the synchronization bits, sender collisions (which turn the cable into one huge interference) and plain interferences; all that remains is a stream of packets, one after the other.
Extra Notes
NICs sometimes do ignore some Ethernet packets: packets that are not directed to their MAC. This can be changed by enabling promiscuous mode (available in most NICs). This is irrelevant for broadcast packets.
There are exception to the rule of wireshark getting all the traffic coming from the NIC:
If the traffic comes incredibly quick, wireshark may drop out of kernel schedule and not see some packets. It happens, nothign can be done about it.
If you listen on all interfaces (as opposed to selecting a single interface to listen at), wireshark will strip the Ethernet (or Wifi) headers. This is a wireshark hack needed to make output files uniform (and possible to be read by other applications).
TL;DR, wireshark output (pcap) is pretty much just the stream of packets that it got from the NIC, one after the other. That is as RAW as you can get with software.

Sending AT commands from an embedded system to a Fastrack Supreme Wavecom Module

I have an embedded system that controls a motor using pwm and some other things, I send commands through a serial connection, which is connected to a Fastrack Wavecom Supreme GSM Module. However, the module connected to the embedded system (the client), fails to send the message to the server module.
I have been able to send messages back and forth between the two wavecom modules, however, when I try and send from my PIC18F45k22 to the wavecom module, it fails.
Any ideas of what could be going wrong?
You did not specify what type of serial communication you are using. For instance, if you are using the PIC's SPI module you may be sampling on the wrong edge of the clock. There are at least 2 common SPI modes widely used and 4 all together. If you are using the PIC's UART there are "a whole bucket full" of setting that may be off. Speed, number of bits, in band signaling, out of band signaling, parity, ect.

How come I do not recieve a response from the SIM (ISO-7816)?

I've got a sim connected to my microcontroller. The rst, i/o, and clck pins are wired correctly. There is a hardware UART on my board, but since it is full-duplex and not half, I've jumperd RX/TX together.
So far, I toggle RST according to ISO-7816, and my UART buffer fills up with the ATR the sim card responds with. Once I've received the ATR, I change the UART to TX mode and send it a PPS. After sending, I change the UART back to RX only mode. It follows the correct format as stated in ISO-7816, but I do not receive the confirmation bytes from the sim. The confirmation is supposed to be a repeat of the settings I sent.
I suppose your problem is of the same origin as I had with gsm modems.
Sending a command you get an acknowledgement from the device, then send the next command, get ack, etc, etc. Soon or later the device hangs up.
The key is the interpretation of the acknowledgement.
You may think the acknowledgement means the command is accepted AND executed. However - at least at ALL gsm modems I know - it means no more but the command was accepted and INTERPRETED - but not executed. In case of time consuming commands you send your next command during previous command is being executed. You do it because you may think acknowledgement means the command is done - but it is not true.
The device may or may not buffering cumulative commands, but soon or later the device runs out of resources and hangs up.
I have no experience with device you use but the phenomena seems to be the same.
While I'm not a protocol expert, the most likely cause seems to me, that you send PPS too early- "after sending" can be easily too early on modern microcontrollers. ISO 7816-3 states, that the guard time applies as usual and the waiting time is 9600 etu's.
Sending PPS too early means, that the card does not yet listen, which perfectly explains receiving no response at all. Wrong format would cause an error block, which should also be visible on the scope, which supports my assumption.

USB CDC device stalling

I'm writing a simple virtual serial port device to report an older serial port. By this point I'm able to enumerate the device and send/receive characters.
After a varying number of bulk-out transmissions from the host to the device the endpoint appears to give up and stop transferring data. On the PC side I receive a write error, and judging from a USBlyzer trace the music stops on a stall (USBD_STATUS_STALL_PID). However my code never intentionally issues a STALL condition on that endpoint and the status flag for having generated one never gets set though.
Given the short amount of time elapsed (<300 µs) between issuing the request and the STALL it would appear to be an invalid response of some sort, and not a time-out. On the device side the output endpoint is ready to go, with data in the buffer and proper DATA0/1 synchronization, but nothing further ever happens.
Note that the device appears to work fine even for long periods of time until I start sending "large" quantities of data. As near as I can tell the device enumeration/configuration also appears to complete successfully. Oh, and the bulk-in endpoint continues to work just fine after this.
For the record I'm using the standard Windows usbser.sys driver and an XMega128A4U µP. I'm also seeing the same behaviour across multiple Windows Vista and 7 machines.
Any ideas what I'm doing wrong or what further tests I might run to narrow things down?
USBlyzer log,
USB CDC stack,
test project
For the record this eventually turned out to be an oscillator problem. (Apparently the FLL's reference is always 1,024 Hz even when the 1,000 Hz USB frames are chosen. The slight clock error meant that a packet occasionally got rejected if it happened to contain one too many 1-bits in a row.)
I guess the moral of the story is to check the basics before assuming you've got a problem with the higher-level protocol. Also in retrospect a hardware USB analyzer would have been a worthwhile investment, the software alternatives mostly seems to spit out a generic error code or nothing at all when something goes awry.
Stalling the out-endpoint may happen on an overflow of the output buffer on the host side. Are you sure that the device does fetch the data it receives via out-endpoint - and if so does it fetch the data at least as fast as data is sent to the device?
Note that the device appears to work fine even for long periods of
time until I start sending "large" quantities of data.
This seems to be a hint for an overflow of the output-buffer.

Thermal Printer interfacing with the AM1808

I have to interface the thermal Printer with my AM1808 based on the Embedded linux.
I have interfaced a printer having only unidirectional communication, means i need to send only data and no need to receive anything from the printer for verification.
I have my own printer that need the bidirectional communication in which i have to send the data and same way i need to receive something from the printer to verify wether it has successfully printed the data or not.
Yes my printer gets hung when it has printed around 4000 bytes so i have to reinitialize it to empty its inbuild buffer.
Now my question is Once i have configured a UART port. do i have to enable or disable transmission or reception ? means it can work with both the transmission and reception enabled ? How can i do this please help me.
Wether I have to put printer on interrupt. ????
Thank you.
All UART's I've ever worked with have independent tx and rx hardware. Assuming no hardware flow-control enabled, then if you can tx OK, your should be able to rx.
Wether I have to put printer on interrupt? - Well, on a preemptive multitasker, it's usual to use an interrupt driver, (or some variant, eg. DMA with interrupt on completion), yes.
I have interfaced a printer having only unidirectional communication, means i need to send only data and no need to receive anything from the printer for verification.
"... no need to receive anything ..." is probably a faulty assumption.
Your printer should have somekind of flow control to prevent data overrun. Character displays & line printers often can receive the data faster than they can display or print it. These devices use a simple comm protocol that does not have any facility for retransmission of lost data. So there's flow control to notify the host to (temporarily) stop sending data when the device's receive buffer is full.
A EIA/RS-232 serial interface can use either hardware (typically using the CTS control line) or software (embedded data, typically using the XON and XOF characters) for single-ended flow control. Linux serial port drivers and line discipline make flow control invisible to the application program once the serial port is configured.
Yes my printer gets hung when it has printed around 4000 bytes so i have to reinitialize it to empty its inbuild buffer.
This appears to be evidence that you are ignoring whatever flow control the printer is providing, and causing data overrun.
Now my question is Once i have configured a UART port. do i have to enable or disable transmission or reception ?
That's not the salient question. You need to determine what kind of flow control the printer needs, and then implement (i.e. configure) that.