Implementing SPI slave ISR on PIC32? - embedded

I have two PIC32MX microcontrollers that are connected over a 1.53MHz SPI bus with Chip Select. I am having trouble getting my slave side interrupt service routine to transmit data correctly. As a test case, I'm having the master send out two bytes (0x01, 0x00) every 10 ms. The slave is supposed to receive the 0x01 command id and respond with a 0x02 when the master sends the 2nd byte (the dummy 0x00).
Ideally each transfer should look like this.
Master Slave
0x01 0x00
0x00 0x02
I'm really not sure where to start with the slave interrupt though. I'm using a fifo buffer called airsysTx to hold data that needs to be shifted out the next time the master makes a request. The slave receives the 0x01 from the master just fine and writes 0x02 to the fifo buffer when it does. I'm not sure how to code the interrupt so that it will be sure to transmit correctly. The code I have below is a good start, but it's wrong. Suggestions?
/*******************************************************************************
* Interrupt service routine for SPI3 interrupts from Air MCU.
* The user's code at this vector should perform any application specific
* operations and MUST clear the SPI3 interrupt flags before exiting.
******************************************************************************/
void __ISR(_SPI_3_VECTOR, ipl7) _SPI3Interrupt()
{
BYTE MasterCMD;
SET_D1();//Set debug LED
// RX INTERRUPT
if(IFS0bits.SPI3RXIF) // receive data available in SPI3BUF Rx buffer
{
MasterCMD = SPI3BUF;
if(AirCMD == 0x01)
{
airsysTxFlush();
airsysTxWrite(0x02);
}
}
//Transmit data if needed.
if(SPI3STATbits.SPITBE)
{
if(!airsysTxIsEmpty())
{
SPI3BUF = airsysTxRead();
}
else
{
//Else write 0 to the tx buffer to clear the spi shift reg
SPI3BUF = 0x00;
}
}
IFS0bits.SPI3RXIF = 0;
IFS0bits.SPI3TXIF = 0;
IFS0bits.SPI3EIF = 0;
SPI3STATbits.SPIROV = 0;// clear the Overflow
CLEAR_D1();//CLEAR Debug LED
} // end ISR
What this code is actually transmitting is something like this:
Ideally each transfer should look like this.
Master Slave
0x01 0x02
0x00 0x01

Generally you can't write a slave SPI driver to interact in the way you describe because you can't control the timing precisely as a slave. What generates your ISR, is it Rx of first byte from master or assertion of chip select?
As the slave, you need to have set up the data bytes you want to transmit before the master starts the transaction. You usually don't have time to react to the first byte. There are a couple of ways to do this:
1) You could use a protocol where master does a 1 or 2 byte write-only transaction that tells the slave what it wants to read. Then master waits a few milliseconds to allow the slave to prepare the response. Then master does a read-only transaction to get the slave response.
2) If using DMA or FIFO, slave preloads the first padding byte(s) into the fifo before master starts the transaction. Then as you get the ISR you put the remaining response data into the fifo (without a flush). You need to have enough pad bytes to accommodate the slave ISR latency in forming the response. So for example, you may define your protocol where master knows that the first N bytes of response are pad bytes, followed by response data. Padding requirement would depend on your master clock speed and slave CPU speed/interrupt latency.

Related

STM32F407 SPI While Reading Datas of LIS3DSH

uint8_t SPI_Rx(uint8_t address)
{
GPIO_ResetBits(GPIOE,GPIO_Pin_3);
address = (0x80) | (address);
while(!SPI_I2S_GetFlagStatus(SPI1,SPI_FLAG_TXE));
SPI_I2S_SendData(SPI1,address);
while(!SPI_I2S_GetFlagStatus(SPI1,SPI_FLAG_RXNE));
SPI_I2S_ReceiveData(SPI1);
while(!SPI_I2S_GetFlagStatus(SPI1,SPI_FLAG_TXE));
SPI_I2S_SendData(SPI1,0x00);
while(!SPI_I2S_GetFlagStatus(SPI1,SPI_FLAG_RXNE));
SPI_I2S_ReceiveData(SPI1);
GPIO_SetBits(GPIOE,GPIO_Pin_3);
return SPI_I2S_ReceiveData(SPI1);
Hi, what you see is SPI reading functions for LIS3DSH sensor. First of all, After I send adress knowledge, SPI RX buffer receives datas of LIS3DSH sensor. But then why does it need to receive data of LIS3DSH sensor by sending "0x00" ? Briefly, why do we need to send else data again while we already received datas? Ahead of time ,Thanks.
When reading data from a register of the LIS3DSH, the entire SPI transaction is two bytes long:
Byte 1: The master sends the register address (incl. a read/write flag) to the sensor.
Byte 2: The sensor sends the register value to the master.
However, SPI is a full-duplex protocol. For each bit and byte, both the master and the slave (sensor) send data. So the real communication in fact looks like so:
Byte 1
Master sends register address.
Slave sends 0 or any random value.
Byte 2
Master sends 0.
Slave sends register value.
This is what you see in the code:
while(!SPI_I2S_GetFlagStatus(SPI1,SPI_FLAG_TXE)); // Wait for SPI bus to be ready
SPI_I2S_SendData(SPI1,address); // Submit byte 1 to be sent
while(!SPI_I2S_GetFlagStatus(SPI1,SPI_FLAG_RXNE)); // Wait for byte 1 from slave
SPI_I2S_ReceiveData(SPI1); // Retrieve byte 1 from slave and ignore it
while(!SPI_I2S_GetFlagStatus(SPI1,SPI_FLAG_TXE)); // Wait for SPI bus to be ready
SPI_I2S_SendData(SPI1,0x00); // Send 0 as byte 2
while(!SPI_I2S_GetFlagStatus(SPI1,SPI_FLAG_RXNE)); // Wait for byte 2 from slave
SPI_I2S_ReceiveData(SPI1); // Retrieve byte 2 and ??
GPIO_SetBits(GPIOE,GPIO_Pin_3); // Deassert CS?
return SPI_I2S_ReceiveData(SPI1); // Retrieve byte 2 again and return it
The last part is somewhat strange. SPI_I2S_ReceiveData() is called twice at the end probably assuming that it will return the same value twice. It would probably be cleaner if it was called only once.

How does one get high-speed UART data on a low-speed MSP430

My project has an MSP430 connected via UART to a Bluegiga Bluetooth module. The MCU must be able to receive variable length messages from the BG module. In the current architecture, each received byte generates a UART interrupt to allow message processing, and power constraints impose a limit on the clock speed of the MSP430. This makes it difficult for the MSP430 to keep up with the any UART speeds faster than 9600bps. The result is a slow communication interface. Speeding up the data rate results in overrun errors, lost bytes, and broken communication.
Any ideas on how communication speed can be increased without sacrificing data integrity in this situation?
I was able to accomplish a 12x speed improvement by employing 2 of the 3 available DMA channels on the MSP430 to populate a ring buffer that could then be processed by the CPU. It was a bit tricky because the MSP430 DMA interrupts are only generated when the size register reaches zero, so I couldn't just populate a ring buffer directly because the message size is variable.
Using one DMA channel as a single byte buffer that is triggered on every byte received by the UART, which then triggers a second DMA channel that populates the ring buffer does the trick.
Below is example C code that illustrates the method. Note that it incorporates references from the MSP430 libraries.
#include "dma.h"
#define BLUETOOTH_RXQUEUE_SIZE <size_of_ring_buffer>
static int headIndex = 0;
static int tailIndex = 0;
static char uartRecvByte;
static char bluetoothRXQueue[BLUETOOTH_RXQUEUE_SIZE];
/*!********************************************************************************
* \brief Initialize DMA hardware
*
* \param none
*
* \return none
*
******************************************************************************/
void init(void)
{
// This is the primary buffer.
// It will be triggered by UART Rx and generate an interrupt.
// It's purpose is to service every byte recieved by the UART while
// also waking up the CPU to let it know something happened.
// It uses a single address of RAM to store the data, so it needs to be
// serviced before the next byte is received from the UART.
// This was done so that any byte received triggers processing of the data.
DMA_initParam dmaSettings;
dmaSettings.channelSelect = DMA_CHANNEL_2;
dmaSettings.transferModeSelect = DMA_TRANSFER_REPEATED_SINGLE;
dmaSettings.transferSize = 1;
dmaSettings.transferUnitSelect = DMA_SIZE_SRCBYTE_DSTBYTE;
dmaSettings.triggerSourceSelect = DMA_TRIGGERSOURCE_20; // USCA1RXIFG, or any UART recieve trigger
dmaSettings.triggerTypeSelect = DMA_TRIGGER_RISINGEDGE;
DMA_init(&dmaSettings);
DMA_setSrcAddress(DMA_CHANNEL_2, (UINT32)&UCA1RXBUF, DMA_DIRECTION_UNCHANGED);
DMA_setDstAddress(DMA_CHANNEL_2, (UINT32)&uartRecvByte, DMA_DIRECTION_UNCHANGED);
// This is the secondary buffer.
// It will be triggered when DMA_CHANNEL_2 copies a byte and will store bytes into a ring buffer.
// It's purpose is to pull data from DMA_CHANNEL_2 as quickly as possible
// and add it to the ring buffer.
dmaSettings.channelSelect = DMA_CHANNEL_0;
dmaSettings.transferModeSelect = DMA_TRANSFER_REPEATED_SINGLE;
dmaSettings.transferSize = BLUETOOTH_RXQUEUE_SIZE;
dmaSettings.transferUnitSelect = DMA_SIZE_SRCBYTE_DSTBYTE;
dmaSettings.triggerSourceSelect = DMA_TRIGGERSOURCE_30; // DMA2IFG
dmaSettings.triggerTypeSelect = DMA_TRIGGER_RISINGEDGE;
DMA_init(&dmaSettings);
DMA_setSrcAddress(DMA_CHANNEL_0, (UINT32)&uartRecvByte, DMA_DIRECTION_UNCHANGED);
DMA_setDstAddress(DMA_CHANNEL_0, (UINT32)&bluetoothRXQueue, DMA_DIRECTION_INCREMENT);
DMA_enableTransfers(DMA_CHANNEL_2);
DMA_enableTransfers(DMA_CHANNEL_0);
DMA_enableInterrupt(DMA_CHANNEL_2);
}
/*!********************************************************************************
* \brief DMA Interrupt for receipt of data from the Bluegiga module
*
* \param none
*
* \return none
*
* \par Further Detail
* \n Dependencies: N/A
* \n Processing: Clear the interrupt and update the circular buffer head
* \n Error Handling: N/A
* \n Tests: N/A
* \n Special Considerations: N/A
*
******************************************************************************/
void DMA_Interrupt(void)
{
DMA_clearInterrupt(DMA_CHANNEL_2);
headIndex = BLUETOOTH_RXQUEUE_SIZE - DMA_getTransferSize(DMA_CHANNEL_0);
if (headIndex == tailIndex)
{
// This indicates ring buffer overflow.
}
else
{
// Perform processing on the current state of the ring buffer here.
// If only partial data has been received, just leave. Either set a flag
// or generate an event to process the message outside of the interrupt.
// Once the message is processed, move the tailIndex.
}
}

Can't get my DAC(PT8211) to work correctly using a PIC32MX uc and SPI

I'm just trying to learn to use external ADC and DAC (PT8211) with my PIC32MX534f06h.
So far, my code is just about sampling a signal with my ADC every time a timer-interrupt is triggered, then sending then same signal out to the DAC.
The interrupt and ADC part works fine and have been tested independently, but the voltages that my DAC outputs don't make much sens to me and stay at 2,5V (it's powered at 0 - 5V).
I've tried to feed the DAC various values ranging from 0 to 65534 (16bits DAC so i guess it should be the expected range of the values to feed to it, right?) voltage stays at 2.5V.
I've tried changing the SPI configuration, using different SPIs (3 and 4) and DACs (I have one soldered to my pcb, soldered to SPI3, and one one breadboard, linked to SPI4 in case the one soldered on my board was defective).
I made sure that the chip selection line works as expected.
I couldn't see the data and clock that are transmissed since i don't have a scope yet.
I'm a bit out of ideas now.
Chip selection and SPI configuration settings
signed short adc_value;
signed short DAC_output_value;
int Empty_SPI3_buffer;
#define Chip_Select_DAC_Set() {LATDSET=_LATE_LATE0_MASK;}
#define Chip_Select_DAC_Clr() {LATDCLR=_LATE_LATE0_MASK;}
#define SPI4_CONF 0b1000010100100000 // SPI on, 16-bit master,CKE=1,CKP=0
#define SPI4_BAUD 100 // clock divider
DAC output function
//output to external DAC
void DAC_Output(signed int valueDAC) {
INTDisableInterrupts();
Chip_Select_DAC_Clr();
while(!SPI4STATbits.SPITBE); // wait for TX buffer to empty
SPI4BUF=valueDAC; // write byte to TX buffer
while(!SPI4STATbits.SPIRBF); // wait for RX buffer to fill
Empty_SPI3_buffer=SPI4BUF; // read RX buffer
Chip_Select_DAC_Set();
INTEnableInterrupts();
}
ISR sampling the data, triggered by Timer1. This works fine.
ADC_input inputs the data in the global variable adc_value (12 bits, signed)
//ISR to sample data
void __ISR( _TIMER_1_VECTOR, IPL7SRS) Test_data_sampling_in( void)
{
IFS0bits.T1IF = 0;
ADC_Input();
//rescale the signed 12 bit audio values to unsigned 16 bits wide values
DAC_output_value = adc_value + 2048; //first unsign the signed 12 bit values (between 0 - 4096, center 2048)
DAC_output_value = DAC_output_value *16; // the scale between 12 and 16 bits is actually 16=65536/4096
DAC_Output(DAC_output_value);
}
main function with SPI, IO, Timer configuration
void main() {
SPI4CON = SPI4_CONF;
SPI4BRG = SPI4_BAUD;
TRISE = 0b00100000;
TRISD = 0b000000110100;
TRISG = 0b0010000000;
LATD = 0x0;
SYSTEMConfigPerformance(80000000L); //
INTCONSET = _INTCON_MVEC_MASK; /* Set the interrupt controller for multi-vector mode */
//
T1CONbits.TON = 0; /* turn off Timer 1 */
T1CONbits.TCKPS = 0b11; /* pre-scale = 1:1 (T1CLKIN = 80MHz (?) ) */
PR1 = 1816; /* T1 period ~ ? */
TMR1 = 0; /* clear Timer 1 counter */
//
IPC1bits.T1IP = 7; /* Set Timer 1 interrupt priority to 7 */
IFS0bits.T1IF = 0; /* Reset the Timer 1 interrupt flag */
IEC0bits.T1IE = 1; /* Enable interrupts from Timer 1 */
T1CONbits.TON = 1; /* Enable Timer 1 peripheral */
INTEnableInterrupts();
while (1){
}
}
I would expect to see the voltage at the ouput of my DAC to mimic those I put at the input of my ADC, instead the DAC output value is always constant, no matter what I input to the ADC
What am i missing?
Also, when turning the SPIs on, should I still manually manage the IO configuration of the SDI SDO SCK pins using TRIS or is it automatically taken care of?
First of all I agree that the documentation I first found for PT8211 is rather poor. I found extended documentation here. Your DAC (PT8211) is actually an I2S device, not SPI. WS is not chip select, it is word select (left/right channel). In I2S, If you are setting WS to 0, that means the left channel. However it looks like in the extended datasheet I found that WS 0 is actually right channel (go figure).
The PIC you've chosen doesn't seem to have any I2S hardware so you might have to bit bash it. There is a lot of info on I2S though ,see I2S bus specification .
There are some slight differences with SPI and I2C. Notice that the first bit is when WS transitions from high to low is the LSB of the right channel. and when WS transitions from low to high, it is not the LSB of the left channel. Note that the output should be between 0.4v to 2.4v (I2S standard), not between 0 and 5V. (Max is 2.5V which is what you've been seeing).
I2S
Basically, I'd try it with the proper protocol first with a bit bashing algorithm with continuous flip flopping between a left/right channel.
First of all, thanks a lot for your comment. It helps a lot to know that i'm not looking at a SPI transmission and that explains why it's not working.
A few reflexions about it
I googled Bit bashing (banging?) and it seems to be CPU intensive, which I would definately try to avoid
I have seen a (successful) projet (in MikroC) where someone transmit data from that exact same PIC, to the same DAC, using SPI, with apparently no problems whatsoever So i guess it SHOULD work, somehow?
Maybe he's transforming the data so that it works? here is the code he's using, I'm not sure what happens with the F15 bit toggle, I was thinking that it was done to manage the LSB shift problem. Here is the piece of (working) MikroC code that i'm talking about
valueDAC = valueDAC + 32768;
valueDAC.F15 =~ valueDAC.F15;
Chip_Select_DAC = 0;
SPI3_Write(valueDAC);
Chip_Select_DAC = 1;
From my understanding, the two biggest differences between SPI and I2S is that SPI sends "bursts" of data where I2S continuously sends data. Another difference is that data sent after the word change state is the LSB of the last word.
So i was thinking that my SPI is triggered by a timer, which is always the same, so even if the data is not sent continuously, it will just make the sound wave a bit more 'aliased' and if it's triggered regularly enough (say at 44Mhz), it should not be SO different from sending I2S data at the same frequency, right?
If that is so, and I undertand correctly, the "only" problem left is to manage the LSB-next-word-MSB place problem, but i thought that the LSB is virtually negligible over 16bit values, so if I could just bitshift my value to the right and then just fix the LSB value to 0 or 1, the error would be small, and the format would be right.
Does it sounds like I have a valid 'Mc-Gyver-I2S-from-my-SPI' or am I forgetting something important?
I have tried to implement it, so far without success, but I need to check my SPI configuration since i'm not sure that it's configured correctly
Here is the code so far
SPI config
#define Chip_Select_DAC_Set() {LATDSET=_LATE_LATE0_MASK;}
#define Chip_Select_DAC_Clr() {LATDCLR=_LATE_LATE0_MASK;}
#define SPI4_CONF 0b1000010100100000
#define SPI4_BAUD 20
DAaC output function
//output audio to external DAC
void DAC_Output(signed int valueDAC) {
INTDisableInterrupts();
valueDAC = valueDAC >> 1; // put the MSB of ValueDAC 1 bit to the right (becase the MSB of what is transmitted will be seen by the DAC as the LSB of the last value, after a word select change)
//Left channel
Chip_Select_DAC_Set(); // Select left channel
SPI4BUF=valueDAC;
while(!SPI4STATbits.SPITBE); // wait for TX buffer to empty
SPI4BUF=valueDAC; // write 16-bits word to TX buffer
while(!SPI4STATbits.SPIRBF); // wait for RX buffer to fill
Empty_SPI3_buffer=SPI4BUF; // read RX buffer (don't know why we need to do this here, but we do)
//SPI3_Write(valueDAC); MikroC option
// Right channel
Chip_Select_DAC_Clr();
SPI4BUF=valueDAC;
while(!SPI4STATbits.SPITBE); // wait for TX buffer to empty
SPI4BUF=valueDAC; // write 16-bits word to TX buffer
while(!SPI4STATbits.SPIRBF); // wait for RX buffer to fill
Empty_SPI3_buffer=SPI4BUF;
INTEnableInterrupts();
}
The data I send here is signed, 16 bits range, I think you said that it's allright with this DAC, right?
Or maybe i could use framed SPI? the clock seems to be continous in this mode, but I would still have the LSB MSB shifting problem to solve.
I'm a bit lost here, so any help would be cool

Every SPI send results in receiving a 0 on MSP430

I run this simplified program for SPI communication, running on the TI MSP430FR5969 on its corresponding launchpad MSP-EXP430FR5969, and set breakpoints just before TX and just after RX in CCS (Code Composer Studio). The breakpoints are labelled with comments.
My launchpad is not connected to anything. (Once I figure this out I intend to communicate it to some other device for real communication.)
I do not expect to receive any data because the launchpad is not connected to anything. But I receive exactly one zero for every send. The breakpoints are hit in alternate order starting with the first TX breakpoint.
Why am I receiving data? Is it because I need to enable pullup registers on some of the pins? I believe the launchpad itself uses the USCI "A" module(s) so the "B" module that I am using should have nothing connected to it.
#include <msp430.h>
int main(void) {
WDTCTL = WDTPW | WDTHOLD;
P1SEL0 &= ~BIT3; // UCB0STE
P1SEL0 &= ~BIT6; // UCB0SIMO
P1SEL0 &= ~BIT7; // UCB0SOMI
P2SEL0 &= ~BIT2; // UCB0CLK
P1SEL1 |= BIT3; // UCB0STE
P1SEL1 |= BIT6; // UCB0SIMO
P1SEL1 |= BIT7; // UCB0SOMI
P2SEL1 |= BIT2; // UCB0CLK
PM5CTL0 &= ~LOCKLPM5;
CSCTL0_H = CSKEY_H;
CSCTL1 &= ~DCORSEL;
CSCTL1 = (CSCTL1 & ~0x000e) | DCOFSEL_0; // 1 MHz
CSCTL3 |= DIVA__1 | DIVS__1 | DIVM__1; // clock dividers = 1
CSCTL0_H = 0;
UCB0CTLW0 |= UCSWRST;
UCB0CTLW0 |= UCCKPH;
UCB0CTLW0 |= UCCKPL;
UCB0CTLW0 |= UCMSB;
UCB0CTLW0 |= UCMST;
UCB0CTLW0 |= UCMODE_2;
UCB0CTLW0 |= UCSYNC;
UCB0CTLW0 |= UCSSEL__SMCLK;
UCB0CTLW0 |= UCSTEM;
// UCB0STATW |= UCLISTEN; // OK, if enabled i receive what i send
UCB0CTLW0 &= ~UCSWRST;
UCB0IE |= UCRXIE;
_enable_interrupts();
_delay_cycles(100000);
int send = 0;
while (1) {
while (!(UCB0IFG & UCTXIFG));
UCB0TXBUF = send; // BREAKPOINT 1
send = (send + 1) % 100;
_delay_cycles(100000);
}
return 0;
}
#pragma vector = USCI_B0_VECTOR
__interrupt void isr_usci_b0 (void) {
static volatile int received = 0;
switch (__even_in_range(UCB0IV, USCI_SPI_UCTXIFG)) {
case USCI_NONE:
break;
case USCI_SPI_UCRXIFG:
received = UCB0RXBUF;
UCB0IFG &= ~UCRXIFG; // BREAKPOINT 2
_no_operation();
break;
case USCI_SPI_UCTXIFG:
break;
}
}
The SPI peripheral does two things if MISO and MOSI are enabled (CLK enabled as well, of course). Assuming Master mode operation, it clocks out data from the TX shift register on the MOSI line and simultaneously clocks in data to the RX shift register from the MISO line.
In your circuit, the MISO input is hanging since you have not enabled either pull-up or pull-down internal resistances. Thus, observing 0x00 would not be out of the ordinary. If you had enabled the pull-up resistance, then you would have seen 0xFF in the receive buffer.
Another rule of thumb:
If you are using the peripheral functions then configure the GPIO pins of the MSP430 as output/input. (i.e. MOSI, CLK = output, MISO = input for SPI master mode)
Answer to the questions in the comments:
The MSP430 is configured in the listed code to be the SPI master. I see little point in the using a dedicated RX interrupt service routine, unless you want the controller to do something else in the time between shifting data from the TX buffer to the shift register and shifting data from the RX shift register to the RX buffer, i.e. one "byte" transfer period. You could as well have polled for the RX interrupt as you have for the TX interrupt. But you must wait for the RX interrupt.
Excerpt from the user guide:
The eUSCI initiates data transfer when data is moved to the transmit data buffer UCxTXBUF. The UCxTXBUF data is moved to the transmit (TX) shift register when the TX shift register is empty, initiating data transfer on UCxSIMO starting with either the MSB or LSB, depending on the UCMSB setting. Data on UCxSOMI is shifted into the receive shift register on the opposite clock edge. When the character is received, the receive data is moved from the receive (RX) shift register to the received data buffer UCxRXBUF and the receive interrupt flag UCRXIFG is set, indicating the RX/TX operation is complete.
A set transmit interrupt flag, UCTXIFG, indicates that data has moved from UCxTXBUF to the TX shift register and UCxTXBUF is ready for new data. It does not indicate RX/TX completion.
To receive data into the eUSCI in master mode, data must be written to UCxTXBUF, because receive and transmit operations operate concurrently.
The client will not send data by itself to the MSP430. The client device may need some time to execute the command the master just sent. Typically an "erase flash" command for SPI Flash chips.
In this case the master, i.e. MSP430, must poll the client device to see if it has data to send/completed the command. This is done typically either by polling a status register of the client device (or by using a dedicated IRQ interrupt). i.e. the client signals "completion of command"/"availability of data" via the status byte (or IRQ interrupt). On this event, the master could read out data from the client.
At first glance it may seem rather counter intuitive that data (dummy bytes) needs to be written in order to read data - perhaps your source of confusion as well :)
Perhaps reading about an SPI client may help. For example this SPI memory.
The SPI peripheral transmits a bit and receives a bit for every clock cycle. Instead of wondering how some unconnected device has sent a byte, think that your SPI peripheral has clocked in a receive byte even though nothing is connected. The byte you receive is 0 because the MISO line happens to be low while nothing is connected.
The SPI peripheral does not know the meaning of the data and does not know how many bytes must be transmitted and received for any particular command. It's up to your application to know when to transmit and receive dummy bytes. For example, if the slave responds to a command in the next byte then your application has to transmit two bytes (the command byte followed by a dummy byte) and at the same time receive two bytes (a dummy byte, followed by the response). Some slaves may send a generic status byte instead of a dummy byte as the first byte of all responses. It's up to your application to use or ignore the status byte.
The MSP430's SPI documentation is not going to tell you when you need to send/receive dummy bytes. You'll have to read the SPI documentation for the slave device for that information. Each slave may have different requirements. Some slaves my receive a command byte and send a reply. Other slaves may receive a command and address byte before sending a reply. Some slaves may reply with multiple bytes. You'll have to program your application to transmit/receive the appropriate number of bytes.
There are no stop bits. The master is both transmitting and receiving with every clock. If you want to stop receiving then stop transmitting. If you want to continue receiving then transmit dummy bytes.
Yes, you should use the RX interrupt. The RX interrupt indicates that it is safe for your application to read the received byte from the RX register. The SPI peripheral is shifting receive bits into a shift register with each clock. But after a byte has been received the SPI peripheral still has to copy the contents of the shift register to the RX register and then set the RX interrupt. You shouldn't assume that the received byte can be read from the RX register until the RX interrupt is indicated.
The behavior you describe is to be expected. With SPI, it is movement on the clock line that indicates the presence of data. The input line can be idle, and data will be received because, in order to send a byte, the clock must be toggled back and forth to latch the transmitted data, but at same time, data is latched into the receive buffer.
An SPI bus is intended to be a closed pathway. The TX line from your processor goes out and is daisy-chained to one or more slave devices and then looped back to your RX pin. Every transition on the clock line drives a data bit. This means that for every transition your hardware will shift one bit into your receive buffer. It's up to your code to know how many of those bits to discard before you start reading real data.
You are reading 0's because nothing is driving the RX pin. When you're connected to a real device, the first several bytes you send will also likely generate 00's on your RX pin. Usually you'll have to send some sort of command byte to the slave device which then will start sending real data. The length of that command should be discarded because the slave will not have started driving its output pin until the command byte (word, string, whatever) is complete.

Tx LPC2148 with Xbee in API mode with Rx Xbee with Relay not works

I started to work on lpc2148 with xbee series 2.
On the transmitting side i am using lpc2148 with xbee coordinator in API mode,
and Rx side i am using xbee on Shield in router AT mode.
I want XBee to activate a D3 pin, which could be used to turn on relay on Rx side
API frame format as below code using c program .
enter code here
#define Delimeter 0x7E
void Init_UART1(void) //This function setups UART1
{
unsigned int Baud16;
U1LCR = 0x83; // DLAB = 1
Baud16 = (Fpclk / 16) / UART_BPS;
U1DLM = Baud16 / 256;
U1DLL = Baud16 % 256;
U1LCR = 0x03;
}
void main() {
Init_UART1();
LED1_ON();
setRemoteState(0x5);//AD3 config DOUT HIGH
Delay(25);
LED1_OFF();
setRemoteState(0x4);//AD3 config DOUT LOW
Delay(25);
void setRemoteState (char value) {
UART1_Write(Delimeter);//start byte
UART1_Write(0);//high part of length
UART1_Write(0X10);//low part of length
UART1_Write(0X17);//remote AT command
UART1_Write(0X0);//frame id 0 for no reply
UART1_Write(0X0);
UART1_Write(0X0);
UART1_Write(0X0);
UART1_Write(0X0);
UART1_Write(0X0);
UART1_Write(0X0);
UART1_Write(0XFF);// broadcast
UART1_Write(0XFF);// broadcast
UART1_Write(0XFF);
UART1_Write(0XFE);
UART1_Write(0X02);//apply changes immediately on remote
UART1_Write('D');//writing on AD3 pin
UART1_Write('3');
UART1_Write(value);
sum = 0x17 + 0xFF + 0xFF + 0xFF + 0xFE + 0x02 + 'D' + '3' + value;
UART1_Write(0xFF - (0xFF & sum));//checksum
Delay(25);
}
}
i am not able to get any communication or data on my Rx side. D3 pinout volage is still low.
Please guide on this point...
This program is working fine with arduino using Serial.write Function.
Regards,
Vijay
Are you using the correct baud rate? Are you sure you've connected TX/RX correctly and haven't crossed them? If you have hardware flow control enabled, is the RTS signal into the XBee asserted? Is the XBee module powering up and receiving enough current?
If you monitor the XBee transmit signal on another device (computer via FTDI's TTL-to-USB cable) are you seeing bytes at startup (I believe it sends a Modem Status during it's startup)? If you monitor the LPC2148 transmit signal, are you seeing the byte stream you think you're sending (confirming that you're driving UART1 correctly)?
Can you tell if the XBee module is receiving your requests, perhaps by toggling ATD0 between high and low output and checking with an LED or scope? Do you have any hardware you can use to monitor the serial stream between the two devices to see if it's sending the bytes you think you're sending? Are you sure it's calculating the correct checksum (dump the bytes somehow and try running them through X-CTU to see if they work).
If you're going to be doing a lot of communications between the LPC2148 and the XBee module, you might want to try porting this Open Source ANSI C XBee Host Library to the platform. It includes multiple layers of XBee frame processing that should reduce the amount of software you need to write.