I have an application running on STM32F429ZIT6 using USB stack to communicate with PC client.
MCU receives one type of message of 686 bytes every second and receives another type of message of 14 bytes afterwards with 0.5 seconds of delay between messages. The 14 bytes message is a heartbeat so it needs to replied by MCU.
It happens that after 5 to 10 minutes of continuous operation, MCU is not able to send data because
hcdc->TxState is always busy. Reception works fine.
During Rx interruption, application only adds data to ring buffer, so that this buffer is later serialized and processed by main function.
static int8_t CDC_Receive_HS(uint8_t* Buf, uint32_t *Len) {
/* USER CODE BEGIN 11 */
/* Message RX Completed, Send it to Ring Buffer to be processed at FMC_Run()*/
for(uint16_t i = 0; i < *Len; i++){
ring_push(RMP_RXRingBuffer, (uint8_t *) &Buf[i]);
}
USBD_CDC_SetRxBuffer(&hUsbDeviceHS, &Buf[0]);
USBD_CDC_ReceivePacket(&hUsbDeviceHS);
return (USBD_OK);
/* USER CODE END 11 */ }
USB TX is also kept as simple as possible:
uint8_t CDC_Transmit_HS(uint8_t\* Buf, uint16_t Len) {
uint8_t result = USBD_OK;
/\* USER CODE BEGIN 12 */
USBD_CDC_HandleTypeDef hcdc = (USBD_CDC_HandleTypeDef*)hUsbDeviceHS.pClassData;
if (hcdc-\>TxState != 0)
{
ZF_LOGE("Tx failed, resource busy\\n\\r"); return USBD_BUSY;
}
USBD_CDC_SetTxBuffer(&hUsbDeviceHS, Buf, Len);
result = USBD_CDC_TransmitPacket(&hUsbDeviceHS);
ZF_LOGD("TX Message Result:%d\\n\\r", result);
/ USER CODE END 12 \*/
return result;
}
I'm using latest HAL Drivers and software from CubeIDE (1.27.1).
I have tried expanding heap min size from 0x200 to larger values but result is the same.
Also Line Coding is set according to what recommended values:
case CDC_SET_LINE_CODING:
LineCoding.bitrate = (uint32_t) (pbuf[0] | (pbuf[1] << 8) | (pbuf[2] << 16) | (pbuf[3] << 24));
LineCoding.format = pbuf[4];
LineCoding.paritytype = pbuf[5];
LineCoding.datatype = pbuf[6];
ZF_LOGD("Line Coding Set\n\r");
break;
case CDC_GET_LINE_CODING:
pbuf[0] = (uint8_t) (LineCoding.bitrate);
pbuf[1] = (uint8_t) (LineCoding.bitrate >> 8);
pbuf[2] = (uint8_t) (LineCoding.bitrate >> 16);
pbuf[3] = (uint8_t) (LineCoding.bitrate >> 24);
pbuf[4] = LineCoding.format;
pbuf[5] = LineCoding.paritytype;
pbuf[6] = LineCoding.datatype;
ZF_LOGD("Line Coding Get\n\r");
break;
Thanks in advance, any support is appreciated.
I don't know enough about the STM32 libraries to really check your code, but I suspect you are forgetting to read the bytes transmitted by the STM32 on PC side. Try opening a terminal program like PuTTY and connecting to the STM32's virtual serial port. Otherwise, the Windows USB-to-serial driver (usbser.sys) will eventually have its buffers filled with data from your device and it will stop requesting more, at which point the buffers on your device will fill up as well.
Related
I am trying to get USB device code working on my stm32f401 microcontroller. So far, I'm able to read the packets from the host, send the first device request, receive and apply the assigned address. After that, the host requests for the device descriptor again, and my device doesn't response to that for some reason. I've been fighting this issue for weeks now, seems like I need help with it.
In my code, after initializing the peripherals, I set up my endpoint 0 and set its Tx FIFO size to 16. When the device receives a packet, it reads the packet status and reads the packet content in function setup_packet_handler(). This function prints the content word by word:
uint32_t data = *fifo;
printf("Rx from FIFO: 0x%08x\n", data);
void rxflvl_handler() {
uint32_t packet_status = USB_OTG_FS->GRXSTSP;
uint8_t endpoint_number = (packet_status & USB_OTG_GRXSTSP_EPNUM_Msk) >> USB_OTG_GRXSTSP_EPNUM_Pos;
uint16_t bcnt = (packet_status & USB_OTG_GRXSTSP_BCNT_Msk) >> USB_OTG_GRXSTSP_BCNT_Pos;
uint16_t pktsts = (packet_status & USB_OTG_GRXSTSP_PKTSTS_Msk) >> USB_OTG_GRXSTSP_PKTSTS_Pos;
printf("Received packet of status 0x%02x\n", pktsts);
switch (pktsts) {
case 0x06: // SETUP packet (includes data)
setup_packet_handler(endpoint_number, bcnt);
break;
case 0x02: // OUT packet (includes data)
break;
case 0x04: // SETUP stage has completed
case 0x03: // OUT transfer has completed
// Re-enable the endpoint
OUT_ENDPOINT(endpoint_number)->DOEPCTL |= USB_OTG_DOEPCTL_CNAK;
OUT_ENDPOINT(endpoint_number)->DOEPCTL |= USB_OTG_DOEPCTL_EPENA;
break;
}
}
My main() function watches the current number of packets to send and free space of the Tx FIFO for endpoint 0:
int main(void)
{
/* Pins initialization */
/* USB initialization */
in_packets = (in_endpoint->DIEPTSIZ & USB_OTG_DIEPTSIZ_PKTCNT_Msk) >> USB_OTG_DIEPTSIZ_PKTCNT_Pos;
fifo_space = in_endpoint->DTXFSTS & USB_OTG_DTXFSTS_INEPTFSAV_Msk;
printf("FIFO free space: %i\n", fifo_space);
while (1)
{
// Checking for USB interrupts
usb_poll();
// Notify if packets number changed
uint32_t npkts = (in_endpoint->DIEPTSIZ & USB_OTG_DIEPTSIZ_PKTCNT_Msk) >> USB_OTG_DIEPTSIZ_PKTCNT_Pos;
if (npkts != in_packets) {
printf("Packets number update: %i\n", npkts);
in_packets = npkts;
}
// Notify if FIFO space changed
uint16_t new_space = in_endpoint->DTXFSTS & USB_OTG_DTXFSTS_INEPTFSAV_Msk;
if (new_space != fifo_space) {
fifo_space = new_space;
printf("FIFO free space: %i\n", new_space);
}
}
}
Now the broken part, the code which sends the device descriptor. The function handle_descriptor_request() is called when the device descriptor request is received.
The function sets the size to transmit to 18; packet count to 3 (endpoint 0 max packet size is configured to 8); enables the endpoint and fills the FIFO. After that, prints the FIFO free space, to ensure it has decreased.
void handle_descriptor_request(void) {
printf("Received descriptor request\n");
uint32_t* fifo = (USB_OTG_FS_PERIPH_BASE + USB_OTG_FIFO_BASE +
0 * 0x1000);
USB_OTG_INEndpointTypeDef* in_endpoint = (USB_OTG_INEndpointTypeDef*)(
USB_OTG_FS_PERIPH_BASE +
USB_OTG_IN_ENDPOINT_BASE +
0
);
// flush_txfifo(0);
// in_endpoint->DIEPTSIZ = 0;
// in_endpoint->DIEPCTL |= USB_OTG_DIEPCTL_SNAK;
// in_endpoint->DIEPCTL |= USB_OTG_DIEPCTL_EPDIS;
printf("Manual PKTCNT to 3\n");
// Set Tx size to 18
in_endpoint->DIEPTSIZ &= ~USB_OTG_DIEPTSIZ_XFRSIZ_Msk;
in_endpoint->DIEPTSIZ &= ~USB_OTG_DIEPTSIZ_PKTCNT_Msk;
in_endpoint->DIEPTSIZ |= 18 << USB_OTG_DIEPTSIZ_XFRSIZ_Pos;
// Send 3 packets (8 bytes max for each)
in_endpoint->DIEPTSIZ |= 3 << USB_OTG_DIEPTSIZ_PKTCNT_Pos;
// Enable transmission
in_endpoint->DIEPCTL |= USB_OTG_DIEPCTL_CNAK;
in_endpoint->DIEPCTL |= USB_OTG_DIEPCTL_EPENA;
// Write a valid device descriptor to the Tx FIFO
*fifo = 0x02000112;
*fifo = 0x08000000;
*fifo = 0x13aa6666;
*fifo = 0x00000000;
*fifo = 0x0000012c;
fifo_space = in_endpoint->DTXFSTS & USB_OTG_DTXFSTS_INEPTFSAV_Msk;
printf("FIFO free space: %i\n", fifo_space);
}
Below is the log I'm getting from the device. I see that the first device request is transmitted successfully, the FIFO size drops from 16 to 11, than back to 16. Number of packets to send decreases from 3 to 2 (although it should decrease to 0, right?). I also see this descriptor with Wireshark. Then the device address is received and answered by a ZLP packet; then something goes wrong. My FIFO space keeps decreasing with every descriptor request, the packets doesn't seem to leave the device.
USB reset received
FIFO free space: 16
INEPNE Endpoint interrupt
TXFE Endpoint interrupt
// First descriptor request. Sent successfully
Received packet of status 0x06
// Device descriptor request in binary form
Rx from FIFO: 0x01000680
Rx from FIFO: 0x00400000
// Output from handle_descriptor_request()
Received descriptor request
Manual PKTCNT to 3
FIFO free space: 11
// Output from main()
Packets number update: 2
FIFO free space: 16
// Output from rxflvl_handler()
Received packet of status 0x04
Received packet of status 0x02
Received packet of status 0x03
INEPNE Endpoint interrupt
TXFE Endpoint interrupt
// Received address
Received packet of status 0x06
// Set address request in binary form
Rx from FIFO: 0x001d0500
Rx from FIFO: 0x00000000
Received address: 29
Writing zero-length packet
Manual PKTCNT to 1
Packets number update: 1
Received packet of status 0x04
Packets number update: 0
TXFRC Endpoint interrupt
TXFE Endpoint interrupt
// Another device descriptor request
Received packet of status 0x06
Rx from FIFO: 0x01000680
Rx from FIFO: 0x00120000
Received descriptor request
Manual PKTCNT to 3
FIFO free space: 11
Packets number update: 3
Received packet of status 0x04
// Looks unsuccessful. Try again...
Received packet of status 0x06
Rx from FIFO: 0x01000680
Rx from FIFO: 0x00120000
Received descriptor request
Manual PKTCNT to 3
FIFO free space: 6
// And so on. The FIFO free space decreases to 0
Here are the Wireshark views of the first and second descriptor responses. The second one fails with EOVERFLOW (-75). Sometimes I see -71 error when run 'dmesg' on my host. Can't figure out what is the reason for that.
If I will be flushing the 0th FIFO before the descriptor transmission, i.e. uncomment the
flush_txfifo(0);
Thing doesn't change a lot: the FIFO free space stops decreasing below 11, but the packets doesn't seem to be sent.
If I recover the endpoint before the descriptor tranmission, uncommenting the
in_endpoint->DIEPTSIZ = 0;
in_endpoint->DIEPCTL |= USB_OTG_DIEPCTL_SNAK;
in_endpoint->DIEPCTL |= USB_OTG_DIEPCTL_EPDIS;
then starting at the second try, the device seems to strart sending something, but still something wrong. The FIFO size increases to 15 words instead of 16
// ...
Rx from FIFO: 0x01000680
Rx from FIFO: 0x00120000
Received descriptor request
Manual PKTCNT to 3
FIFO free space: 11
FIFO free space: 15
Received packet of status 0x04
Received packet of status 0x02
Received packet of status 0x03
Also I can't understand why the number of packets drops from 3 to 2 for the successful transmission. I can see in Wireshark whole descriptor (18 bytes) and the endpoint size is configured to 8.
Could you point me to things I may need to check?
I have connected 2 TI controllers via SPI. The TMS320F28055 controller is my master and the TMS320F2885 controller is my slave. I want to send complete data to the slave via spi. The data always ends up successfully in the SPIDAT register, i.e. the shift register. The shift register should then send the data to the SPIRXBUF - Buffer. Sometimes the data is successfully sent to the buffer and sometimes not it's always very random I've tried a lot. I don't use a FIFO. Does anyone know how I can fix the problem.
I made a table showing the data in the master and slave registers. I also send the configuration of the slave and master.
void spi_init(void)
{
SpiaRegs.SPICTL.all = 0x000E; //Normal SPI clocking scheme(Data in latch on rising edge)master, 4-pin option, No interrupt
SpiaRegs.SPICTL.bit.CLK_PHASE = 1; //1
SpiaRegs.SPIBRR = 0x0077; //BateRate 0.5MHz
SpiaRegs.SPICCR.all = 0x0087; //SPI is ready to transmit or receive the next character.
SpiaRegs.SPICCR.bit.CLKPOLARITY = 0; //0
SpiaRegs.SPIPRI.bit.FREE = 1;
}
This is the code from my master, I use the TMS320F28055:
void spi_init(void)
{
SpiaRegs.SPICCR.bit.SPISWRESET = 0;
SpiaRegs.SPICTL.all = 0x000A; //8 //Normal SPI clocking scheme(Data in latch on rising edge)slave, 4-pin option, No interrupt
SpiaRegs.SPICTL.bit.CLK_PHASE = 1; //1
SpiaRegs.SPIBRR = 0x0077; //BateRate 0.5MHz ist für den Slave nicht notwendig
SpiaRegs.SPICCR.all = 0x0087; //SPI is ready to transmit or receive the next character.
SpiaRegs.SPICCR.bit.CLKPOLARITY = 0; //0
SpiaRegs.SPICTL.bit.SPIINTENA = 1 ;
SpiaRegs.SPICTL.bit.OVERRUNINTENA = 1 ;
SpiaRegs.SPIPRI.bit.FREE = 1;
SpiaRegs.SPICCR.bit.SPISWRESET=1;
}
And this is the code from my slave TMS320F28035.
I'm using an interrupt here, but I've also tried it without an interrupt.
uint16_t pdata = 0x1234;
int dataH, dataL;
dataH = 0;
dataL = 0;
dataH = (pdata >> 8);
dataL = (pdata & 0x00FF);
spi_xmit(dataH);
spi_xmit(dataL);
And with that I send example data, in this case it would be the 0x1234. When I send it it arrives successfully in the shift register and buffer. But if I want to send it more often, the shift register does not completely shift the data into the buffer. To check I debug both microcontrollers at the same time. By the way, I send 8 bits twice in a row. the buffer has a size of 16 bits.
I am trying to create a UART bridge using MSP430. I have a sensor sending strings to the MSP430 which I intend to send to my PC. Additionally, the sensor responds to commands which I intend to send using my PC through the MSP430 bridge. The commands I am sending to the sensor reach it without any flaw. However, the messages sent by the sensor reach the TXBUF of the UART connected to my PC but does not appear on the terminal. On checking the registers I see 0x000A on the TXBUF but it appears to recieve all the chahracters. But nothing is printed.
I am using the following code:
#include <msp430.h>
unsigned char *msg;
unsigned char i=0 , j=0;
int main(void)
{
WDTCTL = WDTPW | WDTHOLD; // stop watchdog timer
// Pin Initialization
P6SEL1 |= BIT1;
P6SEL0 &= ~BIT1;
P6SEL1 |= BIT0;
P6SEL0 &= ~BIT0;
P2SEL1 |= BIT5;
P2SEL0 &= ~BIT5;
P2SEL1 |= BIT6;
P2SEL0 &= ~BIT6;
PM5CTL0 &= ~LOCKLPM5;
// UART Initialization
UCA1CTLW0 |= UCSWRST;
UCA1CTLW0 |= UCSSEL__SMCLK; // Using 1 MHZ clock
UCA3CTLW0 |= UCSWRST;
UCA3CTLW0 |= UCSSEL__SMCLK;
UCA3BRW = 6; // Baud Rate set to 9600
UCA3MCTLW = UCOS16 | UCBRF_8 | 0x2000;
UCA1BRW = 6;
UCA1MCTLW = UCOS16 | UCBRF_8 | 0x2000;
UCA3CTLW0 &= ~UCSWRST;
UCA1CTLW0 &= ~UCSWRST;
UCA3IE |= UCRXIE;
UCA1IE |= UCRXIE;
__enable_interrupt(); // Interrupt enable
while (1)
{}
}
// UART A3 connected to the PC.
#if defined(__TI_COMPILER_VERSION__) || defined(__IAR_SYSTEMS_ICC__)
#pragma vector=EUSCI_A3_VECTOR
__interrupt void USCI_A3_ISR(void)
#elif defined(__GNUC__)
void __attribute__ ((interrupt(EUSCI_A3_VECTOR))) USCI_A3_ISR (void)
#else
#error Compiler not supported!
#endif
{
switch(__even_in_range(UCA3IV, USCI_UART_UCTXCPTIFG))
{
case USCI_NONE: break;
case USCI_UART_UCRXIFG:
while(!(UCA3IFG&UCTXIFG));
UCA1TXBUF = UCA3RXBUF;
__no_operation();
break;
case USCI_UART_UCTXIFG: break;
case USCI_UART_UCSTTIFG: break;
case USCI_UART_UCTXCPTIFG: break;
default: break;
}
}
// UART A1 connected to the sensor.
#if defined(__TI_COMPILER_VERSION__) || defined(__IAR_SYSTEMS_ICC__)
#pragma vector=EUSCI_A1_VECTOR
__interrupt void USCI_A1_ISR(void)
#elif defined(__GNUC__)
void __attribute__ ((interrupt(EUSCI_A1_VECTOR))) USCI_A1_ISR (void)
#else
#error Compiler not supported!
#endif
{
switch(__even_in_range(UCA1IV, USCI_UART_UCTXCPTIFG))
{
case USCI_NONE: break;
case USCI_UART_UCRXIFG:
while(!(UCA1IFG&UCTXIFG)); //Trying to read a string
{
*(msg+i) = UCA1RXBUF;
j = *(msg+i);
UCA3TXBUF = j;
i++;
}
break;
case USCI_UART_UCTXIFG: break;
case USCI_UART_UCSTTIFG: break;
case USCI_UART_UCTXCPTIFG: break;
default: break;
}
}
Please help.
Thanks in advance.
First, the problems that I see with your listing:
(p1) Even though the baud rates of both UARTs are the same, your design does not make use of proper (see problem 3 below) buffering in the event that both the PC and the sensor is sending data at the same time. To make matters worse, both your ISRs contain blocking while loops that don't buffer and only waste time until the interrupt flags clears.
(p2) Your source (shown below) is likely coded in error:
while(!(UCA1IFG&UCTXIFG)); //Trying to read a string
{
*(msg+i) = UCA1RXBUF;
j = *(msg+i);
UCA3TXBUF = j;
i++;
}
because the body of the while loop is actually empty due to the trailing ";" so the code within the open/closing brackets that follows is not part of the while loop.
(p3) The pointer variable msg was never initialized. Most likely it points to random heap memory or unused portion of the stack, so the program doesn't crash right away. Eventually, it would because the variable i is incremented but never decremented, so memory is "one time used" by the sensor ISR.
My suggestions:
(s1) Declare two buffers, one for data arriving from the PC and the other for data arriving from the sensor. Remove the "unsigned char *msg" and replace with something like this:
unsigned char pc_data[256];
unsigned char sensor_data[256];
The size 256 is on purpose to create a poor-mans circular buffer when used with an 8-bit index variable. When the variable reaches 255 and is incremented, it will simply roll back to 0. In this case both i and j as you already declared can be used, but maybe pc_data_index and sensor_data_index would be better understood. You also need two more variables for the size of the buffer, maybe pc_data_count and sensor_data_count. If your procssor cannot afford this much buffer space, then decrease to a modular amount (i.e., 2^BUFSIZE, where BUFSIZE = 32) and use the modular operator when updating the index like this:
pc_data_index = (pc_data_index + 1) % BUFSIZE;
(s2) Change both ISR routines to process both the USCI_UART_UCRXIFG and USCI_UART_UCTXIFG interrupt events. The ISRs should not contain any loops, simply buffer data or write data out from buffer. Here is an example:
switch(__even_in_range(UCA1IV, USCI_UART_UCTXCPTIFG))
{
case USCI_NONE: break;
case USCI_UART_UCRXIFG:
// Byte was received from sensor, so buffer it
sensor_data[sensor_data_count++] = UCA1RXBUF;
sensor_data_index = (sensor_data_index + 1) % BUFSIZE;
// Enable the TX interrupts to send the buffered data
UCA1IE |= UCTXIE;
break;
case USCI_UART_UCTXIFG:
// Sensor UART is ready to send next byte
UCA3TXBUF = sensor_data[sensor_data_index];
sensor_data_count--;
// Disable the TX interrupt if no more data to send
if (sensor_data_count == 0) UCA1IE &= ~UCTXIE;
break;
I am trying to implement the WINC1500 MLA Driver to work with the ATMEGA2561 MCU and I have written my driver code and it's stuck on the line "while((SPSR & (1 << SPIF)) == 0);" in the m2mStub_SpiTxRx function.
I have no idea why it's not progressing through. I'm using the jumpstart ImageCraft IDE for this project.
Here's the implementation of it
void m2mStub_SpiTxRx(uint8_t *p_txBuf,
uint16_t txLen,
uint8_t *p_rxBuf,
uint16_t rxLen)
{
uint16_t byteCount;
uint16_t i;
// Calculate the number of clock cycles necessary, this implies a full-duplex SPI.
byteCount = (txLen >= rxLen) ? txLen : rxLen;
DEBUGOUTF("Calculate the number of clock cycles\n");
DEBUGOUTF("byteCount %d", byteCount, "\n");
DEBUGOUTF("txLen %d", txLen, "\n");
DEBUGOUTF("rxLen %d", rxLen, "\n");
// Read / Transmit.
for (i = 0; i < byteCount; ++i)
{
// Wait for transmitter to be ready. (This is causing the entire thing to crash)
while((SPSR & (1 << SPIF)) == 0);
// Transmit.
if (txLen > 0)
{
// Send data from the transmit buffer.
SPDR = (*p_txBuf++);
--txLen;
}
else
{
// No more Tx data to send, just send something to keep clock active.
SPDR = 0x00U;
}
// Wait for transfer to finish.
while((SPSR & (1 << SPIF)) == 0);
// Send dummy data to slave, so we can read something from it.
SPDR = 0x00U;
// Wait for transfer to finish.
while((SPSR & (1 << SPIF)) == 0);
// Read or throw away data from the slave as required.
if (rxLen > 0)
{
*p_rxBuf++ = SPDR;
--rxLen;
}
else
{
// Clear the registers
volatile uint8_t reg_clear = 0U;
reg_clear = SPDR;
(void)reg_clear;
}
}
}
I don't have enough information to say for sure, but my assumption is that your SPI connection is not set up correctly.
In particular, I guess you forgot to set /SS as output, same as this problem or this.
In the datasheet it says:
Master Mode When the SPI is configured as a master (MSTR in SPCR is
set), the user can determine the direction of the SS pin.
If SS is configured as an output, the pin is a general output pin
which does not affect the SPI system. Typically, the pin will be
driving the SS pin of the SPI slave.
If SS is configured as an input, it must be held high to ensure Master
SPI operation. If the SS pin is driven low by peripheral circuitry
when the SPI is configured as a master with the SS pin defined as an
input, the SPI system interprets this as another master selecting the
SPI as a slave and starting to send data to it. To avoid bus
contention, the SPI system takes the following actions:
The MSTR bit in SPCR is cleared and the SPI system becomes a slave. As a result of the SPI becoming a slave, the MOSI and SCK pins become
inputs.
The SPIF flag in SPSR is set, and if the SPI interrupt is enabled, and the I-bit in SREG is set, the interrupt routine will be executed.
Thus, when interrupt-driven SPI transmission is used in master mode,
and there exists a possibility that SS is driven low, the interrupt
should always check that the MSTR bit is still set. If the MSTR bit
has been cleared by a slave select, it must be set by the user to
re-enable SPI master mode.
So, you just need to configure the /SS pin as output and set to high in your init code, this should solve your problem:
DDRB |= (1 << PB0); // Set /SS (PB0) as output
PORTB |= (1 << PB0); // Set /SS (PB0) high
I am currently working at a logger that uses a MSP430F2618 MCU and SanDisk 4GB SDHC Card.
Card initialization works as expected, I also can read MBR and FAT table.
The problem is that I can't write any data on it. I have checked if it is write protected by notch, but it's not. Windows 7 OS has no problem reading/writing to it.
Though, I have used a tool called "HxD" and I've tried to alter some sectors (under Windows). When I try to save the content to SD card, the tool pop up a windows telling me "Access denied!".
Then I came back to my code for writing to SD card:
uint8_t SdWriteBlock(uchar_t *blockData, const uint32_t address)
{
uint8_t result = OP_ERROR;
uint16_t count;
uchar_t dataResp;
uint8_t idx;
for (idx = RWTIMEOUT; idx > 0; idx--)
{
CS_LOW();
SdCommand(CMD24, address, 0xFF);
dataResp = SdResponse();
if (dataResp == 0x00)
{
break;
}
else
{
CS_HIGH();
SdWrite(0xFF);
}
}
if (0x00 == dataResp)
{
//send command success, now send data starting with DATA TOKEN = 0xFE
SdWrite(0xFE);
//send 512 bytes of data
for (count = 0; count < 512; count++)
{
SdWrite(*blockData++);
}
//now send tow CRC bytes ,through it is not used in the spi mode
//but it is still needed in transfer format
SdWrite(0xFF);
SdWrite(0xFF);
//now read in the DATA RESPONSE TOKEN
do
{
SdWrite(0xFF);
dataResp = SdRead();
}
while (dataResp == 0x00);
//following the DATA RESPONSE TOKEN are a number of BUSY bytes
//a zero byte indicates the SD/MMC is busy programing,
//a non_zero byte indicates SD/MMC is not busy
dataResp = dataResp & 0x0F;
if (0x05 == dataResp)
{
idx = RWTIMEOUT;
do
{
SdWrite(0xFF);
dataResp = SdRead();
if (0x0 == dataResp)
{
result = OP_OK;
break;
}
idx--;
}
while (idx != 0);
CS_HIGH();
SdWrite(0xFF);
}
else
{
CS_HIGH();
SdWrite(0xFF);
}
}
return result;
}
The problem seems to be when I am waiting for card status:
do
{
SdWrite(0xFF);
dataResp = SdRead();
}
while (dataResp == 0x00);
Here I am waiting for a response of type "X5"(hex value) where X is undefined.
But most of the cases the response is 0x00 (hex value) and I don't get out of the loop. Few cases are when the response is 0xFF (hex value).
I can't figure out what is the problem.
Can anyone help me? Thanks!
4GB SDHC
We need to see much more of your code. Many µC SPI codebases only support SD cards <= 2 GB, so using a smaller card might work.
You might check it yourself: SDHC needs a CMD 8 and an ACMD 41 after the CMD 0 (GO_IDLE_STATE) command, otherwise you cannot read or write data to it.
Thank you for your answers, but I solved my problem. It was a problem of timing. I had to put a delay at specific points.