I am trying to use UP501 gps module on Arduino and get a raw information from the GPS like long, lat,alt..etc
I am running this code on Arduino to see if the serial on GPS is available, if it is then print out the data from GPS.
Arduino Code:
#include <SoftwareSerial.h>
SoftwareSerial mySerial(10, 11); // RX, TX .
void setup()
{
// Open serial communications and wait for port to open:
Serial.begin(57600);
Serial.println("Searching using GPS...");
// set the data rate for the SoftwareSerial port
mySerial.begin(9600);
}
void loop() // run over and over
{
if (mySerial.available())
Serial.write(mySerial.read());
}
My problem: I don't get any data from the GPS module.
I thought it would be from the wiring. See the picture below of my wiring.
yellow wire from pin 2 is connected to D10 which will be the RX
black wire from pin 3 is connected to ground.
Red wire from pin 4 is connected to 3.3 volts
Note: I went outside(outdoor) to get a satellite but no information
There is a tutorial on adafruit for using this gps
I sugest you take a look at it becuse your wireing seems to be missing a few things
Related
I used STM32F103 to send data to the PC via USB CDC at a frequency of 4000Hz, 21 bytes per packet. The code generated by STMCube. And then I found that only 1000 packets can be received per second on Windows7 via serial assistant, but it can normally receive 4000 packets per second on Ubuntu via CuteCOM. How do I achieve the same effect on Windows7? Thank you.
int main(void)
{
HAL_Init();
SystemClock_Config();
MX_GPIO_Init();
MX_USB_DEVICE_Init();
while(1)
{
__WFI();
}
}
void HAL_SYSTICK_Callback(void) //4000Hz
{
CDC_Transmit_FS(buff, 21);
}
I use python to test, and the result is:
I made a setup which consists of 3 Zigbee's, 2 routers(Zigbee S2C's) and 1 coordinator(Zigbee S2). The routers are each connected to arduino nano which collects data from 2 FSR's and an IMU(frame type: zigbee transmit request and packet size 46 bytes) and sends it to the Coordinator attached to an arduino UNO. All the Xbees are in API mode 2 and working at a baud rate of 115200. I am using a library called "Simple Zigbee Library" to send all the collected data to the Coordinator. The collection and sending of data works fine except that there are packets lost in the way. The nano's sample data at a frequency of around 25Hz independently. The coordinator tries to read the data send from the zigbees(using the library of course) in every loop, but unfortunately, it seems to receive only around 40-45 samples.(Should have been 25*2=50 samples total from the 2 xbees). Can anybody suggest why this is happening. I need as less data loss as possible for my setup to achieve its motive. Any kind of help is appreciated.
P.S: It may be important to mention that the coordinator is reading the data only from one xbee in each loop.
As can be seen under the "Source" heading of this image of data received by the coordinator, "19" and "106" are the addresses of the routers and there are data packets dropped intermittently
Thank you.
void setup()
{
// Start the serial ports ...
Serial.begin( 115200 );
while( !Serial ){;} // Wait for serial port (for Leonardo only).
xbeeSerial.begin( 115200 );
// ... and set the serial port for the XBee radio.
xbee.setSerial( xbeeSerial );
// Set a non-zero frame id to receive Status packets.
xbee.setAcknowledgement(true);
}
void loop()
{
// While data is waiting in the XBee serial port ...
while( xbee.available() )
{
// ... read the data.
xbee.read();
// If a complete message is available, display the contents
if( xbee.isComplete() ){
Serial.print("\nIncoming Message: ");
printPacket( xbee.getIncomingPacketObject() );
}
}
delay(10); // Small delay for stability
// That's it! The coordinator is ready to go.
}
// Function for printing the complete contents of a packet //
void printPacket(SimpleZigBeePacket & p)
{
//Serial.print( START, HEX );
//Serial.print(' ');
//Serial.print( p.getLengthMSB(), HEX );
//Serial.print(' ');
//Serial.print( p.getLengthLSB(), HEX );
//Serial.print(' ');
// Frame Type and Frame ID are stored in Frame Data
uint8_t checksum = 0;
for( int i=10; i<p.getFrameLength(); i++){
Serial.print( p.getFrameData(i), HEX );
Serial.print(' ');
checksum += p.getFrameData(i);
}
// Calculate checksum based on summation of frame bytes
checksum = 0xff - checksum;
Serial.print(checksum, HEX );
Serial.println();
}
Although you claim to be using 115,200bps, posted code shows you opening the serial ports at 9600 baud, definitely not fast enough for 2500 bytes/second (50 packets/second * 45 bytes/packet * 110% for overhead) received from XBee and dumped by printPacket()). Remember that 802.15.4 is always 250kbps over the air, and the XBee module's serial port configuration is just for local communications with the host.
Make sure your routers are sending unicast (and not broadcast) packets to keep the radio traffic down.
You should verify that sending is working before troubleshooting code on the coordinator. Update the code on your routers to see if you get a successful Transmit Status packet for every packet sent. Aiming for 50Hz seems like a bit much -- you're trying to send 45 bytes (is that the full size of the API frame?) every 20ms.
Are you using a hardware serial port on the Arduino for both the XBee module and Serial.print()? How much time does each call to printPacket() take? If you reduce the code in printPacket() to a bare minimum (last byte of sender's address and the 1-byte frame ID), do you see all packets come through (an indication that you're spending too much time dumping the packets).
I'm concerned with the code you're using in loop. I don't know the deep internals of how the Arduino works, but does that 10ms delay block other code from processing data? What if you simplify it:
void loop()
{
xbee.read();
// Process any complete frames.
while (xbee.isComplete()){
Serial.print("\nIncoming Message: ");
printPacket( xbee.getIncomingPacketObject() );
}
}
But before going too far, you should isolate the problem would by connecting the coordinator to a terminal emulator on a PC to monitor the frame rate. If all frames arrive then there's an issue on the coordinator. If they don't, work on your router code first.
I've found large number of examples, but nothing on how to do it "properly" from STM32MXCube.
How do I create skeleton code from STM32CubeMX for USB CDC virtual COM port communications (if possible STM32F4 Discovery)?
A STM32CubeMX project for Discovery F4 with CDC as USB device should work out of the box. Assuming you use an up-to-date STM32CubeMX and library:
Start STM32CubeMX
Select the board Discovery F4
Enable peripheral UBS_OTG_FS device only (leave over stuff uncheck)
Enable midlleware USB_Device Communication .. .aka CDC
In the clock tab check the clock source is HSE HCLK. It shall give 168 MHz HLCK and 48 MHz in the 48 MHz (USB). Check there is no red anywhere.
Save the project
Generate code (I used SW4STM32 toolchains)
Build (you may need to switch to internal CDT builder vs. GNU make).
Now add some code to send data over the COM port and voila it should work.
Actually, the tricky part is not try to make any "CDC" access until the host USB connects (no CDC setup yet)
Here is how I did it for quick emit test:
In file usbd_cdc_if.c
uint8_t CDC_Transmit_FS(uint8_t* Buf, uint16_t Len)
{
uint8_t result = USBD_OK;
/* USER CODE BEGIN 7 */
if (hUsbDevice_0 == NULL)
return -1;
USBD_CDC_SetTxBuffer(hUsbDevice_0, Buf, Len);
result = USBD_CDC_TransmitPacket(hUsbDevice_0);
/* USER CODE END 7 */
return result;
}
static int8_t CDC_DeInit_FS(void)
{
/* USER CODE BEGIN 4 */
hUsbDevice_0 = NULL;
return (USBD_OK);
/* USER CODE END 4 */
}
In file main.c
/* USER CODE BEGIN Includes */
#include "usbd_cdc_if.h"
/* USER CODE END Includes */
....
/* USER CODE BEGIN WHILE */
while (1)
{
/* USER CODE END WHILE */
/* USER CODE BEGIN 3 */
uint8_t HiMsg[] = "hello\r\n";
CDC_Transmit_FS(HiMsg, strlen(HiMsg));
HAL_Delay(200);
}
As soon you plug the micro USB (CN5) CDC data will start to show on the host terminal.
That works. I can see "hello" on the terminal (you may need to install a driver, http://www.st.com/web/en/catalog/tools/PF257938).
For reception, it needs to be first armed, say, started by a first call to USBD_CDC_ReceivePacket() in a good place. For that it can be CDC_Init_FS.
Then you can handle data as it arrives in CDC_Receive_FS and rearming reception again from here.
That works for me.
static int8_t CDC_Receive_FS (uint8_t* Buf, uint32_t *Len)
{
/* USER CODE BEGIN 6 */
USBD_CDC_ReceivePacket(hUsbDevice_0);
return (USBD_OK);
/* USER CODE END 6 */
}
static int8_t CDC_Init_FS(void)
{
hUsbDevice_0 = &hUsbDeviceFS;
/* USER CODE BEGIN 3 */
/* Set Application Buffers */
USBD_CDC_SetTxBuffer(hUsbDevice_0, UserTxBufferFS, 0);
USBD_CDC_SetRxBuffer(hUsbDevice_0, UserRxBufferFS);
USBD_CDC_ReceivePacket(hUsbDevice_0);
return (USBD_OK);
/* USER CODE END 3 */
}
There are a number of STM32F4 Discovery boards supported by the STM32Cube software, and you haven’t said which you’re using, but I’ve had exactly the same issue with the Discovery board with the F401VCT MCU.
After installing the STM virtual COM port driver, Windows Device Manager showed a STMicroelectronics virtual COM port, but with a yellow warning mark. The COM port was not accessible with a terminal application (PuTTY).
I eventually found that there is a problem with source code output from the STMCube program. But there is a simple fix:
Open a new STM32Cube project and enable the USB_OTG_FS as Device
Only and select CDC Virtual Port COM from the MiddleWares
USB_Device drop-down.
Generate the source code with no other changes needed to any USB settings.
In file usbd_cdc_if.c, change #define USB_HS_MAX_PACKET_SIZE from 512 to 256.
In file usbd_cdc.c, change the #define CDC_DATA_HS_MAX_PACKET_SIZE from 512 to 256.
After doing this, the yellow warning disappeared from Device Manager, and I could receive data at the CDC_Receive_FS function (in usbd_cdc_if.c file) when using PuTTY. Be aware that these definitions return to their incorrect values each time the STM32Cube generates code, and I haven’t found a way around this yet.
I hope this helps.
iChal's fix worked to remove the yellow warning mark.
I would like to mention that USB_HS_MAX_PACKET_SIZE is now in usbd_def.h and CDC_DATA_HS_MAX_PACKET_SIZE is in usbd_cdc.h
I am using STM32CubeMX v4.11.0 STM32Cube v1.0 and the STM32F401C-DISCO.
On further work, I now only have to set the heap size to a larger value.
I am setting it to 0x600 as I also have FreeRTOS enabled. I am using IAR EWARM, so the change is made in linker script stm32f401xc_flash.icf.
I am new to Stellaris LM3S8962. I want to inquire about how to write 4 separate ISRs to handle interrupts originated from 4 buttons Up, Down, Left, Right. I have tried but my code always fell into FaultISR.
Thanks!
https://www.dropbox.com/s/5j5il9kt324o943/Lab%202.rar?dl=0
The link above is my project.
It works fine, but when I add the following statements to the Startup task it would fall into FaultISR
// Configure the UART
SysCtlPeripheralEnable(SYSCTL_PERIPH_UART0);
SysCtlPeripheralEnable(SYSCTL_PERIPH_GPIOA);
// Enable processor interrupts.
IntMasterEnable();
// Set GPIO A0 and A1 as UART pins.
GPIOPinTypeUART(GPIO_PORTA_BASE, GPIO_PIN_0 | GPIO_PIN_1);
// Configure the UART for 9600, 8-N-1 operation.
UARTConfigSetExpClk(UART0_BASE, SysCtlClockGet(), 9600,
(UART_CONFIG_WLEN_8 | UART_CONFIG_STOP_ONE |
UART_CONFIG_PAR_NONE));
// Enable the UART interrupt.
IntEnable(INT_UART0);
UARTIntEnable(UART0_BASE, UART_INT_RX | UART_INT_RT);
I have an atmel UC3-L0 and compass sensor. Now I install AtmelStudio and download some demo code into the board. But I have no idea where the function printf in demo code will appear the data. How should I do to get the data?
The printf function outputs to stdout.
Usually on a "naked" processor with no operating system you need to define how a character is sent or received from a physical interface (usually an USART, console port, USB port, 4-port LCD interface, etc.). So typically you may want to use the USART port of your processor board to connect to a PC running Hyperterm, PuTTY or similar using a serial cable.
In essence you will need to
create FILE streams using the fdev_setup_stream() macro and
provide pointers to functions get() and put() that tell the printf() function how exactly to read and write from/to that stream (e.g. read/write to a USART, an LCD display, etc.).
you may have libraries - depending on your hardware - that already contain such functions (plus the correct port initialisation functions), like e.g. uart.c/.h, lcd.c/.h, etc.
In the documentation of stdio.h (e.g. here) look for the following:
printf(), fdev_setup_stream()
If you have downloaded Atmel Studio you may look into the stdiodemo.c code for further insight.
In order to use printf in ATMEL studio you should check the following things:
Add and Apply the Standard serial I/O module from Project->ASF Wizard.
Also add the USART module from the ASF Wizard.
Include the following code snippet before the main function.
static struct usart_module usart_instance;
static void configure_console(void)
{
struct usart_config usart_conf;
usart_get_config_defaults(&usart_conf);
usart_conf.mux_setting = EDBG_CDC_SERCOM_MUX_SETTING;
usart_conf.pinmux_pad0 = EDBG_CDC_SERCOM_PINMUX_PAD0;
usart_conf.pinmux_pad1 = EDBG_CDC_SERCOM_PINMUX_PAD1;
usart_conf.pinmux_pad2 = EDBG_CDC_SERCOM_PINMUX_PAD2;
usart_conf.pinmux_pad3 = EDBG_CDC_SERCOM_PINMUX_PAD3;
usart_conf.baudrate = 115200;
stdio_serial_init(&usart_instance, EDBG_CDC_MODULE, &usart_conf);
usart_enable(&usart_instance);
}
Make Sure you call the configure_console after system_init() from the main function.
Now go to tools->extension manager. Add the terminal window extension.
Build and Run your program and open the terminal window from view-> terminal window. put the correct com port to which your device is running on and set the baud to 115200 and hit connect on the terminal window.
You should see the printf statements now. (Float doesn't get printed in Atmel studio)
I was recently puzzling over this myself. I has installed Atmel Studio 7.0 and was using the SAMD21 Dev Board via an example project in which a call to printf was made.
In the sample code I saw that there was a configuration section:
/*!
* \brief Initialize USART to communicate with on board EDBG - SERCOM
* with the following settings.
* - 8-bit asynchronous USART
* - No parity
* - One stop bit
* - 115200 baud
*/
static void configure_usart(void)
{
struct usart_config config_usart;
// Get the default USART configuration
usart_get_config_defaults(&config_usart);
// Configure the baudrate
config_usart.baudrate = 115200;
// Configure the pin multiplexing for USART
config_usart.mux_setting = EDBG_CDC_SERCOM_MUX_SETTING;
config_usart.pinmux_pad0 = EDBG_CDC_SERCOM_PINMUX_PAD0;
config_usart.pinmux_pad1 = EDBG_CDC_SERCOM_PINMUX_PAD1;
config_usart.pinmux_pad2 = EDBG_CDC_SERCOM_PINMUX_PAD2;
config_usart.pinmux_pad3 = EDBG_CDC_SERCOM_PINMUX_PAD3;
// route the printf output to the USART
stdio_serial_init(&usart_instance, EDBG_CDC_MODULE, &config_usart);
// enable USART
usart_enable(&usart_instance);
}
In windows device manager I saw that there was an "Atmel Corp. EDBG USB Port (COM3)" listed under "Ports". However, the one of the "Properties" of this port was listed as 9600 Bits per second. I changed this from 9600 to 115200 to be consistent with the config section above.
Finally, I ran PuTTY.exe and set the Connection-->Serial setting to COM3 and 115200 baud. Then I went to Session, then clicked the Serial Connection Type, then clicked the Open button. And, BAM, there's my printf output via PuTTY.