Hi first time using forum so sorry if I did something wrong.
I'm trying to make a simple gps NMEA sentence reader using a raspberry pi pico and a gps module I have hooked up to it. Here is the code I created.
from machine import Pin, UART
gps_module = UART(1, baudrate=9600, tx=Pin(4), rx=Pin(5))
while(1):
NMEA = gps_module.readline()
print(NMEA)
My problem is all that's outputted is "None". I am new to programming so all knowledge is helpful. If anyone knows how to get it to output the NEMA sentences from the gps module please let me know. Thanks.
Related
I am a student, currently doing my internship and working on arduino mega2560 and Dragino LoRa shield v1.3. I was searching the data sheet for Dragino LoRa shield v1.3 on the internet, but I have been unsuccessful to find any. I have Sx1276 chip data sheet (http://www.semtech.com/images/datasheet/sx1276.pdf)used in Dragino LoRa shield v1.3 board. But there is no proper data sheet for the entire board. Can anyone help? I want to understand the pin configurations and pin mapping. There is another resource I found on GitHub, however, it is quite hard to understand (https://github.com/dragino/Lora/tree/master/Lora%20Shield/hardware/v1.3). Any kind of help would be greatly appreciated. Thank you.
Here you can see pretty much anything you want.
https://github.com/dragino/Lora/blob/master/Lora%20Shield/hardware/v1.3/Lora%20Shield%20v1.3.sch.pdf
You have to understand: The Square in the top left is your board. You can see every connected Pin has its name for example pin 14 is CLK. You can search for the CLK on the RFM95 and you will see it is pin 4.
Hope i helped you
The corresponding relationship between sx1278 chip data sheet and the sheild board is showed in the scheme.
In fact,you just need to know the sheild pin is connected in the default way when it is inserted directly in the mega 2560,but the ss pin (DIO 10 of the sheild board )maybe should be connected with DIO53 of mega 2560,you can try it.The default way you can refer to its introduction on Wikipedia.
I am using Mindstorms and build a Robot with two Motors and a IR Sensor.
1) I made a program which lets the Robot follow a IR signal and stops when reaching it.
2) I made a program to remote control the robot with the IR control.
Both program work. But when combining them, program 1 does not work anymore.
It gives eratic results from the IR sensor. It seams the detecting a IR-Button is not compatible with measuring the signal in the same program. Anyone has similar experiance or know how to deal with it?
This is the program which works:
Introducing another Selection around it which senses a IR Button does not work anymore:
The result is, that the program follows to the right section, but the IR measurements of distance and directions give random results.
Anyone has any Idea?
If you have already tried another sensor and there is still a problem it could possibly be a bug with the software. I would post your example on the the NI MINDSTORMS support board so that they can look into the bug.
http://forums.ni.com/t5/LabVIEW-for-LEGO-MINDSTORMS-and/bd-p/460
I am going to read the PC's music output and getting basic information (beat/tone/..) of the song played (then flash the lights accordingly etc). Can NAudio be used for the purpose and any samples? Sorry for this too general question at the moment.
TIA
-d
You can access your PC's output using WasapiLoopbackCapture. However, NAudio does not include a beat detection algorithm, so you'd need to find one yourself. There is a FFT class though which could be used to determine frequencies present.
Is it possible to capture all/any audio played by a PC into a system.io.stream, so that it can then be run through speech recognition (System.Speech.Recognition.SpeechRecognitionEngine)?
Essentially I'm looking to pefrom speech recognition on any audio on the client PC, google seems to suggest that capturing a stream like this can be done using Microsoft.DirectX.DirectSound, however I cannot honestly determine how. Any suggestions would be greatly appreciated.
Take a look at this question for a solution on Vista/Win7, and take a look at this one for WinXP.
Summary:
You can use Loopback recording with WASAPI in Vista/Win7, but there is no equivalent API in WinXP, however a partial solution can be achieved with a virtual soundcard driver.
I am planning on doing a small arduino project and would like to know if what I'm thinking would work with a regular arduino board. I'm thinking of buying an Arduino Uno for my project, along with an IR LED and an IR sensor. So here's what I want to go with this:
I want to point the LED towards the sensor, so that the sensor is always detecting light. Then', I'll start "cutting" that light (say, with with my hand) several times. I want the arduino program to time the intervals between the times the light is "cut" and send these times to my computer via USB, so I can process this data.
I've seen many people talk about serial communication between an arduino board and a computer, but I'm not sure how that works. Will it use the same usb connector I use to upload programs to the board, or do I have to buy anything else?
EDIT: tl;dr: I guess my question, in the end, is twofold:
1) Am I able to "talk" to my computer using the built-in USB connector on the board, or is that used solely for uploading programs and I need to buy another one? and
2) Is this project feasible with an Arduino Uno board?
Thanks for the help!
Yes, your project is very feasible.
You use the built in USB connector to both program the device and communicate with it. Check out some examples on the Serial Reference Page
For reading the sensor, you'll want to use either a digital or analog input. For a digital input, you'll likely have to external components to control the light threshold, but it will provide a simple yes or no if something is in front of it. With an analog input, you can use a threshold in code to determine when your hand passes.
Timing can either be done on device with the Millis() function or on the connected computer.