I have a Gateway - Node application using the LoRa module but I don't know whether to choose the LoRa module to interface UART or SPI.
Can someone help me distinguish the difference when using these two types? Example: when I have 5 Nodes connected to Gatewway, which one should I use? and same when I have 50 Nodes.
Thanks!
A UART converts the signals to RS232 signaling(NOT VOLTAGES, You will need an additional adapter chip like the FTDI 232H) to hook up to a serial port on a computer. Speeds are usually limited to less than 400 Kilobits per second(varies based on distance and devices)
If you are connecting multiple devices to the same micro-controller(Arduino), use SPI. The connection speeds are not limited by standards. It is a bus arrangement with 4 pins (clock SCLK, input MISO, output MOSI and Slave select SS) The SCLK, MISO, MOSI are connected to all devices.To chain additional devices it requires an additional SS pin per device.
SPI is going to be faster(several (<5?)Megabits per second is not uncommon (depends on length(not greater than .3 meters), wire quality, environmental noise and device specifications) and requires less discrete components.
Since LoRa speeds max out around 300kbps, a single SPI connected gateway could theoretically handle 15 LoRa transceivers on a single gateway.
Doing 15 devices may violate local RF duty cycle restrictions resulting in fines and/or imprisonment.
Please check with your regulatory institutions prior to implementation of any solution.
I would suggest using four transceivers with external antennas each pointing in a different cardinal direction(possibly offset) at each gateway. This configuration should permit 400+(depending on usage patterns) client devices per gateway.
I am working on a hardware-embedded software project involving USB protocol. My plan is to use a USB3.0 Hub that clusters many USB2.0 isochronous devices, each requiring around 33 Mbps BW.
Now, I know that USB3 is able to enumerate up to 127 devices [1], what I'd like to know is whether this is true if I connect USB2 devices to the USB3 Hub. Can I, in theory, use the hub with 97 USB2 devices?
Also, could I utilize the entire BW (yes, only around 70% is feasible) of a USB3 port this way? The above figures add up to 5 Gbps*70% = 3.2 Gbps = 97*33Mbps.
Any help is much appreciated.
I believe there is some confusion with your understanding.
USB 3.0 hub contains 2 logical hub partitions.
1 - USB 3.0 hub
2 - USB 2.0 hub
USB 2.0 devices communicate through the USB 2.0 hub data path which has nothing to do with USB 3.0 data path. So the assumption of the bandwidth is incorrect as they are for USB 3.0 devices connected via USB 3.0 data path.
Also USB 2.0 devices will get connected via USB 2.0 lines and NOT USB 3.0 lines.
Another misconception in your question.
The speed which you mentioned i.e. 5 Gbps, that is USB 3.0 link speed.
What that means is that 2 devices over USB 3.0 link can send or receive data at 5 Gbps. Since every hub downstream port is physically a separate link, the link speed will be same for all i.e. 5 Gbps. So its speed between 2 link partners and not end to end speed.
Now when you say End to End data transfer rate, that will depend on your host controller driver architecture, your OS performance, your Hub, etc.
For Isochronous endpoints, the hub will start data transfer on service intervals for each endpoints on each port as per USB spec but you cannot be sure that End to End data transfer rate will be equally divided.
PS - You will get a lot less end to end bandwidth by the way as the link speed for USB 2.0 is 480 Mbps. :)
I've got a few video converter boxes (Marshall VAC-11SU3, Marshall VAC-11HU3, Magewell USB Capture SDI, Blackmagic UltraStudio Express) and no cameras.
They all have an incoming video signal plugged into their respective SDI or HDMI ports.
The issue is that GetNativeMediaType always returns the same format as GetMediaTypeByIndex does for index 0 regardless of the actual video format that is coming into the SDI/HDMI port.
Every Media Foundation example I've seen so far has a UI to pick the "correct" native format. This menu is populated from GetMediaTypeCount and GetMediaTypeByIndex for the device.
My users will not know what to pick!
We've been using Blackmagic's DeckLink APIs and our users see the incoming video signal format in the UI.
We'd like to expand support for multiple device manufacturers but this one has me stumped.
Media Foundation does not employ a concept of signal format detection you have with recent Blackmagic hardware (earlier Blackmagic products, by the way, did not offer detection).
A video source driver could indeed enumerate the media type it sees on the wire as first GetNativeMediaType output and/or offer dynamic format change during streaming session to such format. Media Foundation video sources are mostly assuming however webcamera-like devices and have a fixed type enumeration order.
I would not assume Blackmagic driver to be different because it mostly mimics a webcamera, so that with a WDM driver Blackmagic device inputs could be consumed using standard APIs. If one needs extended functionality, such as signal detection, Blackmagic suggests using their DeckLink SDK (which is good by the way).
I'm trying to reverse-engineer a BLE device that uses USB HID over GATT to communicate with the host. I can capture the traffic using usbpcap, but when loading the results into wireshark, the packets seem to contain the bytes representing the data that is going over the air (i.e. device descriptor), but the packets are not decoded according to USBHID protocol. Everything is decoded as USB, and only contain URB_INTERRUPT_IN, URB_BULK in/out and URB_CONTROL_OUT, while I'm looking for things like GET DESCRIPTOR Request/Response DEVICE. Is there an extra step I can take to get the packets formatted and parsed correctly?
There are a few characteristics in use. You have one characteristic which contains the Report Map. This is usually only read once when the device is paired. This map contains the layout/specification of the data which is later sent through the Report notifications. This is mostly "copy-paste" the specification from the USB spec into BLE.
Now, when you run HID-over-GATT and your Bluetooth controller talks to the Host over USB, what you will see in usbpcap is the ACL data which contains L2CAP data, which contains GATT data, which in turn contains the Report data for HID. Then the Bluetooth stack on the host will decode this and feed it into the kernel's HID parser.
I would suggest you to instead connect your HID-over-GATT device to an Android phone and then take a look at the HCI snoop log what happens, which is decodable in Wireshark (but it won't parse your HID data).
I'm trying to build a device to read the current GPS coordinates. The device will include a small computer inside running Windows 7.
I'm looking for a usb GPS to connect it to the computer and be able to read the GPS coordinates from my VB.net 2010 program. Of course, here the most important thing is what hardware I need to accomplish that. Suggest me some GPS models.
Do I need only a GPS receptor or do I need even more hardware?
In addition to jcibar's answer:
For Bluetooth GPS or most USB devices you probably don't even have to set the classic RS232 communication settings like "baud rate". It will just work, whatever baud rate you set.
Look at the "Ports (COM & LPT)" list of the Windows Device Manager - One of the "COM" devices listed should be the GPS receiver and it many times includes a description that indicates what it is.
E.g. on my Win7 x64 notebook I have a "Sierra Wireless Gobi 2000 HS-USB NMEA 9001 (COM8)" port, which is the notebook's built-in GPS. It will just start communicating the moment I open the port:
17.09.2013 10:12:01.890 [RX] - $GPGSA,A,1,,,,,,,,,,,,,,,*1E<CR><LF>
$GPGSV,4,1,16,10,,,,21,,,,20,,,,32,,,*7A<CR><LF>
$GPGSV,4,2,16,31,,,,30,,,,29,,,,28,,,*78<CR><LF>
$GPGSV,4,3,16,27,,,,26,,,,25,,,,24,,,*79<CR><LF>
$GPGSV,4,4,16,23,,,,22,,,,19,,,,18,,,*7E<CR><LF>
$GPGGA,,,,,,0,,,,,,,,*66<CR><LF>
$PQXFI,,,,,,,,,,*56<CR><LF>
$GPVTG,,T,,M,,N,,K,N*2C<CR><LF>
$GPRMC,,V,,,,,,,,,,N*53<CR><LF>
You can use any serial COM port / RS232 logger to test this.
You can use any GPS (USB, Bluetooth) that provides a RS232-level serial interface (serial port). The GPS will create a virtual serial port (e.g. COM13) that you can use in your VB.net program to read serial data (NMEA frames tipically at 4800 bauds).
For instance, for USB you could use something like this: Haicom HI-206USB.