What happens if we use directly peripheral circuits to microprocessor? - hardware

If i use directly peripherals circuits to microprocessor will it effect on .i don't understand it .

I translate your question as: What happens if I connect peripheral circuits directly to a microprocessor?
So going on that basis, let us see:
So my answer is: It depends. It depends on the microprocessor. It depends on the peripheral. And above all, it depends if you know how to connect the two properly.
This has been done many times in the past and sometimes even today. There are many examples of small systems where there are direct connections between the two parts you mention. Today, you are more likely to see this with a microcontroller.

Related

Is it possible to have CAN on Arduino without extra hardware?

I would like to have Arduino operating in a CAN network. Does the software that provides OSI model network layer exist for Arduino? I would imagine detecting the HI/LOW levels with GPIO/ADC and sending the signal to the network with DAC. It would be nice to have that without any extra hardware attached. I don't mind to have a terminating resistor required by the CAN network though.
By Arduino I mean any of them. My intention is to keep the development environmen.
If such a software does not exist, is there any technical obstacle for that, like limited flash size (again, I don't mean particular board with certain Atmega chip).
You can write a bit banging CAN driver, but it has many limitations.
First it's the timeing, it's hard to achieve the bit timing and also the arbitration.
You will be able to get 10kb or perhaps even 50kb but that consumes a huge amount of your cpu time.
And the code itself is a pain.
You have to calculate the CRC on the fly (easy) but to implement the collision detection and all the timing parameters is not easy.
Once, I done this for a company, but it was a realy bad idea.
Better buy a chip for 1 Euro and be happy.
There are several CAN Bus Shield boards available (e.g: this, and this), and that would be a far better solution. It is not just a matter of the controller chip, the bus interface, line drivers, and power all need to be considered. If you have the resources and skills you can of course create your own board or bread-board for less.
Even if you bit-bang it via GPIO you would need some hardware mods I believe to handle bus contention detection, and it would be very slow and may not interoperate well with "real" CAN controllers on the bus.
If your aim is to communicate between devices of your own design rather than off-the shelf CAN devices, then you don't need CAN for that, and something proprietary will suffice, and a UART will perform faster that a bit-banged CAN implementation.
I don't think, that such software exists. CAN bus is more complex, than for example I2C. Basically you would have to implement functionality of both CAN controller and CAN transceiver. See this thread for more details (in German).
Alternatively you could use one of the CAN shields. Another option were to use BeagleBone with suitable CAN cape.
Also take a look at AVR-CAN.

Details on USB- no luck so far

I've been looking for a detailed description for how USB protocol and cabling works for a long time with no luck. I am looking for a detailed yet not overcomplicated explanation of how things work on the software and hardware side of USB. Links and explanations would be appreciated. I've really run out of ideas, so it would be great if you can help me out.
This is what I do know:
USB hardware carries 4 lines- 5V power, ground, and 2 full duplex lines.
When connecting, the device can ask for a specified amount of current.
The transfer speeds for USB are quite fast compared to traditional serial connections.
When connecting, a device will output descriptors to the host describing itself. These descriptors will also be used for data.
What I don't know:
How does a program in C/C++ write directly to a USB port? Does it write to an address in the port?
How do some devices describe themselves as HID?
How do drivers work?
Everything else...
Thank you!
Identification
Every device has a (unique) Vendor and Product ID. These are provided (sold) by usb.org to identify a device. You can use a library like libusbx to enumerate all connected devices and select the one with the Vendor and Product ID you are looking for.
HID Descriptors
The point of HID descriptors is actually to do away with drivers. HID descriptors are a universal way of describing your device so you don't need to waste time on a driver for every system/architecture/etc/. (Same concept as the JVM.)
Reports
You will use either the input, output, or feature reports to read or write to your device. You send a stream to your device on the input or feature report. This is typically 8 bytes I believe. Only one of which is a single character you wish to write. The HID descriptor contains all the information you need to put together a report. Although I'm struggling to find a related link to clarify this.
Potential Libraries
In an effort to be open-minded here are all the libraries I am familiar with and some info about them.
libusb-0.1
First off is libusb-0.1. This used to be the go to and was built in to many Linux kernels and Windows I believe. It is very easy to use and there is a lot of documentation. However, the owner never updated and it wasn't edited for many years. It supports only synchronous transfers. (If an error occurs, the program can wait infinitely while it expects a transfer.)
libusbx
Next is libusbx. This is what most people would suggest today and I agree. It was published by those frustrated by the owner of libusb-0.1. The code is much more lightweight, up-to-date, and importantly does not require root privileges like libusb-0.1 and libusb-1.0 (Discussed in a second). It supports synchronous or asynchronous transfers.
libusb-1.0
Then there is libusb-1.0. This was the first update to libusb-0.1 in some number of years. It is not compatible with libusb-0.1. This was published the same day as libusbx as a retaliation (I assume) and an attempt to rectify the lack of updated content and conserve a user-base. It supports synchronous or asynchronous transfers.
hid.h
Finally, there is the hid library. This was built on top of libusb as another layer of abstraction. But honestly, I think it's just really confusing and it just adds more overhead than necessary.
Some Good Resources
Understanding HID Descriptors
Control Message Transfer Documentation (Very Good Link IMO)
Rolling Your Own HID Descriptor
Good Visual of HID Reports for Transfers
Great List of bmRequestType constants (You will need this or similar)
A simple terminal app for speaking with DigiSpark using libusbx and libusb-0.1
I know this isn't exactly what you are looking for, but maybe it will get you started!
This website has a general overview of how USB devices work:
https://www.beyondlogic.org/usbnutshell/usb1.shtml
Particular sections give answers to things from the list of things you don't know yet about USB.
E.g. to find out how USB devices identify themselves, read about USB descriptors:
https://www.beyondlogic.org/usbnutshell/usb5.shtml#DeviceDescriptors
To learn how a C/C++ program can talk to a USB device, see examples on using the libusb library:
https://github.com/libusb/libusb/tree/master/examples
To learn how USB drivers work, see a tutorial from Bootlin:
https://bootlin.com/blog/usb-slides/

How do I control a motor wirelessly?

I am a ME undergrad and am designing an implant device that requires programming knowledge. I honestly have no idea how to get started and am looking for advice. Basically what I need is a way to control a stepper motor. Stepper motor's use steps (pulses) to rotate the gear head. Now this motor I'm using needs 20 steps to revolve once. I need to be able to control the # of steps I want in a day per say. The motor I'm purchasing comes with an encoder which I'm guessing connects to the circuit board. Now what I want to do is have an external control (like a remote control for a toy)that can set these rates. I don't know anything about radio transmitters, or how to program the circuit board to do this for me. Any help would be appreciated, or books I can look into, websites, or tutorials. Thanks.
There are many ways of solving this problem, but it is more of a systems engineering question than a programming question; until you know what the system looks like, there is no way of determining what parts will be implemented in software. More details would be required to provide a specific answer.
For example what are the security/safety considerations?
What wireless technology do you need to use? e.g. RF or IR, if RF then licensing may be an issue, and that may vary from country to country. You could use BlueTooth, ZigBee, or even WiFi, but these technologies are probably more expensive and complex than necessary for such a simple application. If IR then is immunity from interference from TV remotes or PC IrDA ports or similar required?
If the commands/signals from the remote are complex you will probably need both the remote and the motor driver to incorporate a micro-controller and software. On the other hand if you just need increase/decrease functions then it would be entirely possible to implement the remote functionality you describe without any processing at all (depending on teh communication technology you choose).
What is the motor encoder for? Stepper motors do not normally need an encoder since the controller can simply count steps executed in either direction to determine position. Is the encoder incremental or absolute? If it is incremental, then it is certainly not needed; if it is absolute than it may be useful if you need to know the exact position of the motor on power-up without having to perform an initialisation or requiring end-stop switches.
You mentioned a "circuit board"; what hardware do you already have? What does it do? Do you have documentation for it? If it is commercially available, can you provide a link so we can see the documentation?
As you can see you have more system-level design issues to solve before you even consider software implementation, so the question is not yet ready to be answered here on SO. I suggest you seek out your university's EE department and team-up with someone with electronics expertise do design a complete system, then consider the software aspects.
Well worth taking a look at the Microchip site:
http://www.microchip.com/forums/f170.aspx
They produce microcontrollers that can be programmed to do exactly what you require (and a lot more).

Is hardware impossible to debug without software?

Disclaimer: I am (mostly) hardware ignorant. This is probably my problem. However I find it hard to accept that it is not possible to debug hardware so therefore I just wanted to get some second opinions.
We have an issue. Where certain actions (swapping Usb devices in and out at run-time) can blow either the Usb hub or chip on our Usb board (it's custom hardware). It's a fuzzy problem (it appears that the degree of "blownness" can vary a bit) and the problem manifests itself in intermittent fashions with various symptoms that are very difficult to reliably reproduce (typically random corruption of packets).
This results in difficulty in ascertaining if a newly reported problem is due to this hardware fault or is actually a bug in the software. We have since implemented protection on these devices but if an unprotected device is used with a protected device it has a possibility of then tainting the (now protected) device. One of the ports is also not protected meaning that someone could still "kill off" a unit that should be safe by accidentally using the wrong port.
The upshot of this is that it is impossible to tell which of our devices suffer this issue without completely replacing ALL the hardware (we've bitten the bullet for most of our production hardware but there is still a lot of dev and QA hardware out there with this issue).
I would imagine that it could be possible, given a piece of hardware that one could use some kind of hardware diagnostics tools to determine whether the kit is faulty or not. Am I living in a dream world? My hardware department tell me that the only tests that can prove the fault would be software tests... but as I have stated the symptoms are very difficult to reproduce. As I'm not that experienced with hardware I don't know if this is the only answer or not. I therefore ask the world.
Built In Test Equipment is used for performing a Built In Test
BITE for BIT
(No bytes involved.)
It is completely, utterly normal for military/aerospace equipment to have extra hardware to test itself with.
The original IBM PC hard a surprising quantity of test hardware built in.
In the case of your equipment, a test device and some statistical analysis would do the trick.
This could be done in hardware in a dongle, but frankly would be easier to with some software.
Use two back-to-back USB to RS232 serial converters to make a USB loopback device.
Send lots of data , checksum packets and measure error rates.
I'm assuming your errors occur on the in->out as well as the out-<in side.
Really, your hardware guys need to look at some application notes; USB IS hotplug-safe IF done according to the book.
There is a cool example out on the net of opto-coupling a USB chip's connection to the board it's onto prevent this sort of thing. The USB chip is connected to the host, powered from the host, and the interface to the USB chip is SPI, which is opto-coupled back to the rest of the board.
As for you, the chips are failing partially. Injured devices may work fine for months then die. An electro-static discharge ("a static zap") can do the same thing that you describe. A device can be injured by shocks too small for you to feel.
The wires and features in semiconductors are microscopic, and easily damaged by stray electricity.
If the hardware design is mostly right probably the liekly cause of the problems you've been experiancing is ESD when the devices are handled to plug/unplug. Your devie has it's own power supply and it's ground voltage floats relative to the other end of the USB cable, until it is connected.
Hope this helps.
No it's not.
A lot of hardware manufacturers begin with hardware testing. Inputs and outputs (IO) is just a matter of evaluating where circuit flow is going. Consider the abstraction that both software and hardware deal in boolean operations.
Hardware is just a little less human readable!
When it comes down to it, hardware's line of communication is (at its most basic) HIGH and LOW through various pins.
I have a brother (in the automobile tech industry) who has used and electrometer to measure voltage on pins to isolate where the problem is (I'm not really smart enough in that field to go into more detail on how he does it).
Your problem is that the only known symptom is so hard to detect (packet corruption in USB stream), that you're going to need software (at some level) to detect it.
If you can work out why packets are getting corrupted (bad voltages?) then maybe you could detect that with hardware?
Otherwise you need some kind of robust testing kit, and software to send/receive lots of packets to look for corruption?
No. That's what oscilloscopes and logic analyzers are for. Also there is more specialized equipment such as USB testers.
The simpler the hardware is, and the more access you have to the signals, the more likely you are to be able to diagnose it in a 'purely hardware' kind of way. For example if you had a simple parallel port card plugged into a PCI slot, it would be relatively straightforward to put a bus analyzer on the PCI bus, and the adapter's output, and see if the outputs did the right thing when the card was addressed. But note you'd still need to attempt to access that card from the PCI bus, which would mean either (A) some kind of PCI bus simulation, which would be one heck of a big pile of test hardware, or (B) a cheap off-the-shelf PC with a few lines of test code.
But then at the other end of the spectrum, suppose you're dealing with a large FPGA. You can get one heck of a lot of logic into an FPGA, and you won't necessarily have access to all the test points you'd like. I've personally encountered a bug with a serial port embedded in an FPGA, where a race condition with the shift register preload register would occasionally corrupt a byte. Hypothetically the VHDL could have been reworked to bring out test points, and a pile of scopes and analyzers gathered, but from a management standpoint it was much more cost effective to try to tease the problem out with software. Under normal usage, the bug in question would have turned up once every blue moon. We iterated through speculation about the conditions that would elicit the bug, and refining the test code, until we had test software that could reproduce the bug 2-3 times a minute. At that point we could actually provide clues to the VHDL guys that helped them fix the problem quickly.
Long story short, inside of a week a hardware bug was smoked out via software, whereas starting with the same information and going 'hardware only' would likely have not been any faster, and would have required a lot of expensive test equipment. So, yeah, you probably can do it without software, but as usual it's a trade-off, and you have to find the right balance point between the amount of software vs hardware for the job.

What microcontroller (and other components) would I need to create a timer device?

As a hobby project to keep myself out of trouble, I'd like to build a little programmer timer device. It will basically accept a program which is a list of times and then count down from each time.
I'd like to use a C or Java micro controller. I have used BASIC in the past to make a little autonomous robot, so this time around I'd like something different.
What micro controller and display would you recommend? I am looking to keep it simple, so the program would be loaded into memory via computer (serial is ok, but USB would make it easier)
Just use a PIC like 16F84 or 16F877 for this. It is more than enough.
As LCD use a 16 x 2 LCD. It is easy to use + will give a nice look to your project.
LCD
The language is not a matter. You can use PIC C, Micro C or any thing you like. The LCD's interface is really easy to drive.
As other components you will just need the crystal and 2 capacitors as oscillator + pull up resister. The rest of the components depend on the input method that you are going to use to set the times.
If you are using a computer to load the list then you will need additional circuit to change the protocols. Use MAX 232 to do that. If you want to use USB, you need to go ahead and use a PIC with USB support. (18F series)
(source: sodoityourself.com)
This is a set of nice tutorials you can use. You can purchase the products from them as well. I purchased once from them.
I would go with the msp430. An ez430 is $20 and you can get them at digikey or from ti directly, then sets of 3 microcontroller boards for $10 after that. llvm and gcc (and binutils) compiler support. Super simple to program, extremely small and extremely low power.
There are many ways to do this, and a number of people have already given pretty good suggestions AVR or PIC are good starting points for a microcontroller to work with that doesn't require too much in the way of complicated setup (hardware & software) or expense (these micros are very cheap). Honestly I'm somewhat surprised that nobody has mentioned Arduino here yet, which happens to have the advantage of being pretty easy to get started with, provides a USB connection (USB->Serial, really), and if you don't like the board that the ATMega MCU is plugged into, you can later plug it in wherever you might want it. Also, while the provided programming environment provides some high level tools to easily protype things you're still free to tweak the registers on the device and write any C code you might want to run on it.
As for an LCD display to use, I would recommend looking for anything that's either based on an HD44780 or emulates the behavior of one. These will typically use a set of parallel lines for talking to the display, but there are tons code examples for interfacing with these. In Arduino's case, you can find examples for this type of display, and many others, on the Arduino Playground here: http://www.arduino.cc/playground/Code/LCD
As far as a clock is concerned, you can use the built-in clock that many 8-bit micros these days provide, although they're not always ideal in terms of precision. You can find an example for Arduino on doing this sort of thing here: http://www.arduino.cc/playground/Code/DateTime. If you want something that might be a little more precise you can get a DS1307 (Arduino example: http://www.arduino.cc/cgi-bin/yabb2/YaBB.pl?num=1191209057/0).
I don't necessarily mean to ram you towards an Arduino, since there are a huge number of ways to do this sort of thing. Lately I've been working with 32-bit ARM micros (don't do that route first, much steeper learning curve, but they have many benefits) and I might use something in that ecosystem these days, but the Arduino is easy to recommend because it's relatively inexpensive, there's a large community of people out there using it, and chances are you can find a code example for at least part of what you're trying to do. When you need something that has more horsepower, configuration options, or RAM, there are options out there.
Here are a few places where you can find some neat hardware (Arduino-related and otherwise) for projects like the one you're describing:
SparkFun Electronics
Adafruit Industries
DigiKey (this is a general electronics supplier, they have a bit of everything)
There are certainly tons more, though :-)
I agree with the other answers about using a PIC.
The PIC16F family does have C compilers available, though it is not ideally suited for C code. If performance is an issue, the 18F family would be better.
Note also that some PICs have internal RC oscillators. These aren't as precise as external crystals, but if that doesn't matter, then it's one less component (or three with its capacitors) to put on your board.
Microchip's ICD PIC programmer (for downloading and debugging your PIC software) plugs into the PC's USB port, and connects to the microcontroller via an RJ-11 connector.
Separately, if you want the software on the microcontroller to send data to the PC (e.g. to print messages in HyperTerminal), you can use a USB to RS232/TTL converter. One end goes into your PC's USB socket, and appears as a normal serial port; the other comes out to 5 V or 3.3 V signals that can be connected directly to your processor's UART, with no level-shifting required.
We've used TTL-232R-3V3 from FDTI Chip, which works perfectly for this kind of application.
There are several ways to do this, and there is a lot of information on the net. If you are going to use micro controllers then you might need to invest in some programming equipment for them. This won't cost you much though.
Simplest way is to use the sinus wave from the power grid. In Europe the AC power has a frequency of 50Hz, and you can use that as the basis for your clock signal.
I've used Atmel's ATtiny and ATmega, which are great for programming simple and advanced projects. You can program it with C or Assembly, there are lots of great projects for it on the net, and the programmers available are very cheap.
Here is a project I found by Googling AVR 7 segment clock.
A second vote for PIC. Also, I recommend the magazine Circuit Cellar Ink. Some technical bookstores carry it, or you can subscribe: http://www.circellar.com/
PIC series will be good, since you are creating a timer, I recommend C or Assembly (Assembly is good), and use MPLAB as the development environment. You can check how accurate your timer with 'Stopwatch' in MPLAB. Also PIC16F877 has built in Hardware Serial Port. Also PIC16F628 has a built in Hardware serial port. But PIC16F877 has more ports. For more accurate timers, using higher frequency oscillators is recommended.