How do I control a motor wirelessly? - embedded

I am a ME undergrad and am designing an implant device that requires programming knowledge. I honestly have no idea how to get started and am looking for advice. Basically what I need is a way to control a stepper motor. Stepper motor's use steps (pulses) to rotate the gear head. Now this motor I'm using needs 20 steps to revolve once. I need to be able to control the # of steps I want in a day per say. The motor I'm purchasing comes with an encoder which I'm guessing connects to the circuit board. Now what I want to do is have an external control (like a remote control for a toy)that can set these rates. I don't know anything about radio transmitters, or how to program the circuit board to do this for me. Any help would be appreciated, or books I can look into, websites, or tutorials. Thanks.

There are many ways of solving this problem, but it is more of a systems engineering question than a programming question; until you know what the system looks like, there is no way of determining what parts will be implemented in software. More details would be required to provide a specific answer.
For example what are the security/safety considerations?
What wireless technology do you need to use? e.g. RF or IR, if RF then licensing may be an issue, and that may vary from country to country. You could use BlueTooth, ZigBee, or even WiFi, but these technologies are probably more expensive and complex than necessary for such a simple application. If IR then is immunity from interference from TV remotes or PC IrDA ports or similar required?
If the commands/signals from the remote are complex you will probably need both the remote and the motor driver to incorporate a micro-controller and software. On the other hand if you just need increase/decrease functions then it would be entirely possible to implement the remote functionality you describe without any processing at all (depending on teh communication technology you choose).
What is the motor encoder for? Stepper motors do not normally need an encoder since the controller can simply count steps executed in either direction to determine position. Is the encoder incremental or absolute? If it is incremental, then it is certainly not needed; if it is absolute than it may be useful if you need to know the exact position of the motor on power-up without having to perform an initialisation or requiring end-stop switches.
You mentioned a "circuit board"; what hardware do you already have? What does it do? Do you have documentation for it? If it is commercially available, can you provide a link so we can see the documentation?
As you can see you have more system-level design issues to solve before you even consider software implementation, so the question is not yet ready to be answered here on SO. I suggest you seek out your university's EE department and team-up with someone with electronics expertise do design a complete system, then consider the software aspects.

Well worth taking a look at the Microchip site:
http://www.microchip.com/forums/f170.aspx
They produce microcontrollers that can be programmed to do exactly what you require (and a lot more).

Related

How to demo examples of embeded systems?

It seems that a lot of small business people have a need for some customized embedded systems, but don't really know too much about the possibilities and cannot quite envisage them.
I had the same problem when trying to explain what Android could do; I was generally met with glazed eyes - and then I made a few demos. Somehow, being able to see something - to be able to touch it and play around with it – people have that cartoon lightbulb moment.
Even if it is not directly applicable to them, a demo starts them thinking about what could be useful to them.
The sort of person I am talking about may or may not be technical, but is certainly intelligent, having built from scratch a business which turns over millions.
Their needs are varied, from RFID or GPS asset & people tracking, to simple stock control systems, displays, communications, sometime satellite, sometimes VPN or LAN (wifi or RJ45). A lot of it needs a good back-end database with a web-site to display, query, data-mine …
So, to get to the question, I am looking for a simple project, or projects, which will cause that cartoon lightbulb moment. It need not be too complicated as those who need complicated solutions are generally tech-savvy, just something straightforward & showing what could be done to streamline their business and make it more profitable.
It would be nice it if could include some wifi/RJ45 comms, communicate across the internet (e.g not just a micro-controller attached to a single PC – that should then communicate with a server/web-site), an RFID reader would be nice, something actually happening (LEDs, sounds, etc), plus some database, database analysis/data-ming – something end-to-end, preferably in both directions.
A friend was suggesting a Rube Goldberg like contraption with a Lego Mindstorms attached to a local PC, but also controllable from a remote PC (representing head office) or web site. That would show remote control of devices. Maybe it could pick up some RFID tags and move them around (at random, or on command), representing stock control (or maybe employee/asset movement within a factory or warehouse (Location Based Services/GIS)), which cold then be shown on the web site, with some nice charts & graphs etc.
Any other ideas?
How best to implement it? One of those micro-controller starter kites like http://www.nerdkits.com/ ? Maybe some Lego, or similar robot kit, a few cheap RFID readers … anything else?
And – the $409,600 question – what's a good, representative demo which demonstrate as many functionalities as possible, as impressively as possible, with the least effort? (keeping it modular and allowing for easy addition of features, since there is such a wide area to cover)
p.s a tie with an Adroid slate PC would be welcome too
Your customers might respond better to a solid looking R/C truck which seeks RFID tags than to a Lego robot. Lego is cool, but it has a bit of a slapped-together 'kiddie' feel.
What if you:
scatter some RFID tags across the conference room.
add a GPS & wifi transmitter to your truck.
drive the truck to the tag
(manually - unless you want to invest a lot of time in steering algorithms).
have a PC drawing a real-time track of the trucks path.
every the truck gets within range of the tag, add it to an inventory list on the screen, showing item id, location, time recorded, total units so far.
indicate the position of the item on the map.
I'd be impressed.
Is it 'least effort'? I don't know, but I'd hope that if this is the type of solution you are pitching, that you already have a good handle on how to read GPS and RFID devices, how to establish a TCP or UDP connection with wifi, how to send and decode packets. Add some simple graphics and database lookup, and you are set.
Regarding hardware, I don't have any first hand experience with any of these, but the GadgetPC Wi-Fi G Kit + a USB RFID reader + a USB GPS reciever looks like a nice platform for experimenting with this.
Many chip manufactures have off-the-shelf demo boards. Microchip has some great demo boards for TCP/IP communications on an embedded system. I haven't seen one yet for RFID. Showing potential customers some of these demos could get them thinking about what is possible.

LabVIEW + National Instruments hardware or ???

I'm in the processes of buying a new data acquisition system for my company to use for various projects. At first, it's primary purpose will be to monitor up to 20 thermocouples and control the temperature of a composites oven. However, I also plan on using it to monitor accelerometers, strain gauges, and to act as a signal generator.
I probably won't be the only one to use it, but I have a good bit of programming experience with Atmel microcontrollers (C). I've used LabVIEW before, but ~5 years ago. LabVIEW would be good because it is easy to pick up on for both me and my coworkers. On the flip side, it's expensive. Right now I have a NI CompactDAQ system with 2 voltage and one thermocouple cards + LabVIEW speced out and it's going to cost $5779!
I'm going to try to get the same I/O capabilities with different NI hardware for less $ + LabVIEW to see if I can get it for less $. I'd like to see if anyone has any suggestions other than LabVIEW for me.
Thanks in advance!
Welcome to test and measurement. It's pretty expensive for pre-built stuff, but you trade money for time.
You might check out the somewhat less expensive Agilent 34970A (and associated cards). It's a great workhorse for different kinds of sensing, and, if I recall correctly, it comes with some basic software.
For simple temperature control, you might consider a PID controller (Watlow or Omega used to be the brands, but it's been a few years since I've looked at them).
You also might look into the low-cost usb solutions from NI. The channel count is lower, but they're fairly inexpensive. They do still require software of some type, though.
There are also a fair number of good smaller companies (like Hytek Automation) that produce some types of measurement and control devices or sub-assemblies, but YMMV.
There's a lot of misconception about what will and will not work with LabView and what you do and do not need to build a decent system with it.
First off, as others have said, test and measurement is expensive. Regardless of what you end up doing, the system you describe IS going to cost thousands to build.
Second, you don't NEED to use NI hardware with LabView. For thermocouples your best bet is to look into multichannel or multiple single-channel thermocouple units - something that reads from a thermocouple and outputs to something like RS-232, etc. The OMEGABUS Digital Transmitters are an example, but many others exist.
In this way, you need only a breakout card with lots of RS-232 ports and you can grow your system as it needs it. You can still use labview to acquire the data via RS-232 and then display, log, process, etc, it however you like.
Third party signal generators would also work, for example. You can pick up good ones (with GPIB connection) reasonably cheaply and with a GPIB board can integrate it into LabView as well. This if you want something like a function generator, of course (duty cycled pulses, standard sine/triangle/ramp functions, etc). If you're talking about arbitrary signal generation then this remains a reasonably expensive thing to do (if $5000 is our goalpost for "expensive").
This also hinges on what you're needing the signal generation for - if you're thinking for control signals then, again, there may be cheaper and more robust opitons available. For temperature control, for example, separate hardware PID controllers are probably the best bet. This also takes care of your thermocouple problem since PID controllers will typically accept thermocouple inputs as well. In this way you only need one interface (RS-232, for example) to the external PID controller and you have total access in LabView to temperature readings as well as the ability to control setpoints and PID parameters in one unit.
Perhaps if you could elaborate on not just the system components as you've planned them at present, but the ultimaty system functionality, it may be easier to suggest alternatives - not simply alternative hardware, but alternative system design altogether.
edit :
Have a look at Omega CNi8C22-C24 and CNiS8C24-C24 units -> these are temperature and strain DIN PID units which will take inputs from your thermocouples and strain gauges, process the inputs into proper measurements, and communicate with LabView (or anything else) via RS-232.
This isn't necessarily a software answer, but if you want low cost data aquisition, you might want to look at the labjack. It's basically a microcontroller & usb interface wrapped in a nice box (like an arduino (Atmel AVR + USB-Serial converter) but closed source) with a lot of drivers and functions for various languages, including labview.
Reading a thermocouple can be tough because microvolts are significant, so you either need a high resolution A/D or an amplifier on the input. I think NI may sell a specialized digitizer for thermocouple readings, but again you'll pay.
As far as the software answer, labview will work nicely with almost any hardware you choose -- e.g. I built my own temperature controller based on an arduino (with an AD7780) wrote a little interface using serial commands and then talked with it using labview. But if you're willing to pay a premium for a guaranteed to work out of the box solution, you can't go wrong with labview and an NI part.
LabWindows CVI is NI's C IDE, with good integration with their instrument libraries and drivers. If you're willing to write C code, maybe you could get by with the base version of LabWindows CVI, versus having to buy a higher-end LabView version that has the functionality you need. LabWindows CVI and LabView are priced identically for the base versions, so
that may not be much of an advantage.
Given the range of measurement types you plan to make and the fact that you want colleagues to be able to use this, I would suggest LabVIEW is a good choice - it will support everything you want to do and make it straightforward to put a decent GUI on it. Assuming you're on Windows then the base package should be adequate and if you want to build stand-alone applications, either to deploy on other PCs or to make a particular setup as simple as possible for your colleagues, you can buy the application builder separately later.
As for the DAQ hardware, you can certainly save money - e.g. Measurement Computing have a low cost 8-channel USB thermocouple input device - but that may cost you in setup time or be less robust to repeated changes in your hardware configuration for different tests.
I've got a bit of experience with LabView stuff, and if you can afford it, it's awesome (and useful for a lot of different applications).
However, if your applications are simple you might actually be able to hack together something with one or two arduino's here, it's OSS, and has some good cheap hardware boards.
LabView really comes into its own with real time applications or RAD (because GUI dev is super easy), so if all you're doing is running a couple of thermopiles I'd find something cheaper.
A few thousand dollars is not a lot of money for process monitoring and control systems. If you do a cost/benefit analysis, you will very quickly recover your development costs if the scope of the system is right and if it does the job it is intended to do.
Another tool to consider is National Instruments measurement studio with VB .NET. This way you can still use the NI hardware if you want and can still build nice gui's quickly.
Alternatively, as others have said, it is perfectly viable to get industrial serial based instruments and talk to them with LabVIEW, VB .NET, c# or whatever you like.
If you go down the route of serial instruments, another piece of hardware that might be useful is a serial terminal (example). These allow you to connect arbitrary numbers of devices to your network. You computers can then use them as though they were physical COM ports.
Have you looked at MATLAB. They have a toolbox called Data Acquisition. compactDAQ is a supported hardware.
LabVIEW is a great visual programming environment. In terms if we want to drag,drop and visualize our system. NI Hardware also comes with the NIDAQmx Library which can be accessed through our code. Probably a feasible solution for you would be to import the libraries into another programming language and write code for all the activities which otherwise you were going to perform using LabVIEW. Though other overheads like code optimization would be the users responsibility, you are free to tweak the normal method flow, by introducing your own improvements at suitable junctures in the DAQ process.

What microcontroller (and other components) would I need to create a timer device?

As a hobby project to keep myself out of trouble, I'd like to build a little programmer timer device. It will basically accept a program which is a list of times and then count down from each time.
I'd like to use a C or Java micro controller. I have used BASIC in the past to make a little autonomous robot, so this time around I'd like something different.
What micro controller and display would you recommend? I am looking to keep it simple, so the program would be loaded into memory via computer (serial is ok, but USB would make it easier)
Just use a PIC like 16F84 or 16F877 for this. It is more than enough.
As LCD use a 16 x 2 LCD. It is easy to use + will give a nice look to your project.
LCD
The language is not a matter. You can use PIC C, Micro C or any thing you like. The LCD's interface is really easy to drive.
As other components you will just need the crystal and 2 capacitors as oscillator + pull up resister. The rest of the components depend on the input method that you are going to use to set the times.
If you are using a computer to load the list then you will need additional circuit to change the protocols. Use MAX 232 to do that. If you want to use USB, you need to go ahead and use a PIC with USB support. (18F series)
(source: sodoityourself.com)
This is a set of nice tutorials you can use. You can purchase the products from them as well. I purchased once from them.
I would go with the msp430. An ez430 is $20 and you can get them at digikey or from ti directly, then sets of 3 microcontroller boards for $10 after that. llvm and gcc (and binutils) compiler support. Super simple to program, extremely small and extremely low power.
There are many ways to do this, and a number of people have already given pretty good suggestions AVR or PIC are good starting points for a microcontroller to work with that doesn't require too much in the way of complicated setup (hardware & software) or expense (these micros are very cheap). Honestly I'm somewhat surprised that nobody has mentioned Arduino here yet, which happens to have the advantage of being pretty easy to get started with, provides a USB connection (USB->Serial, really), and if you don't like the board that the ATMega MCU is plugged into, you can later plug it in wherever you might want it. Also, while the provided programming environment provides some high level tools to easily protype things you're still free to tweak the registers on the device and write any C code you might want to run on it.
As for an LCD display to use, I would recommend looking for anything that's either based on an HD44780 or emulates the behavior of one. These will typically use a set of parallel lines for talking to the display, but there are tons code examples for interfacing with these. In Arduino's case, you can find examples for this type of display, and many others, on the Arduino Playground here: http://www.arduino.cc/playground/Code/LCD
As far as a clock is concerned, you can use the built-in clock that many 8-bit micros these days provide, although they're not always ideal in terms of precision. You can find an example for Arduino on doing this sort of thing here: http://www.arduino.cc/playground/Code/DateTime. If you want something that might be a little more precise you can get a DS1307 (Arduino example: http://www.arduino.cc/cgi-bin/yabb2/YaBB.pl?num=1191209057/0).
I don't necessarily mean to ram you towards an Arduino, since there are a huge number of ways to do this sort of thing. Lately I've been working with 32-bit ARM micros (don't do that route first, much steeper learning curve, but they have many benefits) and I might use something in that ecosystem these days, but the Arduino is easy to recommend because it's relatively inexpensive, there's a large community of people out there using it, and chances are you can find a code example for at least part of what you're trying to do. When you need something that has more horsepower, configuration options, or RAM, there are options out there.
Here are a few places where you can find some neat hardware (Arduino-related and otherwise) for projects like the one you're describing:
SparkFun Electronics
Adafruit Industries
DigiKey (this is a general electronics supplier, they have a bit of everything)
There are certainly tons more, though :-)
I agree with the other answers about using a PIC.
The PIC16F family does have C compilers available, though it is not ideally suited for C code. If performance is an issue, the 18F family would be better.
Note also that some PICs have internal RC oscillators. These aren't as precise as external crystals, but if that doesn't matter, then it's one less component (or three with its capacitors) to put on your board.
Microchip's ICD PIC programmer (for downloading and debugging your PIC software) plugs into the PC's USB port, and connects to the microcontroller via an RJ-11 connector.
Separately, if you want the software on the microcontroller to send data to the PC (e.g. to print messages in HyperTerminal), you can use a USB to RS232/TTL converter. One end goes into your PC's USB socket, and appears as a normal serial port; the other comes out to 5 V or 3.3 V signals that can be connected directly to your processor's UART, with no level-shifting required.
We've used TTL-232R-3V3 from FDTI Chip, which works perfectly for this kind of application.
There are several ways to do this, and there is a lot of information on the net. If you are going to use micro controllers then you might need to invest in some programming equipment for them. This won't cost you much though.
Simplest way is to use the sinus wave from the power grid. In Europe the AC power has a frequency of 50Hz, and you can use that as the basis for your clock signal.
I've used Atmel's ATtiny and ATmega, which are great for programming simple and advanced projects. You can program it with C or Assembly, there are lots of great projects for it on the net, and the programmers available are very cheap.
Here is a project I found by Googling AVR 7 segment clock.
A second vote for PIC. Also, I recommend the magazine Circuit Cellar Ink. Some technical bookstores carry it, or you can subscribe: http://www.circellar.com/
PIC series will be good, since you are creating a timer, I recommend C or Assembly (Assembly is good), and use MPLAB as the development environment. You can check how accurate your timer with 'Stopwatch' in MPLAB. Also PIC16F877 has built in Hardware Serial Port. Also PIC16F628 has a built in Hardware serial port. But PIC16F877 has more ports. For more accurate timers, using higher frequency oscillators is recommended.

What Skill set should a low level programmer possess?

I am an embedded SW Engineer, with less than 3 yrs of experience. I aim to "sharpen the saw" continuously. I was wondering if there was anything specific to low level programming that C/C++ coders should be proficient with.
What comes to my mind is familiarity with the hardware's architecture and instruction set. Knowing how to fiddle with bits is also important, resource management and performance have been part of my job, is there anything else?
EDIT: I work with an in-house customized RTOS, not embedded Linux.
I see a lot of high-level operating system answers here, but you specifically said low-level.
Some scattered thoughts:
Design for test. As you work through a problem only change one thing at a time per test.
You need to understand busses and interfaces, spi, i2c, usb, ethernet, etc. Number one interface, today, yesterday, and tomorrow, the uart, serial.
The steps involved in programming a flash.
Tricks to avoid making the product easily brickable.
Bootloaders in general.
Bit-banging above said interfaces on various families of parts (different chip
vendors have different ideas about io pins, pull ups, direction
controls, etc).
Board and chip bring up, you certainly never want to
boot a many tens of thousands of lines of code program on the first
power up (think led on, led off).
How to debug a product without using too much test equipment (logical analyzers and scopes), at the same time you have to learn to use a scope for debugging, you are far
more valuable if you don't HAVE TO have a tech or engineer in the lab
with you.
How would you reprogram the unit in the field? What would
you do to minimize human error when allowing the user to field
upgrade the unit? Remember field downgrades as well.
What would you do to discourage hacking (binaries, etc).
Efficient use of the flash/rom (don't wear out one bank or section, spread the wear around, or see if the flash is doing it for you).
How and when to use a watchdog timer.
State machines, very useful with bytestreams (serial and ethernet), design packet structures that stream well and are tailored to a state machine, and that have a header and checksum or other structure that insures you do not interpret partial packets or
random data as a good packet.
Specific concepts like,
Endianness (this link is to an old but good linuxjournal article)
Effective use of multithreading architectures (the Embedded site is good in general)
Debugging embedded and multithreaded systems
Understand, Learn and Follow good programming techniques (the link is very old and the point very generic and subjective, but think about it)
Other things (this IBM page on embedded linux sums up most of the other points I want to make)
One more thing -- never underestimate testing! or, planning test cases!!
Use the reference links I give as concepts,
please followup further for deeper knowledge.
I'd study the electronics of the actual chips. Learn how they work internally (such as architecture), interface with peripherals, electrical and timing characteristics, etc.
Basically, read the data sheet start to finish a few times and dig into anything you've not seen/used before.
By the way, what chips do you work with?
Similar to what Brian said, learn how to create unit tests and automated builds.
These skills are are good for all levels of software engineers to be proficient in. They will help improve the quality of your code while also making it easier to refactor and improve the code base.
If you haven't yet I think every Software Engineer should read The Pragmatic Programmer and Code Complete. I know these are not specific to low level programming, but have a large wealth of knowledge in them that applies to all sub disciplines.
Having great familiarity with pointers, the checks these languages don't do much (like buffer overflow and stuff like that), digital electronics. Operational systems internals might also help.
Get to know how stuff is represented internally, specially ready-made data structures (supposing you won't build your own one).
Above all, practice a lot. Doing it brings much more to you than just reading about it ;)
bit operations
processor architectures (caches, etc)
wcet analysis
scheduling
Edit: What I forgot to mention is model based development.
Today, the control algorithms are often implemented as some kind of automaton from which C code is generated afterwards.
Commercial available tools are for example MATLAB/Simulink, ASCET or SCADE.
Get yourself a copy of the MISRA-C book. It was originally written by members of the automotive industry, and attempts to make software written in C more robust by applying a number (quite a large number!) of rules and guidelines.
Then, buy PC-Lint (or another static analysis tool) to check your code for MISRA and other rules.
These are particularly relevant to low-level and embedded C, as between them they deal with the causes of a lot of bugs in such software, such as issues relating to pointers, memory leaks, integer promotion (there's a whole chapter on that in the MISRA book), endianness, and undefined behaviour.
Good question. Some that haven't been mentioned...
Learn your various options for achieving low-level multitasking. From basic round-robin (non-preemptive) schedulers, with timing ticks from a hardware timer, up to a preemptive RTOS. Learn why you might need an RTOS, and why you might not. If you use an RTOS, learn that beginners with a PC background probably tend to want to create too many tasks.
Getting visibility into the internals for debugging can be a challenge. There's no screen typically, so no throwing in "printf" calls wherever you want. An emulator or JTAG interface is ideal--you can set breakpoints and step through your program (as long as halting the micro doesn't make hardware go crazy, like swinging a robot arm around at full speed!). If emulator/JTAG is not available, learn how to use a spare serial port (or maybe even bit-bash a pin to make a serial port) for a debug channel, with some simple memory peek/poke commands.

Is low-level / embedded systems programming hard for software developers? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
Given my background as a generalist, I can cover much of the area from analog electronics to writing simple applications that interface to a RDBMS backend.
I currently work in a company that develops hardware to solve industry-specific problems. We have an experienced programmer that have written business apps, video games, and a whole bunch of other stuff for PC's. But when I talk to him about doing low-level programming, he simultaneously express interest and also doubt/uncertainty about joining the project.
Even when talking about PC's, he seems to be more comfortable operating at the language level than the lower-level stuff (instruction sets, ISR's). Still, he's a smart guy, and I think he'd enjoy the work once he is over the initial learning hump. But maybe that's my own enthusiasm for low-level stuff talking... If he was truly interested, maybe he would already have started learning stuff in that direction?
Do you have experience in making that software-to-hardware (or low-level software) transition? Or, better yet, of taking a software only guy, and transitioning him to the low-level stuff?
Edit:
P.S. I'd love to hear from the responders what their own background is -- EE, CS, both?
At the end of the day, everything is an API.
Need to write code for an SPI peripheral inside a microcontroller? Well, get the datasheet or hardware manual, and look at the SPI peripheral. It's one, big, complex API.
The problem is that you have to understand the hardware and some basic EE fundamentals in order to comprehend what the API means. The datasheet isn't written by and for SW developers, it was written for hardware engineers, and maybe software engineers.
So it's all from the perspective of the hardware (face it - the microcontroller company is a hardware company filled with hardware/asic engineers).
Which means the transition is by no means simple and straightforward.
But it's not difficult - it's just a slightly different domain. If you can implement a study program, start off with Rabbit Semiconductor's kits. There's enough software there so a SW guy can really dig in with little effort, and the HW is easy to deal with because everything is wrapped in nice little libraries. When they want to do something complex they can dig into the direct hardware access and fiddle at the lower level, but at the same time they can do some pretty cool things such as build little webservers or pan/tilt network cameras. There are other companies with similar offerings, but Rabbit is really focused on making hardware easy for software engineers.
Alternately, get them into the Android platform. It looks like a unix system to them, until they want to do something interesting, and then they'll have the desire to attack that little issue and they'll learn about the hardware.
If you really want to jump in the deep end, go with an arduino kit - cheap, free compilers and libraries, pretty easy to start off with, but you have to hook wires up to do something interesting, which might be too big of a hurdle for a reluctant software engineer. But a little help and a few nudges in the right direction and they will be absolutely thrilled to have a little LED display that wibbles* like the nightrider lights...
-Adam
*Yes, that's a technical engineering term.
The best embedded programmers I've worked with are EE trained and learned SW on the job. The worst embedded developers are recent CS graduates who think SW is the only way to solve a problem. I like to think of embedded programming as the bottom of the SW pyramid. It's a stable abstraction layer/foundation that makes life easy for the app developers.
"Hard" is an extremely relative term. If you're used to thinking in the tight, sometimes convoluted way you need to for small embedded code (for example, you're a driver developer), then certainly it's not "hard".
Not to "bash" (no pun intended) shell scripters, but if you write perl and shell scripts all day, then it might very well be "hard".
Likewise if you're a UI guy for Windows. It's a different kind of thinking.
Why embedded development is "hard":
1) The context may switch to an interrupt between each machine instruction. Since high level language constructs may map to multiple assembly instructins, this might even be within a line of code, e.g. long var = 0xAAAA5555. If accessed in an interrupt service routine, in a 16 bit processore var might only be half set.
2) Visibility into the system is limited. You may not even have output to Hyperterm unless you write it yourself. Emulators don't always work that well or consistently (though they are way better than they used to be). You will have to know how to use oscilloscopes and logic analyzers.
3) Operations take time. For example, say your serial transmitter uses an interrupt to signal when it is time to send another byte. You could write 16 bytes to a transmit buffer, then clear interrupts and wonder why your message is never sent. Timing in general is a tricky part of embedded programming.
4) You are subject to subtle race conditions that occur only rarely and are very difficult to debug.
5) You have to read the manual. A lot. You can't make it work by fooling around. Sometimes 20 things have to be set up correctly to get what you are after.
6) The hardware doesn't always work or is easy to damage, and it takes a while to figure out that you broke it.
7) Software repairs in embedded systems are usually very expensive. You can't just update a web page. A recall can erase any profit you made on the device.
There are probably more but I've got this race condition to solve...
This is very subjective I guess, his reasons could be many. But if he's like me, I know where he's coming from. Let me explain.
In my career I've dedicated 6 years to the telecom industry, working a lot with embedding SDK middleware into low-end mobile phones etc.
Most embedded environments I've experienced are like harsh weather for a programmer, you constantly have to overcome limitations in resources etc. Some might find this a challenge and enjoy it for the challenge itself, some might feel close to "the real stuff" - the hardware, some might feel it limits their creativity.
I'm the kind who feels it limits my creativity.
I enjoy being back in Windows desktop environment and flap my wings with elaborate class designs, stretch my legs a few clockcycles extra, use unnecessary amounts of memory for diagnostics etc.
On certain embedded units in the past, I hardly had support for fseek() (an ANSI C standard file function). If lucky, a "watchdog" could give clues to where something crashed. Not to mention the pain of communicating with the user in single-threaded preemptive swamps.
Well, you know what I'm getting at. In my opinion it's not necessarily hard, but it's quite a leap, with potentially little reuse of your current experience.
Regards
Robert
There is a very real difference in mindset from user-level application development (ie, general purpose PC or Web applications) to hard deadline, real-time response application development (ie, the hardware/software interface).
Interrupts, instruction sets, context switching and hard resource constraints are relatively unknown to your average developer. I'm assuming here that your 'average developer' is not an Electrical/Electronic or other Engineer by training.
The transition for this developer you mention may be well outside his comfort zone. Some of us like stretching like that. Others of us may have decided the view isn't worth the climb.
Likewise, folks who've been in the hardware area (ie, Engineers) often have difficulty with the assumptions and language of software development.
These are gross generalities, of course, but hopefully give some insight.
He needs to be comfortable with the low-level stuff, but mostly for debugging and field issues. There is a serious learning curve depending on the architecture, but not impossible. On the other hand, the low-level code takes (in general) more time and debugging than higher-level code. So if you need to be going back to low-level all the time, then perhaps something isn't right in the design. Even for the embedded controls I've built, I spend the vast majority of time in high-level code. Although when you have issues, it is extremely advantageous to have a very good low-level knowledge.
I am an EE turned Software Engineer. I prefer programming low level. Most software developers classically trained that I know do not want to operate at this level they want apis to call. So for me it is a win win, I create the low level driver and api for them to use. There is a "new" degree, at least new since I went to college, called Computer Engineer. Hmm, it might be an electrical engineering degree not computer science, but it is a nice mix of software and digital hardware basics. The individuals that I have worked with from this field are much more comfortable with low level.
If the individual is not comfortable or willing then place them somewhere where they are comfortable. Let them do documentation or work on the user interface. If all of the work at the company requires low level work then this individual needs to do it or find another job. Dont sugar coat it.
I also think they will enjoy it once they get over the hump, the freedom you have at that level, not hindered by operating systems, etc. Recently I witnessed a few co-workers experience for the first time seeing their software run under simulation. Every net within the processor and other on chip peripherals. No you dont have a table on a gui (debugger) showing the current state of the memory, you have to look at the memory bus, look for the address you are interested in, look for a read or write signal and the data bus. I worry about the day that silicon arrives and they no longer have this level of visibility. Will be like an addict in detox.
Well, I cut my teeth on hardware when I started reading Popular Electronics at age 14 – this was BEFORE personal computers, in case you were wondering and if you weren’t well, you know anyway. lol
I’ve done the low level bit-bang stuff on the 8048/51 microprocessor, done PIC’s and some other single chip variations and of course Rabbit Semiconductor. (great if you're into C). That’s great (and fun) stuff; Yes, there is a different way of looking at things – not harder, but some of that information is a bit harder to come by as it isn’t as discussed as the software issues. (Of course, this depends on the circle of friends with which you associate, eh).
But, having said all of this, I want to remind you of a technology that started to bridge the gap for programmers into the world of hardware and has since become a very MAJOR player and that is the .NET micro framework. You can find information on this technology at the following;
http://msdn.microsoft.com/en-us/embedded/bb267253.aspx
It addresses some of the same issues that .NET web development addressed in that you can use some (quite a bit, actually) of your existing PC based knowledge in the new environments – Some caution, of course, as your target machine doesn’t have 4 GIG of RAM – it may only have 64K (or less)
Starting in version 2.5 of the .NET micro framework, you have access to networking and web services – way kewl, eh? It doesn’t stop there … Want to control the lights in your house? How about a temp recording station? All with the skills you already have. Well, mostly -- Check out the link.
The SDK plugs into your VisualStudio IDE. There are a number of “Development Kits” available for a very reasonable amount of cash – Now, what would normally take a big learning curve in components, building a circuit board and wiring up “stuff” can be done reasonably easy with a dev kit and some pretty simple code – Of course, you may need to do the occasional bit bang operation, but more and more sensor folks are providing .NET micro framework drivers – so, the hardware development may be closer than you think…
Hope it helps...
I like both. Embedded challenges me and really gets me going in a visceral way. Making something that affects the macro physical world is very satisfactory. But I've had to do a lot of catch up on the electrical/electronics end, since my bachelor's is in computer science. I've a pretty generalist background, where I studied ai, graphics, compilers, natural language, etc. Now I'm doing graduate work in embedded systems. The really tough part is adjusting to the lack of runtime facilities like an operating system.
Low-level embedded programming also tends to include low-level debugging. Which (in my experience) usually involves (at least) the use of an oscilloscope. Unless your colleague is going to be happy spending at least some of the time in physical contact with the hardware and thinking in terms of microseconds and volts, I'd be tempted to leave them be.
Agreed on the "hard" term is quite relative.
I would say different, as you would need to employ different development patterns that you won't use in other kind of environment.
The time constraint for instance could requires a learning curve.
However being curious, would be a quality for a developer, wouldn't be?
You are right in that anyone with enough knowledge not to feel completely lost in an area (over the hump?) will enjoy the challenges of learning something new.
I myself would feel quite nervous being moving to the level of instruction sets etc as there is a huge amount of background knowledge needed to feel comfortable in the environment.
It may make a difference if you are able to support the developer in learning how to do this. Having someone there you can ask and talk through issue with is a huge help in that sort of domain change.
It may be worth having the developer assigned to a smaller project with others as a first step and see how that goes. If he expresses enthusiasm to try another project, things should flow on from there.
I would say it is not any harder, it just requires a different knowledge set, different considerations.
I think that it depends on the way that they program in their chosen environment, and the type of embedded work that you're talking about.
Working on an embedded linux platform, say, is a far smaller jump than trying to write code on an 8 bit platform with no operating system at all.
If they are the type of person that has an understanding of what is going on underneath the api and environment that they are used to, then it won't be too much of a stretch to move into embedded development.
However, if their world view stops at the high level api that they've been using, and they have no concept of anything beneath that, they are going to have a really hard time.
As a (very) general statement if they are comfortable working on multithreaded applications they will probably be ok, as that shares some of the same issues of data volatility that you have when working on embedded projects.
With all of that said, I've seen more embedded programmers successfully working in PC development than I have the reverse. (of course I might not have seen a fair cross section)
"But when I talk to him about doing low-level programming, he simultaneously express interest and also doubt/uncertainty about joining the project." -- That means you let him try and you prepare to hire someone else in case he doesn't pass the learning curve.
i began as a SW engineer i'm now HW one !
the important is to understand how it works and to be motivated !