What Skill set should a low level programmer possess? - embedded

I am an embedded SW Engineer, with less than 3 yrs of experience. I aim to "sharpen the saw" continuously. I was wondering if there was anything specific to low level programming that C/C++ coders should be proficient with.
What comes to my mind is familiarity with the hardware's architecture and instruction set. Knowing how to fiddle with bits is also important, resource management and performance have been part of my job, is there anything else?
EDIT: I work with an in-house customized RTOS, not embedded Linux.

I see a lot of high-level operating system answers here, but you specifically said low-level.
Some scattered thoughts:
Design for test. As you work through a problem only change one thing at a time per test.
You need to understand busses and interfaces, spi, i2c, usb, ethernet, etc. Number one interface, today, yesterday, and tomorrow, the uart, serial.
The steps involved in programming a flash.
Tricks to avoid making the product easily brickable.
Bootloaders in general.
Bit-banging above said interfaces on various families of parts (different chip
vendors have different ideas about io pins, pull ups, direction
controls, etc).
Board and chip bring up, you certainly never want to
boot a many tens of thousands of lines of code program on the first
power up (think led on, led off).
How to debug a product without using too much test equipment (logical analyzers and scopes), at the same time you have to learn to use a scope for debugging, you are far
more valuable if you don't HAVE TO have a tech or engineer in the lab
with you.
How would you reprogram the unit in the field? What would
you do to minimize human error when allowing the user to field
upgrade the unit? Remember field downgrades as well.
What would you do to discourage hacking (binaries, etc).
Efficient use of the flash/rom (don't wear out one bank or section, spread the wear around, or see if the flash is doing it for you).
How and when to use a watchdog timer.
State machines, very useful with bytestreams (serial and ethernet), design packet structures that stream well and are tailored to a state machine, and that have a header and checksum or other structure that insures you do not interpret partial packets or
random data as a good packet.

Specific concepts like,
Endianness (this link is to an old but good linuxjournal article)
Effective use of multithreading architectures (the Embedded site is good in general)
Debugging embedded and multithreaded systems
Understand, Learn and Follow good programming techniques (the link is very old and the point very generic and subjective, but think about it)
Other things (this IBM page on embedded linux sums up most of the other points I want to make)
One more thing -- never underestimate testing! or, planning test cases!!
Use the reference links I give as concepts,
please followup further for deeper knowledge.

I'd study the electronics of the actual chips. Learn how they work internally (such as architecture), interface with peripherals, electrical and timing characteristics, etc.
Basically, read the data sheet start to finish a few times and dig into anything you've not seen/used before.
By the way, what chips do you work with?

Similar to what Brian said, learn how to create unit tests and automated builds.
These skills are are good for all levels of software engineers to be proficient in. They will help improve the quality of your code while also making it easier to refactor and improve the code base.

If you haven't yet I think every Software Engineer should read The Pragmatic Programmer and Code Complete. I know these are not specific to low level programming, but have a large wealth of knowledge in them that applies to all sub disciplines.

Having great familiarity with pointers, the checks these languages don't do much (like buffer overflow and stuff like that), digital electronics. Operational systems internals might also help.
Get to know how stuff is represented internally, specially ready-made data structures (supposing you won't build your own one).
Above all, practice a lot. Doing it brings much more to you than just reading about it ;)

bit operations
processor architectures (caches, etc)
wcet analysis
scheduling
Edit: What I forgot to mention is model based development.
Today, the control algorithms are often implemented as some kind of automaton from which C code is generated afterwards.
Commercial available tools are for example MATLAB/Simulink, ASCET or SCADE.

Get yourself a copy of the MISRA-C book. It was originally written by members of the automotive industry, and attempts to make software written in C more robust by applying a number (quite a large number!) of rules and guidelines.
Then, buy PC-Lint (or another static analysis tool) to check your code for MISRA and other rules.
These are particularly relevant to low-level and embedded C, as between them they deal with the causes of a lot of bugs in such software, such as issues relating to pointers, memory leaks, integer promotion (there's a whole chapter on that in the MISRA book), endianness, and undefined behaviour.

Good question. Some that haven't been mentioned...
Learn your various options for achieving low-level multitasking. From basic round-robin (non-preemptive) schedulers, with timing ticks from a hardware timer, up to a preemptive RTOS. Learn why you might need an RTOS, and why you might not. If you use an RTOS, learn that beginners with a PC background probably tend to want to create too many tasks.
Getting visibility into the internals for debugging can be a challenge. There's no screen typically, so no throwing in "printf" calls wherever you want. An emulator or JTAG interface is ideal--you can set breakpoints and step through your program (as long as halting the micro doesn't make hardware go crazy, like swinging a robot arm around at full speed!). If emulator/JTAG is not available, learn how to use a spare serial port (or maybe even bit-bash a pin to make a serial port) for a debug channel, with some simple memory peek/poke commands.

Related

What's the motivation in using Verilog or VHDL over C?

I come from a programming background and not messed around too much with hardware or firmware (at most a bit electronics and Arduino).
What is the motivation in using hardware description languages (HDL) such as Verilog and VHDL over programming languages like C or some Assembly?
Is this issue at all a matter of choice?
I read that hardware, which its firmware is written in an HDL, has a clear advantage in running instructions in parallel. However, I was surprised to see discussions expressing doubts whether to write firmware in C or Assembly (how is Assembly appropriate if you don't necessarily have a CPU?) but I concluded it's also an option.
Therefore, I have a few questions (don't hesitate to explain anything):
A firmware really can be written either in HDL or in a software programming language, or it's just another way to perform the same mission? I'd love to real-world examples. What constraints resulting from each option?
I know that a common use of firmware over software is in hardware accelerators (such as GPUs, network adapters, SSL accelerators, etc). As I understand it, this acceleration is not always necessary, but only recommended (for example, in the case of SSL and acceleration of complex algorithms). Can one choose between firmware and software in all cases? If not, I'd be happy to cases in which firmware is clearly and unequivocally appropriate.
I've read that the firmware mostly burned on ROM or flash. How it is represented in there? In bits, like software? If so, what's the profound difference? Is it the availability of adapted circuits in the case of firmware?
I guess I made a mistake here and there in some assumptions, please forgive me. Thank you!
The term "firmware" is at best ill defined, and I believe that is probably the cause of your confusion.
Historicallym - before the availability of programmable logic devices - the term "firmware" has been used to refer to code stored-in and executed-from read-only memory (ROM). At a time when the only available ROM technology was mask-ROM where the code was burned into the device at manufacture of the silicon and therefore unchangeable without replacing the chip - that was pretty "firm". Even with later programmable read-only memory (PROM), which could be programmed post-manufacture, because it was one-time programmable (OTP), the term still applied.
With the introduction of UV erasable EEPROM, firmware became perhaps less "firm", but the lack of in-circuit programmability and the need to expose the device to UV to erase it still made replacement of the embedded software a chore - normally requiring removal of the chip, placing it in the eraser for an hour or so, then programming it in a dedicated programmer.
The advent of NOR Flash memory, where code could be stored and executed directly from the device, but also readily changed in-circuit, the term firmware in this context has become less common. However it is still used (perhaps mainly by older practitioners) to refer to embedded software stored and executed from a random-access, read-only memory device as opposed to loaded into RAM from a file system.
The use for the term firmware to refer to programmable logic configuration is newer and has probably come about simply because it is hardware, but the configuration is written much like software using a high-level language.
The upshot of this is that you do not choose
"Verilog and VHDL over programming languages like C or some Assembly"
because in each context the term firmware simply refers to a different concept.
It would be best to avoid the term firmware altogether as it means different things to different people or in different contexts.
There is perhaps some further confusion form the fact that some hardware description languages are based on software development languages - such as Handle C, which is a C-like hardware description language.
This question would not have much arise some time ago, but with current platforms you can now translate between C and HDL languages (instead of using Handel-C extension of C, from the 90's), mainly between C and behavioral VHDL. And a lot of newer tools provided by enterprises, like Xilinx Electronic System Level Design Ecosystem, or Impulse-C (http://www.impulseaccelerated.com/products_universal.htm)
It is important to know though that as C is a middle level language, and VHDL is, as stated a Hrdware Description Language. C can only handle sequential instructions while VHDL allows both sequential and concurrent executions.
And even though a C program can be successfully written with pure logical or algorithmic thinking,
a successful VHDL programmer needs thorough working knowledge of the hardware circuits. being able to predict how a given code will be implemented in hardware.
In both languages you care about the resource usage, but in a different way (Unless you are programming for resource-constrained devices). But when it comes to VHDL, apart from the memory, other logic elements are limited in a FPGA (where you normally put the VHDL code in).

LabVIEW + National Instruments hardware or ???

I'm in the processes of buying a new data acquisition system for my company to use for various projects. At first, it's primary purpose will be to monitor up to 20 thermocouples and control the temperature of a composites oven. However, I also plan on using it to monitor accelerometers, strain gauges, and to act as a signal generator.
I probably won't be the only one to use it, but I have a good bit of programming experience with Atmel microcontrollers (C). I've used LabVIEW before, but ~5 years ago. LabVIEW would be good because it is easy to pick up on for both me and my coworkers. On the flip side, it's expensive. Right now I have a NI CompactDAQ system with 2 voltage and one thermocouple cards + LabVIEW speced out and it's going to cost $5779!
I'm going to try to get the same I/O capabilities with different NI hardware for less $ + LabVIEW to see if I can get it for less $. I'd like to see if anyone has any suggestions other than LabVIEW for me.
Thanks in advance!
Welcome to test and measurement. It's pretty expensive for pre-built stuff, but you trade money for time.
You might check out the somewhat less expensive Agilent 34970A (and associated cards). It's a great workhorse for different kinds of sensing, and, if I recall correctly, it comes with some basic software.
For simple temperature control, you might consider a PID controller (Watlow or Omega used to be the brands, but it's been a few years since I've looked at them).
You also might look into the low-cost usb solutions from NI. The channel count is lower, but they're fairly inexpensive. They do still require software of some type, though.
There are also a fair number of good smaller companies (like Hytek Automation) that produce some types of measurement and control devices or sub-assemblies, but YMMV.
There's a lot of misconception about what will and will not work with LabView and what you do and do not need to build a decent system with it.
First off, as others have said, test and measurement is expensive. Regardless of what you end up doing, the system you describe IS going to cost thousands to build.
Second, you don't NEED to use NI hardware with LabView. For thermocouples your best bet is to look into multichannel or multiple single-channel thermocouple units - something that reads from a thermocouple and outputs to something like RS-232, etc. The OMEGABUS Digital Transmitters are an example, but many others exist.
In this way, you need only a breakout card with lots of RS-232 ports and you can grow your system as it needs it. You can still use labview to acquire the data via RS-232 and then display, log, process, etc, it however you like.
Third party signal generators would also work, for example. You can pick up good ones (with GPIB connection) reasonably cheaply and with a GPIB board can integrate it into LabView as well. This if you want something like a function generator, of course (duty cycled pulses, standard sine/triangle/ramp functions, etc). If you're talking about arbitrary signal generation then this remains a reasonably expensive thing to do (if $5000 is our goalpost for "expensive").
This also hinges on what you're needing the signal generation for - if you're thinking for control signals then, again, there may be cheaper and more robust opitons available. For temperature control, for example, separate hardware PID controllers are probably the best bet. This also takes care of your thermocouple problem since PID controllers will typically accept thermocouple inputs as well. In this way you only need one interface (RS-232, for example) to the external PID controller and you have total access in LabView to temperature readings as well as the ability to control setpoints and PID parameters in one unit.
Perhaps if you could elaborate on not just the system components as you've planned them at present, but the ultimaty system functionality, it may be easier to suggest alternatives - not simply alternative hardware, but alternative system design altogether.
edit :
Have a look at Omega CNi8C22-C24 and CNiS8C24-C24 units -> these are temperature and strain DIN PID units which will take inputs from your thermocouples and strain gauges, process the inputs into proper measurements, and communicate with LabView (or anything else) via RS-232.
This isn't necessarily a software answer, but if you want low cost data aquisition, you might want to look at the labjack. It's basically a microcontroller & usb interface wrapped in a nice box (like an arduino (Atmel AVR + USB-Serial converter) but closed source) with a lot of drivers and functions for various languages, including labview.
Reading a thermocouple can be tough because microvolts are significant, so you either need a high resolution A/D or an amplifier on the input. I think NI may sell a specialized digitizer for thermocouple readings, but again you'll pay.
As far as the software answer, labview will work nicely with almost any hardware you choose -- e.g. I built my own temperature controller based on an arduino (with an AD7780) wrote a little interface using serial commands and then talked with it using labview. But if you're willing to pay a premium for a guaranteed to work out of the box solution, you can't go wrong with labview and an NI part.
LabWindows CVI is NI's C IDE, with good integration with their instrument libraries and drivers. If you're willing to write C code, maybe you could get by with the base version of LabWindows CVI, versus having to buy a higher-end LabView version that has the functionality you need. LabWindows CVI and LabView are priced identically for the base versions, so
that may not be much of an advantage.
Given the range of measurement types you plan to make and the fact that you want colleagues to be able to use this, I would suggest LabVIEW is a good choice - it will support everything you want to do and make it straightforward to put a decent GUI on it. Assuming you're on Windows then the base package should be adequate and if you want to build stand-alone applications, either to deploy on other PCs or to make a particular setup as simple as possible for your colleagues, you can buy the application builder separately later.
As for the DAQ hardware, you can certainly save money - e.g. Measurement Computing have a low cost 8-channel USB thermocouple input device - but that may cost you in setup time or be less robust to repeated changes in your hardware configuration for different tests.
I've got a bit of experience with LabView stuff, and if you can afford it, it's awesome (and useful for a lot of different applications).
However, if your applications are simple you might actually be able to hack together something with one or two arduino's here, it's OSS, and has some good cheap hardware boards.
LabView really comes into its own with real time applications or RAD (because GUI dev is super easy), so if all you're doing is running a couple of thermopiles I'd find something cheaper.
A few thousand dollars is not a lot of money for process monitoring and control systems. If you do a cost/benefit analysis, you will very quickly recover your development costs if the scope of the system is right and if it does the job it is intended to do.
Another tool to consider is National Instruments measurement studio with VB .NET. This way you can still use the NI hardware if you want and can still build nice gui's quickly.
Alternatively, as others have said, it is perfectly viable to get industrial serial based instruments and talk to them with LabVIEW, VB .NET, c# or whatever you like.
If you go down the route of serial instruments, another piece of hardware that might be useful is a serial terminal (example). These allow you to connect arbitrary numbers of devices to your network. You computers can then use them as though they were physical COM ports.
Have you looked at MATLAB. They have a toolbox called Data Acquisition. compactDAQ is a supported hardware.
LabVIEW is a great visual programming environment. In terms if we want to drag,drop and visualize our system. NI Hardware also comes with the NIDAQmx Library which can be accessed through our code. Probably a feasible solution for you would be to import the libraries into another programming language and write code for all the activities which otherwise you were going to perform using LabVIEW. Though other overheads like code optimization would be the users responsibility, you are free to tweak the normal method flow, by introducing your own improvements at suitable junctures in the DAQ process.

What are the prerequisites for learning embedded systems programming? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 9 years ago.
Improve this question
I have completed my degree in Computer Engineering. We had some basic electronics courses in Digital Signal Processing, Information Theory, etc. but my primary field is Programming.
However, I was looking to get into Embedded Systems Programming, and I have NO knowledge of how it is done. However, I am very keen on going into this field.
My questions :
What are the languages used to program embedded systems?
Will I be able to learn without having any basics in electronics?
Any other prerequisites that I should know?
Without a doubt, experience or at least a significant understanding of digital electronics and low level computer engineering is required. You'll need to be able to read device datasheets and understand them. Scopes, multimeters, logic analyzers, etc... are tools of the trade.
C is used mostly, but higher level languages are sneaking in slowly.
Getting started in Embedded Systems is a complex task in itself, because it is a very vast field with numerous options in hardware and software.
What are the languages used to program embedded system programs?
Assembly Language, C, C++, Python, C# and others.
Will I be able to learn without having any basics in electronics?
Learning embedded systems without the basic knowledge of electronics would not be a good idea. Embedded systems is a mix of hardware and software. You can follow the approach of learning-by-doing instead of going through the lengthy and detailed text books.You can refer to this blog
to learn embedded systems by doing practicals, step by step. It will help you to get started from the scratch.
Any other prerequisites that I should know?
Basic electronics, digital electronics, knowledge of microcontrollers and C programming. Since you are from computer science background you would need a development board of any 8-bit microcontroller (students of EE and ECE have enough knowledge and background to build it on breadboard or pcb) to get started. (Don't prefer simulators in the start, you might get your concepts wrong!).
You have to accept constraints and be able to work with them:
CPU speed
scarce memory
lack of networking facilities
custom compilers and OSes
custom motherboards and drivers
debugging with a logic analyzer
weird coding and testing practices
...
The reward is a deep understanding on what is going on.
VHDL, Verilog, and FPGA's are serious players in this arena as well. With a good background in CS, plenty of commitment, and maybe some MIT OpenCourseware you'll be able to pull off something good. A good knowledge of cpu architectures and some ASM will go a long way too.
I went into that field with no knowledge of how it was done as a fresh graduate and quit after 1,5 years. So, what I say may be a little bit rusty, and definitely not comprehensive.
The language we were using was C. But at that time, the disc space was 4MB and memory was 8MB on the devices we were developing for, and I know that C was used due to its libraries' tiny footprint. Apparently, performance was a criterion as well.
As to basic electronics, for an entry level position almost none is necessary. You will gain the required information and experience with time.
Not prerequisites, but having experience in the operating system internals and system development is definitely a plus.
Embedded systems are generally programmed in C, although there are systems at the ends of the range which use assembler when code space or timing is really tight (or there is no decent C compiler available), and at the other end, C++ up to .NET compact. It depends on what you mean by embedded systems, they go from really small microcontrollers with a few hundred bytes of RAM and program space, up to the smartphone type of device running a full multitasking operating system and user interface.
You'll get further in the higher end of this range without a background in electronics, because its less tied to the hardware and more similar to desktop development. As you go down the range of applications, a knowledge of electronics, analogue and digital, and power supplies, noise issues, compliance issues, heat issues and others all combine to make a really challenging design environment.
Start by reading some of the links here and embedded.com
The one thing that I have not seen mentioned in the answers so far is that up until now you have probably done most of your coding in the context of an operating system. In many (perhaps most?) cases, with firmware as opposed to software, you will not have the convenience and benefits of coding on top of an operating system. This is why so many of the other answers indicated that a good knowledge of electronics was critical.
As others mentioned, embedded can mean many things. In my world (Aerospace and Defense), we work with real-time operating systems (VxWorks and Integrity are the biggest players) and occasionally Linux. We program in C primarily, although C++ is also used as well if the project has decided to use Object Oriented Programming and Modeling.
So, as for the Pre-Reqs, C for sure. You really need to learn all about C, including BIT wise operations, dealing with hex values, pointers, all the low level stuff. Assembly as well, but I only use it for debugging the hardest stuff nowadays. You need to know enough to read and understand.
I think An Embedded Software Primer is a great start to change your thinking towards embedded. Handling interrupts, real-time issues, etc...
As Mickey mentioned, sometimes you don't even have an OS. In these cases, you usually write your own task manager of some sort, but that usually wouldn't be something for the newbie to start with. Good luck.
Languages: C, Assembler, Processing, Basic and a whole variety of others, it depends on what platform you're using as to what's available.
You might get more specific information if you ask the same question at ChipHacker or Electronics Exchange which are both stack exchange style sites (like this is) but geared to electronics and "physical computing".
You'll want to get pretty comfortable with C and build a solid understanding of assembly. In systems / embedded, usually you're working with small amounts of memory and slower processors, so you need to understand how to use limited resources wisely.
If you're getting into this as a hobby, pick up a gumstix board or an arduino, these dev boards will give you hundreds of hours of learning and fun.
If you're trying to make a career of this, find a job where the projects sound interesting and get your hands dirty. Take every task that comes your way and ask yourself how you can do better and learn from this task.
Either way, have fun and happy coding!
Learn C. Learn to apply C to all problems. Other languages can wait. Eventually assembly will help and no programmer should be without the use a scripting language.
Depending on what embedded targets you use there could be very little difference between a PC and your target. With little electronics background this would be your easiest entry.
Small processors will give you the the steepest learning curve but you will learn the most about embedded programming. However with no electronic background this can present extra problems you might not have the skills to solve yet.
Eventually you will have to learn electronics if you want to make further progress beyond the basics.

How can I make my own microcontroller?

How can I make my own microcontroller? I've done some work using GAL chips and programmed a chip to do simple commands such as add, load, move, xor, and output, but I'd like to do something more like a real microcontroller.
How can I go about doing this? I've read a little bit about FPGA and CPLD, but not very much, and so was looking for some advice on what to get and how to start developing on it.
Look here for a good wiki book. I had some coursework I wrote when I was teaching Electronic Eng, but I couldn't find it around. When I was teaching, most of the students were happy to use the schematic capture tools in the Xilinx Foundation package. They've moved onto ISE and WebPACK now. You can download the WebPack for free, which is useful, and it has schematic capture and simulation in it.
If you really want to shine, learn VHDL or Verilog (VHDL seems to be more common where I've worked, but that is only a small smattering of places) and code the design rather than enter it through the GUI.
If you know ANYTHING at all about digital logic design (and some HDL) I rekon you can have a somewhat functional 8-bit microprocessor simulating in VHDL in about 2 days. You're not going to build anything blazingly fast or enormously powerful in that time but it's a good starting point to grow from. If you have to learn about digital design, factor in a couple of days to learn how the tools work and simulate some basic logic circuits before moving onto the uP design.
Start learning the basics of digital systems, and how to build a binary adder. Move on to building an ALU to handle addition, subtraction, and, or, xor, etc and then a sequencer to read opcodes from RAM and supply them to the execution unit.
You can get fancy with instruction set design, but I'd recommend starting out REALLY simple until you have your head around whats going on, then throw it out and start again with something more complex.
Once you have the design simulating nicely you can gauge its complexity and purchase a device to suit. You should look at a development system for the device family you've chosen. Pick a device bigger than what you need for development because it's nice to be able to add extra instrumentation to debug it when it's running, and you almost certainly won't have optimized your design in the early stages of getting it on the device.
EDIT: Colin Mackenzie has a good tutorial about uC design and some FPGA boards as well as a bit of other stuff.
You may want to have a look around OpenCores.org, a "forge" site for open source IP core development. Also, consider getting yourself a development board like one of these to play around with.
Much of the tools ecosystem revolves around VHDL, although Avalda is working on tools to compile F# for FPGAs.
I saw a textbook once that stepped through building a machine from TTL chips. This had the same instruction set as a PDP-8, which is very - and I mean very - simple, so the actual machine architecture is easy to implement in this way.
The PDP-8 FAQ mentions a book: "The Art of Digital Design," second edition, by Franklin Prosser and David Winkel (Prentice-Hall, 1987, ISBN 0-13-046780-4). It also mentions people implementing it in FPGA's.
Given the extreme simplicity of this CPU architecture and availability of PDP-8 code or reference implementations it might be a good starting point to warm up with.
Alternatively, an acquaintance of mine implemented a thumb (cut down ARM) on a FPGA as a university project run by one Steve Furber (a prominent Acorn alumnus). Given that this could be compressed into a format small enough for a university project it might also be a good start.
To play with soft-core microprocessors, I like the Spartan 3 Starter Board from Digilent just because it has 1M of static RAM. SDRAM and DDR RAM are harder to get going, you know.
The leds, switches and a simple serial interface are a plus to debug and communicate.
As someone already pointed out, OpenCores.org is a good place to find working examples. I used the Plasma uC to write some papers while on university.
A microcontroller can be as simple as a ROM (instruction*2^x + (clock phase) is the address, outputs are the control signals, and you're good to go). Or it can be a complex harry beast with three arms and branch prediction support hardware.
Can you give more details about your aspirations?
After searching some very helpful links by all of you, I came across this Wikiversity course.
One of the first sentences is, "Have you ever thought to build your own microprocessor?"
Xilinx has a MicroBlaze and a PicoBlaze soft controller for its FPGAs. The latter is free, while, IIRC, the Microblaze is to be paid for.
As its name suggests the PicoBlaze is a small processor, which has its limitations, but OTOH is compact enough to run on a CPLD. Anyway a nice processor to get you started.
Pablo Bleyer has a PicoBlaze-compatible PacoBlaze. PacoBlaze was written in Verilog (which, like Adam said, less common than VHDL).
You need a big fpga for a little mcu.
You need a fpga with the correct hardware blocks if you need things like AD.
You need a soft core to put into the fpga.
But how about to just play around with a normal MCU before this project,
so you kind of know where you are going? How about some AVR:s from Atmel.
You can get free samples of pic micro controllers at this site. Last I knew, you don't even have to pay shipping.
http://www.microchip.com/stellent/idcplg?IdcService=SS_GET_PAGE&nodeId=64

Is low-level / embedded systems programming hard for software developers? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
Given my background as a generalist, I can cover much of the area from analog electronics to writing simple applications that interface to a RDBMS backend.
I currently work in a company that develops hardware to solve industry-specific problems. We have an experienced programmer that have written business apps, video games, and a whole bunch of other stuff for PC's. But when I talk to him about doing low-level programming, he simultaneously express interest and also doubt/uncertainty about joining the project.
Even when talking about PC's, he seems to be more comfortable operating at the language level than the lower-level stuff (instruction sets, ISR's). Still, he's a smart guy, and I think he'd enjoy the work once he is over the initial learning hump. But maybe that's my own enthusiasm for low-level stuff talking... If he was truly interested, maybe he would already have started learning stuff in that direction?
Do you have experience in making that software-to-hardware (or low-level software) transition? Or, better yet, of taking a software only guy, and transitioning him to the low-level stuff?
Edit:
P.S. I'd love to hear from the responders what their own background is -- EE, CS, both?
At the end of the day, everything is an API.
Need to write code for an SPI peripheral inside a microcontroller? Well, get the datasheet or hardware manual, and look at the SPI peripheral. It's one, big, complex API.
The problem is that you have to understand the hardware and some basic EE fundamentals in order to comprehend what the API means. The datasheet isn't written by and for SW developers, it was written for hardware engineers, and maybe software engineers.
So it's all from the perspective of the hardware (face it - the microcontroller company is a hardware company filled with hardware/asic engineers).
Which means the transition is by no means simple and straightforward.
But it's not difficult - it's just a slightly different domain. If you can implement a study program, start off with Rabbit Semiconductor's kits. There's enough software there so a SW guy can really dig in with little effort, and the HW is easy to deal with because everything is wrapped in nice little libraries. When they want to do something complex they can dig into the direct hardware access and fiddle at the lower level, but at the same time they can do some pretty cool things such as build little webservers or pan/tilt network cameras. There are other companies with similar offerings, but Rabbit is really focused on making hardware easy for software engineers.
Alternately, get them into the Android platform. It looks like a unix system to them, until they want to do something interesting, and then they'll have the desire to attack that little issue and they'll learn about the hardware.
If you really want to jump in the deep end, go with an arduino kit - cheap, free compilers and libraries, pretty easy to start off with, but you have to hook wires up to do something interesting, which might be too big of a hurdle for a reluctant software engineer. But a little help and a few nudges in the right direction and they will be absolutely thrilled to have a little LED display that wibbles* like the nightrider lights...
-Adam
*Yes, that's a technical engineering term.
The best embedded programmers I've worked with are EE trained and learned SW on the job. The worst embedded developers are recent CS graduates who think SW is the only way to solve a problem. I like to think of embedded programming as the bottom of the SW pyramid. It's a stable abstraction layer/foundation that makes life easy for the app developers.
"Hard" is an extremely relative term. If you're used to thinking in the tight, sometimes convoluted way you need to for small embedded code (for example, you're a driver developer), then certainly it's not "hard".
Not to "bash" (no pun intended) shell scripters, but if you write perl and shell scripts all day, then it might very well be "hard".
Likewise if you're a UI guy for Windows. It's a different kind of thinking.
Why embedded development is "hard":
1) The context may switch to an interrupt between each machine instruction. Since high level language constructs may map to multiple assembly instructins, this might even be within a line of code, e.g. long var = 0xAAAA5555. If accessed in an interrupt service routine, in a 16 bit processore var might only be half set.
2) Visibility into the system is limited. You may not even have output to Hyperterm unless you write it yourself. Emulators don't always work that well or consistently (though they are way better than they used to be). You will have to know how to use oscilloscopes and logic analyzers.
3) Operations take time. For example, say your serial transmitter uses an interrupt to signal when it is time to send another byte. You could write 16 bytes to a transmit buffer, then clear interrupts and wonder why your message is never sent. Timing in general is a tricky part of embedded programming.
4) You are subject to subtle race conditions that occur only rarely and are very difficult to debug.
5) You have to read the manual. A lot. You can't make it work by fooling around. Sometimes 20 things have to be set up correctly to get what you are after.
6) The hardware doesn't always work or is easy to damage, and it takes a while to figure out that you broke it.
7) Software repairs in embedded systems are usually very expensive. You can't just update a web page. A recall can erase any profit you made on the device.
There are probably more but I've got this race condition to solve...
This is very subjective I guess, his reasons could be many. But if he's like me, I know where he's coming from. Let me explain.
In my career I've dedicated 6 years to the telecom industry, working a lot with embedding SDK middleware into low-end mobile phones etc.
Most embedded environments I've experienced are like harsh weather for a programmer, you constantly have to overcome limitations in resources etc. Some might find this a challenge and enjoy it for the challenge itself, some might feel close to "the real stuff" - the hardware, some might feel it limits their creativity.
I'm the kind who feels it limits my creativity.
I enjoy being back in Windows desktop environment and flap my wings with elaborate class designs, stretch my legs a few clockcycles extra, use unnecessary amounts of memory for diagnostics etc.
On certain embedded units in the past, I hardly had support for fseek() (an ANSI C standard file function). If lucky, a "watchdog" could give clues to where something crashed. Not to mention the pain of communicating with the user in single-threaded preemptive swamps.
Well, you know what I'm getting at. In my opinion it's not necessarily hard, but it's quite a leap, with potentially little reuse of your current experience.
Regards
Robert
There is a very real difference in mindset from user-level application development (ie, general purpose PC or Web applications) to hard deadline, real-time response application development (ie, the hardware/software interface).
Interrupts, instruction sets, context switching and hard resource constraints are relatively unknown to your average developer. I'm assuming here that your 'average developer' is not an Electrical/Electronic or other Engineer by training.
The transition for this developer you mention may be well outside his comfort zone. Some of us like stretching like that. Others of us may have decided the view isn't worth the climb.
Likewise, folks who've been in the hardware area (ie, Engineers) often have difficulty with the assumptions and language of software development.
These are gross generalities, of course, but hopefully give some insight.
He needs to be comfortable with the low-level stuff, but mostly for debugging and field issues. There is a serious learning curve depending on the architecture, but not impossible. On the other hand, the low-level code takes (in general) more time and debugging than higher-level code. So if you need to be going back to low-level all the time, then perhaps something isn't right in the design. Even for the embedded controls I've built, I spend the vast majority of time in high-level code. Although when you have issues, it is extremely advantageous to have a very good low-level knowledge.
I am an EE turned Software Engineer. I prefer programming low level. Most software developers classically trained that I know do not want to operate at this level they want apis to call. So for me it is a win win, I create the low level driver and api for them to use. There is a "new" degree, at least new since I went to college, called Computer Engineer. Hmm, it might be an electrical engineering degree not computer science, but it is a nice mix of software and digital hardware basics. The individuals that I have worked with from this field are much more comfortable with low level.
If the individual is not comfortable or willing then place them somewhere where they are comfortable. Let them do documentation or work on the user interface. If all of the work at the company requires low level work then this individual needs to do it or find another job. Dont sugar coat it.
I also think they will enjoy it once they get over the hump, the freedom you have at that level, not hindered by operating systems, etc. Recently I witnessed a few co-workers experience for the first time seeing their software run under simulation. Every net within the processor and other on chip peripherals. No you dont have a table on a gui (debugger) showing the current state of the memory, you have to look at the memory bus, look for the address you are interested in, look for a read or write signal and the data bus. I worry about the day that silicon arrives and they no longer have this level of visibility. Will be like an addict in detox.
Well, I cut my teeth on hardware when I started reading Popular Electronics at age 14 – this was BEFORE personal computers, in case you were wondering and if you weren’t well, you know anyway. lol
I’ve done the low level bit-bang stuff on the 8048/51 microprocessor, done PIC’s and some other single chip variations and of course Rabbit Semiconductor. (great if you're into C). That’s great (and fun) stuff; Yes, there is a different way of looking at things – not harder, but some of that information is a bit harder to come by as it isn’t as discussed as the software issues. (Of course, this depends on the circle of friends with which you associate, eh).
But, having said all of this, I want to remind you of a technology that started to bridge the gap for programmers into the world of hardware and has since become a very MAJOR player and that is the .NET micro framework. You can find information on this technology at the following;
http://msdn.microsoft.com/en-us/embedded/bb267253.aspx
It addresses some of the same issues that .NET web development addressed in that you can use some (quite a bit, actually) of your existing PC based knowledge in the new environments – Some caution, of course, as your target machine doesn’t have 4 GIG of RAM – it may only have 64K (or less)
Starting in version 2.5 of the .NET micro framework, you have access to networking and web services – way kewl, eh? It doesn’t stop there … Want to control the lights in your house? How about a temp recording station? All with the skills you already have. Well, mostly -- Check out the link.
The SDK plugs into your VisualStudio IDE. There are a number of “Development Kits” available for a very reasonable amount of cash – Now, what would normally take a big learning curve in components, building a circuit board and wiring up “stuff” can be done reasonably easy with a dev kit and some pretty simple code – Of course, you may need to do the occasional bit bang operation, but more and more sensor folks are providing .NET micro framework drivers – so, the hardware development may be closer than you think…
Hope it helps...
I like both. Embedded challenges me and really gets me going in a visceral way. Making something that affects the macro physical world is very satisfactory. But I've had to do a lot of catch up on the electrical/electronics end, since my bachelor's is in computer science. I've a pretty generalist background, where I studied ai, graphics, compilers, natural language, etc. Now I'm doing graduate work in embedded systems. The really tough part is adjusting to the lack of runtime facilities like an operating system.
Low-level embedded programming also tends to include low-level debugging. Which (in my experience) usually involves (at least) the use of an oscilloscope. Unless your colleague is going to be happy spending at least some of the time in physical contact with the hardware and thinking in terms of microseconds and volts, I'd be tempted to leave them be.
Agreed on the "hard" term is quite relative.
I would say different, as you would need to employ different development patterns that you won't use in other kind of environment.
The time constraint for instance could requires a learning curve.
However being curious, would be a quality for a developer, wouldn't be?
You are right in that anyone with enough knowledge not to feel completely lost in an area (over the hump?) will enjoy the challenges of learning something new.
I myself would feel quite nervous being moving to the level of instruction sets etc as there is a huge amount of background knowledge needed to feel comfortable in the environment.
It may make a difference if you are able to support the developer in learning how to do this. Having someone there you can ask and talk through issue with is a huge help in that sort of domain change.
It may be worth having the developer assigned to a smaller project with others as a first step and see how that goes. If he expresses enthusiasm to try another project, things should flow on from there.
I would say it is not any harder, it just requires a different knowledge set, different considerations.
I think that it depends on the way that they program in their chosen environment, and the type of embedded work that you're talking about.
Working on an embedded linux platform, say, is a far smaller jump than trying to write code on an 8 bit platform with no operating system at all.
If they are the type of person that has an understanding of what is going on underneath the api and environment that they are used to, then it won't be too much of a stretch to move into embedded development.
However, if their world view stops at the high level api that they've been using, and they have no concept of anything beneath that, they are going to have a really hard time.
As a (very) general statement if they are comfortable working on multithreaded applications they will probably be ok, as that shares some of the same issues of data volatility that you have when working on embedded projects.
With all of that said, I've seen more embedded programmers successfully working in PC development than I have the reverse. (of course I might not have seen a fair cross section)
"But when I talk to him about doing low-level programming, he simultaneously express interest and also doubt/uncertainty about joining the project." -- That means you let him try and you prepare to hire someone else in case he doesn't pass the learning curve.
i began as a SW engineer i'm now HW one !
the important is to understand how it works and to be motivated !