Skills needed in 8-bit, 16-bit, 32-bit - embedded

Are there any specific skillsets required with 8-bit, 16-bit and 32-bit processing for embedded developers?

Yes, there are specific skills expected and differences between 8bit and 32bit processors. (Ignoring 16 bit, since there's so few of them available)
8 bit processors and tools are vastly different than the 32bit variants (even excluding Linux based systems).
Processor architecture
Memory availability
Peripheral complexity
An 8051 is a strange beast and plopping your average CS in front of one and asking them to make a product is asking for something that only mostly works. It's multiple memory spaces, lack of stack, constrained register file, and constrained memory really make "modern" computer science difficult.
Even an AVR, which is less of a strange beast, still has constraints that a 32 bit processor just doesn't have, particularly memory
And all of these are very different than writing code on an embedded linux platform.

In general processors and microcontrollers using 32 bit architecture tend to be more complex and used in more complex applications. As such, someone with only 8 bit device experience may not process the skills or experience necessary for more complex projects.
So it is not specifically the bit-width that is the issue, but it is used simply as a shorthand or proxy for complexity of systems. It is a very crude measure in any event since architectures differ widely even withing the bit-width classification; AVR, PIC and x51 for example are very different, as are 68K, ARM and x86. Even within the ARM family a Cortex-M device is very different from an A-class device.
Beware of any job spec that uses such broad skill classifications - something for you to challenge perhaps in the interview.

Related

How an embedded board developers decided on what endianity they want to support?

ARM MCU supports both little endian and big endian. However, when the manufactures design microcontrollers i.e. when they use ARM microprocessor and add peripherals to it they support either big endian or little endian. So, my question is how board manufacturer like STM32, TI decides whether they want to support little or big endian. As far as I understand ARM microprocessor supports both little endian and big endian.
It's completely subjective.
The terms Big Endian and Little Endian are taken from the book Gulliver's Travels, where two nations fight a fierce, bloody war based on the disagreement about if one should hatch an egg on the "big" side or the "little" side. That is, they were fighting over something completely pointless.
In the computer world in the 1970s-80s, the Big Endian camp consisted prominently of Motorola and IBM, and the Little Endian camp consisted prominently of Intel. All other manufacturers had to pick either side.
So mostly it is picked by tradition.
Regarding ARM specifically, all ARM Cortex are in practice Little Endian. Even Freescale, former Motorola, picked Little Endian for their Kinetis family. There are however various other 32 bit architectures that use Big Endian, including I believe some pre-Cortex ARMs.
Importantly, "network endianess" is almost always Big Endian, also out of tradition. But this has an actual objective and practical reason though, namely CRC calculations. In order to create a CRC calculator in pure digital logic with XOR gates, the data must be transmitted MS byte first. It is nowadays rare to implement CRC using digital gates, but that's the historical reason.
The software ecosystem for ARM Cortex MCUs is more-or-less exclusively little-endian. It would be unlikely that anyone would choose to use big-endian exclusively or even mixed-endian without very a good application specific reason. So the choice is normally simple, and I doubt the designers think about it at all.
Unlike other ARM implementations, Cortex-M MCUs do not support "on-the-fly" changing of the endianness, and the choice of endianness is fixed by the silicon vendors. All popular (and maybe even non-popular ones) Cortex-M MCUs implement little endian, so that's the practical answer.
Little endian is easier to map a stream of bytes/ASCII characters into memory. So maybe that's one reason why it is more popular.
The choice of endianness is pretty immaterial. If you are writing standalone code in a high level language that does not exchange data with anyone else, then it really does not matter which endianness you choose. Even if you have to exchange data, then usually you would encapsulate the data access in some low level functions so by and large the endianness is still not too impactful.

What's the motivation in using Verilog or VHDL over C?

I come from a programming background and not messed around too much with hardware or firmware (at most a bit electronics and Arduino).
What is the motivation in using hardware description languages (HDL) such as Verilog and VHDL over programming languages like C or some Assembly?
Is this issue at all a matter of choice?
I read that hardware, which its firmware is written in an HDL, has a clear advantage in running instructions in parallel. However, I was surprised to see discussions expressing doubts whether to write firmware in C or Assembly (how is Assembly appropriate if you don't necessarily have a CPU?) but I concluded it's also an option.
Therefore, I have a few questions (don't hesitate to explain anything):
A firmware really can be written either in HDL or in a software programming language, or it's just another way to perform the same mission? I'd love to real-world examples. What constraints resulting from each option?
I know that a common use of firmware over software is in hardware accelerators (such as GPUs, network adapters, SSL accelerators, etc). As I understand it, this acceleration is not always necessary, but only recommended (for example, in the case of SSL and acceleration of complex algorithms). Can one choose between firmware and software in all cases? If not, I'd be happy to cases in which firmware is clearly and unequivocally appropriate.
I've read that the firmware mostly burned on ROM or flash. How it is represented in there? In bits, like software? If so, what's the profound difference? Is it the availability of adapted circuits in the case of firmware?
I guess I made a mistake here and there in some assumptions, please forgive me. Thank you!
The term "firmware" is at best ill defined, and I believe that is probably the cause of your confusion.
Historicallym - before the availability of programmable logic devices - the term "firmware" has been used to refer to code stored-in and executed-from read-only memory (ROM). At a time when the only available ROM technology was mask-ROM where the code was burned into the device at manufacture of the silicon and therefore unchangeable without replacing the chip - that was pretty "firm". Even with later programmable read-only memory (PROM), which could be programmed post-manufacture, because it was one-time programmable (OTP), the term still applied.
With the introduction of UV erasable EEPROM, firmware became perhaps less "firm", but the lack of in-circuit programmability and the need to expose the device to UV to erase it still made replacement of the embedded software a chore - normally requiring removal of the chip, placing it in the eraser for an hour or so, then programming it in a dedicated programmer.
The advent of NOR Flash memory, where code could be stored and executed directly from the device, but also readily changed in-circuit, the term firmware in this context has become less common. However it is still used (perhaps mainly by older practitioners) to refer to embedded software stored and executed from a random-access, read-only memory device as opposed to loaded into RAM from a file system.
The use for the term firmware to refer to programmable logic configuration is newer and has probably come about simply because it is hardware, but the configuration is written much like software using a high-level language.
The upshot of this is that you do not choose
"Verilog and VHDL over programming languages like C or some Assembly"
because in each context the term firmware simply refers to a different concept.
It would be best to avoid the term firmware altogether as it means different things to different people or in different contexts.
There is perhaps some further confusion form the fact that some hardware description languages are based on software development languages - such as Handle C, which is a C-like hardware description language.
This question would not have much arise some time ago, but with current platforms you can now translate between C and HDL languages (instead of using Handel-C extension of C, from the 90's), mainly between C and behavioral VHDL. And a lot of newer tools provided by enterprises, like Xilinx Electronic System Level Design Ecosystem, or Impulse-C (http://www.impulseaccelerated.com/products_universal.htm)
It is important to know though that as C is a middle level language, and VHDL is, as stated a Hrdware Description Language. C can only handle sequential instructions while VHDL allows both sequential and concurrent executions.
And even though a C program can be successfully written with pure logical or algorithmic thinking,
a successful VHDL programmer needs thorough working knowledge of the hardware circuits. being able to predict how a given code will be implemented in hardware.
In both languages you care about the resource usage, but in a different way (Unless you are programming for resource-constrained devices). But when it comes to VHDL, apart from the memory, other logic elements are limited in a FPGA (where you normally put the VHDL code in).

Advantages of atmega32 [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
What are the advantages of using ATmega32 than other microcontrollers?
Is it better than PIC, ARM, and 8051?
Advantages
Still runs on 5 V, so legacy 5 V stuff interfaces cleaner
Even though it's 5 V capable, newer parts can run to 1.8 V. This wide range is very rare.
Nice instruction set, very good instruction throughput compared to other processors (HCS08, PIC12/16/18).
High quality GCC port (no proprietary crappy compilers!)
"PA" variants have good sleep mode capabilities, in micro-amperes.
Well rounded peripheral set
QTouch capability
Disadvantages
Still 8-bit. An ARM is a 16/32-bit workhorse, and will push a good amount more data around, at much higher clock speeds, than any 8-bit.
Cost. Can be expensive compared to HCS08 or other bargain 8-bit processors.
GCC toolchain has quirks, like the split memory model and limited 16-bit pointers.
Atmel is not the best supplier on the planet (at least they're not Maxim...)
In short, they are a very clean and easy to work with a 8-bit microcontroller.
An 8051 is legacy: the tools are passable, the architecture is bizarre (idata? xdata? non-reentrant functions in most compilers by default?).
PIC before PIC24 is also bizarre (register banking) and poor clock->instruction throughput. There is no first-class open source C compiler either.
PIC32 is competing with ARM7TDMI and ARM Cortex-M3, based on an adapted MIPS core, and has a GCC port (not main-lined).
AVR32 is competing with Cortex-M3, and offers a pretty good value, especially in the low power area.
MSP430 is the king for ultra low-power devices, and has a passable GCC port (if you're not targeting 430X).
HCS08 is very inexpensive, but poor instruction throughput. Peripherals vary quite a bit.
ARM used to be a higher cost entry point, but with the introduction of the Cortex-M3 architecture, the price has been dropping compared to an 8-bit. For example, the LPC13xx series is comparable to a ATmega32 in many ways. Luminary (TI) has quite an impressive peripheral set.
It depends.. Firstly you have to know what you want from the microprocessor.
In general:
PIC:
Old architecture. This means it's either expensive or slow
Targets only low end market (< few Mhz)
There's a lot of code written for it
ARM
Scalable
Fast/cheap
Atmega is somewhere in between
I find the PIC family (before the MIPS version) to have the most painful instruction set of all, which means assembler is the language of choice if you want to conserve space, get performance, have control, etc.
The 8051 is a little less painful, more registers, but still takes a handful of instructions to do anything useful (meaning you cannot compare these to other chips from a MHz perspective). I like AVR in many ways, they embrace the homebrew and developer community, or if not directly there is a much better family of developers out there compared to the competition. I don't like the instruction set, but it is decades ahead of the PIC and 8051. I like the MSP430 instruction set quite a bit, it is one of the best instruction sets for teaching assembler, TI is not as developer friendly though, it can be a struggle. The eZ430 was on the right path but the goodfet is better as you don't have it failing to work every other kernel version.
MSP430 and ARM have the best instructions sets as far as I am concerned which leads to both good assembler and good compiler tools. You can find commercial tools for all of the above and certainly for 8051, MSP430, and ARM free tools (MSP430 and ARM can use GCC, 8051 cannot, look for SDCC). For now mspgcc4.sf.net and CodeSourcery are the place for GCC based tools for MSP430 and ARM. LLVM supports both, I was able to get LLVM 27 to beat the latest GCC in a dhrystone test, but that is one test, LLVM trails behind in performance but is improving.
As far as finding and creating free cross compilers I see LLVM already as the easiest to get and use and going forward it will hopefully only get better. Sadly the MSP430 port for LLVM was a look what I could do in an afternoon PowerPoint presentation and not a serious port.
My answer is that it depends on what you are doing, and I recommend you try all of them. These days the evaluation boards are in the sub US$50 range and some in the sub US$30 range. Even within the ARM family (ST, Atmel, Stellaris, LPC, etc.) there is a wide veriety of features and quirks that you will only find if you try them. Avoid the LPCexpresso, mbed2, and STM32 primers. Avoid LPC in general and avoid Cortex-M3 in general until you have cut your teeth on an ARM7. Look at SparkFun for Olimex and other boards. Although it is probably LPC the ARMmite PRO and Arduino Pro are good choices. The eZ430 is a good MSP430 start, and I don't remember who is making 8051 stuff, Renasys (sp?), 8051s are not all created equal, the register space varies from one to another and you have to prepare for that. I would probably look for an 8051 simulator if you want to play with the 8051.
I see AVT and definitely ARM continuing to dominate, I would like to see the MSP430 be used for things other than just super low power. With ARM, AVR, and MSP430 you can use and get used to GCC tools now and in the future, which has a lot of benefits even if GCC isn't the best compiler in the world, it is by far the best supported compiler. I would avoid proprietary compilers and tools. I would look for devices that have non-proprietary programming interfaces that are field programmable, JTAG is good, but for example the new SWD JTAG on Cortex-M3 is bad. TI MSP was hurt by this but some hacking has resolved this, at least for now. I really don't have much good to say about PIC and won't try. A big thing to look for is glue logic, does the part or family have the SPI or I2C or whatever bus you want to use, do you need an internal pull up or wired or input?
Some chips just don't have that option and you have to add external hardware. Do you need an interrupt, with conditioning? ARM tends to win on this because it is a core used by many so each ARM vendor puts its own I/O out there so you can still live in the ARM world and have many choices, AVR and MSP are going to be very limited by comparison. With ARM the tools are going to be state of the art, ARM is the most used processor right now. AVR and MSP are special project addons, less widely supported and fragile. Although ARM is low power compared to Intel on a SBC or computer platform, it is likely not as low-power as an AVR or MSP. You really need to look at your project and pick the right processor for the job, I wouldn't and don't limit myself to one family. With as cheap as the evaluation boards are, and almost all can use free tools, it is just a matter of putting in a few nights or weekends in to learn each. I suggest learning more than one AVR, and learning more than one microprocessor.
At this end of the spectrum, there are only really two factors that make much difference. First, in smaller quantities, the only thing that matters at all is which architecture suits your development needs best. If you are already familiar with PIC, there's not much point in learning avr, or visa versa. Pick an architecture that you like, then sort through the options on that architecture to see which model is up to your particular needs.
In quantity (say, 20 or more units), you might benefit by choosing just the right platform that precisely matches your devices' needs, to keep costs as low as possible.
In general, Pic and avr platforms are good for simple, single function devices, where as arm is used in cases where you need a full OS stack like QNX or Linux for things like TCP or real time with OS services.
If you want the widest choice of peripherals, performance, price-point, software and tool support, and suppliers it would be hard to beat an ARM Cortex-M3 based part.
But addressing your question directly the whole AVR range has a consistent architecture and common peripheral set from the Tiny to the Mega (not AVR32 however which is entirely different). This is the significant difference with PIC where when moving up the range( PIC10, 12, 16, 18, 24, 32), you get different peripheral designs, different instruction sets, and need to invest in different compilers and debug hardware.
The instruction set for AVR was designed for efficient C code compilation (again unlike PIC).
8051 is an architecture originally introduced by Intel decades ago, but now used as the core for 8 bit devices from a number of vendors. It has some clever tricks such as efficient multitasking context switches via its 8 duplicated register banks, and a block of bit addressable memory, but has a quirky memory architecture and limited address range (like most 8 bit devices). Great for small well targeted devices, but not truly general purpose.
ARM Cortex-M3 essentially replaces ARM7TDMI, and is a cleaner design with well thought out architecture. It requires minimal assembler start-up code and even ISRs and vector tables can be coded in C directly without any weird compiler extensions or assembler entry/exit code. Its bitbanding technique allows all memory and peripherals to be atomically bit addressable, which is useful for fast I/O and safe multithreading. Basically it is designed to allow C or C++ code at the system level without non-standard compiler extensions. It is of course a 32 bit architecture, so does not have the resource or arithmetic limitations of 8 bit devices. Prices for low-end parts compete with higher performance 8 bit devices, and blow most 16 bit devices out of the water (making 16 bit almost obsolete).
One other key thing to remember is that PIC and AVR are from single vendors, while 8051 and ARM are licensed cores. Each licensee adds their own peripheral set, so there is no commonality between vendors on peripherals, so device driver code needs porting when switching vendors, and you need to ensure the part has the peripherals you need. If you design your device layer well, this is seldom much of a problem.
Well, it isn't easy to answer. It mostly depends on what you used before. If you are already AVR user, then it's good to use. On the other hand you can find PICs with similar capabilities, so I'd say it's mostly personal preference. I think that most ARMs are more capable than atmega32 series. If you want good advice, tell us what you plan to use it for.
AVRs have flat memory model and have free development tools and cheap development hardware is available for them.
I don't know enough about 8051 to comment.
Oh and if you're thinking about original atmega32, I'd say it's a bad idea. It's going to be deprecated soon, so you may want to consider newer models from the atmega32 series.

Does it matter which microcontroller to use for 1st time embed system programmer?

I've experience in doing desktop and web programming for a few years. I would like to move onto doing some embed system programming. After asking the initial question, I wonder which hardware / software IDE should I start on...
Arduino + Arduino IDE?
Atmel AVR + AVR Studio 4?
Freescale HCS12 or Coldfire + CodeWarrior?
Microchip PIC+ MPLAB?
ARM Cortex-M3 + ARM RealView / WinARM
Or... doesn't matter?
Which development platform is the easiest to learn and program in (take in consideration of IDE usability)?
Which one is the easiest to debug if something goes wrong?
My goal is to learn about "how IO ports work, memory limitations/requirements incl. possibly paging, interrupt service routines." Is it better to learn one that I'll use later on, or the high level concept should carry over to most micro-controllers?
Thanks!
update: how is this dev kit for a start? Comment? suggestion?
Personally, I'd recommend an ARM Cortex-M3 based microcontroller. The higher-power ARM cores are extremely popular, and these low-power versions could very well take off in a space that is still littered with proprietary 8/16-bit cores. Here is a recent article on the subject: The ARM Cortex-M3 and the convergence of the MCU market.
The Arduino is very popular for hobbyist. Atmel's peripheral library is fairly common across processor types. So, it would smooth a later transition from an AVR to an ARM.
I don't mean to claim that an ARM is better than an AVR or any other core. Choosing an MCU for a commercial product usually comes down to peripherals and price, followed by existing code base and development tools. Besides, microcontrollers are general much much simpler than a desktop PC. So, it's really not that hard to move form one to another after you get the hang of it.
Also, look into FreeRTOS if you are interested in real-time operating system (RTOS) development. It's open source and contains a nice walk through of what an RTOS is and how they have implemented one. In fact, their walk-through example even targets an AVR.
Development tools for embedded systems can be very expensive. However, there are often open source alternatives for the more open cores like ARM and AVR. For example, see the WinARM and WinAVR projects.
Those tool-chains are based on GCC and are thus also available (and easier to use IMHO) on non-Windows platforms. If you are familiar with using GCC, then you know that there are an abundance of "IDE's" to suit your taste from EMACS and vi (my favorite) to Eclipse.
The commercial offerings can save you a lot of headaches getting setup. However, the choice of one will very much depend on your target hardware and budget. Also, Some hardware support direct USB debugging while others may require a pricey JTAG adapter.
Other Links:
Selection Guide of Low Cost Tools for Cortex-M3
Low-Cost Cortex-M3 Boards:
BlueBoard-LPC1768-H ($32.78)
ET-STM32 Stamp Module ($24.90)
New Arduino to utilize an ARM Cortex-M3 instead of an AVR microcontroller.
Given that you already have programming experience, you might want to consider getting an Arduino and wiping out the firmware to do your own stuff with AVR Studio + WinAVR. The Arduino gives you a good starting point in understanding the electronics side of it. Taking out the Arduino bootloader would give you better access to the Atmel's innards.
To get at the goals you're setting out, I would also recommend exploring desktop computers more deeply through x86 programming. You might build an x86 operating system kernel, for instance.
ARM is the most widely used embedded architecture and covers an enormous range of devices from multiple vendors and a wide range of costs. That said there are significant differences between ARM7, 9, 11, and Cortex devices - especially Cortex. However if getting into embedded systems professionally is your aim, ARM experience will serve you well.
8 bit architectures are generally easier to use, but often very limited in both memory capacity and core speeds. Also because they are simple to use, 8-bit skills are relatively easy to acquire, so it is a less attractive skill for a potential employer because it is easy to fulfil internally or with less experienced (and therefore less expensive) staff.
However if this is a hobby rather than a career, the low cost of parts, boards, and tools, and ease of use may make 8 bit attractive. I would suggest AVR simply because it is supported by the free avr-gcc toolchain. Some 8 bit targets are supported by SDCC, another open source C compiler. I believe Zilog make their Z8 compiler available for free, but you may need to pay for the debug hardware (although this is relatively inexpensive). Many commercial tool vendors provide code-size-limited versions of their tools for evaluation and non-commercial use, but beware most debuggers require specialist hardware which may be expensive, although in some cases you can build it yourself if you only need basic functionality and low speeds.
Whatever you do do take a look at www.embedded.com. If you choose ARM, I have used WinARM successfully on commercial projects, although it is not built-for-comfort! A good list of ARM resources is available here. For AVR definitely check out www.avrfreaks.net
I would only recommend Microchip PIC parts (at least the low-end ones) for highly cost sensitive projects where the peripheral mix is a good fit to the application; not for learning embedded systems. PIC is more of a branding than an architecture, the various ranges PIC12, 16, 18, 24, and PIC32 are very different from each other, so learning on one does not necessarily stand you in good stead for using another - often you even need to purchase new tools! That said, the dsPIC which is based on the PIC24 architecture may be a good choice if you wanted to get some simple DSP experience at the same time.
In all cases check out compiler availability (especially if C++ support is a requirement) and cost, and debugger hardware requirements, since often these will be the most expensive parts of your dev-kit, the boards and parts are often the least expensive part.
This is kind of a hard question to answer as your ideal answer very much depends on what it is your interested in learning.
If your goal is just to dive a little deeper into the inner workings of computing systems i would almost recommend you forgo the embedded route and pick up a book on writing a linux kernel module. Write something simple that reads a temperature sensor off the SMbus or something like that.
If your looking at getting into high level (phones, etc) embedded application development, download the Android SDK, you can code in java under eclipse and even has a nice emulator.
If your looking at getting into the "real" microcontroller space and really taking a look at low level system programming, i would recommend you start on a very simple architecture such as an AVR or PIC, something without an MMU.
Diving into the middle ground, for example an ARM with MMU and some sort of OS be it linux or otherwise is going to be a bit of a shock as without a background is both system programming and hardware interfacing i think the transition will be very rough if you plan to do much other than write very simple apps, counting button presses or similar.
Texas Instruments has released a very interesting development kit at a very low price: The eZ430-Chronos Development Tool contains an MSP430 with display and various sensors in a sports watch, including a usb debug programmer and a usb radio access point for 50$
There is also a wiki containing lots and lots of information.
I have already created a stackexchange proposal for the eZ430-Chronos Kit.
No it doesn't matter if you want to learn how to program an embedded device. But you need to know the flow of where to start and where to go next. Cause there are many micro-controllers out there and you don't know which one to choose. So better have a road-map before starting.
In my view you should start with - Any AVR board (atmega 328P- arduino boards or AVR boards)
then you should go to ARM micro-controller - first do ARM CORTEX TDMI
then ARM cortex M3 board.Thus this will give you an overall view after which you can choose any board depending on what kind of project you are working and what are your requirements.
Whatever you do, make sure you get a good development environment. I am not a fan of Microchip's development tools even though I like their microcontrollers (I have been burned too many times by MPLAB + ICD, too much hassle and dysfunction). TI's 2800 series DSPs are pretty good and have an Eclipse-based C++ development environment which you can get into for < US$100 (get one of the "controlCARD"-based experimenter's kits like the one for the 28335) -- the debugger communications link is really solid; the IDE is good although I do occasionally crash it.
Somewhere out there are ICs and boards that are better; I'm not that familiar with the embedded microcontroller landscape, but I don't have much patience for poor IDEs with yet another software tool chain that I have to figure out how to get around all the bugs.
Some recommend the ARM. I'd recommend it, not as a first platform to learn, but as a second platform. ARM is a bit complex as a platform to learn the low-level details of embedded, because its start-up code and initialisation requirements are more complicated than many other micros. But ARM is a big player in the embedded market, so well worth learning. So I'd recommend it as a second platform to learn.
The Atmel AVR would be good for learning many embedded essentials, for 3 main reasons:
Architecture is reasonably straight-forward
Good development kits available, with tutorials
Fan forum with many resources
Other micros with development kits could also be good—such as MSP430—although they may not have such a fan forum. Using a development kit is a good way to go, since they are geared towards quickly getting up-and-running with the micro, and foster effective learning. They are likely to have tutorials oriented towards quickly getting started.
Well, I suppose the development kits and their tutorials are likely to gloss over such things as bootloaders and start-up code, in favour of getting your code to blink the LED as soon as possible. But that could be a good way to get started, and you can explore the chain of events from "power-on" to "code running" at your pace.
I'm no fan of the PICs, at least the PIC16s, due to their architecture. It's not very C-friendly. And memory banks are painful.
It does matter, you need to gradually acquire experience starting with simpler systems. Note that by simpler I dont mean less powerful, I mean ease of use, ease of setup etc. In that vein I would recommend the following (I have no vested interest in a any of the products, I just found them the best):
I've started using one of these (MBED developer board). The big selling points for me were that I could code in C or C++, straightforward connection vis USB and a slick on-line development environment (no local tool installation required at all!).
http://mbed.org/
Five minutes afer opening box I had a sample blinky program (the 'hello world' of the emedded world) running the following:
#include "mbed.h"
DigitalOut myled(LED1);
int main()
{
while(1)
{
myled = 1;
wait(0.2);
myled = 0;
wait(0.2);
}
}
That's it! Above is the complete program!
It's based on ARM Cortex M3, fast and plenty of memory for embedded projects (100mhz, 256k flash & 32k ram). The online dev tools have a very good library and plenty of examples and theres a very active forum. Plenty of help on connecting devices to MBED etc
Even though I have plenty of experience with embedded systems (ARM 7/9, Renases M8/16/32, Coldfire, Zilog, PIC etc) I still found this a refreshingly easy system to get to grips with while having serious capability.
After initially playing with it on a basic breadboard I bought a base board from these guys: http://www.embeddedartists.com/products/lpcxpresso/xpr_base.php?PHPSESSID=lj20urpsh9isa0c8ddcfmmn207. This has a pile of I/O devices (including a miniture OLED and a 3axis accelerometer). From the same site I also bought one of the LCPExpresso processor boards which is cheap, less power/memory than the MBED but perfect for smaller jobs (still hammers the crap out of PIC/Atmega processors). The base board supports both the LCPExpresso and the MBED. Purchasing the LCPExpress processor board also got me me an attached JTAG debugger and an offline dev envoronment (Code Red's GCC/Eclipse based dev kit). This is much more complex than the online MBED dev environment but is a logical progression after you've gained expeience with the MBED.
With reference to my original point noite that the MBED controller is much more capable than the the LPCExpresso controller BUT is much simpler to use and learn with.
I use microchips PIC's, its what I started on, I mainly got going on it due to the 123 microcontroller projects for the evil genius book. I took a Microprocessors class at school for my degree and learned a bit about interrupts and timing and things, this helped a ton with my microcontrollers. I suppose some of the other programmers etc may be better/easier, but for $36 for the PicKit1, I'm too cheap to go buy another one...and frankly without using them I don't know if they are easier/better, I like mine and recommend it every chance I get, and it took me forever to really actually look at it, but I was able to program another chip off board with ICSP finally. I don't know what other programmers do it, but for me that's the nicest thing 5 wire interface and you're programmed. Can't beat that with a stick...
I've only used one of those.
The Freescale is a fine chip. I've used HC-something chips for years for little projects. The only caveat is that I wouldn't touch CodeWarrier embedded with a 10 foot pole. You can find little free C compilers and assemblers (I don't remember the name of the last one I used) that do the job just fine. Codewarrior was big and confusing and regardless of how much I knew about the chip architecture and C programming always seemed to only make things harder. If you've used Codewarrior on the Mac back in the old days and think CW is pretty neat, well, it's not at all like that. CW embedded looks vaguely similar, but it works very differently, and not very well.
A command-line compiler is generally fine. Professionals who can shell out the big bucks get expensive development environments, and I'm sure they make things better, but without that it's still far better than writing assembly code for a desktop PC in 1990, and somehow we managed to do that just fine. :-)
You might consider a RoBoard. Now, this board may not be what you are looking for in terms of a microcontroller, but it does have the advantage of being able to run Windows or DOS and thus you could use the Microsoft .NET or even C/C++ development tools to fiddle around with things like servos or sensors or even, what the heck, build a robot! It's actually kinda fun.
There's also the Axon II, which has the ATmega640 processor.
Either way, both boards should help you achieve your goal.
Sorry for the robotics focus, just something I'm interested in and thought it may help you too.
I use PICs, but would consider Arduino if I chose today. But from your goals:
how IO ports work
memory limitations/requirements
interrupt service routines
I wonder if you best bet is just to hack in the Linux kernel?
BBC Micro Bit
https://en.wikipedia.org/wiki/Micro_Bit
This cheap little board (~20 pounds) was crated by ARM Holdings as an educational device, and 1M units were given out for free to UK students.
It contains an ARM Cortex-M0, the smallest ARM core of all.
I recommend it as a first micro-controller board due to its wide availability, low cost, simplicity, and the fact that it introduces you to the ARM architecture, which has many more advanced boards also available for more serious applications.

What's the best description for "embedded hardware system"?

When I hear that, I always think about an mobile device. But why is the hardware "embedded" there? Isn't the whole device the hardware? Why is a personal computer no embedded hardware system?
In today's world embedded simply refers to a system with one or more of the following traits:
Single purpose (ie, not a general purpose computer, like your desktop)
Firmware rather than software - still software, but not as easily updated (flash, etc)
Hardware and software are designed together as a unit
Different, perhaps more rigorous testing as software updates are not desired
Real time computing
Memory integrated on the CPU
Microcontroller rather than microprocessor
Expected high reliability (you shouldn't have to reboot your dishwasher or microwave)
If it runs a program, but doesn't look like a computer, it's an embedded system.
That's my standard answer for friends and family. There's too many different types of embedded systems to get more specific.
I worked in the "embedded" area for a while and we considered anything that we had to write custom code for the hardware to be embedded.
If you have to work around the memory structure, write custom device drivers and anything that sits "directly on the metal" is generally "embedded".
If you're debugging it via a serial port - it's embedded.
It is called "embedded" because the computer is embedded as part of a larger device.
There is a very wide range of embedded systems.
At the low end are 8-pin PICs, for example there is a 12F629 in these diode lights. These costs cents and have very little memory.
The NXT by LEGO contains two controllers, a relatively big AT91SAM7S256 with a 32-bit ARM core, 256KB of flash ROM and 64KB of RAM, and a smaller 8-bit ATmega48 with 4KB of flash.
Currently I'm working on embedded systems for trains, these typically have a PowerPC with some hundreds of MHz clock, on the order of a hundred MB of RAM, run VxWorks or Linux and are connected by Ethernet.
I think there are still more powerful embedded systems for telecommunications, but I haven't worked on these.
As per Wikipedia:
An embedded system is a
special-purpose computer system designed to perform one or a few
dedicated functions, often with
real-time computing constraints. It is
usually embedded as part of a complete
device including hardware and
mechanical parts. In contrast, a
general-purpose computer, such as a
personal computer, can do many
different tasks depending on
programming.
Embedded systems are designed to do some specific task, rather than be a
general-purpose computer for multiple
tasks. Some also have real-time
performance constraints that must be
met, for reasons such as safety and
usability; others may have low or no
performance requirements, allowing the
system hardware to be simplified to
reduce costs.
Embedded systems are not always standalone devices. Many embedded
systems consist of small, computerized
parts within a larger device that
serves a more general purpose. For
example, the Gibson Robot Guitar
features an embedded system for tuning
the strings, but the overall purpose
of the Robot Guitar is, of course, to
play music.[2] Similarly, an embedded
system in an automobile provides a
specific function as a subsystem of
the car itself.
The program instructions written for embedded systems are referred to as
firmware, and are stored in read-only
memory or Flash memory chips. They run
with limited computer hardware
resources: little memory, small or
non-existent keyboard and/or screen.
From personal experience, if it's "headless" (i.e. doesn't have an output device like a VDU and relies on something like LED's), if there is a serial port used mainly for debugging and logging and if you often use a logic analyser for debugging, it's embedded.
"Embedded" has become a very diverse term.
I've seen and worked on designs that:
Simply toggled discrete I/O (including LEDs) at fixed intervals
Drivers for hardware solutions (e.g. webcams, wireless com)
Acted as communications translators for board-level I/O (SPI<->I2C<->Rs232<->USB)
[ insert multitude of appliances here ]
Human-controlled electronics (calculator-esque, phone-esque)
System level devices to coordinate actions of other devices.
I also like Dour-High-Arch's comment above:
"Another important difference is that embedded apps may run for years without intervention..."
"Embedded system" is a very broad term and I don't think that it is easy to have a single definition. The word "embedded" actually refers to an industry and not to a "hardware system". The description of embedded systems has changed over the years and it is definitely going to change in the future too.
In early days one would say the embedded systems were only programmed in assembly, but now C is common place and perhaps in the future other languages are used as well. CPUs are getting bigger and bigger, external memories are used all the time and they are many devices considered to be embedded that are not dedicated to a single task, applications can be added to them and the software is easily updated. Watches, gadgets, house appliances, automotive devices, PLCs, motor controllers, weather stations, system monitoring devices are all considered embedded. It is difficult to singe define them all.