Per-Process Network Monitor in Obj-C? - objective-c

I want to write a program that monitors packets in and packets out, or in other words, network statistics. Is there any useful class in cocoa? I know there are some existed tools, but it will be a sub-routine of my program. So I must implement on my own.
Thanks a lot!
P.S. I only want to calculate one program / process 's statistics rather than the whole system.

You won't find much in Cocoa, you'll probably have to delve deeper into Core stuff. In the recently released pool of WWDC videos, they have a video on Core OS Networking that covers the Mac and iPhone platforms both, and gives a pretty good overview of some of the classes you might be needing to look at.
Might not hurt to give the Core Networking stuff a glance, since whatever you're monitoring is most likely using it:
http://developer.apple.com/mac/library/documentation/Networking/Conceptual/CFNetwork/Introduction/Introduction.html

Related

What Skill set should a low level programmer possess?

I am an embedded SW Engineer, with less than 3 yrs of experience. I aim to "sharpen the saw" continuously. I was wondering if there was anything specific to low level programming that C/C++ coders should be proficient with.
What comes to my mind is familiarity with the hardware's architecture and instruction set. Knowing how to fiddle with bits is also important, resource management and performance have been part of my job, is there anything else?
EDIT: I work with an in-house customized RTOS, not embedded Linux.
I see a lot of high-level operating system answers here, but you specifically said low-level.
Some scattered thoughts:
Design for test. As you work through a problem only change one thing at a time per test.
You need to understand busses and interfaces, spi, i2c, usb, ethernet, etc. Number one interface, today, yesterday, and tomorrow, the uart, serial.
The steps involved in programming a flash.
Tricks to avoid making the product easily brickable.
Bootloaders in general.
Bit-banging above said interfaces on various families of parts (different chip
vendors have different ideas about io pins, pull ups, direction
controls, etc).
Board and chip bring up, you certainly never want to
boot a many tens of thousands of lines of code program on the first
power up (think led on, led off).
How to debug a product without using too much test equipment (logical analyzers and scopes), at the same time you have to learn to use a scope for debugging, you are far
more valuable if you don't HAVE TO have a tech or engineer in the lab
with you.
How would you reprogram the unit in the field? What would
you do to minimize human error when allowing the user to field
upgrade the unit? Remember field downgrades as well.
What would you do to discourage hacking (binaries, etc).
Efficient use of the flash/rom (don't wear out one bank or section, spread the wear around, or see if the flash is doing it for you).
How and when to use a watchdog timer.
State machines, very useful with bytestreams (serial and ethernet), design packet structures that stream well and are tailored to a state machine, and that have a header and checksum or other structure that insures you do not interpret partial packets or
random data as a good packet.
Specific concepts like,
Endianness (this link is to an old but good linuxjournal article)
Effective use of multithreading architectures (the Embedded site is good in general)
Debugging embedded and multithreaded systems
Understand, Learn and Follow good programming techniques (the link is very old and the point very generic and subjective, but think about it)
Other things (this IBM page on embedded linux sums up most of the other points I want to make)
One more thing -- never underestimate testing! or, planning test cases!!
Use the reference links I give as concepts,
please followup further for deeper knowledge.
I'd study the electronics of the actual chips. Learn how they work internally (such as architecture), interface with peripherals, electrical and timing characteristics, etc.
Basically, read the data sheet start to finish a few times and dig into anything you've not seen/used before.
By the way, what chips do you work with?
Similar to what Brian said, learn how to create unit tests and automated builds.
These skills are are good for all levels of software engineers to be proficient in. They will help improve the quality of your code while also making it easier to refactor and improve the code base.
If you haven't yet I think every Software Engineer should read The Pragmatic Programmer and Code Complete. I know these are not specific to low level programming, but have a large wealth of knowledge in them that applies to all sub disciplines.
Having great familiarity with pointers, the checks these languages don't do much (like buffer overflow and stuff like that), digital electronics. Operational systems internals might also help.
Get to know how stuff is represented internally, specially ready-made data structures (supposing you won't build your own one).
Above all, practice a lot. Doing it brings much more to you than just reading about it ;)
bit operations
processor architectures (caches, etc)
wcet analysis
scheduling
Edit: What I forgot to mention is model based development.
Today, the control algorithms are often implemented as some kind of automaton from which C code is generated afterwards.
Commercial available tools are for example MATLAB/Simulink, ASCET or SCADE.
Get yourself a copy of the MISRA-C book. It was originally written by members of the automotive industry, and attempts to make software written in C more robust by applying a number (quite a large number!) of rules and guidelines.
Then, buy PC-Lint (or another static analysis tool) to check your code for MISRA and other rules.
These are particularly relevant to low-level and embedded C, as between them they deal with the causes of a lot of bugs in such software, such as issues relating to pointers, memory leaks, integer promotion (there's a whole chapter on that in the MISRA book), endianness, and undefined behaviour.
Good question. Some that haven't been mentioned...
Learn your various options for achieving low-level multitasking. From basic round-robin (non-preemptive) schedulers, with timing ticks from a hardware timer, up to a preemptive RTOS. Learn why you might need an RTOS, and why you might not. If you use an RTOS, learn that beginners with a PC background probably tend to want to create too many tasks.
Getting visibility into the internals for debugging can be a challenge. There's no screen typically, so no throwing in "printf" calls wherever you want. An emulator or JTAG interface is ideal--you can set breakpoints and step through your program (as long as halting the micro doesn't make hardware go crazy, like swinging a robot arm around at full speed!). If emulator/JTAG is not available, learn how to use a spare serial port (or maybe even bit-bash a pin to make a serial port) for a debug channel, with some simple memory peek/poke commands.

embedded application

In the last two months I've worked as a simple application using a computer vision library(OpenCV).
I wish to run that application directly from the webcam without the need of an OS. I'm curious to know if that my application can be burned into a chip in order to not have the OS to run it.
Ofcorse the process can be expensive, but I'm just curious. Do you have any links about that?
ps: the application is written in C.
I'd use something bigger than a PIC, for example a small 32 bit ARM processor.
Yes. It is theoretically possible to port your app to PIC chips.
But...
There are C compilers for the PIC chip, however, due to the limitations of a microcontroller, you might find that the compiler, and the microcontroller itself is far too limited for computer vision work, especially if your initial implementation of the app was done on a full-blown PC:
You'll only have integer math available to you, in most cases, if not all (can't quote me on that, but our devs at work don't have floating point math for their PIC apps and it causes many foul words to emanate from their cubes). Either that, or you'll need to hook to an external math coprocessor.
You'll have to figure out how to get the PIC chip to talk USB to the camera. I know this is possible, but it will require additional hardware, and R&D time.
If you need strict timing control,
you might even have to program the
app in assembler.
You'd have to port portions of OpenCV to the PIC chip, if it hasn't been already. My guess is not.
If your'e not already familiar with microcontroller programming, you'll need some time to get up to speed on the differences between desktop PC programming and microcontroller programming, and you'll have to gain some experience in that. This may not be an issue for you.
Basically, it would probably be best to re-write the whole program from scratch given a PIC chip constraint. Good thing is though, you've done a lot of design work already. It would mainly be hardware/porting work.
OR...
You could try using a small embedded x86 single-board PC, perhaps in the PC/104 form factor, with your OS/app on a CF card. It's a real bone fide PC, you just add your software. Good thing is, you probably wouldn't have to re-write your app, unless it had ridiculous memory footprint. Embedded PC vendors are starting to ship boards based on 1 GHz Intel Atoms, and if you needed more help you could perhaps hook a daughterboard onto the PC-104 bus. You'll work around all of the limitations listed above, as your using an equivalent platform to the PC you developed your app on. And it has USB ports! If you do a thorough cost analysis and if your'e cool with a larger form factor, you might find it to be cheaper/quicker to use a system based on a SBC than rolling a solution using PIC chips/microcontrollers.
A quick search of PC-104 on Google would reveal many vendors of SBCs.
OR...
And this would be really cheap - just get a off-the-shelf cheap Netbook, overwrite the OEM OS, and run the code on there. Hackish, but cheap, and really easy - your hardware issues would be resolved within a week.
Just some ideas.
I think you'll find this might grow into pretty large project.
It's obviously possible to implement a stand-alone hardware solution to do something like this. Off the top of my head, Rabbit's solutions might get you to the finish-line faster. But you might be able to find some home-grown Beagle Board or Gumstix projects as well.
Two Google links I wanted to emphasize:
Rabbit: "Camera Interface Application Kit"
Gumstix: "Connecting a CMOS camera to a Gumstix Connex motherboard"
I would second Nate's recommendation to take a look at Rabbit's core modules.
Also, GHIElectronics has a product called the Embedded Master that runs .Net MicroFramework and has USB host/device capabilities built-in as well as a rich library that is a subset of the .Net framework. It runs on an Arm processor and is fairly inexpensive (> $85). Though not nearly as cheap as a single PIC chip it does come with a lot of glue logic pre-built onto the module.
CMUCam
I think you should have a look at the CMUcam project, which offers affordable hardware and an image processing library which runs on their hardware.

starting a microcontroller simulator/emulator

I would like to create/start a simulator for the following microcontroller board: http://www.sparkfun.com/commerce/product_info.php?products_id=707#
The firmware is written in assembly so I'm looking for some pointers on how one would go about simulating the inputs that the hardware would receive and then the simulator would respond to the outputs from the firmware. (which would also require running the firmware in the simulated environment).
Any pointers on how to start?
Thanks
Chris
Writing a whole emulator is going to be a real challenge. I've attempted to write an ARM emulator before, and let me tell you, it's not a small project. You're going to either have to emulate the entire CPU core, or find one that's already written.
You'll also need to figure out how all the IO works. There may be docs from sparkfun about that board, but you'll need to write a memory manager if it uses MMIO, etc.
The concept of an emulator isn't that far away from an interpreter, really. You need to interpret the firmware code, and basically follow along with the instructions.
I would recommend a good interactive debugger instead of tackling an emulator. The chances of destroying the hardware is low, but really, would you rather buy a new board or spend 9 months writing something that won't implement the entire system?
It's likely that the PIC 18F2520 already has an emulator core written for it, but you'll need to delve into all the hardware specs to see how all the IO is mapped still. If you're feeling up to it, it would be a good project, but I would consider just using a remote debugger instead.
You'll have to write a PIC simulator and then emulate the IO functionality of the ports.
To be honest, it looks like its designed as a dev kit - I wouldn't worry about your code destroying the device if you take care. Unless this a runner-up for an enterprise package, I would seriously question the ROI on writing a sim.
Is there a particular reason to make an emulator/simulator, vs. just using the real thing?
The board is inexpensive; Microchip now has the RealICE debugger which is quite a bit more responsive than the old ICD2 "hockey puck".
Microchip's MPLAB already has a built-in simulator. It won't simulate the whole board for you, but it will handle the 18F2520. You can sort of use input test vectors & log output files, I've done this before with a different Microchip IC and it was doable but kinda cumbersome. I would suggest you take the unit-testing approach and modularize the way you do things; figure out your test inputs and expected outputs for a manageable piece of the system.
It's likely that the PIC 18F2520 already has an emulator core written for it,
An open source, cross-platform simulator for microchip/PICs is available under the name of "gpsim".
It's extremely unlikely that a bug in your code could damage the physical circuitry. If that's possible, then it is either a bug in the board design or it should be very clearly documented.
If I may offer you a suggestion from many years of experience working with these devices: don't program them in assembly. You will go insane. Use C or BASIC or some higher-level language. Microchip produces a C compiler for most of their chips (dunno about this one), and other companies produce them as well.
If you insist on using an emulator, I'm pretty sure Microchip makes an emulator for nearly every one of their microcontrollers (at least one from each product line, which would probably be good enough). These emulators are not always cheap, and I'm unsure of their ability to accept complex external input.
If you still want to try writing your own, I think you'll find that emulating the PIC itself will be fairly straightforward -- the format of all the opcodes is well documented, as is the memory architecture, etc. It's going to be emulating the other devices on the board and the interconnections between them that will kill you. You might want to look into coding the interconnections between the components using a VHDL tool that will allow you to create custom simulations for the different components.
Isn't this a hardware-in-the-loop simulator problem? (e.g. http://www.embedded.com/15201692 )

Is low-level / embedded systems programming hard for software developers? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
Given my background as a generalist, I can cover much of the area from analog electronics to writing simple applications that interface to a RDBMS backend.
I currently work in a company that develops hardware to solve industry-specific problems. We have an experienced programmer that have written business apps, video games, and a whole bunch of other stuff for PC's. But when I talk to him about doing low-level programming, he simultaneously express interest and also doubt/uncertainty about joining the project.
Even when talking about PC's, he seems to be more comfortable operating at the language level than the lower-level stuff (instruction sets, ISR's). Still, he's a smart guy, and I think he'd enjoy the work once he is over the initial learning hump. But maybe that's my own enthusiasm for low-level stuff talking... If he was truly interested, maybe he would already have started learning stuff in that direction?
Do you have experience in making that software-to-hardware (or low-level software) transition? Or, better yet, of taking a software only guy, and transitioning him to the low-level stuff?
Edit:
P.S. I'd love to hear from the responders what their own background is -- EE, CS, both?
At the end of the day, everything is an API.
Need to write code for an SPI peripheral inside a microcontroller? Well, get the datasheet or hardware manual, and look at the SPI peripheral. It's one, big, complex API.
The problem is that you have to understand the hardware and some basic EE fundamentals in order to comprehend what the API means. The datasheet isn't written by and for SW developers, it was written for hardware engineers, and maybe software engineers.
So it's all from the perspective of the hardware (face it - the microcontroller company is a hardware company filled with hardware/asic engineers).
Which means the transition is by no means simple and straightforward.
But it's not difficult - it's just a slightly different domain. If you can implement a study program, start off with Rabbit Semiconductor's kits. There's enough software there so a SW guy can really dig in with little effort, and the HW is easy to deal with because everything is wrapped in nice little libraries. When they want to do something complex they can dig into the direct hardware access and fiddle at the lower level, but at the same time they can do some pretty cool things such as build little webservers or pan/tilt network cameras. There are other companies with similar offerings, but Rabbit is really focused on making hardware easy for software engineers.
Alternately, get them into the Android platform. It looks like a unix system to them, until they want to do something interesting, and then they'll have the desire to attack that little issue and they'll learn about the hardware.
If you really want to jump in the deep end, go with an arduino kit - cheap, free compilers and libraries, pretty easy to start off with, but you have to hook wires up to do something interesting, which might be too big of a hurdle for a reluctant software engineer. But a little help and a few nudges in the right direction and they will be absolutely thrilled to have a little LED display that wibbles* like the nightrider lights...
-Adam
*Yes, that's a technical engineering term.
The best embedded programmers I've worked with are EE trained and learned SW on the job. The worst embedded developers are recent CS graduates who think SW is the only way to solve a problem. I like to think of embedded programming as the bottom of the SW pyramid. It's a stable abstraction layer/foundation that makes life easy for the app developers.
"Hard" is an extremely relative term. If you're used to thinking in the tight, sometimes convoluted way you need to for small embedded code (for example, you're a driver developer), then certainly it's not "hard".
Not to "bash" (no pun intended) shell scripters, but if you write perl and shell scripts all day, then it might very well be "hard".
Likewise if you're a UI guy for Windows. It's a different kind of thinking.
Why embedded development is "hard":
1) The context may switch to an interrupt between each machine instruction. Since high level language constructs may map to multiple assembly instructins, this might even be within a line of code, e.g. long var = 0xAAAA5555. If accessed in an interrupt service routine, in a 16 bit processore var might only be half set.
2) Visibility into the system is limited. You may not even have output to Hyperterm unless you write it yourself. Emulators don't always work that well or consistently (though they are way better than they used to be). You will have to know how to use oscilloscopes and logic analyzers.
3) Operations take time. For example, say your serial transmitter uses an interrupt to signal when it is time to send another byte. You could write 16 bytes to a transmit buffer, then clear interrupts and wonder why your message is never sent. Timing in general is a tricky part of embedded programming.
4) You are subject to subtle race conditions that occur only rarely and are very difficult to debug.
5) You have to read the manual. A lot. You can't make it work by fooling around. Sometimes 20 things have to be set up correctly to get what you are after.
6) The hardware doesn't always work or is easy to damage, and it takes a while to figure out that you broke it.
7) Software repairs in embedded systems are usually very expensive. You can't just update a web page. A recall can erase any profit you made on the device.
There are probably more but I've got this race condition to solve...
This is very subjective I guess, his reasons could be many. But if he's like me, I know where he's coming from. Let me explain.
In my career I've dedicated 6 years to the telecom industry, working a lot with embedding SDK middleware into low-end mobile phones etc.
Most embedded environments I've experienced are like harsh weather for a programmer, you constantly have to overcome limitations in resources etc. Some might find this a challenge and enjoy it for the challenge itself, some might feel close to "the real stuff" - the hardware, some might feel it limits their creativity.
I'm the kind who feels it limits my creativity.
I enjoy being back in Windows desktop environment and flap my wings with elaborate class designs, stretch my legs a few clockcycles extra, use unnecessary amounts of memory for diagnostics etc.
On certain embedded units in the past, I hardly had support for fseek() (an ANSI C standard file function). If lucky, a "watchdog" could give clues to where something crashed. Not to mention the pain of communicating with the user in single-threaded preemptive swamps.
Well, you know what I'm getting at. In my opinion it's not necessarily hard, but it's quite a leap, with potentially little reuse of your current experience.
Regards
Robert
There is a very real difference in mindset from user-level application development (ie, general purpose PC or Web applications) to hard deadline, real-time response application development (ie, the hardware/software interface).
Interrupts, instruction sets, context switching and hard resource constraints are relatively unknown to your average developer. I'm assuming here that your 'average developer' is not an Electrical/Electronic or other Engineer by training.
The transition for this developer you mention may be well outside his comfort zone. Some of us like stretching like that. Others of us may have decided the view isn't worth the climb.
Likewise, folks who've been in the hardware area (ie, Engineers) often have difficulty with the assumptions and language of software development.
These are gross generalities, of course, but hopefully give some insight.
He needs to be comfortable with the low-level stuff, but mostly for debugging and field issues. There is a serious learning curve depending on the architecture, but not impossible. On the other hand, the low-level code takes (in general) more time and debugging than higher-level code. So if you need to be going back to low-level all the time, then perhaps something isn't right in the design. Even for the embedded controls I've built, I spend the vast majority of time in high-level code. Although when you have issues, it is extremely advantageous to have a very good low-level knowledge.
I am an EE turned Software Engineer. I prefer programming low level. Most software developers classically trained that I know do not want to operate at this level they want apis to call. So for me it is a win win, I create the low level driver and api for them to use. There is a "new" degree, at least new since I went to college, called Computer Engineer. Hmm, it might be an electrical engineering degree not computer science, but it is a nice mix of software and digital hardware basics. The individuals that I have worked with from this field are much more comfortable with low level.
If the individual is not comfortable or willing then place them somewhere where they are comfortable. Let them do documentation or work on the user interface. If all of the work at the company requires low level work then this individual needs to do it or find another job. Dont sugar coat it.
I also think they will enjoy it once they get over the hump, the freedom you have at that level, not hindered by operating systems, etc. Recently I witnessed a few co-workers experience for the first time seeing their software run under simulation. Every net within the processor and other on chip peripherals. No you dont have a table on a gui (debugger) showing the current state of the memory, you have to look at the memory bus, look for the address you are interested in, look for a read or write signal and the data bus. I worry about the day that silicon arrives and they no longer have this level of visibility. Will be like an addict in detox.
Well, I cut my teeth on hardware when I started reading Popular Electronics at age 14 – this was BEFORE personal computers, in case you were wondering and if you weren’t well, you know anyway. lol
I’ve done the low level bit-bang stuff on the 8048/51 microprocessor, done PIC’s and some other single chip variations and of course Rabbit Semiconductor. (great if you're into C). That’s great (and fun) stuff; Yes, there is a different way of looking at things – not harder, but some of that information is a bit harder to come by as it isn’t as discussed as the software issues. (Of course, this depends on the circle of friends with which you associate, eh).
But, having said all of this, I want to remind you of a technology that started to bridge the gap for programmers into the world of hardware and has since become a very MAJOR player and that is the .NET micro framework. You can find information on this technology at the following;
http://msdn.microsoft.com/en-us/embedded/bb267253.aspx
It addresses some of the same issues that .NET web development addressed in that you can use some (quite a bit, actually) of your existing PC based knowledge in the new environments – Some caution, of course, as your target machine doesn’t have 4 GIG of RAM – it may only have 64K (or less)
Starting in version 2.5 of the .NET micro framework, you have access to networking and web services – way kewl, eh? It doesn’t stop there … Want to control the lights in your house? How about a temp recording station? All with the skills you already have. Well, mostly -- Check out the link.
The SDK plugs into your VisualStudio IDE. There are a number of “Development Kits” available for a very reasonable amount of cash – Now, what would normally take a big learning curve in components, building a circuit board and wiring up “stuff” can be done reasonably easy with a dev kit and some pretty simple code – Of course, you may need to do the occasional bit bang operation, but more and more sensor folks are providing .NET micro framework drivers – so, the hardware development may be closer than you think…
Hope it helps...
I like both. Embedded challenges me and really gets me going in a visceral way. Making something that affects the macro physical world is very satisfactory. But I've had to do a lot of catch up on the electrical/electronics end, since my bachelor's is in computer science. I've a pretty generalist background, where I studied ai, graphics, compilers, natural language, etc. Now I'm doing graduate work in embedded systems. The really tough part is adjusting to the lack of runtime facilities like an operating system.
Low-level embedded programming also tends to include low-level debugging. Which (in my experience) usually involves (at least) the use of an oscilloscope. Unless your colleague is going to be happy spending at least some of the time in physical contact with the hardware and thinking in terms of microseconds and volts, I'd be tempted to leave them be.
Agreed on the "hard" term is quite relative.
I would say different, as you would need to employ different development patterns that you won't use in other kind of environment.
The time constraint for instance could requires a learning curve.
However being curious, would be a quality for a developer, wouldn't be?
You are right in that anyone with enough knowledge not to feel completely lost in an area (over the hump?) will enjoy the challenges of learning something new.
I myself would feel quite nervous being moving to the level of instruction sets etc as there is a huge amount of background knowledge needed to feel comfortable in the environment.
It may make a difference if you are able to support the developer in learning how to do this. Having someone there you can ask and talk through issue with is a huge help in that sort of domain change.
It may be worth having the developer assigned to a smaller project with others as a first step and see how that goes. If he expresses enthusiasm to try another project, things should flow on from there.
I would say it is not any harder, it just requires a different knowledge set, different considerations.
I think that it depends on the way that they program in their chosen environment, and the type of embedded work that you're talking about.
Working on an embedded linux platform, say, is a far smaller jump than trying to write code on an 8 bit platform with no operating system at all.
If they are the type of person that has an understanding of what is going on underneath the api and environment that they are used to, then it won't be too much of a stretch to move into embedded development.
However, if their world view stops at the high level api that they've been using, and they have no concept of anything beneath that, they are going to have a really hard time.
As a (very) general statement if they are comfortable working on multithreaded applications they will probably be ok, as that shares some of the same issues of data volatility that you have when working on embedded projects.
With all of that said, I've seen more embedded programmers successfully working in PC development than I have the reverse. (of course I might not have seen a fair cross section)
"But when I talk to him about doing low-level programming, he simultaneously express interest and also doubt/uncertainty about joining the project." -- That means you let him try and you prepare to hire someone else in case he doesn't pass the learning curve.
i began as a SW engineer i'm now HW one !
the important is to understand how it works and to be motivated !

Best platform for learning embedded programming? [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
I'm looking to learn about embedded programming (in C mainly, but I hope to brush up on my ASM as well) and I was wondering what the best platform would be. I have some experience in using Atmel AVR's and programming them with the stk500 and found that to be relatively easy. I especially like AVR Studio and the debugger that lets you view that state of registers.
However, If I was to take the time to learn, I would rather learn about something that is prevalent in industry. I am thinking ARM, that is unless someone has a better suggestion.
I would also be looking for some reference material, I have found the books section on the ARM website and if one is a technically better book than another I would appreciate a heads up.
The last thing I would be looking for is a prototyping/programming board like the STK500 that has some buttons and so forth.
Thanks =]
"embedded programming" is a very broad term. AVR is pretty well in that category, but it's a step below ARM, in that it's both simpler to use, as well as less powerful.
If you just want to play around with ARM, buy a Nintendo DS or a Gameboy Advance. These are very cheap compared to the hardware inside (wonders of mass production), and they both have free development toolchains based off of gcc which can compile to them.
If you want to play around with embedded linux, BeagleBoard is looking to be a good option, only $150 and it has a ton of features.
Personally I think AVR is best for the smaller-sized 8-bit platforms, and ARM is best for the larger, more powerful 32-bit based platforms. Like many AVR fans, I don't like PIC. It just seems worse in pretty much every way. Also avoid anything that requires you to write any type of BASIC.
If you just want to play around with it, I'd suggest the Arduino platform (http://www.arduino.cc). It's based on the ATmega168 or ATmega8, depending on the version. It uses a C-like language and has its own IDE.
Myself I've worked in embedded programming for 9 years now and have experience on TI MSP430, Atmel AVR (a couple of flavours) and will be using an ARM soon.
My suggestion is to pickup something that has some extra features in the processor like ethernet controller and CAN controller, even get two or three if you can. Embedded devices are nice to work with, but once they can talk to other similar devices via CAN or get onto a network, they can become much more fun to play with.
ADI's Blackfin is another option since it's quite a straight forward architecture to program, yet can also do some fairly hefty DSP stuff should you choose to go down that route. It helps that the assembly language is quite sane too.
The Blackfin STAMP boards are an inexpensive (~$100 last I checked) way in, and they support the free GCC tools and uClinux.
Whatever architecture you choose I'd definitely recommend first downloading the toolchain\SDK and looking through the sample projects and tutorials - generally having a bit of a play about. You can often get quite acquainted with the architecture through simulation without even touching any hardware.
ARM has the nicest instruction set of the widely used embedded platforms, leaving you free to pick up the general principles of writing software for embedded platforms without getting bogged down in weird details like non-orthogonal registers or branch delay slots. There are plenty of emulators - ARM's own, while not free, is cycle-accurate; and a huge variety of programmable ARM-based hardware is cheap and easy to come by as well.
The TI MSP430 is a great platform for learning how to program microcontrollers. TI has a variety of FREE Tools and some cheap evaluation boards (starting at $20). Plus, it's a low-power, modern microcontroller.
A nice choice would be PIC18 by Microchip
It has quite alot of material, documentation, tutorials and projects on the internet
Free IDE and compiler.
you can pull your own flash writer in a few minutes.
(Although for a debugger to work you'll need to work harder)
If you're a student (or has a student email address) Microchip will send you free sample chips. So basically you can have a full development environment for close to nothing.
PICs are quite prevalent in the industry. Specifically as controllers for robots for some reason although they can do so much more.
Arduino seems to be the platform of choice these days for beginners although there are lots of others. I like the Olimex boards personally but they are not really for beginners.
Microchip's PIC range of CPUs are also excellent for beginners, especially if you want to program in assembler.
BTW, Assembler is not used as much as it used to. The general rule with embedded is if you've got 4k of memory or more, use C. You get portability and you can develop code faster.
I suppose it depends on your skill level and what you want to do with the chip. I usually choose which embedded chip to use by the available peripherals. If you want a USB port, find one with USB built in, if you want analogue-to-digital, find one with an ADC etc. If you've got a simple application, use an 8-bit but if you need serious number crunching, go 32 bits.
I'd like to suggest the beagleboard from TI. It has a Omap3 on it. That's a Cortex-A8 ARM11 CPU, a C64x+ DSP and a video accelerator as well.
The board does not need an expensive jtag device. A serial cable an an SD-Card is all you need to get started. Board costs only $150 and there is a very active community.
www.beagleboard.org
Your question sort of has been answered in this question.
To add to that, the embedded processor industry is very segmented, it doesn't have a major player like Intel/x86 is for the "desktop" processor industry. The ARM processor does have a large share, so does MIPS I believe, and there are many smaller more specific microcontroller like chips available (like the MSP430 etc from TI).
As for documentation, I do embedded development for a day job, and the documentation we have access to (as software developers) is rather sparse. Your best bet is to use the documentation available on the processor manufacturers site.
Take a look at Processing and the associated Arduino and Wiring boards.
If you just want to have fun, then try the Parallax Propeller chip. The HYDRA game platform looks like a blast. There's a $100 C compiler for it now.
I started on BASIC stamps, moved up through SX chips and PICs into 8051s, then 68332s, various DSPs, FPGA soft processors, etc.
8051s are more useful in the real world... the things won't go away. There's TONS of derivatives and crazy stuff for them. (Just stay away from the DS80C400) The energy industry is absolutely full of them.
Start with something tiny. If you have external RAM and plenty of registers... what's the difference between that and a SBC?
Many moons ago I've worked with 8-bitters like 68HC05 and Z80, later AVR and MSP430 (16-bit). However most recent projects were on ARM7. Several manufacturers offer ARM controllers, in all colors and sizes (well, not really color).
ARM(7) is replacing 8-bit architecture: it's more performant (32-bit RISC at faster instruction cycles than most 8-bitters), has more memory and is available with several IO-configurations.
I worked with NXP LPC2000 controllers, which are also inexpensive (< 1 USD for a 32-bitter!).
If you're in Europe http://www.olimex.com/dev/index.html has some nice low-cost development boards. Works in the rest of the world too :-)
For a fun project to test, have a look at xgamestation
But for a more industrial used one chip solution programming, look at PIC
For my Computer Architecture course I had to work with both a PIC and an AVR; in my opinion the PIC was easier to work with, but that's maybe because that's what we worked with the most and we had the most time to get used to. We used the AVR maybe only a couple of times so I couldn't get the hang of it perfectly but it also was nothing overly complicated, or at least not more frustrating than the other.
I think you can also order microprocessor samples from Microchip's website so you could also get started with that?
Second that:
Arduino platform http://www.arduino.cc
HTH
For learning, you can't go past the AVR. The chips are cheap and they'll run with zero external components - they also supply enough current to drive an LED straight from the port.
You can start with a cheap programmer such as lady-ada's USBTinyISP (USD$22 for a kit) which can power your board with 5V from the USB port. Get the free tools WinAVR (GCC based) and AVRStudio and get a small project working in no time.
Yes the AVRs have limitations - but developing software for microcontrollers is largely about managing resources and coping with those problems. It's unlikely that you'll experience problems such as running out of stack space, RAM or ROM when you're making hobbist projects for powerful ARM platforms.
That said, ARM is also a great platform which is widely used in the industry, however, for learning I highly recommend AVRs.
I would suggest Microchip's PIC18F series. I just started developing for them with the RealICE in-circuit emulator, but the pickit2 is a decent debugger for the price. You could say this for the AVR's also, but there is a large following for the device all over the web. I was able to have a - buggy, yet functional - embedded USB device running within days due to all the PIC related chatter.
The only thing I don't like about the PICs is that a lot of the sample code is VERY entwined into the demo boards. That can make it hard to tear out sections that you need and still have an application that will build and run for your application.
Texas Instruments has released a very interesting development kit at a very low price: The eZ430-Chronos Development Tool contains an MSP430 with display and various sensors in a sports watch, including a usb debug programmer and a usb radio access point for 50$
There is also a wiki containing lots and lots of information.
I have already created a stackexchange proposal for the eZ430-Chronos Kit.
You should try and learn from developpers kits provided by Embedded Artists. After you get the kit, check their instructional videos and videos provided by NXP, which are not as detailed as they could be, but they cover a lot of things. Problems with learning ARM as your first architecture and try to do something practicall are:
You need to buy dev. kit.
You need a good book to learn ARM assembly, because sooner or later you will come across ARM startup code, which is quite a deal for a beginner. The book i mentioned allso covers some C programming.
Combine book mentioned above with a user guide for your speciffic processor like this one. Make sure you get this as studying this in combination with above book is the only way to learn your ARM proc. in detail.
If you want to make a transfer from ARM assembly to C programming you will need to read this book, which covers a different ARM processor but is easier for C beginner. The down side is that it doesn't explain any ARM assembly, but this is why you need the first book.
There is no easy way.
mikroElektronika has nice ARM boards and C, Pascal and Basic compilers that might suite your demands.