I really like my Mac, and I am thinking about developing Software for it some time in the future. What are the reasons why you're making Mac Software? Because you think Mac is so cool? Or is the market so interesting? I think many of us would like to know, right?
Well for me, as I said, it's the coolness of the Mac. I don't know anything about the market.
I like objective C as a language. I use mac's in my daily life. I like the flexibility of having a unixish operating system underneath. In short, it serves my needs and it helps me learn.
I attended a talk by Wil Shipley of Delicious Monster fame. His argument for developing on the Mac was quite simple:
Mac users buy more software on average
Average prices are higher
These seemed to me to be quite compelling reasons, couple this with a less crowded market place.
This was in the days before iPhone, you have a much broader customer base these days.
In addition the 'enforced' MVC for developing Mac and iPhone App make porting between platforms very easy, see the example of Tweetie for the iPhone and now Mac.
In short:
The Objective-C language, Cocoa and the applications that Mac OS X provides developers with (not just Xcode and Interface Builder).
Slightly longer:
I love the Objective-C syntax and the countless number of functions that Cocoa provides. You can do so much stuff when writing apps for Mac OS X without even needing to write a single line of code. And you can do even more stuff when you write just really easy "glue code". And when you need to write "real" code to implement something that isn't already available in one of the frameworks, it doesn't feel so tiresome because in my opinion, writing Objective-C code is fun.
The market is interesting too, there are for example many areas where there just is software missing for the Mac so you can be the first one to make an application for a certain task which can be a great thing for a new company to jump in. The market is also frustrating though because I think nowadays people expect too much from Mac software, at least I do. Mac software not only has to be absolutely mindblowingly good in terms of the technical implementation and usability/intuitiveness, it also has to look beautiful or cool or whatever. This makes it really hard for one man software shops for example. You need to be a coder and a designer and that's said to be impossible although there are exceptions of course. Compare that to Linux for example where everyone is happy with the standard GUI tools that the GUI system of choice provides. Maybe it's not as bad as I think it is though, this is just my impression and I could be dead wrong.
I haven't done much Mac programming for about three years now but every time I develop an app in Java, C#, Ruby, C or whatever else I'm using, I catch myself dreaming about writing the app in Objective-C with Xcode instead. But then I also start thinking about the upsides of developing cross platform software instead of software that will only run on Mac OS X. I use Windows, Mac OS X, Linux and some lesser known operating systems and being able to use the apps that I write on every one of these platforms instead of just on the Mac is too nice to give up. That's why I mainly develop cross platform software these days although I do like developing for Mac OS X better.
Users.
In some circles, Mac is the only option for some reason. You have whole companies without a single Windows machine.
Simple - because it's the OS I use. If I primarily used Windows, I'd probably be learning C#/.NET instead of ObjC/Cocoa.
Also, I would say Mac software has a better reputation than Windows software ("free Windows software" could be described as a pseudonym for "spyware"). There's a really good community around OS X software (sites like TUAW, iusethis.com etc etc), which I guess is because OS X has become quite recently, at the same time as all the blogging/social-media stuff has taken of ("good timing" more or less).
I program for Mac OS X because it's the environment I use, and I decided that a good way to ensure the quality of the environment I use was to have a hand in constructing that environment. It helps that there are good communities of Mac users, admins and developers so that understanding the environment, what's hot or not and so on is both easy and quick. That community feeling gives me a chance to both learn from others and show off a bit, and I enjoy both of those things.
I actually moved to OS X from NeXTSTEP, and the reason I was on NeXT was exactly the same - it was what I used. The computing lab at Uni was NeXT-based, it was a good UNIX platform which was easy to use and to grok and I realised that if I was developing for this then I'd be contributing to the quality of my own user experience, and there'd probably be a job in it once I'd graduated (there was) ;-).
Like some of the other posters I also like the Objective-C language, but it's not a deal-clincher for me. I've written (and still do write) tools in C and Python, and have previously maintained WebObjects Java code. There's nothing wrong with any of these things.
I've been a mac user since 1984 and an apple user for even longer...
nonetheless, I am still always inspired by how many amazing resources are available to developers right out of the box... (sometimes it takes a few years for those resources to get properly documented, but they are there and they are amazing.) examples: coreAudio, coreGraphics, coreVideo, FireWireSDK, etc etc etc...
I am not an evangelist and I am not paid by apple, just my $0.02
|K<
The Mac market is growing at an incredible rate. It's awesome to have your market grow without any work on our part ;)
There are a few good reasons to develop for Mac:
1.Apple provide good tools such as Xcode and the language is well documented as well as having a large community of support
2.It is very easy to port applications from OS X to iPhone/iPad.
3.Both OS X and iOS (iPhone and iPad) have app stores in which people buy lots of software providing a large potential audience and therefore providing developers with an easy way to make money from their applications.
Related
My company is planning to build a simulation tool for processing (beverage) and we're currently looking at a half-baked system written in Xojo. I had personally never heard of this language and would appreciate it if anyone could give a quick assessment.
We have no in-house Xojo competence at all and are of course reluctant to bring in a system that would require a big investment in know-how for just one system.
So, we're now looking at our options: Port it to a language we're good at (C# or Java) or continue development in Xojo while building internal skills for the language.
So, what are the big pro's and con's with Xojo?
Cheers
Xojo has been around since the late 1990s, then named RealBasic. Its strength lies in its ability to make native looking and behaving apps for many platforms, mainly OS X but also Win and even Linux. The dev community is fairly small, though. But the company managed to stay in business all this time and isn't looking to end it any time soon.
The language is fairly simple and easy to learn, using long known concepts (its design was based on Visual Basic).
Knowing Java, it should be easy to grasp the language. The bigger hurdle is probably getting familiar with its libary. Many things are much simpler to accomplish in Xojo vs Java, though.
Call me lazy, but that's what I like about Xojo. I also program ObjC in Xcode, but for those little tools that just need to work quickly, Xojo is superior for whipping out a program quickly that that has a decent UI and works on many platforms with little to no tweaks.
If you need x-platform support, give it a try, for sure. If you only need the app to run on a single platform, and if you have skills with other dev systems, I'd advise against starting out with Xojo, to avoid the risks you get when going with such a small company that's offering closed-source software.
In your particular case where you have already a half-working solution, I suggest you take a few days to familiarize yourself to get a feeling for it (you can use Xojo for free as long as you don't build standalone apps with it). It's overall fairly stable and I'm still using a 3-year old version most of the time to develop and build my apps. So, even if Xojo should go out of business suddenly, I'd not be too worried. As long as you stick with the simple functionality (e.g. not use unique features such as XojoScript), you can still convert the app to another language later, but there's also a fair chance you never have to.
If you are looking for someone to take a look at your Xojo project I'd recommend posting on the Xojo Find A Developer page at http://www.xojo.com/support/consultants.php where all Pro developers get it. The consultants that want to talk to you about it will then contact you. (Full disclosure: we, BKeeney Software, are on the list and would be happy to help you figure it all out).
The recent article of steve jobs link
made me think about the future of flash. I'm learning actionscript 3.0 in my studies but is it the right decision still to go for it? I was pretty sure that I will be able to build application in as3 for iphones/ipads in the near future. It seems to me, while I would stay with flash, the market will be polarized by apple and adobe and you will always work double for both clientele, or just lose half of them.
Which decision would you take as a designer, if you were still at university and you intend to become a freelancer?
This question has been around a lot of times. For my opinion on flash's future please look at this answer: Should I Abandon Adobe Flash for HTML5 and <canvas>?
If you are a designer, you will probably actually feel good working with Adobe's Creative Suite, including Flash CS3/CS4/CS5. CS5 will be able to export HTML5 in the near future: http://www.9to5mac.com/Flash-html5-canvas-35409730 . You shouldn't be too worried. OTOH you should consider, that whatever CS5 exports is likely to perform poorer in HTML5, than on flash player.
From my perspective as a developer, I think there is no harm in learning any language, altough ActionScript 3 is relatively boring and easy to grasp. However this makes it a good language to learn programming, including many best practices. The most important things you learn as a programmer transcend languages. The more languages you try to really fully understand and exploit, the better you become by understanding the approaches they promote.
My personal advice to web developers is to have a look at Haxe. It is a much more powerful, elegant and expressive language than ActionScript and it allows you to target many platforms. Enough to build a whole web app on 'classic' platforms with only one language. Haxe's C++ backend allows building native iPhone apps using an SDL based port of the flash player API, although currently it's not very clear whether Apple's policy will allow distribution. Nonetheless it is an open source language with an enthusiastic community, that moves really fast and adapts to changes rapidly (e.g. unlike ActionScript Haxe can leverage flash player 10's alchemy opcodes for fast memory access) making you as a developer independant.
edit: I have personally dropped any plans of targetting the platform until Apple is willing to ease its very restictive policies, since I find this kind of behaviour intollerable. Nonetheless, I think Objective-C is a great and inspiring language, so you may actually wanna have a look at it.
I think that reports of the death of flash have been greatly exaggerated. Flash has always been "the bad guy" - self-proclaimed experts have always loudly declared that Flash sucks and is on its way out, but oddly enough I've never had any trouble finding lots of Flash work. There are things that you can do quickly and easily in Flash that are either much harder or flat-out impossible without it. It's an amazing tool and it's going to be in use at least in some capacity for the foreseeable future.
That said, even if Flash on the web goes the way of the dodo in two years (which won't happen) it's still a valuable tool. It's a wonderful way to learn Object Oriented Programming, and its uses go far beyond shiny websites. You can use something like Flash Builder in Eclipse to get accustomed to working in a code-oriented IDE, you can build AIR apps to deploy across platforms, you will soon be able to publish saleable apps to every major phone including the iPhone, etc. I've been having a lot of fun with it lately getting it to work with Arduino - it's just a hobby project but I'm trying to build a little helicopter that I can control from an AIR app. I'd be curious to see someone do that in HTML5. ;)
Flash is amazingly powerful - your abilities are in many respects limited only by your imagination and willingness to figure out how to make it work. It's really bizarre to read all of this stuff about how (some) browsers can now play (certain types of) video one their own, ergo Flash is Dead. How unimaginative. :)
This is a tough call. Flash is a fairly dominant technology at this point when it comes to media-intensive web sites. Flash is also very popular for delivering mini-games. I do think that Flash video, which is currently also dominant, will gradually be replaced by HTML5 technologies. I'm not so sure that Flash can be replaced easily when it comes to those media-intensive sites. There is a large number of very talented people comfortable with Flash that might be reluctant to adopt other technologies. I would probably hedge my bets and get comfortable with Javascript and other HTML5 technologies.
Apple vs. Adobe controversy reveals two opposite views of mobile computing.
Apple wants that its developers make the best of its devices by excluding middleware. The aim is to deliver the best possible user experience.
Adobe wants that its developers publish their work on as many platforms as possible. The aim is to reach the widest audience.
Nobody knows which view will win in the future. The mobile war is just beginning...
I think it depends on how far into the future you want to look, and what you think is most important. Flash on the desktop will not die for a long time, if ever. If that is good enough, keep going with where you're going. If not using flash on iPhone/iPad is a deal breaker, you only really have two choices - Objective-C or HTML5.
HTML5 is definitely gaining momentum, but it can't be used directly in all browsers yet, and likely for a while. However, in the mobile space, there is pretty excellent support in the major smart phones.
There isn't a single platform/technology/language that can hit everything. If I were going to bet money on the future, though, I would say HTML5 is going to win for the most reach across platforms. And given it is on the rise, I would bet that in the next few years, there will be a lot of demand for good developers in this area, but don't expect the path to be fully paved for you. You'll have to get your hands a little dirty. If you're looking for a decent editor, I use Netbeans, but I also do Java development, so that makes sense for me. Search around, though, and you'll probably find a decent set of tools that work well for you. It is a very active space.
As far as I'm concerned, Actionscript is a pretty good language to learn OOP. Javascript is a bit shit. Eitherway, I would expect you'd learn a certain set of (frontend/2d graphics) skills which will come in handy regardless of the vehicle you'll eventually use to deploy your work.
Personally, I like the flexscript language used by Flash, it's more structured and object oriented than Javascript. Also it has real inheritance, not the prototype based crap, and compiles to bytecode. For the artist, Flash is easier to use in many ways due to the available tools.
I do hope for better integration into browsers. The current flash plugin is clunky and causes crashes for many users, also the plugin system makes it integrate badly into the flow of pages.
With HTML5, I think the browser plugin idiom is dying in general. Everything from video playback to fancy vector animations can be done with just HTML + Javascript. Even a standard for 3D graphics in webpages is on the way (O3D).
Also I wonder how Adobe will cope with the current explosion of platforms/operating sytems/browsers, especially in the mobile realm. At the moment, the Flash support for systems except for Windows on PC isn't much good.
Just as projects like SVGWeb brings SVG capabilities to browsers that don't have native SVG, I would expect that if/when HTML5 gains traction against Flash there will be conversion capabilities from existing Flash to browsers without Flash. In fact, Adobe already has a conversion from Flash to iPhone using Flash Professional CS5. IMHO, there's too much Flash content in the wild for this not to happen eventually, and there are too many people for which Actionscript is their primary (or only) language for some conversion not to happen.
Career-wise, the clear long-term trend is away from Flash, and I agree with Tom that hedging your bets is wise. However, HTML5 is still fairly new, and you might do yourself a disservice to ignore Flash at this point. With conversion technologies, a Flash skillset will likely be usable for at least several years.
I'm a software developer. I've been programming in high level languages for a few years.
I would like to know, how to take my first step into programming hardware. Not something crazy complicated, but maybe some ordinary CE device? Assuming I don't need to put the PCB together with varies components, but just to program the tiny cpu?
How low-level do I have to go? ASM? C? manipulating registers? or are the dev kit quite high level now? Is Java even in the picture? OO coding in hardware, is that even a dream or a reality? Need a reality check.
I also tend to learn better with books or sites that are written in a tutorial format. Something that guides the way for me from something simple to something more complex. Any recommendations? Maybe something that will introduce me to the popular hardware (microprocessor/micro-controller) available today?
Much appreciated, thank you everyone.
The actual programming isn't a big deal. The frustrating, annoying part is getting your development environment setup and getting the tools working. Once you've done that, you're half done.
I'd suggest buying a development kit ('dev kit') that has USB built in and works with your chosen OS. Get that working, and you're halfway done.
If you're missing the knowledge, it's also important to know the basics of how a processor works. You'll be programming at a much lower level than any other programming, so the fundamentals are a bit more important.
If you know C then it's only a matter of learnig the tool chain steps to download the code.
Easy place to start (cheap hardware/software) http://www.arduino.cc/en/Guide/HomePage
I have been coding in C both as a hobby and professionally for about 16 years now, but always for userland code (i.e., programs, not kernel or drivers). Most of my jobs involved high level languages (I have done a lot of Perl and Ruby programming, with the occasional Java, Python and shell scripting in between). I did develop a lot for MS-DOS (which was probably as close to bare-metal programming as you would get on a x86 machine), but my last job involved 5 years of Perl and Ruby on Rails web development.
That being said, I am now a senior engineer for embedded Linux development, developing drivers (including an emulator for a legacy simple microprocessor inside a kernel module) for uClinux on the Blackfin platform. There are times when my inexperience with hardware related issues (i.e., floating signal levels due to lack of a pull-up/pull-down on a pin) did get in the way, but it has been mostly a highly enjoyable and thrilling experience. As stated by others, understanding your tools is essential -- for uClinux, that meant the GNU Toolchain, which fortunately I was already familiar with due to my background on FOSS technologies.
The Blackfin is hardly an entry-level microprocessor (in particular, it does not have a MMU, which has some relevant effects on Linux development), but as already stated, you can buy a Beagleboard for around US$200 with all required accessories and start messing around with it in just a few days. If you want something simpler, there are many Arduino options out there, though if you have some real development experience under your belt I believe you will find their development environment a little limiting (I know I did).
After you get comfortable with your tools you might want to spend some money on an in-circuit emulator (or ICE). These are usually highly platform specific (both in terms of target architecture and development environment), but are highly recommended for anything beyond the usual blink-LEDs-after-button-press examples I am sure you will quickly outgrow.
In few months you will find yourself building custom images for hackable customer devices using Buildroot and having a lot of fun. All I can say is, go for it, it's highly addictive and not particularly expensive to do nowadays.
Also something to look into is the Microsoft Robotics Studio. They support quite a lot of hardware boards (including CE), and with it is is fairly easy to get a small robot up and running. And what's more cool a project to learn embedded programming?
It all integrates nicely in Visual Studio (express) and their devkit also comes with a free express edition.
Get a beagleboard. Cheap, lots of users (community support will be key), many OS options. http://beagleboard.org/
Well, if you want to know what you're doing, you need to understand the assembly language of the processor and the processor's architecture.
You will need to learn C to be competent in microcontrollers. There is no way around that.
There are some VM-level languages on embedded systems. I see the Java out-of-memory exception from time to time on my cell phone(which also helps to give me a strong opinion on VM-level embedded languages).
The ARM has some support for hardware-level Java bytecodes.
Your best bet is to pick up something like the PIC or the Atmel chips and begin hacking with them.
If you want to do it with your existing hardware, get a hypervisor for your PC and begin writing a basic kernel.
I'm thinking about switching my path "slightly" by going into desktop development (VC++, MFC, C#, etc) after about 8 years within embedded telecom systems development (C, MAKE, Symbian, 100 compilers etc, etc).
My concern however is that my experience within embedded systems maybe doesn't give me much value when going into desktop development. For example that the domain specific problems and environments I've worked with for so long still doesn't give me much to negotiate salaries with since it bares little worth on the desktop.
I think this place might be good for input on this.
So, the Q:
If you disregard the obvious generic experience on programming language level, give an example of something you have learned working with embedded systems that you could reuse when working in a desktop environment.
PS:
I should note that I'm no beginner in the desktop area - since many years back all my hobby projects are focused around desktop development.
Embedded engineers in general tend to be more disciplined when it comes to validating operations and dealing with finite resources.
This can also translate into coming up with an exception handling strategy earlier on.
The quintessential example is checking the return value of malloc. I have seen very few desktop software consistently check it, but it's commonplace in embedded environments.
Discipline of having a clean, well-organized set of source-code is the key skill that translates well to the "desktop experience". -- I've noticed that the embedded projects I've written and picked up are often WAY cleaner than their desktop counterparts.
Many desktop-only developers could benefit from the experience of making a program fit in 128K of FLASH and 32K of SRAM, not to mention communicating meaningfully with a user through only an LED or two and a couple of buttons. Making that a requirement might reduce some of the endemic code bloat in the applications industry. :-)
Even if you don't switch tracks to straight application development, the embedded experience translates well to driver development, as well as to low level utilities and to long running services. All of these are also domains where the disciplines that are nearly second-nature to a successful embedded developer remain valuable.
I was a desktop developer for almost 5yrs before switching to an embedded environment.
I find working on an embedded environment more challenging as we have to deal with memory limitations, slow CPU speed, cross-compilation issues, etc.
Having learned a lot of patience, discipline and low-level intricacies, desktop development should be as easy as a walk in the park.
State machines/event driven programming on embedded systems is not that different from event driven programming on the desktop. The depth of experience you have of these coding techniques on embedded systems, especially telecoms embedded systems, should make you a great desktop programmer.
Similarly, your experience with communications protocols should transfer nicely to the desktop. Most desktop applications have some involvement with the network.
I'm curious to see how popular the alternatives to C are in the embedded developer world e.g. Ada...
I've only ever used C (with a little bit of assembler), but then my targets have very limited resources. Is there a move else where in this space to something else? What is winning the ware in set top boxes?
If !C what was the underlying reason?
Compiler support for target
Trace \ static analysis tools
other...
Thanks.
Forth is quite popular for embedded development.
Also, while Smalltalk is probably not popular in the embedded community, embedded development is definitely popular in the Smalltalk community.
When you say "embedded development", keep in mind that you have to consider the scale of the project.
When programming something on the scale of a microcontroller or the firmware for an ASIC, you tend to see C and assembly dominate the scene. Embedded developers tend to "specialize" in these languages since compilers for them are available for nearly every embedded target platform. If your project migrates from, say, a chip with a PowerPC core to a chip with an ARM core, you can be fairly confident that your C code will not be overly difficult to port over. Some chips do have compilers available for other languages, but typically they do not match the C compiler in terms of efficiency of the resulting binary. Since embedded systems are often low on resources, system designers want to make their code as efficient as possible (also one reason why you see a lot of assembly language code). I have seen development tools available for languages such as C++, Pascal, Basic, and others, but they are typically niche tools that are not mature enough to match the efficiency of the available C compilers. Debugging tools for these languages also tend to be harder to find than what is available for C/assembly.
You also mentioned set-top boxes. Embedded systems on this scale can pack the equivalent power of a desktop computer from 7-8 years ago. Their available RAM, storage space, and processing power allows them to run full-featured operating systems and interpreters for higher-level languages. On these more powerful systems you will still see C and assembly language being used (for driver code, if nothing else), but other languages (such as Java, Lua, Tcl, Ruby, etc) are becoming more and more common. Using interpreted languages makes porting code from one platform to another even easier, as long as the platform has sufficient resources to handle the overhead of the language interpreter. Any low-level code that interfaces directly with hardware (drivers) with still typically use assembly or C since high-level languages don't always have the capability to do this sort of thing. Anything running as an application on top of the embedded operating system can usually be developed and tested inside an emulator or virtual machine, and so you will see a lot of code being developed in whatever language the developer happens to be comfortable with.
TLDR version: C is popular because is it a versatile language that nearly all developers are familiar with. Assembly is popular because it allows for low-level hardware access in ways that would otherwise be difficult or impossible. Interpreted/scripted languages such as Java are becoming more popular, but the resource requirements of the interpreters for these languages may be too much for some embedded systems to handle. The quality and variety of development/debugging tools availability for the C and assembly languages also makes these options attractive.
Perhaps not quite the large step from C you're looking for but C++ is also resonably popular for embedded projects.
I haven't used myself, but Bascom is quite popular for AVR microcontrollers. It is a Basic IDE that lets you interact with the peripherals very easily. I've met hardware people that successfully use it.
Yes. Java is becoming more popular - many processors have added instructions that pertain primarily to Java and similar languages (.net). Also, uclinux runs on microcontrollers, so you can use practically any language for some of the larger micros.
Basic is still common, as is assembly.
You'll see Ada in certain gov't projects.
And some engineers are even putting Lua and other interpreters on their micros so their customers can extend the functionality.
But C is still dominant.
-Adam
In the early 90 I did a lot of embedded development on the 8051 using Intel PLM51 and the DCX51 operating system.
PLM is very simple language – but very powerful
We now use C
If you work in the smartcard space, you get to use Java Card. Yep, Java, on an 8-bit micro. It's kinda fun, actually. I get to develop in Eclipse, test ( & debug!) on the PC simulator, and can be confident that it'll run the same on the card. It's just such a pity Java is a terrible language for embedded apps :)
I've used EC++ (Embedded C++) quite extensively.
Also, PICBasic has been popular with the PIC'ers for eons now.
I have used Ada in embedded project for military avionics because of customer requirements. There is lots of Ada tools for embedded development but most of it is very expensive. Personally I would just use C.
There is a Pascal compiler for 8051
JAL
There is a group of folks working to make Lua a viable option for embedded work. They are targeting primarily 32-bit ARMs with 256K FLASH and 64K RAM or better, and seem happy with their work so far.
They are partly inspired by the classic BASIC-Stamp, a BASIC interpreter running in a moderately powerful PIC with the program itself stored in a serial EEPROM device.
At work, I am still maintaining a customer's embedded system that is written in a compiled flavor of BASIC running in a Zilog Z180 CPU. 1980's technology all around, with most of the system still built out of 24-bin DIP packages in sockets. The compiler runs under CP/M-80 running in a Z80 simulator, that itself runs in the MS-DOS simulator built into Windows. Aside from the shear amazement that anything productive can be done this way (and that you can still buy 27C256 UV erasable EPROMS, and that my nearly 20 year old Data/IO PROM programmer still works) I really wish the customer could afford to move to a new hardware design so the system could be rewritten in a maintainable language.
Depends on the microcontroller, many of them have C but the compilers are horribly, assembler is usually easy and the best performing, most efficient, etc. Ones like the msp, avr, and arm are good for C compilers and for those I would and do use C (depending on the problem).
I would stick to C or assembler, you are wasting memory, performance, and resources using anything else.
Pascal, Modula2 work fine too. Essentially they are pretty much equivalent to C, except for the inability to do alloca (though some have that as extension).
But the core problem will be the problem with any !C compiler: what do you prefer, a better compiler/toolchain or the language of preference.
Despite I like the Wirthian languages most, I simply use C, and am living with the consequences, simply because the toolchain is better.
There have been examples in the past (Pascals, or even tightly compiled Basics), but C is mostly the norm. I never understood why.
I worked on a device which ran some incredibly old version of python (1.4 or something). There was no way to debug it (other than printing debug messages) so when your code hit an exception everything would just stop and you scratched your head for an hour. Whenever you made a change and upgraded the code it was running, it took about 10 minutes to interpret and compile it.
Needless to say we scrapped that and replaced the microcontroller with one that ran C.
See this related question:
What languages are used for real-time systems programming.
In response to your "why" question, from the standpoint of government/military acquisition, there is a perception that Java (language, platform, etc...) is the lingua franca these days and that economies of scale in the language will reduce acquisition and maintenance cost. There's also a hope that one can efficiently train a competent Java programmer to be a reasonable RT/embedded programmer in Java faster than if they are required to learn a new language. This rationale is suspect, in my opinion, but it does answer the "why" question.
If you include the iPhone as an embedded platform then Objective-C
Considering how many times I've had a Java out-of-memory exception on my phone(most of the time I do anything remotely interesting), I'd run away from Java like a bat out of a hot place.
I've heard that Erlang was designed for use for cell phones. I think Lisp is a good architecture for remote device support- if the device cna handle the run-time.
A lot of home-brew users and small companies needing a cheap solution have found Tiny Tiger and Basic STAMP (using BASIC) meets their needs.