Which platform should i choose for scientific computing? - gpu

What are the pros and cons in choosing PS3 as a platform for scientific computing in detriment of GPU's? Is It the better choice ?

Stick with a PC, you will have a far easier life at the end of the day. I also wouldn't be surprised if you get more horsepower out of GPU's.
p.s., from what I know dispatching work to the cells is not an enjoyable task :D

I'd go for GPU, for three reasons:
(a) GPU code can be developed, tested, and run on pretty much any PC you may want to use, with the only dependency being a $150 video card, whereas CELL/PS3 is a much more custom development environment and won't run natively on your laptop, etc.;
(b) I'm willing to bet a lot that GPUs and Cuda will be alive and well in 5 years, but I wouldn't put money on PS3 being around that long -- what are you going to do if PS4 has a totally different architecture and CELL effectively dies?
(c) There's a more vibrant research and development community around GPU than there is around PS3/Cell (outside of strict game development), so you're likely to be in more good company, have example code and tools to work with, etc.

There is no broad "better" choice, it is all dependent on the situation and what you're doing. Probably the biggest PRO to a PS3 is they're cheap by comparison. A computer can more easily scale bigger though (for a price) when looking into things like CUDA.

CUDA is pretty slick. I was shown a presentation recently demonstrating how easy it is to get at the power of the GPU's many cores using a C++ based syntax. If I was starting a parallel computing project now, I would probably take the PC/GPU-based route.

A major objection to the PS3 (which is already quite a wacky choice unless you're under some pretty extreme price/performance constraints) has to be that Sony are dropping support for installation of other OS. In future, PS3s without the disabling firmware update may become harder and harder to get hold of.

Related

Which SoC for making a Rubik's Cube Solver?

Is it wiser to use a Raspberry Pi over an Intel Galileo for making a Rubik's Cube Solver? Programming language isn't a major issue, Although Python would be slightly more preferred.
The major constraint is that there is only one PWM pin on the Raspberry Pi, we're thinking about using servo motors to rotate the Cube. What do you people think?
Major differences:
PWM pins
Processor
RAM
While this is hardly the kind of question to ask on this forum; I will attempt to NOT confuse you with my answer.
Before trying to answer which is best there are a few other things you need to ask yourself:
what does your design involve other than the processor? Do you want
to use 6 servos, one for each face of the cube? Do you have a more
cost efficient design that involves fewer servos? How many I/O pins
do you actually need?
RAM and Processor type are factors to consider when it comes to how fast your algorithm will run. Are you trying to make the fastest Rubik's Cube Solver in the world? Or just one that can actually solve the problem.
is cost a factor? Both platforms are decently priced but there is a difference between them which may matter when you are on a budget
The Galileo platform is newer than the Pi. You are more likely to find answers to your question when going with the more popular platform.
Is the programming language important? This comes back to how fast you want the algorithm to run. A c implementation will run faster than a python implementation, but ultimately I think it's better to stick to what you are more comfortable with.
On a personal note, I would probably go with the Pi because there's a huge community built around it and you can find plug-in expansion boards for almost anything you can think of, which will allow you to focus on software without worrying too much about the hardware side of things.

Does it matter which microcontroller to use for 1st time embed system programmer?

I've experience in doing desktop and web programming for a few years. I would like to move onto doing some embed system programming. After asking the initial question, I wonder which hardware / software IDE should I start on...
Arduino + Arduino IDE?
Atmel AVR + AVR Studio 4?
Freescale HCS12 or Coldfire + CodeWarrior?
Microchip PIC+ MPLAB?
ARM Cortex-M3 + ARM RealView / WinARM
Or... doesn't matter?
Which development platform is the easiest to learn and program in (take in consideration of IDE usability)?
Which one is the easiest to debug if something goes wrong?
My goal is to learn about "how IO ports work, memory limitations/requirements incl. possibly paging, interrupt service routines." Is it better to learn one that I'll use later on, or the high level concept should carry over to most micro-controllers?
Thanks!
update: how is this dev kit for a start? Comment? suggestion?
Personally, I'd recommend an ARM Cortex-M3 based microcontroller. The higher-power ARM cores are extremely popular, and these low-power versions could very well take off in a space that is still littered with proprietary 8/16-bit cores. Here is a recent article on the subject: The ARM Cortex-M3 and the convergence of the MCU market.
The Arduino is very popular for hobbyist. Atmel's peripheral library is fairly common across processor types. So, it would smooth a later transition from an AVR to an ARM.
I don't mean to claim that an ARM is better than an AVR or any other core. Choosing an MCU for a commercial product usually comes down to peripherals and price, followed by existing code base and development tools. Besides, microcontrollers are general much much simpler than a desktop PC. So, it's really not that hard to move form one to another after you get the hang of it.
Also, look into FreeRTOS if you are interested in real-time operating system (RTOS) development. It's open source and contains a nice walk through of what an RTOS is and how they have implemented one. In fact, their walk-through example even targets an AVR.
Development tools for embedded systems can be very expensive. However, there are often open source alternatives for the more open cores like ARM and AVR. For example, see the WinARM and WinAVR projects.
Those tool-chains are based on GCC and are thus also available (and easier to use IMHO) on non-Windows platforms. If you are familiar with using GCC, then you know that there are an abundance of "IDE's" to suit your taste from EMACS and vi (my favorite) to Eclipse.
The commercial offerings can save you a lot of headaches getting setup. However, the choice of one will very much depend on your target hardware and budget. Also, Some hardware support direct USB debugging while others may require a pricey JTAG adapter.
Other Links:
Selection Guide of Low Cost Tools for Cortex-M3
Low-Cost Cortex-M3 Boards:
BlueBoard-LPC1768-H ($32.78)
ET-STM32 Stamp Module ($24.90)
New Arduino to utilize an ARM Cortex-M3 instead of an AVR microcontroller.
Given that you already have programming experience, you might want to consider getting an Arduino and wiping out the firmware to do your own stuff with AVR Studio + WinAVR. The Arduino gives you a good starting point in understanding the electronics side of it. Taking out the Arduino bootloader would give you better access to the Atmel's innards.
To get at the goals you're setting out, I would also recommend exploring desktop computers more deeply through x86 programming. You might build an x86 operating system kernel, for instance.
ARM is the most widely used embedded architecture and covers an enormous range of devices from multiple vendors and a wide range of costs. That said there are significant differences between ARM7, 9, 11, and Cortex devices - especially Cortex. However if getting into embedded systems professionally is your aim, ARM experience will serve you well.
8 bit architectures are generally easier to use, but often very limited in both memory capacity and core speeds. Also because they are simple to use, 8-bit skills are relatively easy to acquire, so it is a less attractive skill for a potential employer because it is easy to fulfil internally or with less experienced (and therefore less expensive) staff.
However if this is a hobby rather than a career, the low cost of parts, boards, and tools, and ease of use may make 8 bit attractive. I would suggest AVR simply because it is supported by the free avr-gcc toolchain. Some 8 bit targets are supported by SDCC, another open source C compiler. I believe Zilog make their Z8 compiler available for free, but you may need to pay for the debug hardware (although this is relatively inexpensive). Many commercial tool vendors provide code-size-limited versions of their tools for evaluation and non-commercial use, but beware most debuggers require specialist hardware which may be expensive, although in some cases you can build it yourself if you only need basic functionality and low speeds.
Whatever you do do take a look at www.embedded.com. If you choose ARM, I have used WinARM successfully on commercial projects, although it is not built-for-comfort! A good list of ARM resources is available here. For AVR definitely check out www.avrfreaks.net
I would only recommend Microchip PIC parts (at least the low-end ones) for highly cost sensitive projects where the peripheral mix is a good fit to the application; not for learning embedded systems. PIC is more of a branding than an architecture, the various ranges PIC12, 16, 18, 24, and PIC32 are very different from each other, so learning on one does not necessarily stand you in good stead for using another - often you even need to purchase new tools! That said, the dsPIC which is based on the PIC24 architecture may be a good choice if you wanted to get some simple DSP experience at the same time.
In all cases check out compiler availability (especially if C++ support is a requirement) and cost, and debugger hardware requirements, since often these will be the most expensive parts of your dev-kit, the boards and parts are often the least expensive part.
This is kind of a hard question to answer as your ideal answer very much depends on what it is your interested in learning.
If your goal is just to dive a little deeper into the inner workings of computing systems i would almost recommend you forgo the embedded route and pick up a book on writing a linux kernel module. Write something simple that reads a temperature sensor off the SMbus or something like that.
If your looking at getting into high level (phones, etc) embedded application development, download the Android SDK, you can code in java under eclipse and even has a nice emulator.
If your looking at getting into the "real" microcontroller space and really taking a look at low level system programming, i would recommend you start on a very simple architecture such as an AVR or PIC, something without an MMU.
Diving into the middle ground, for example an ARM with MMU and some sort of OS be it linux or otherwise is going to be a bit of a shock as without a background is both system programming and hardware interfacing i think the transition will be very rough if you plan to do much other than write very simple apps, counting button presses or similar.
Texas Instruments has released a very interesting development kit at a very low price: The eZ430-Chronos Development Tool contains an MSP430 with display and various sensors in a sports watch, including a usb debug programmer and a usb radio access point for 50$
There is also a wiki containing lots and lots of information.
I have already created a stackexchange proposal for the eZ430-Chronos Kit.
No it doesn't matter if you want to learn how to program an embedded device. But you need to know the flow of where to start and where to go next. Cause there are many micro-controllers out there and you don't know which one to choose. So better have a road-map before starting.
In my view you should start with - Any AVR board (atmega 328P- arduino boards or AVR boards)
then you should go to ARM micro-controller - first do ARM CORTEX TDMI
then ARM cortex M3 board.Thus this will give you an overall view after which you can choose any board depending on what kind of project you are working and what are your requirements.
Whatever you do, make sure you get a good development environment. I am not a fan of Microchip's development tools even though I like their microcontrollers (I have been burned too many times by MPLAB + ICD, too much hassle and dysfunction). TI's 2800 series DSPs are pretty good and have an Eclipse-based C++ development environment which you can get into for < US$100 (get one of the "controlCARD"-based experimenter's kits like the one for the 28335) -- the debugger communications link is really solid; the IDE is good although I do occasionally crash it.
Somewhere out there are ICs and boards that are better; I'm not that familiar with the embedded microcontroller landscape, but I don't have much patience for poor IDEs with yet another software tool chain that I have to figure out how to get around all the bugs.
Some recommend the ARM. I'd recommend it, not as a first platform to learn, but as a second platform. ARM is a bit complex as a platform to learn the low-level details of embedded, because its start-up code and initialisation requirements are more complicated than many other micros. But ARM is a big player in the embedded market, so well worth learning. So I'd recommend it as a second platform to learn.
The Atmel AVR would be good for learning many embedded essentials, for 3 main reasons:
Architecture is reasonably straight-forward
Good development kits available, with tutorials
Fan forum with many resources
Other micros with development kits could also be good—such as MSP430—although they may not have such a fan forum. Using a development kit is a good way to go, since they are geared towards quickly getting up-and-running with the micro, and foster effective learning. They are likely to have tutorials oriented towards quickly getting started.
Well, I suppose the development kits and their tutorials are likely to gloss over such things as bootloaders and start-up code, in favour of getting your code to blink the LED as soon as possible. But that could be a good way to get started, and you can explore the chain of events from "power-on" to "code running" at your pace.
I'm no fan of the PICs, at least the PIC16s, due to their architecture. It's not very C-friendly. And memory banks are painful.
It does matter, you need to gradually acquire experience starting with simpler systems. Note that by simpler I dont mean less powerful, I mean ease of use, ease of setup etc. In that vein I would recommend the following (I have no vested interest in a any of the products, I just found them the best):
I've started using one of these (MBED developer board). The big selling points for me were that I could code in C or C++, straightforward connection vis USB and a slick on-line development environment (no local tool installation required at all!).
http://mbed.org/
Five minutes afer opening box I had a sample blinky program (the 'hello world' of the emedded world) running the following:
#include "mbed.h"
DigitalOut myled(LED1);
int main()
{
while(1)
{
myled = 1;
wait(0.2);
myled = 0;
wait(0.2);
}
}
That's it! Above is the complete program!
It's based on ARM Cortex M3, fast and plenty of memory for embedded projects (100mhz, 256k flash & 32k ram). The online dev tools have a very good library and plenty of examples and theres a very active forum. Plenty of help on connecting devices to MBED etc
Even though I have plenty of experience with embedded systems (ARM 7/9, Renases M8/16/32, Coldfire, Zilog, PIC etc) I still found this a refreshingly easy system to get to grips with while having serious capability.
After initially playing with it on a basic breadboard I bought a base board from these guys: http://www.embeddedartists.com/products/lpcxpresso/xpr_base.php?PHPSESSID=lj20urpsh9isa0c8ddcfmmn207. This has a pile of I/O devices (including a miniture OLED and a 3axis accelerometer). From the same site I also bought one of the LCPExpresso processor boards which is cheap, less power/memory than the MBED but perfect for smaller jobs (still hammers the crap out of PIC/Atmega processors). The base board supports both the LCPExpresso and the MBED. Purchasing the LCPExpress processor board also got me me an attached JTAG debugger and an offline dev envoronment (Code Red's GCC/Eclipse based dev kit). This is much more complex than the online MBED dev environment but is a logical progression after you've gained expeience with the MBED.
With reference to my original point noite that the MBED controller is much more capable than the the LPCExpresso controller BUT is much simpler to use and learn with.
I use microchips PIC's, its what I started on, I mainly got going on it due to the 123 microcontroller projects for the evil genius book. I took a Microprocessors class at school for my degree and learned a bit about interrupts and timing and things, this helped a ton with my microcontrollers. I suppose some of the other programmers etc may be better/easier, but for $36 for the PicKit1, I'm too cheap to go buy another one...and frankly without using them I don't know if they are easier/better, I like mine and recommend it every chance I get, and it took me forever to really actually look at it, but I was able to program another chip off board with ICSP finally. I don't know what other programmers do it, but for me that's the nicest thing 5 wire interface and you're programmed. Can't beat that with a stick...
I've only used one of those.
The Freescale is a fine chip. I've used HC-something chips for years for little projects. The only caveat is that I wouldn't touch CodeWarrier embedded with a 10 foot pole. You can find little free C compilers and assemblers (I don't remember the name of the last one I used) that do the job just fine. Codewarrior was big and confusing and regardless of how much I knew about the chip architecture and C programming always seemed to only make things harder. If you've used Codewarrior on the Mac back in the old days and think CW is pretty neat, well, it's not at all like that. CW embedded looks vaguely similar, but it works very differently, and not very well.
A command-line compiler is generally fine. Professionals who can shell out the big bucks get expensive development environments, and I'm sure they make things better, but without that it's still far better than writing assembly code for a desktop PC in 1990, and somehow we managed to do that just fine. :-)
You might consider a RoBoard. Now, this board may not be what you are looking for in terms of a microcontroller, but it does have the advantage of being able to run Windows or DOS and thus you could use the Microsoft .NET or even C/C++ development tools to fiddle around with things like servos or sensors or even, what the heck, build a robot! It's actually kinda fun.
There's also the Axon II, which has the ATmega640 processor.
Either way, both boards should help you achieve your goal.
Sorry for the robotics focus, just something I'm interested in and thought it may help you too.
I use PICs, but would consider Arduino if I chose today. But from your goals:
how IO ports work
memory limitations/requirements
interrupt service routines
I wonder if you best bet is just to hack in the Linux kernel?
BBC Micro Bit
https://en.wikipedia.org/wiki/Micro_Bit
This cheap little board (~20 pounds) was crated by ARM Holdings as an educational device, and 1M units were given out for free to UK students.
It contains an ARM Cortex-M0, the smallest ARM core of all.
I recommend it as a first micro-controller board due to its wide availability, low cost, simplicity, and the fact that it introduces you to the ARM architecture, which has many more advanced boards also available for more serious applications.

embedded application

In the last two months I've worked as a simple application using a computer vision library(OpenCV).
I wish to run that application directly from the webcam without the need of an OS. I'm curious to know if that my application can be burned into a chip in order to not have the OS to run it.
Ofcorse the process can be expensive, but I'm just curious. Do you have any links about that?
ps: the application is written in C.
I'd use something bigger than a PIC, for example a small 32 bit ARM processor.
Yes. It is theoretically possible to port your app to PIC chips.
But...
There are C compilers for the PIC chip, however, due to the limitations of a microcontroller, you might find that the compiler, and the microcontroller itself is far too limited for computer vision work, especially if your initial implementation of the app was done on a full-blown PC:
You'll only have integer math available to you, in most cases, if not all (can't quote me on that, but our devs at work don't have floating point math for their PIC apps and it causes many foul words to emanate from their cubes). Either that, or you'll need to hook to an external math coprocessor.
You'll have to figure out how to get the PIC chip to talk USB to the camera. I know this is possible, but it will require additional hardware, and R&D time.
If you need strict timing control,
you might even have to program the
app in assembler.
You'd have to port portions of OpenCV to the PIC chip, if it hasn't been already. My guess is not.
If your'e not already familiar with microcontroller programming, you'll need some time to get up to speed on the differences between desktop PC programming and microcontroller programming, and you'll have to gain some experience in that. This may not be an issue for you.
Basically, it would probably be best to re-write the whole program from scratch given a PIC chip constraint. Good thing is though, you've done a lot of design work already. It would mainly be hardware/porting work.
OR...
You could try using a small embedded x86 single-board PC, perhaps in the PC/104 form factor, with your OS/app on a CF card. It's a real bone fide PC, you just add your software. Good thing is, you probably wouldn't have to re-write your app, unless it had ridiculous memory footprint. Embedded PC vendors are starting to ship boards based on 1 GHz Intel Atoms, and if you needed more help you could perhaps hook a daughterboard onto the PC-104 bus. You'll work around all of the limitations listed above, as your using an equivalent platform to the PC you developed your app on. And it has USB ports! If you do a thorough cost analysis and if your'e cool with a larger form factor, you might find it to be cheaper/quicker to use a system based on a SBC than rolling a solution using PIC chips/microcontrollers.
A quick search of PC-104 on Google would reveal many vendors of SBCs.
OR...
And this would be really cheap - just get a off-the-shelf cheap Netbook, overwrite the OEM OS, and run the code on there. Hackish, but cheap, and really easy - your hardware issues would be resolved within a week.
Just some ideas.
I think you'll find this might grow into pretty large project.
It's obviously possible to implement a stand-alone hardware solution to do something like this. Off the top of my head, Rabbit's solutions might get you to the finish-line faster. But you might be able to find some home-grown Beagle Board or Gumstix projects as well.
Two Google links I wanted to emphasize:
Rabbit: "Camera Interface Application Kit"
Gumstix: "Connecting a CMOS camera to a Gumstix Connex motherboard"
I would second Nate's recommendation to take a look at Rabbit's core modules.
Also, GHIElectronics has a product called the Embedded Master that runs .Net MicroFramework and has USB host/device capabilities built-in as well as a rich library that is a subset of the .Net framework. It runs on an Arm processor and is fairly inexpensive (> $85). Though not nearly as cheap as a single PIC chip it does come with a lot of glue logic pre-built onto the module.
CMUCam
I think you should have a look at the CMUcam project, which offers affordable hardware and an image processing library which runs on their hardware.

Multi core programming

I want to get into multi core programming (not language specific) and wondered what hardware could be recommended for exploring this field.
My aim is to upgrade my existing desktop.
If at all possible, I would suggest getting a dual-socket machine, preferably with quad-core chips. You can certainly get a single-socket machine, but dual-socket would let you start seeing some of the effects of NUMA memory that are going to be exacerbated as the core counts get higher and higher.
Why do you care? There are two huge problems facing multi-core developers right now:
The programming model Parallel programming is hard, and there is (currently) no getting around this. A quad-core system will let you start playing around with real concurrency and all of the popular paradigms (threads, UPC, MPI, OpenMP, etc).
Memory Whenever you start having multiple threads, there is going to be contention for resources, and the memory wall is growing larger and larger. A recent article at arstechnica outlines some (very preliminary) research at Sandia that shows just how bad this might become if current trends continue. Multicore machines are going to have to keep everything fed, and this will require that people be intimately familiar with their memory system. Dual-socket adds NUMA to the mix (at least on AMD machines), which should get you started down this difficult road.
If you're interested in more info on performance inconsistencies with multi-socket machines, you might also check out this technical report on the subject.
Also, others have suggested getting a system with a CUDA-capable GPU, which I think is also a great way to get into multithreaded programming. It's lower level than the stuff I mentioned above, but throw one of those on your machine if you can. The new Portland Group compilers have provisional support for optimizing loops with CUDA, so you could play around with your GPU even if you don't want to learn CUDA yourself.
Quad-core, because it'll permit you to do problems where the number of concurrent processes is > 2, which often non-trivializes problems.
I would also, for sheer geek squee, pick up a nice NVidia card and use the CUDA API. If you have the bucks, there's a stand-alone CUDA workstation that plugs into your main computer via a cable and an expansion slot.
It depends what you want to do.
If you want to learn the basics of multithreaded programming, then you can do that on your existing single-core PC. (If you have 2 threads, then the OS will switch between them on a single-core PC. Then when you move to a dual-core PC they should automatically run in parallel on separate cores, for a 2x speedup). This has the advantage of being free! The disadvantages are that you won't see a speedup (in fact a parallel implementation is probably slightly slower due to overheads), and that buggy code has a slightly higher chance of working.
However, although you can learn multithreaded programming on a single-core box, a dual-core (or even HyperThreading) CPU would be a great help.
If you want to really stress-test the code you're writing, then as "blue tuxedo" says, you should go for as many cores as you can easily afford, and if possible get hyperthreading too.
If you want to learn about algorithms for running on graphics cards - which is a very different area to x86 multicore - then get CUDA and buy a normal nVidia graphics card that supports it.
I'd recommend at least a quad-core processor.
You could try tinkering with CUDA. It's free, not that hard to use and will run on any recent NVIDIA card.
Alternatively, you could get a PlayStation 3 and the Linux SDK and work out how to program a Cell processor. Note that the next cheapest option for Cell BE development is an order of magnitude more expensive than a PS3.
Finally, any modern motherboard that will take a Core Quad or quad-core Opteron (get a good one from Asus or some other reputable manufacturer) will let you experiment with a multi-core PC system for a reasonable sum of money.
The difficult thing with multithreaded/core programming is that it opens a whole new can of worms. The bugs you'll be faced with are usually not the one you're used to. Race conditions can remain dormant for ages until they bite and your mainstream language compiler won't assist you in any way. You'll get random data and/or crashes that only happen once a day/week/month/year, usually under the most mysterious conditions...
One things remains true fortunately : the higher the concurrency exhibited by a computer, the more race conditions you'll unveil.
So if you're serious about multithreaded/core programming, then go for as many cpu cores as possible. Keep in mind that neither hyperthreading nor SMT allow for the level of concurrency that multiple cores provide.
I would agree that, depending on what you ultimately want to do, you can probably get by with just your current single-core system. Multi-core programming is basically multi-threaded programming, and you can certainly do that on a single-core chip.
When I was a student, one of our projects was to build a thread-safe implementation the malloc library for C. Even on a single core processor, that was more than enough to cure me of my desire to get into multi-threaded programming. I would try something small like that before you start thinking about spending lots of money.
I agree with the others where I would upgrade to a quad-core processor. I am also a BIG FAN of ASUS Motherboards (the P5Q Pro is excellent for Core2Quad and Core2Duo processors)!
The draw for multi-core programming is that you have more resources to get things done faster. If you are serious about multi-core programming, then I would absolutely get a quad-core processor. I don't believe that you should get the new i7 architecture from Intel to take advantage of multi-core processing because anything written to take advantage of the Core2Duo or Core2Quad will just run better on the newer architecture.
If you are going to dabble in multi-core programming, then I would get a good Core2Duo processor. Remember, it's not just how many cores you have, but also how FAST the cores are to process the jobs. My Core2Duo running at 4GHz routinely completes jobs faster than my Core2Quad running at 2.4GHz even with a multi-core program.
Let me know if this helps!
JFV

Is low-level / embedded systems programming hard for software developers? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
Given my background as a generalist, I can cover much of the area from analog electronics to writing simple applications that interface to a RDBMS backend.
I currently work in a company that develops hardware to solve industry-specific problems. We have an experienced programmer that have written business apps, video games, and a whole bunch of other stuff for PC's. But when I talk to him about doing low-level programming, he simultaneously express interest and also doubt/uncertainty about joining the project.
Even when talking about PC's, he seems to be more comfortable operating at the language level than the lower-level stuff (instruction sets, ISR's). Still, he's a smart guy, and I think he'd enjoy the work once he is over the initial learning hump. But maybe that's my own enthusiasm for low-level stuff talking... If he was truly interested, maybe he would already have started learning stuff in that direction?
Do you have experience in making that software-to-hardware (or low-level software) transition? Or, better yet, of taking a software only guy, and transitioning him to the low-level stuff?
Edit:
P.S. I'd love to hear from the responders what their own background is -- EE, CS, both?
At the end of the day, everything is an API.
Need to write code for an SPI peripheral inside a microcontroller? Well, get the datasheet or hardware manual, and look at the SPI peripheral. It's one, big, complex API.
The problem is that you have to understand the hardware and some basic EE fundamentals in order to comprehend what the API means. The datasheet isn't written by and for SW developers, it was written for hardware engineers, and maybe software engineers.
So it's all from the perspective of the hardware (face it - the microcontroller company is a hardware company filled with hardware/asic engineers).
Which means the transition is by no means simple and straightforward.
But it's not difficult - it's just a slightly different domain. If you can implement a study program, start off with Rabbit Semiconductor's kits. There's enough software there so a SW guy can really dig in with little effort, and the HW is easy to deal with because everything is wrapped in nice little libraries. When they want to do something complex they can dig into the direct hardware access and fiddle at the lower level, but at the same time they can do some pretty cool things such as build little webservers or pan/tilt network cameras. There are other companies with similar offerings, but Rabbit is really focused on making hardware easy for software engineers.
Alternately, get them into the Android platform. It looks like a unix system to them, until they want to do something interesting, and then they'll have the desire to attack that little issue and they'll learn about the hardware.
If you really want to jump in the deep end, go with an arduino kit - cheap, free compilers and libraries, pretty easy to start off with, but you have to hook wires up to do something interesting, which might be too big of a hurdle for a reluctant software engineer. But a little help and a few nudges in the right direction and they will be absolutely thrilled to have a little LED display that wibbles* like the nightrider lights...
-Adam
*Yes, that's a technical engineering term.
The best embedded programmers I've worked with are EE trained and learned SW on the job. The worst embedded developers are recent CS graduates who think SW is the only way to solve a problem. I like to think of embedded programming as the bottom of the SW pyramid. It's a stable abstraction layer/foundation that makes life easy for the app developers.
"Hard" is an extremely relative term. If you're used to thinking in the tight, sometimes convoluted way you need to for small embedded code (for example, you're a driver developer), then certainly it's not "hard".
Not to "bash" (no pun intended) shell scripters, but if you write perl and shell scripts all day, then it might very well be "hard".
Likewise if you're a UI guy for Windows. It's a different kind of thinking.
Why embedded development is "hard":
1) The context may switch to an interrupt between each machine instruction. Since high level language constructs may map to multiple assembly instructins, this might even be within a line of code, e.g. long var = 0xAAAA5555. If accessed in an interrupt service routine, in a 16 bit processore var might only be half set.
2) Visibility into the system is limited. You may not even have output to Hyperterm unless you write it yourself. Emulators don't always work that well or consistently (though they are way better than they used to be). You will have to know how to use oscilloscopes and logic analyzers.
3) Operations take time. For example, say your serial transmitter uses an interrupt to signal when it is time to send another byte. You could write 16 bytes to a transmit buffer, then clear interrupts and wonder why your message is never sent. Timing in general is a tricky part of embedded programming.
4) You are subject to subtle race conditions that occur only rarely and are very difficult to debug.
5) You have to read the manual. A lot. You can't make it work by fooling around. Sometimes 20 things have to be set up correctly to get what you are after.
6) The hardware doesn't always work or is easy to damage, and it takes a while to figure out that you broke it.
7) Software repairs in embedded systems are usually very expensive. You can't just update a web page. A recall can erase any profit you made on the device.
There are probably more but I've got this race condition to solve...
This is very subjective I guess, his reasons could be many. But if he's like me, I know where he's coming from. Let me explain.
In my career I've dedicated 6 years to the telecom industry, working a lot with embedding SDK middleware into low-end mobile phones etc.
Most embedded environments I've experienced are like harsh weather for a programmer, you constantly have to overcome limitations in resources etc. Some might find this a challenge and enjoy it for the challenge itself, some might feel close to "the real stuff" - the hardware, some might feel it limits their creativity.
I'm the kind who feels it limits my creativity.
I enjoy being back in Windows desktop environment and flap my wings with elaborate class designs, stretch my legs a few clockcycles extra, use unnecessary amounts of memory for diagnostics etc.
On certain embedded units in the past, I hardly had support for fseek() (an ANSI C standard file function). If lucky, a "watchdog" could give clues to where something crashed. Not to mention the pain of communicating with the user in single-threaded preemptive swamps.
Well, you know what I'm getting at. In my opinion it's not necessarily hard, but it's quite a leap, with potentially little reuse of your current experience.
Regards
Robert
There is a very real difference in mindset from user-level application development (ie, general purpose PC or Web applications) to hard deadline, real-time response application development (ie, the hardware/software interface).
Interrupts, instruction sets, context switching and hard resource constraints are relatively unknown to your average developer. I'm assuming here that your 'average developer' is not an Electrical/Electronic or other Engineer by training.
The transition for this developer you mention may be well outside his comfort zone. Some of us like stretching like that. Others of us may have decided the view isn't worth the climb.
Likewise, folks who've been in the hardware area (ie, Engineers) often have difficulty with the assumptions and language of software development.
These are gross generalities, of course, but hopefully give some insight.
He needs to be comfortable with the low-level stuff, but mostly for debugging and field issues. There is a serious learning curve depending on the architecture, but not impossible. On the other hand, the low-level code takes (in general) more time and debugging than higher-level code. So if you need to be going back to low-level all the time, then perhaps something isn't right in the design. Even for the embedded controls I've built, I spend the vast majority of time in high-level code. Although when you have issues, it is extremely advantageous to have a very good low-level knowledge.
I am an EE turned Software Engineer. I prefer programming low level. Most software developers classically trained that I know do not want to operate at this level they want apis to call. So for me it is a win win, I create the low level driver and api for them to use. There is a "new" degree, at least new since I went to college, called Computer Engineer. Hmm, it might be an electrical engineering degree not computer science, but it is a nice mix of software and digital hardware basics. The individuals that I have worked with from this field are much more comfortable with low level.
If the individual is not comfortable or willing then place them somewhere where they are comfortable. Let them do documentation or work on the user interface. If all of the work at the company requires low level work then this individual needs to do it or find another job. Dont sugar coat it.
I also think they will enjoy it once they get over the hump, the freedom you have at that level, not hindered by operating systems, etc. Recently I witnessed a few co-workers experience for the first time seeing their software run under simulation. Every net within the processor and other on chip peripherals. No you dont have a table on a gui (debugger) showing the current state of the memory, you have to look at the memory bus, look for the address you are interested in, look for a read or write signal and the data bus. I worry about the day that silicon arrives and they no longer have this level of visibility. Will be like an addict in detox.
Well, I cut my teeth on hardware when I started reading Popular Electronics at age 14 – this was BEFORE personal computers, in case you were wondering and if you weren’t well, you know anyway. lol
I’ve done the low level bit-bang stuff on the 8048/51 microprocessor, done PIC’s and some other single chip variations and of course Rabbit Semiconductor. (great if you're into C). That’s great (and fun) stuff; Yes, there is a different way of looking at things – not harder, but some of that information is a bit harder to come by as it isn’t as discussed as the software issues. (Of course, this depends on the circle of friends with which you associate, eh).
But, having said all of this, I want to remind you of a technology that started to bridge the gap for programmers into the world of hardware and has since become a very MAJOR player and that is the .NET micro framework. You can find information on this technology at the following;
http://msdn.microsoft.com/en-us/embedded/bb267253.aspx
It addresses some of the same issues that .NET web development addressed in that you can use some (quite a bit, actually) of your existing PC based knowledge in the new environments – Some caution, of course, as your target machine doesn’t have 4 GIG of RAM – it may only have 64K (or less)
Starting in version 2.5 of the .NET micro framework, you have access to networking and web services – way kewl, eh? It doesn’t stop there … Want to control the lights in your house? How about a temp recording station? All with the skills you already have. Well, mostly -- Check out the link.
The SDK plugs into your VisualStudio IDE. There are a number of “Development Kits” available for a very reasonable amount of cash – Now, what would normally take a big learning curve in components, building a circuit board and wiring up “stuff” can be done reasonably easy with a dev kit and some pretty simple code – Of course, you may need to do the occasional bit bang operation, but more and more sensor folks are providing .NET micro framework drivers – so, the hardware development may be closer than you think…
Hope it helps...
I like both. Embedded challenges me and really gets me going in a visceral way. Making something that affects the macro physical world is very satisfactory. But I've had to do a lot of catch up on the electrical/electronics end, since my bachelor's is in computer science. I've a pretty generalist background, where I studied ai, graphics, compilers, natural language, etc. Now I'm doing graduate work in embedded systems. The really tough part is adjusting to the lack of runtime facilities like an operating system.
Low-level embedded programming also tends to include low-level debugging. Which (in my experience) usually involves (at least) the use of an oscilloscope. Unless your colleague is going to be happy spending at least some of the time in physical contact with the hardware and thinking in terms of microseconds and volts, I'd be tempted to leave them be.
Agreed on the "hard" term is quite relative.
I would say different, as you would need to employ different development patterns that you won't use in other kind of environment.
The time constraint for instance could requires a learning curve.
However being curious, would be a quality for a developer, wouldn't be?
You are right in that anyone with enough knowledge not to feel completely lost in an area (over the hump?) will enjoy the challenges of learning something new.
I myself would feel quite nervous being moving to the level of instruction sets etc as there is a huge amount of background knowledge needed to feel comfortable in the environment.
It may make a difference if you are able to support the developer in learning how to do this. Having someone there you can ask and talk through issue with is a huge help in that sort of domain change.
It may be worth having the developer assigned to a smaller project with others as a first step and see how that goes. If he expresses enthusiasm to try another project, things should flow on from there.
I would say it is not any harder, it just requires a different knowledge set, different considerations.
I think that it depends on the way that they program in their chosen environment, and the type of embedded work that you're talking about.
Working on an embedded linux platform, say, is a far smaller jump than trying to write code on an 8 bit platform with no operating system at all.
If they are the type of person that has an understanding of what is going on underneath the api and environment that they are used to, then it won't be too much of a stretch to move into embedded development.
However, if their world view stops at the high level api that they've been using, and they have no concept of anything beneath that, they are going to have a really hard time.
As a (very) general statement if they are comfortable working on multithreaded applications they will probably be ok, as that shares some of the same issues of data volatility that you have when working on embedded projects.
With all of that said, I've seen more embedded programmers successfully working in PC development than I have the reverse. (of course I might not have seen a fair cross section)
"But when I talk to him about doing low-level programming, he simultaneously express interest and also doubt/uncertainty about joining the project." -- That means you let him try and you prepare to hire someone else in case he doesn't pass the learning curve.
i began as a SW engineer i'm now HW one !
the important is to understand how it works and to be motivated !