Suggestions for the most appropriate (best) language for programming an embedded system? [closed] - embedded

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
I'm looking at doing embedded coding for a device that's approximately 20MHz, has 6mb ram, with an ARM32 processor. Can anyone suggest the best / most appropriate language for programming an embedded system? I'm considering:
Lua
TinyPy
C
Java ME
C#
someone has suggested JavaScript
Any suggestions? Thanks
Edit - looks like C and Lua are the winners. Cheers all!
Edit - Real Time is not an issue, its more the limited ram/cpu dictating things.

If you're bringing the device up from scratch or interfacing directly with non-standard peripherals, C is really the only way to go.
If you've already got an embedded OS or can port one without difficulty, you might have more flexibility in adding one of the more script-y languages. C# is out of the question unless you're on WinCE, and then you'll be restricted to .NET Micro.
Beyond that, "best" has little meaning without describing what your device is going to be used for. Some languages have better support for certain tasks than others.

C is probably your best bet for such limited cpu resources.

I've used Lua on an ARM OMAP processor. Lua's tight integration with C allows going to the metal whenever you need, and its small size makes it suitable for a wide range of platforms. I developed the UI for my firmware in Lua on my mac and then brought it over to the embedded platform with no changes.
While the OMAP processor was beefy enough to run other languages like Java or Python, I didn't know what hardware I was targeting when I started the code. Lua was a safe bet.

I'd be tempted to go with straight C, but then I've been writing C for nearly 30 years. Lua and TinyPy seem too new, experimental, to me; embedded devices need to be very robust.
Java ME has good points. I don't know about C# in an embedded world.

It's important to specify what you expect this device to do. Is it some sort of control application? Does it have to implement algorithms? What about floating-point support? GUIs? Is performance critical? Are you planning on using an OS?
Answering these questions is a crucial prerequisite to picking a programming language.
That said, embedded systems have to be reliable, so I'd go for some tested solution. C is probably the most solid and best-supported option for ARM chips, but YMMV depending on your specific needs.

C is certainly the most used language in embedded systems.
It also seems to be the most talked about language in general http://www.langpop.com/

Edit: hmm. I just noticed that the 'embedded' you seem to be describing is not about adding an automation language to an application, but squeezing an application into an embedded platform. As others suggest, unless you really need it, skip embeddable languages and program your application in C. There is nearly no runtime overhead for that, except for what you actually use.
In no particular order, Lua, JavaScript and TCL are all quite well suited to embedding. Lua has been the easiest for me to embed. Javascript might be the fastest. All three have good handling for untrusted code, but TCL's is most robust, for example, untrusted code can run untrusted code (if it's trusted to do that much).

Unless you have an RTOS available that supports a variety of alternate languages, C or C++ (depending on your compiler chain) is the way to go.

Your decision is most likely to be determined by the tools avaiable for this processor.
C is by far the most supported language for embedded processors, so you can't go far wrong with that, and it will be good experience if you have to write software for other chips in the future.
C++ is becoming more popular for embedded systems. Beyond that, it depends on your priorities (time to market, resource usage, speed), and the quality of the tools you use.

C the best

Related

Embedded Developer, what skills are important [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 9 years ago.
I want to make a list of things that need to learn that is valuable for my career. What skills do you think are vital for an embedded developer, now and the distant future?
I have become quite proficient with C and ARM assembler through working with embedded Linux kernel and I'm about to dive into Linux drivers. However I can't help to think that I'm maybe narrowing my skill set to much. I want to keep working with embedded systems in the future but you never know the job market (paranoid that I'm going to be outsourced to China and India).
I feel that I'm currently quite weak with C++ and Java, I would also like to learn the Android kernel in the future. I also don't know any scripting languages.
Can anyone who has worked with embedded systems for a while, give some input on what skills/languages they think is vital for an embedded developer? Should I continue to only hone my C skills or should I learn new things.
Here's my list:
C essentials
OOP/ C++ - classes, encapsulation, polymorphism, overloading/ overriding, templates
Algorithms - search, sort, b-trees
Design Patterns - factory, observer, singleton etc.
Real Time Operating Systems - primitives (semaphore, mutex), scheduling techniques, user/ kernel space
Linux fundamentals, driver writing, shell
microprocessor fundamentals - interrupt processing, registers, assembly code, etc.
microcontroller fundamentals - ADC, DAC, Timers, PWM, DMA, watchdog, etc.
Memory - NOR, NAND, SRAM, DRAM, wear levelling
Basic protocols - I2C, SPI, UART, LIN
Advanced protocols - SATA, PCIE, USB, CAN, MOST
Concurrent/ parallel programming - MPI for SMP etc.
UML - class diagram, component diagram, state diagram, sequence diagram
Perl or Python for scripting, for e.g. to modify simple text files.
Java and Android
Basic electronics - schematics reading, using oscilloscope, multimeter, soldering iron
Specialized techniques for embedded programming e.g. debouncing of switches, resistive ladder switches, rotary encoders, etc.
software engineering - SDLC, CMMI, agile methods e.g. SCRUM, version control (ClearCase, git, svn), bug tracking (JIRA?), static code checking, Lint, unit testing, continuous integration
build environments - makefile, cmake
Basic FPGA/ ASIC design, basic DSP
As mentioned by Lundin, this question is open to many different answers. You have from small battery-powered memory-constrained bare metal embedded devices to more complex systems running Linux.
First of all, it's very important to be a flexible developer. You need to be able to adapt to changes as quickly as possible. You may need to do a concept-proof prototype in just a couple of weeks in a language you've never used before, or to start working in a legacy project to fix a bug very quickly.
It's very important to know about software architecture concepts, RTOS, event-driven systems (embedded systems are reactive by nature) and modeling too (UML). Maybe test-driven development (TDD). These are language-agnostic, and will help you to develop good firmware from the ground-up.
Regarding to languages, I think that C is used in both small and big systems, so having a good background in C is a must. Here I'm not talking about c programming at a novice-level. I'm talking about knowing what the processor and the compiler do behind the scenes. According to what you mentioned, you probably have these skills. This is very helpful in the case of small systems, where every byte of RAM and ROM counts.
Knowing something about MISRA-C rules will help you to develop safer C code.
Probably you will need some scripting programming to perform automated testing, data processing, code-generating tools, etc. I use Python for all of this, and also some linux shell scripting.
Being able to design PC-based applications is useful for creating test fixtures to test the embedded devices in the production line, or maybe because the embedded device just needs a pc software to work, like a pocket USB-based oscilloscope. In this case, I use Qt, since it's cross-platform, but you can use Visual Studio with C# if you only want to work in Windows.
In the case of embedded systems, it's better if you have a solid hardware background. Also, you need to be able to use an oscilloscope, a logic analyzer, a signal generator, etc. Sometimes you will need to fix hardware problems with software. :)
Here is a small list of books I find very useful:
Practical UML Statecharts in C/C++.
UML Distilled.
Making Embedded Systems.
Computers as components.
Embedded Software Primer.
Better Embedded Systems Software.
I hope it helps.
Fernando
Whatever the domain you want to select not only C programming you should know but you should also be good familiar with hardware.
It doesn't matter which domain your are working on (linux, vlsi, arm....). But matter how efficient your code is running on the hardware.
If you really would like to work in embedded world, you will find your way.

pragmatic cross platform (and very fast to make it - actually - work) "throwaway" code: which language/tools? [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 11 years ago.
my development style brings me to write a lot of throw-away "assisting" code,
whether for automatic generation of code parts, semi-automated testing, and generally to build dummies, prototypes or temporary "sparring partners" for the main development; I know I'm not the only one...
since I frequently work both under windows and Unicies, I'd like to non-exclusively focus on a single "swiss army knife" tool that can work in both the environments with limited differences, that would allow me to do usual stuff like text parsing, db access, sockets, nontrivial filesystem and process manipulation
until now under unix I've used a bit of perl and massive amounts of shell scripts, but the latter are a bit limited and perl... despite being very capable and having modules for an incredible array of duties, sincerely I find it too "hostile" for me for something that goes beyond 100 lines of code.
what would you suggest?
scripting is not a requirement, it would be ok to use more static-styled languages IF it makes development faster (getting programs to actually do their work and possibly in a human readable state) and if it doesn't become nightmarish to handle errors/exception and to adapt to dynamic environments (e.g. I don't like to hardwire data /db table structure in my code, especially by hand).
I've been intrigued by python, ruby, but maybe groovy (with its ability to access the huge class library and his compact syntax) or something else is better suited
thanks a lot in advance!
(meanwhile, on a completely different note, scala looks really tempting just for the cleanliness of it, but that's - probably - a completely different story, unless you tell me the opposite...?)
Python is arguably one of the best choices. Its biggest benefit is that it has a huge built-in library for doing all sorts of stuff. It is also mature, very cross-platform, actively developed, and has many support options (mailing lists, newsgroups, etc).
In addition, it has a built-in GUI toolkit (tkinter) for those times when you need to write a quick GUI to get input from a user or display output from a running process. And if you don't like tkinter, there are other cross-platform GUI toolkits available.
I suggest Python.
For me it has a sweet spot of good libraries, documentation, community, cross-platform functionality, and ease of writing/reading.
It fills a similar niche to Perl's, but if you find Perl to be 'hostile' for longer scripts, you will probably like Python, especially when compared to Ruby, which feels more Perl-y, IMHO.
As an aside, all of these are quite easy to just try out - why not do that?
Then you can decide for yourself instead of trusting the questionable wisdom of an online forum (:
I think that Python and Ruby are your best bets, depending on exactly how you think and code.
I personally find Python EXTREMELY readable and its syntax is highly intuitive. I've heard Python described as "pseudo-code plus colons."
On the other hand, once you get around its slightly bizarre syntax, Ruby makes for high-speed development. It's built around DRY principles and convention-before-configuration, which is great for rapid prototyping.
There are other languages--especially Haskell and the Lisp dialects--that can make for super-rapid prototyping, but they don't have as large a supportive community, so there's a shortage in library and discussion supply.

fastest scripting programming language? [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
I have a web application project where performances count more
than anything else, and I have the choice of the technologies
to use.
The language shootout benchmarks that are not really related
to web applications.
What would you recommand as the best suitable candidates?
Thanks!
A friend suggested the gwan server on IRC. Looks to be what I
was searching but I never heard about it before. Anybody with
prior experience on this package? Ease of use, reliability?
Before I leave Apache, I would like to get your thoughts.
G-WAN is a neat webserver: it's based around the "C scripts" concept:
A C script is simply C source-code that is compiled by the webserver and then loaded in protected memory. It will get called by the webserver when a request to the servlet is made. The servlet, as it's compiled by a C compiler, is "as fast" as normally compiling a C program. However, the advantage of C scripts to, for instance, CGI or FastCGI, is that the compiled program is in the same memory space as the webserver. This reduces the overhead of communication (either by creating a process, in the case of CGI, for each request, or the socket for FastCGI).
The webserver is using the select/poll technique: non-blocking I/O. However, there's a neat thing to it. Every program can be written as if it was using blocking I/O. As the webserver itself compiles each C script, it can transform the program to use non-blocking I/O. As of this, it can link itself to third-party libraries (like database access) and still make use of the non-blocking I/O nature: no thread/process context switching.
The tools provided for programming the C scripts are, for instance, caching and safe buffers. The next (not yet released as of writing this post) version will also include a Key-Value store.
Performance-wise: there are some benchmarks available showing it's outperforming any other webserver, however I don't trust these. Try writing a small CPU intensive program in C and in, for instance, PHP. Let the C script run on G-WAN and the PHP script on Apache, and do a benchmark yourself.
There is more to it, but that's out of scope for this question.
Some downsides of G-WAN is that it is developed by only one person. There is a forum, however, where you can ask questions.
Ease of use is limited by your skill in C. The API provided, however, is simple. It still has some inconsistencies and (in my opinion) ugly parts, but that's not a problem. A more serious problem is that each version is not guaranteed to be backwards-compatible and you may have to rewrite.
If you want to be safe: make use of C's platform independentness: allow your code to be compiled to (Fast)CGI programs and also to be used by G-WAN. Might G-WAN fail, you can always fallback to Apache's (Fast)CGI (see http://www.fastcgi.com/ for API's).
If performance counts more than anything else, don't use a scripting language. Especially since you have full control over the technology stack. Compiled languages will perform better for CPU intensive operations.
LuaJit (Lua) is the fastest scripting language with JIT technology..
if you want the fastest for server side web application (that not always scripting), that would be g-wan.. you can use c, c++, java..
ASP.NET is also fast enough for almost anything, but quite pricey
php with hiphop would be easiest to learn and also fast enough..
it depends on how many request do you need.. and how fast you learn the language ^^
don't forget to cache static data (using memcache or nosql)
Begin by identifying if your application performance really depends on the language or on some other factor (like database requests for instance). Ability to cache results can also be a very important factor.
For performance the language used come quite far in the list of important points to check and the use case also influence which language is better. For example if you have many regex to check you should check regex support in the candidate language, etc...
For image processing, the most important point will probably be the underlying image library you use, usually written in C. I have the case of ImageMagick in mind, because I'm currently using it. It's available for as a library for most languages and the scripting language layer is only necessary to call functions of the library and used language at that level won't change much (but caching precomputed result images could change performance by a large margin). This use case would probably be similar for calling a cryptographic lib.
If performance is really such an issue, for image processing you could also consider using a lib that works with GPU accelerator cards (libs with cuda or openGPU support).
Javascript is constantly being scrutinized and optimized for use on mobile devices, so on actual full-size servers it runs EXTREMELY fast. Check out Node.JS, a project for implementing server side javascript to serve webpages: http://nodejs.org/
Well, if you use a database with a large volume of data you will spend more time there than running a php or asp or (insert other flavours here) script
If you can you should build a mockup of your app (or at least a segment of the more database or processor-intensive parts) and try to benchmark those
Update: Seem like Java 7 using NIO.2 has manage to outperform Gwan using C but almost 2x in timing, it is incredible but you can a few a simple tests.
The only downside of Java is not able to integrate shared libraries built on C. I'm ready to challenge someone to prove me wrong that Java NIO.2 is slower than C.
I recommend the Java programming language; it's not a scripting language, but it's probably the fastest programming language that can be used for programming web applications. I also recommend using a framework like Spring for a better programming experience (versus "raw" Java Servlet Programming).
The fasted scripting Language is ASP followed by PHP, but if you want applications that scale to unlimited speeds, use C++ or Java.
Google Search uses C++
Gmail uses Java
YouTube = Python
Twiiter used to use Ruby now they shifted to Java
Facebook = PHP at front end and some java at the backend
But i recommend PHP at the front end and C++ at the back-end

Why is Objective-C not very popular outside of the Apple community? [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
I know that the Apple community – including Mac and iPhone developers – mainly use Objective-C for their development language. But it seems that not many people use Objective-C outside of the Apple community, such as in the Windows or Linux worlds.
What are the possible reasons that Objective-C is not particularly popular outside of the Apple community?
Another way of thinking about this question might be: why did C++, rather than Objective-C, become the "Object-Oriented C"?
I learned C++ in 1991, and remember that C++ seemed like the hot thing while Objective-C was this weird little language that no-one (other than NeXT) wanted to use. I've been trying to remember why, and I think it boils down to 4 things (5, if you include C++ having AT&T behind it):
Features: C++ had, even then, a much richer set of features than Objective-C.
Syntax: Objective-C's syntax is a much bigger change from C than is C++.
Performance: Stroustrup focused on making C++ features easily mappable to C, so that (in theory!) there was no performance penalty in using C++. And with judicious use of the "inline" keyword, you could even get better performance with C++ than with C. Even now, there is no way I would use Objective-C in a project where performance was critical.
Style: Relatively strong, static typing was the fashion (for good reasons).
So compared to Objective-C, C++ in the early '90s gave you more features with less of a performance penalty, with a syntax that was both fashionable and more familiar than Objective-C's.
This is a complicated question; but in short; I think the answer most likely lies in the age of the operating systems, and their roots.
UNIX is C, so that's that.
Linux is envisioned as a straight-up clone of Unix, (Fine, this is slightly inaccurate, but close enough for this discussion) and as such, it is more or less written in C.
Windows is an old operating system; and one that is built by stacking hack upon hack going back all the way to Windows 3.1. C++ is heavily favored, and in .NET, C#.
This new influx is of course based on whatever agenda Microsoft has with that platform.
Mac OS X; on the other hand, is a (comparatively) young operating system, and its new parts (while still quite old, being inherited from the NeXT and whatnot) are all based on Objective-C because, "Hey! Why not?".
As backwards compatibility was not among the list of priorities with Mac OS X 10.0; the C/C++-based Toolbox and Carbon got the short end of the stick, and the entire operating system was more or less made as a reskinned version of NeXTStep.
The issue with Obj-C is that the power of the language comes mostly from the sizable frameworks, the generally high level of integration into the system, and so on. It's almost impossible to get a good jive like that going without a clean break from backwards compatibility and, as such, it would never really stand a chance on any platform that didn't dare to do this. Apple, with a small (at the time) and devoted user base, dared do this, and struck gold.
Microsoft is now trying, but are, in my humble opinion, failing. ("Failing?! .NET!? HOW DARE YOU!?": With 4 major revisions in about 8 years, they are doing more growing than maturation; which may be a good thing, if they can turn it around.)
Edit: There are some projects attempting to port OpenStep to Linux, but they are a bit clunky and hard to use; there are also smaller projects on NS/OS-likes with smaller problem domains, but it's uphill work.
I was recently standing in a bookshop reading Masterminds of Programming where the creators of programming languages talk about their creations. There was one chapter about Objective-C where Tom Love (one of the creator of Objective-C, along with Brad Cox) was asked why C++ had gone so far, while Objective-C hadn't:
Why do you think that C++ was used more frequently than Objective-C?
Tom: It had the AT&T moniker behind it.
Just that?
Tom: I think so.
What do you think about Objective-C today?
Tom: It still exists. How about that?
Objective-C is nothing but a thin layer(a bit thicker with 2.0) of syntactic sugar for message passing on top of standard C. Even the most basic object orientation is provided by the runtime library, which was proprietary for a long time. Inertia is an important factor in language use.
It shines especially on GUIs, but the only toolkits that support it are Apple's and the mostly unknown and catchup-playing GNUStep.
While there is some value for Objective-C outside of GUIs, and I think people would use the extensions were they imported to C, even in system code, there is little reason to choose it over alternatives, when little of your system is meant to work with it.
Off the top of my head, I believe C++ is older than Objective-C, and not only for this reason has a much bigger user base. Everywhere that OC may have come, C++ was already there :)
Also, C++ has more features. Many people are impressed by lots of features. And it's had more research and development poured into it... and so forth. Essentially, momentum.

What technology should I use to write my game? [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 9 years ago.
I have a great idea for a 3D network game, and I've concluded that it is possible to write it in Java as an applet which will live under the web browser, just like a full software in C++. And it will look and feel the same.
The main advantage of Java on C++ is that with Java you can play without downloading any software. I have already thought about the download of the graphics, sound, etc but I found a solution for it. RuneScape just proves that it is possible.
So my first question is, should my game live on a web browser or on the operating system? I think that in a web browser it is much more portable, although you need install Java and stuff. But the fact is, that most MMO games are currently not in the web. If you suggest in a software so please suggest a language either - C++ or something more productive like Python or C#?
So after choosing a language, I need a graphics solution. Should I write directly with OpenGL/DirectX or use a game engine? What game engine should I use? Ogre? jMonkeyEngine?
What's your opinion?
Thank you!
P.S: Please don't use answers like "Use what you know".
Despite your last point, use whatever you can, and what will provide the biggest user base possible.
Applets are old, and no longer used as extensively as they used to. Flash or Silverlight are the "standard" for web games now. It may be worth checking out JavaFX based your interest in using Java, it's supposedly a replacement for what applets should have been. I've not actually used JavaFX, nor do I hear much about it, take that as you wish. The biggest benefit of deploying to the web is as you've said, the user base is larger and people are more likely to give your game a play. The downside is that you end up using the likes of Flash or equivalent for the development process.
If you go down the route of building a standalone application, you can use whatever you want. C++, Java, C#, Python and so on are all viable options. You can make games in most languages. C++ is the industry standard but ignore this fact. You can make amazing looking and performing games in any language if you are a hobbyist developer. What I'm trying to say is that unless you are building the next big hit, using C++ can be avoided. In contrast to web applications, your users will require a framework/API that you use. For example, they'll need OpenGL/DirectX/XNA and so forth. As for XNA vs DirectX vs OpenGL? It matters not, your language choice will most likely dictate your choice of graphics API/Framework. So I'll leave this point up to yourself for research.
As for should you use an engine? It depends.
Are you making a game which is complex enough to warrant an engine?
Do you wish to just focus on the game, rather than the engine?
Do you feel comfortable learning an existing engine?
Do you feel comfortable producing the required components (collision etc..) on your own?
Other factors come into this, but it may be worth just focusing on the game at hand. You can easily write a simple enough engine for what you require. By doing this, you'll avoid licensing and deployment issues.
One option to consider is the Unity 3D game engine - in addition to being a fairly powerful development tool, it has several cross-platform deployment options. You can build both a stand-alone executable (for Windows and Mac, not yet Linux), and a web-browser version, which answers your first question about deploying on the web versus OS. You can do both.
It also uses both Javascript and C# (and Boo, a variant of Python) for scripting languages. These are based on Mono, the Open-Source version of .NET, so it's not just a gaming platform, but has access to all of .NET's abilities (well, those implemented in Mono anyway).
See the Licensing page for a long list of Unity's features (the Basic version is free). And check out the list of Unity-based games, of which the first is Tiger Woods PGA Tour Online, by Electronic Arts.
A game that just runs as an applet will not be percieved as a real game to most hardcore gamers.
If you want a game that is played only by noobs, the java might be an option, otherwise drop it and stick to a language that allows to actually produce executables.
Talking about the library, there are not so many you can't try them all and chose the one you like the most, so... do just that.