As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 9 years ago.
I have a great idea for a 3D network game, and I've concluded that it is possible to write it in Java as an applet which will live under the web browser, just like a full software in C++. And it will look and feel the same.
The main advantage of Java on C++ is that with Java you can play without downloading any software. I have already thought about the download of the graphics, sound, etc but I found a solution for it. RuneScape just proves that it is possible.
So my first question is, should my game live on a web browser or on the operating system? I think that in a web browser it is much more portable, although you need install Java and stuff. But the fact is, that most MMO games are currently not in the web. If you suggest in a software so please suggest a language either - C++ or something more productive like Python or C#?
So after choosing a language, I need a graphics solution. Should I write directly with OpenGL/DirectX or use a game engine? What game engine should I use? Ogre? jMonkeyEngine?
What's your opinion?
Thank you!
P.S: Please don't use answers like "Use what you know".
Despite your last point, use whatever you can, and what will provide the biggest user base possible.
Applets are old, and no longer used as extensively as they used to. Flash or Silverlight are the "standard" for web games now. It may be worth checking out JavaFX based your interest in using Java, it's supposedly a replacement for what applets should have been. I've not actually used JavaFX, nor do I hear much about it, take that as you wish. The biggest benefit of deploying to the web is as you've said, the user base is larger and people are more likely to give your game a play. The downside is that you end up using the likes of Flash or equivalent for the development process.
If you go down the route of building a standalone application, you can use whatever you want. C++, Java, C#, Python and so on are all viable options. You can make games in most languages. C++ is the industry standard but ignore this fact. You can make amazing looking and performing games in any language if you are a hobbyist developer. What I'm trying to say is that unless you are building the next big hit, using C++ can be avoided. In contrast to web applications, your users will require a framework/API that you use. For example, they'll need OpenGL/DirectX/XNA and so forth. As for XNA vs DirectX vs OpenGL? It matters not, your language choice will most likely dictate your choice of graphics API/Framework. So I'll leave this point up to yourself for research.
As for should you use an engine? It depends.
Are you making a game which is complex enough to warrant an engine?
Do you wish to just focus on the game, rather than the engine?
Do you feel comfortable learning an existing engine?
Do you feel comfortable producing the required components (collision etc..) on your own?
Other factors come into this, but it may be worth just focusing on the game at hand. You can easily write a simple enough engine for what you require. By doing this, you'll avoid licensing and deployment issues.
One option to consider is the Unity 3D game engine - in addition to being a fairly powerful development tool, it has several cross-platform deployment options. You can build both a stand-alone executable (for Windows and Mac, not yet Linux), and a web-browser version, which answers your first question about deploying on the web versus OS. You can do both.
It also uses both Javascript and C# (and Boo, a variant of Python) for scripting languages. These are based on Mono, the Open-Source version of .NET, so it's not just a gaming platform, but has access to all of .NET's abilities (well, those implemented in Mono anyway).
See the Licensing page for a long list of Unity's features (the Basic version is free). And check out the list of Unity-based games, of which the first is Tiger Woods PGA Tour Online, by Electronic Arts.
A game that just runs as an applet will not be percieved as a real game to most hardcore gamers.
If you want a game that is played only by noobs, the java might be an option, otherwise drop it and stick to a language that allows to actually produce executables.
Talking about the library, there are not so many you can't try them all and chose the one you like the most, so... do just that.
Related
As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 9 years ago.
I want to make a list of things that need to learn that is valuable for my career. What skills do you think are vital for an embedded developer, now and the distant future?
I have become quite proficient with C and ARM assembler through working with embedded Linux kernel and I'm about to dive into Linux drivers. However I can't help to think that I'm maybe narrowing my skill set to much. I want to keep working with embedded systems in the future but you never know the job market (paranoid that I'm going to be outsourced to China and India).
I feel that I'm currently quite weak with C++ and Java, I would also like to learn the Android kernel in the future. I also don't know any scripting languages.
Can anyone who has worked with embedded systems for a while, give some input on what skills/languages they think is vital for an embedded developer? Should I continue to only hone my C skills or should I learn new things.
Here's my list:
C essentials
OOP/ C++ - classes, encapsulation, polymorphism, overloading/ overriding, templates
Algorithms - search, sort, b-trees
Design Patterns - factory, observer, singleton etc.
Real Time Operating Systems - primitives (semaphore, mutex), scheduling techniques, user/ kernel space
Linux fundamentals, driver writing, shell
microprocessor fundamentals - interrupt processing, registers, assembly code, etc.
microcontroller fundamentals - ADC, DAC, Timers, PWM, DMA, watchdog, etc.
Memory - NOR, NAND, SRAM, DRAM, wear levelling
Basic protocols - I2C, SPI, UART, LIN
Advanced protocols - SATA, PCIE, USB, CAN, MOST
Concurrent/ parallel programming - MPI for SMP etc.
UML - class diagram, component diagram, state diagram, sequence diagram
Perl or Python for scripting, for e.g. to modify simple text files.
Java and Android
Basic electronics - schematics reading, using oscilloscope, multimeter, soldering iron
Specialized techniques for embedded programming e.g. debouncing of switches, resistive ladder switches, rotary encoders, etc.
software engineering - SDLC, CMMI, agile methods e.g. SCRUM, version control (ClearCase, git, svn), bug tracking (JIRA?), static code checking, Lint, unit testing, continuous integration
build environments - makefile, cmake
Basic FPGA/ ASIC design, basic DSP
As mentioned by Lundin, this question is open to many different answers. You have from small battery-powered memory-constrained bare metal embedded devices to more complex systems running Linux.
First of all, it's very important to be a flexible developer. You need to be able to adapt to changes as quickly as possible. You may need to do a concept-proof prototype in just a couple of weeks in a language you've never used before, or to start working in a legacy project to fix a bug very quickly.
It's very important to know about software architecture concepts, RTOS, event-driven systems (embedded systems are reactive by nature) and modeling too (UML). Maybe test-driven development (TDD). These are language-agnostic, and will help you to develop good firmware from the ground-up.
Regarding to languages, I think that C is used in both small and big systems, so having a good background in C is a must. Here I'm not talking about c programming at a novice-level. I'm talking about knowing what the processor and the compiler do behind the scenes. According to what you mentioned, you probably have these skills. This is very helpful in the case of small systems, where every byte of RAM and ROM counts.
Knowing something about MISRA-C rules will help you to develop safer C code.
Probably you will need some scripting programming to perform automated testing, data processing, code-generating tools, etc. I use Python for all of this, and also some linux shell scripting.
Being able to design PC-based applications is useful for creating test fixtures to test the embedded devices in the production line, or maybe because the embedded device just needs a pc software to work, like a pocket USB-based oscilloscope. In this case, I use Qt, since it's cross-platform, but you can use Visual Studio with C# if you only want to work in Windows.
In the case of embedded systems, it's better if you have a solid hardware background. Also, you need to be able to use an oscilloscope, a logic analyzer, a signal generator, etc. Sometimes you will need to fix hardware problems with software. :)
Here is a small list of books I find very useful:
Practical UML Statecharts in C/C++.
UML Distilled.
Making Embedded Systems.
Computers as components.
Embedded Software Primer.
Better Embedded Systems Software.
I hope it helps.
Fernando
Whatever the domain you want to select not only C programming you should know but you should also be good familiar with hardware.
It doesn't matter which domain your are working on (linux, vlsi, arm....). But matter how efficient your code is running on the hardware.
If you really would like to work in embedded world, you will find your way.
As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 11 years ago.
my development style brings me to write a lot of throw-away "assisting" code,
whether for automatic generation of code parts, semi-automated testing, and generally to build dummies, prototypes or temporary "sparring partners" for the main development; I know I'm not the only one...
since I frequently work both under windows and Unicies, I'd like to non-exclusively focus on a single "swiss army knife" tool that can work in both the environments with limited differences, that would allow me to do usual stuff like text parsing, db access, sockets, nontrivial filesystem and process manipulation
until now under unix I've used a bit of perl and massive amounts of shell scripts, but the latter are a bit limited and perl... despite being very capable and having modules for an incredible array of duties, sincerely I find it too "hostile" for me for something that goes beyond 100 lines of code.
what would you suggest?
scripting is not a requirement, it would be ok to use more static-styled languages IF it makes development faster (getting programs to actually do their work and possibly in a human readable state) and if it doesn't become nightmarish to handle errors/exception and to adapt to dynamic environments (e.g. I don't like to hardwire data /db table structure in my code, especially by hand).
I've been intrigued by python, ruby, but maybe groovy (with its ability to access the huge class library and his compact syntax) or something else is better suited
thanks a lot in advance!
(meanwhile, on a completely different note, scala looks really tempting just for the cleanliness of it, but that's - probably - a completely different story, unless you tell me the opposite...?)
Python is arguably one of the best choices. Its biggest benefit is that it has a huge built-in library for doing all sorts of stuff. It is also mature, very cross-platform, actively developed, and has many support options (mailing lists, newsgroups, etc).
In addition, it has a built-in GUI toolkit (tkinter) for those times when you need to write a quick GUI to get input from a user or display output from a running process. And if you don't like tkinter, there are other cross-platform GUI toolkits available.
I suggest Python.
For me it has a sweet spot of good libraries, documentation, community, cross-platform functionality, and ease of writing/reading.
It fills a similar niche to Perl's, but if you find Perl to be 'hostile' for longer scripts, you will probably like Python, especially when compared to Ruby, which feels more Perl-y, IMHO.
As an aside, all of these are quite easy to just try out - why not do that?
Then you can decide for yourself instead of trusting the questionable wisdom of an online forum (:
I think that Python and Ruby are your best bets, depending on exactly how you think and code.
I personally find Python EXTREMELY readable and its syntax is highly intuitive. I've heard Python described as "pseudo-code plus colons."
On the other hand, once you get around its slightly bizarre syntax, Ruby makes for high-speed development. It's built around DRY principles and convention-before-configuration, which is great for rapid prototyping.
There are other languages--especially Haskell and the Lisp dialects--that can make for super-rapid prototyping, but they don't have as large a supportive community, so there's a shortage in library and discussion supply.
As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 11 years ago.
If you've worked on a cross platform development project what advice do you have for someone (like myself) considering starting one? Examples:
What worked?
What didn't?
What problems did you run into and
how did you solve them?
Did you aim for consistent appearance
and functionality across all platforms
or try to take advantage of each platform's strengths?
What language and cross platform libraries did you
use and did they live up to their claims?
Was it a desktop, mobile or web-based application?
I worked on a small cross-platform project. The app was a desktop app, distributed directly as an executable on USB.
Did some research initially - you might be able to trace my stackoverflow questions - and settled with Mozilla's XULRunner.
It worked really well in terms of being cross platform - you're basically running off a well cross-tested platform that runs firefox and thunderbird, so the crossability (is that a word?) is pretty bullet proof.
Its first flaw I thought was the documentation - there is a lot out there, but it's a bit obscure at first.
But then, the mozilla forums community is awesome, really friendly and helpful.
In terms of development, it's all javascript and a bit of html-like marker language called xul and styling using css (you can also build extensions in c++ if you need, but the core is pretty powerful as it), so it's pretty familiar right away and pretty flexible too.
XUL gives a good platform-specific native feel to the apps (IMO), or you can re-do entirely your chrome elements and make a custom ui.
One major gotcha with XUL is that because of the way XULRunner works, your source code will be more or less pretty much viewable by anyone. If you have proprietary code this may not be ideal for you - or you will need to isolate in a separate library written in c++ (XPCom components in Xul terms).
I'd highly recommend it. I had a pretty short turn around time frame to build the project and it really saved my ass. I did the development on a Mac, and it worked right away on PC without any major change.
Additionally, there was other platforms that I reviewed for this project.
I've seen some strong support for cross platform UI kits & languages like QT (with PyQT for instance), or tcl/tk and such. I wasn't too taken by it but you might want to look into it.
Some people suggested AIR, JavaFX and .NET/Mono. For me this wasn't viable as it's too much of a dependency on some systems (XP doesn't come with Java, Mac doesn't come with .NET/Mono, neither come with AIR). If you want something that will just work out of the box, that doesn't fit the bill. But your requirements may be different.
Titanium (from Appcellerator) is probably the closest contestant to XULRunner. It seemed great, quite powerful but the desktop development edition has pretty awful debugging features compared to XULRunner. Their agenda is more toward the mobile dev edition, which I can understand, but I wish they'd care more for the desktop edition. However, if cross-platform mobile development is what you're after this may be good - though I haven't tested it for that purpose.
Hope this helps.
As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
I know that the Apple community – including Mac and iPhone developers – mainly use Objective-C for their development language. But it seems that not many people use Objective-C outside of the Apple community, such as in the Windows or Linux worlds.
What are the possible reasons that Objective-C is not particularly popular outside of the Apple community?
Another way of thinking about this question might be: why did C++, rather than Objective-C, become the "Object-Oriented C"?
I learned C++ in 1991, and remember that C++ seemed like the hot thing while Objective-C was this weird little language that no-one (other than NeXT) wanted to use. I've been trying to remember why, and I think it boils down to 4 things (5, if you include C++ having AT&T behind it):
Features: C++ had, even then, a much richer set of features than Objective-C.
Syntax: Objective-C's syntax is a much bigger change from C than is C++.
Performance: Stroustrup focused on making C++ features easily mappable to C, so that (in theory!) there was no performance penalty in using C++. And with judicious use of the "inline" keyword, you could even get better performance with C++ than with C. Even now, there is no way I would use Objective-C in a project where performance was critical.
Style: Relatively strong, static typing was the fashion (for good reasons).
So compared to Objective-C, C++ in the early '90s gave you more features with less of a performance penalty, with a syntax that was both fashionable and more familiar than Objective-C's.
This is a complicated question; but in short; I think the answer most likely lies in the age of the operating systems, and their roots.
UNIX is C, so that's that.
Linux is envisioned as a straight-up clone of Unix, (Fine, this is slightly inaccurate, but close enough for this discussion) and as such, it is more or less written in C.
Windows is an old operating system; and one that is built by stacking hack upon hack going back all the way to Windows 3.1. C++ is heavily favored, and in .NET, C#.
This new influx is of course based on whatever agenda Microsoft has with that platform.
Mac OS X; on the other hand, is a (comparatively) young operating system, and its new parts (while still quite old, being inherited from the NeXT and whatnot) are all based on Objective-C because, "Hey! Why not?".
As backwards compatibility was not among the list of priorities with Mac OS X 10.0; the C/C++-based Toolbox and Carbon got the short end of the stick, and the entire operating system was more or less made as a reskinned version of NeXTStep.
The issue with Obj-C is that the power of the language comes mostly from the sizable frameworks, the generally high level of integration into the system, and so on. It's almost impossible to get a good jive like that going without a clean break from backwards compatibility and, as such, it would never really stand a chance on any platform that didn't dare to do this. Apple, with a small (at the time) and devoted user base, dared do this, and struck gold.
Microsoft is now trying, but are, in my humble opinion, failing. ("Failing?! .NET!? HOW DARE YOU!?": With 4 major revisions in about 8 years, they are doing more growing than maturation; which may be a good thing, if they can turn it around.)
Edit: There are some projects attempting to port OpenStep to Linux, but they are a bit clunky and hard to use; there are also smaller projects on NS/OS-likes with smaller problem domains, but it's uphill work.
I was recently standing in a bookshop reading Masterminds of Programming where the creators of programming languages talk about their creations. There was one chapter about Objective-C where Tom Love (one of the creator of Objective-C, along with Brad Cox) was asked why C++ had gone so far, while Objective-C hadn't:
Why do you think that C++ was used more frequently than Objective-C?
Tom: It had the AT&T moniker behind it.
Just that?
Tom: I think so.
What do you think about Objective-C today?
Tom: It still exists. How about that?
Objective-C is nothing but a thin layer(a bit thicker with 2.0) of syntactic sugar for message passing on top of standard C. Even the most basic object orientation is provided by the runtime library, which was proprietary for a long time. Inertia is an important factor in language use.
It shines especially on GUIs, but the only toolkits that support it are Apple's and the mostly unknown and catchup-playing GNUStep.
While there is some value for Objective-C outside of GUIs, and I think people would use the extensions were they imported to C, even in system code, there is little reason to choose it over alternatives, when little of your system is meant to work with it.
Off the top of my head, I believe C++ is older than Objective-C, and not only for this reason has a much bigger user base. Everywhere that OC may have come, C++ was already there :)
Also, C++ has more features. Many people are impressed by lots of features. And it's had more research and development poured into it... and so forth. Essentially, momentum.
As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
I'm looking at doing embedded coding for a device that's approximately 20MHz, has 6mb ram, with an ARM32 processor. Can anyone suggest the best / most appropriate language for programming an embedded system? I'm considering:
Lua
TinyPy
C
Java ME
C#
someone has suggested JavaScript
Any suggestions? Thanks
Edit - looks like C and Lua are the winners. Cheers all!
Edit - Real Time is not an issue, its more the limited ram/cpu dictating things.
If you're bringing the device up from scratch or interfacing directly with non-standard peripherals, C is really the only way to go.
If you've already got an embedded OS or can port one without difficulty, you might have more flexibility in adding one of the more script-y languages. C# is out of the question unless you're on WinCE, and then you'll be restricted to .NET Micro.
Beyond that, "best" has little meaning without describing what your device is going to be used for. Some languages have better support for certain tasks than others.
C is probably your best bet for such limited cpu resources.
I've used Lua on an ARM OMAP processor. Lua's tight integration with C allows going to the metal whenever you need, and its small size makes it suitable for a wide range of platforms. I developed the UI for my firmware in Lua on my mac and then brought it over to the embedded platform with no changes.
While the OMAP processor was beefy enough to run other languages like Java or Python, I didn't know what hardware I was targeting when I started the code. Lua was a safe bet.
I'd be tempted to go with straight C, but then I've been writing C for nearly 30 years. Lua and TinyPy seem too new, experimental, to me; embedded devices need to be very robust.
Java ME has good points. I don't know about C# in an embedded world.
It's important to specify what you expect this device to do. Is it some sort of control application? Does it have to implement algorithms? What about floating-point support? GUIs? Is performance critical? Are you planning on using an OS?
Answering these questions is a crucial prerequisite to picking a programming language.
That said, embedded systems have to be reliable, so I'd go for some tested solution. C is probably the most solid and best-supported option for ARM chips, but YMMV depending on your specific needs.
C is certainly the most used language in embedded systems.
It also seems to be the most talked about language in general http://www.langpop.com/
Edit: hmm. I just noticed that the 'embedded' you seem to be describing is not about adding an automation language to an application, but squeezing an application into an embedded platform. As others suggest, unless you really need it, skip embeddable languages and program your application in C. There is nearly no runtime overhead for that, except for what you actually use.
In no particular order, Lua, JavaScript and TCL are all quite well suited to embedding. Lua has been the easiest for me to embed. Javascript might be the fastest. All three have good handling for untrusted code, but TCL's is most robust, for example, untrusted code can run untrusted code (if it's trusted to do that much).
Unless you have an RTOS available that supports a variety of alternate languages, C or C++ (depending on your compiler chain) is the way to go.
Your decision is most likely to be determined by the tools avaiable for this processor.
C is by far the most supported language for embedded processors, so you can't go far wrong with that, and it will be good experience if you have to write software for other chips in the future.
C++ is becoming more popular for embedded systems. Beyond that, it depends on your priorities (time to market, resource usage, speed), and the quality of the tools you use.
C the best