Java processors - what to choose? - jvm

any suggestion what kind of Java processor to choose for studying and learning purposes?
I have read something about picoJava and its speed. DO you think it can be used for learning and practicing Java for processors?
Thank you

It's not clear to me if you are asking about learning the Java programming language, i.e., software, or learning about the processor, i.e., hardware, which executes instructions built in a programming language.
The Java programming language is available on many different platforms (including native processors like picoJava) via the Java virtual machine (JVM). If you're interested in learning the Java programming language, choose a JVM for your favorite OS; most are free, they perform well, and have lots of good documentation and samples. For example, try downloading the Java 7 SDK from Oracle - you'll be able to write simple programs, compile them, and run them on a JVM, on your existing platform.
If you are interested in learning about processors, there are a number of simple processors available, including some specifically tailored for students at a modest cost. You can also study the processor on your own platform - some make documentation generally available, though it's generally not as accessible as programming language documentation, unless it's tailored for students, e.g., A Simple and Affordable TTL Processor for the Classroom, or a programmable interrupt controller.
You can also study the Java byte-code instruction set, i.e., the instruction set of the Java virtual machine.
You can, of course, learn the Java programming language, Java byte-code, and general processor principles at the same time, but you may find it easier to tackle these topics separately; each is vast.

Related

What's the motivation in using Verilog or VHDL over C?

I come from a programming background and not messed around too much with hardware or firmware (at most a bit electronics and Arduino).
What is the motivation in using hardware description languages (HDL) such as Verilog and VHDL over programming languages like C or some Assembly?
Is this issue at all a matter of choice?
I read that hardware, which its firmware is written in an HDL, has a clear advantage in running instructions in parallel. However, I was surprised to see discussions expressing doubts whether to write firmware in C or Assembly (how is Assembly appropriate if you don't necessarily have a CPU?) but I concluded it's also an option.
Therefore, I have a few questions (don't hesitate to explain anything):
A firmware really can be written either in HDL or in a software programming language, or it's just another way to perform the same mission? I'd love to real-world examples. What constraints resulting from each option?
I know that a common use of firmware over software is in hardware accelerators (such as GPUs, network adapters, SSL accelerators, etc). As I understand it, this acceleration is not always necessary, but only recommended (for example, in the case of SSL and acceleration of complex algorithms). Can one choose between firmware and software in all cases? If not, I'd be happy to cases in which firmware is clearly and unequivocally appropriate.
I've read that the firmware mostly burned on ROM or flash. How it is represented in there? In bits, like software? If so, what's the profound difference? Is it the availability of adapted circuits in the case of firmware?
I guess I made a mistake here and there in some assumptions, please forgive me. Thank you!
The term "firmware" is at best ill defined, and I believe that is probably the cause of your confusion.
Historicallym - before the availability of programmable logic devices - the term "firmware" has been used to refer to code stored-in and executed-from read-only memory (ROM). At a time when the only available ROM technology was mask-ROM where the code was burned into the device at manufacture of the silicon and therefore unchangeable without replacing the chip - that was pretty "firm". Even with later programmable read-only memory (PROM), which could be programmed post-manufacture, because it was one-time programmable (OTP), the term still applied.
With the introduction of UV erasable EEPROM, firmware became perhaps less "firm", but the lack of in-circuit programmability and the need to expose the device to UV to erase it still made replacement of the embedded software a chore - normally requiring removal of the chip, placing it in the eraser for an hour or so, then programming it in a dedicated programmer.
The advent of NOR Flash memory, where code could be stored and executed directly from the device, but also readily changed in-circuit, the term firmware in this context has become less common. However it is still used (perhaps mainly by older practitioners) to refer to embedded software stored and executed from a random-access, read-only memory device as opposed to loaded into RAM from a file system.
The use for the term firmware to refer to programmable logic configuration is newer and has probably come about simply because it is hardware, but the configuration is written much like software using a high-level language.
The upshot of this is that you do not choose
"Verilog and VHDL over programming languages like C or some Assembly"
because in each context the term firmware simply refers to a different concept.
It would be best to avoid the term firmware altogether as it means different things to different people or in different contexts.
There is perhaps some further confusion form the fact that some hardware description languages are based on software development languages - such as Handle C, which is a C-like hardware description language.
This question would not have much arise some time ago, but with current platforms you can now translate between C and HDL languages (instead of using Handel-C extension of C, from the 90's), mainly between C and behavioral VHDL. And a lot of newer tools provided by enterprises, like Xilinx Electronic System Level Design Ecosystem, or Impulse-C (http://www.impulseaccelerated.com/products_universal.htm)
It is important to know though that as C is a middle level language, and VHDL is, as stated a Hrdware Description Language. C can only handle sequential instructions while VHDL allows both sequential and concurrent executions.
And even though a C program can be successfully written with pure logical or algorithmic thinking,
a successful VHDL programmer needs thorough working knowledge of the hardware circuits. being able to predict how a given code will be implemented in hardware.
In both languages you care about the resource usage, but in a different way (Unless you are programming for resource-constrained devices). But when it comes to VHDL, apart from the memory, other logic elements are limited in a FPGA (where you normally put the VHDL code in).

Are there any FreeRTOS interpreted language libraries available?

I work for a company that created firmware for several device using FreeRTOS. Lately our request for new features has surpassed how much work our firmware engineers are capable of, but we can't afford to hire anyone new right now either. Making even tiny changes requires firmware people to go in and modify things at a very low level.
I've been looking for some sort of interpreted language project for FreeRTOS that would let us implement new features at a higher level. Ideally I would like to get things eventually so the devices become closer to generic computers with us writing drivers, rather than us having to implement every feature ourselves.
Are there any FreeRTOS projects that interpret java, python or similar bytecode?
I've looked on google, but since I'm not a firmware engineer myself I'm not sure if I'm looking for the right keywords.
Thanks everyone
I don't think the RTOS, or even the OS, matters too much here if the code is portable. Depending on your input & output scheme, you'll probably need to do a little porting.
Regarding embeddable scripting languages, the 2 I'm familiar with are LUA and PAWN.
I think there are versions of Python & other such languages ported to embedded systems, but they tend to be the embedded Linux variety. Depending on your platform (no idea if it's a little MCU with 8K ROM or an embedded PC) that might be an option.
There are no interpreted languages out there that are "made" to use FreeRTOS, or any other microcontroller threading library (loosely called an 'RTOS' within the e2e community).
However, languages which I have first hand experience using in embedded systems that are (a) written in C, and (b) small enough to embedded in a microcontroller include:
LUA (suitable for almost anything, even some PICs)
Python (suitable for most ARM architectures, anyways, with more than 1mb ram)
I do not have first-hand experience with it, but Ruby may be as easy to embed as Python.
Instead of looking for FreeRTOS-specific interpreters, you might try looking for any interpreters for your particular microcontroller, or microcontroller in general. It might be possible to interface them with FreeRTOS or turn the interpreter into a task.
There seems to be someone trying to go for Lua on FreeRTOS (pic32).
I guess your question boils down ultimately to finding ways of increasing the level of abstraction above the low-level RTOS mechanisms. While it is perhaps true that interpreted languages work at somewhat higher level of abstraction than C, you can do much better than that by applying methods based on event-driven frameworks and state machines. Such event-driven frameworks have been around for decades and have been proven in countless embedded systems in all sorts of domains. Today, virtually every modeling tool for embedded systems capable of code-generation (e.g., Rational-Rose RT, Rhapsody, etc.) contains a variant of such a state-machine framework.
But event-driven, state-machine frameworks can be used also without big tools. The QP state machine frameworks (state-machine.com), for example, do everything that a conventional RTOS can do, only more efficiently, plus many things that an RTOS can't.
When you start using modern event-driven programming paradigm with state machines, your problems will change. You will no longer struggle with 15 levels of convoluted if-else statements, and you will stop worrying about semaphores or other such low-level RTOS mechanisms. Instead, you'll start thinking at a higher level of abstraction about state machines and events exchanged among them. After you experience this quantum leap, you will never want to go back to the raw RTOS and the spaghetti code.

How to move from microcontrollers to embedded linux?

As a kind of opposite to this question: "Is low-level embedded systems programming hard for software developers" I would like to ask for advice on moving from the low level embedded systems to programming for more advanced systems with OS, especially embedded Linux.
I have mostly worked with small microcontroller hardware and software, but now doing software only. My education also consists of hardware and embedded things mainly. I haven't had many programming courses and don't know much about software design or OO coding.
Now I have a big project in my hands that is going to be done in embedded Linux. I have major problems with designing things and keeping things manageable because I haven't really needed to do that before. Also making use of multitasking and blocking calls instead of running "parallel" task from main function is like another world.
What kind of experiences do you have on moving from low-level programming to bigger systems with OS (Linux)? What was hard and how did you solve it? What kind of mindset is needed?
Would it be worthwhile to learn C++ from zero or continue using plain C?
The main problems with using the Linux kernel to replace microcontroller systems is driving the devices you are interfacing with. For this you may have to write drivers. I would say stick with C as the language because you are going to want to keep the user-space as clean as possible. Look into the uclibc library for a leaner C standard library.
http://www.uclibc.org/
You may also find busybox useful. This provides many userspace utilities as a single binary.
http://www.busybox.net/
Then it is simply a matter of booting from some storage to a live system and running some controlling logic through init that interfaces with your hardware. If need be you can access the live system and run the busybox utilities. Really, the only difference is that the userspace is much leaner than in a normal distribution and you will be working 'closer' to the kernel in terms of objectives.
Also look into realtime linux.
http://www.realtimelinuxfoundation.org/
If you need some formal promise of task completion. I suspect the hardest bit will be booting/persistent storage and interfacing with your hardware if it is exotic. If you are unfamiliar with Linux booting then
http://www.cromwell-intl.com/unix/linux-boot.html
Might help.
In short, if you have not developed at a deep level for Linux, built your own distro, or have kernel experience then you might find the programming hard-going.
http://www.linuxdevices.com/ Might also help
Good Luck
In order to work with Unix/Linux you should get into the Unix philosophy: http://www.faqs.org/docs/artu/ch01s06.html
I consider the whole book a quite interesting read: http://www.faqs.org/docs/artu/index.html
Here you can find a free Linux distro for embedded targets plus bootloader to get you started: http://www.denx.de/wiki/DULG/WebHome
I was in a very similar predicament not too long ago. I bought and read Embedded Linux Primer and it was a very helpful way to make the mental-transition to a high level OS (from a microcontroller perspective).
If you have the "time to 'take your time'," you could obviously make the transition. But if you need to get up to speed quickly, you may want to strongly consider getting a technical mentor to help guide you.
You also may find it useful to work your way into Linux by starting out with ucLinux. It's basically Linux on a microcontroller. You could get a feel for the kernel without the virtual memory aspect of it as transition. See if ucLinux supports a microcontroller that you are already familiar with and see how the kernel interacts with that architecture.
I agree that the Embedded Linux Primer book is great for getting your brain wrapped around embedded Linux. You're better off sticking with C for now. C++ can wait, and it's more useful for applications, not driver code.
When you're comfortable with how ucLinux operates, then you could start out with a normal Linux kernel on a microprocessor architecture such as ARM that has an MMU and virtual memory.
Just my two cents!

mono for emdedded

I'm a C# developer, I'm interested in embedded development for chips like MSP430. Please suggest some tools and tutorials.
Mono framework is very powerful and customizable, mono specific examples will be more helpful.
Mono requires a 32 bit system, it is not going to work on 16-bit systems.
There is currently no full mono support for the MSP430.
Mono doesn't run in a vacuum - you will need to make a program that exposes the microcontroller functionality to Mono, then link to Mono and program the entire thing on the microcontroller. This program will have to provide some functionality to Mono that is normally provided by an operating system.
The paged igorgue linked to gives you a good starting point for this process: http://www.mono-project.com/Embedding%5FMono
I don't know what the requirements of the Mono VM are, though. It may be easy to compile and use, or you may have to write a lot of supporting code, or dig deep into mono to disable code you won't be using, or can't support on the chosen microcontroller.
Further, Mono isn't gargantuan, but it's complex and designed with larger 32 bit processors in mind. It may or may not fit onto the relatively limited 16 bit MSP430.
However, the MSP430 does have a GCC port, so you don't have to port the mono code to a new compiler, which should make your job easier.
Good luck, and please let us know what you decide to do, and how it works out!
-Adam
The tools to use Mono on an MSP430 just aren't available. Drop all the C# and use C/C++ instead.
MSP devices usually have 8 to 256KB Flash and 256 bytes (!) to 16kBytes of RAM.
Using C# or even c++ is really not an option. Also, complex frameworks are a no-go.
If you really want to start with MSP430 (which are powerful, fast and extremely low-power processors for their area of use), you should look for the MSPGCC toolchain.
http://mspgcc.sourceforge.net/
It contains compiler (GCC3.22 based) along with all necessary tools (make, JTAG programmer etc.). Most MSP processors are supported with code optimisation and support of internal hardware such as the hardware multiplier.
All you need is an editor (yopu can use Eclipse, UltraEdit or even the normal Notepad) and some knowledge about writing a simple makefile.
And you should prepare to write tight code (especially in terms of ram usage).
I think that Netduino can be of some interest for you.
Visit their web site at http://netduino.com/.
It's opensource hardware (like Arduino, http://www.arduino.cc/).
It runs .NET Micro Framework (http://www.microsoft.com/en-us/netmf/default.aspx), the breed oriented to embedded development.
Regards,
Giacomo

Where should I begin with HDLs?

I am a self-taught embedded developer. I mostly use AVRs programmed in C and ASM, but I have dabbled with other systems. I am looking to move onto more complex devices like CPLDs and FPGAs, but I have no idea where to start. So my one and a half questions are:
Do you prefer VHDL or Verilog and why?
What is a good way for one with no prior experience in HDLs get started in learning such a beast?
Buy a cheap starter kit from Xilinx or Altera (the two big FPGA players). A Xilinx Spartan3 starter kit is $200.
I personally prefer VHDL. It is strongly typed and has more advanced features than Verilog. VHDL is more popular in Europe and Verilog is dominating in the US.
Buy a book (e.g. Peter Ashendens The Designers Guide to VHDL) and start simulating your designs in a free simulator. ModelSim from Mentor Graphis is a good one and there are free versions available (with crippled simulation speed).
Make up some interesting project (mini cpu, vga graphics, synthesizer) and start designing. Always simulate and make sure your design works before putting your design into the hardware ...
If you have no background in digital electronics buy a book in that subject as well.
Back in the day when I worked on ASIC design, it was in verilog. In many cases as a designer you don't get to choose: the ASIC synthesis tools for an HDL cost a substantial amount of money, and companies only purchase the full toolchain for one "blessed" language. My employer had standardized on verilog, so that is what we used.
FPGA synthesis tools are substantially cheaper, so you have more freedom as an FPGA designer to pick your favored language and tools.
There are also free verilog simulators available at verilog.net.
As #kris mentioned, an FPGA starter board is also a good way to go. Having your verilog code light up an LED on a board is infinitely more satisfying than a simulator waveform on the screen.
Also check out opencores.org - There are some articles and a lot of open source code in both Verilog and VHDL you can learn from.
As far as I can tell, VHDL vs Verilog gets just as religious as Ruby vs Python or Java vs C#. Different people have their own favourites.
Check out this site:
http://www.fpga4fun.com/
Nice simple projects using simple tools. I used one of these boards a few years ago to build a small VGA display system for use as a notice board.
Looking at the site again I'm thinking of getting a Xylo-LM board as it has an ARM processor as well as SDRAM and a Xilinx Spartan 3e.
Another board I used before was the XPort 2 from Charmed Labs. This plugs into a Gameboy Advance which is well supported with open source development tools.
Check out:
http://www.charmedlabs.com/index.php?option=com_virtuemart&page=shop.browse&category_id=6&Itemid=43
One additional thing to think about is whether you should start by learning an HDL, or by learning boolean logic, Karnaugh maps, DeMorgan's theorem, gates, implementing arithmetic in gates, etc. It's easy to write non-synthesizable HDL if you don't have an accurate mental model of what the underlying hardware will look like.
This book is the Verilog version of the one I used in undergrad, and it did a pretty good job in my opinion. It starts you out with the material mentioned above, as well as some basic, basic info on the transistor-level implementation of gates, then introduces you to an HDL, and has you build progressively more complex structural and behavioral hardware blocks. Yes, I know it's ungodly expensive, as are most college textbooks, but this is one of those things for which the information I've been able to find online, at least, has been woefully inadequate.
Once you're ready to choose an HDL, I heartily recommend Verilog (having learned VHDL first). Yes, VHDL was once much more feature-rich than Verilog, but later revisions of the language (Verilog 2001, Verilog 2005, SystemVerilog, etc..) have cherry-picked most of the interesting features, and there is far more robust toolchain support for Verilog and its variant these days, in addition to it being the dominant language in use in the US (in my experience, VHDL is only used here when dealing with extreme legacy blocks, and in academic contexts, partially due to the tools support mentioned previously). Finally, once you've learned the HDL, you have a hardware verification language (HVL) in SystemVerilog with strict-superset syntax, saving you a good bit of the learning curve. Not so for VHDL, to my knowledge.
Altera and Xilinx have simulators build into their free tool sets. They are limited versions of the very popular Mentor ModelSim tools. They will handle the size of designs you are likely to get to fit in a < $500 (US) board.
For HDL choice Verilog is to C as VHDL is to ADA. So Verilog is easier to get started with, but you can make mistakes more easily. Check your simulation and compilation warnings to avoid those problems.
Verilog 2.http://www.opensparc.net/
HTH
Verilog is much easier to learn and simpler syntax. Its also a newer language. Secondly, most people use verilog. VHDL has many datatypes which give it a learning curve. Once you know verilog it will be easier to bridge the gap to VHDL. Oh and theres also macros in verilog which are very neet. I invented a language with it. Finally, you will eventually be able to do mixed language HW design. I started out with VHDL, then learned verilog and am now pro verilog.
I was in the same boat as you are now a semester ago. My preferred book was this one, since it talked about FPGAs by reviewing digital logic. It also shows side-by-side comparisons of VHDL and Verilog code so that, instead of choosing one that people may push you to, you can learn the one that you like stylistically.
As for an FPGA itself, use Xilinx's ISE webpack to do your programming (it's free), and start off with FPGAs like the Basys2 FPGA board. It's a very small FPGA that should get you started for a small price, but has the added advantage that you learn resource and memory management very early. You can use Digilent's Adept (also free) to make life easy in uploading your "compiled" code to the board.
Good luck!
Before plunging into Verilog/VHDL or buying an FPGA dev kit I'd recommend taking an introductory class on digital design. There are good online OpenCourseWare MIT classes.
Good luck.