What is MAGIC programming language? Which other language is closest in syntax? [closed] - magic-unipaas

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
I have recently heard about Magic programming language from several sources and didn't recall ever hearing about it before. It was mentioned that it is a programming language from Israel.
I did some googling and couldn't find much information about it. I couldn't find any code examples, and wikipedia didn't have any information on it either.
I think this is the site for it http://www.magicsoftware.com/en/products/?catID=70 though I am not sure, as it mentions uniPaaS instead of magic. However other material on the site indicates that this is the new name for it.
I was interested in learning more about it from it's practitioners, rather than the company. I saw several claims on the internet that it provided really fast application development, similar to claims made by RoR proponents when it came out.
How does it compare to VB?
Is it still a better RAD tool than current .net or mvc frameworks like django, ror ...etc?
How hard is it to learn?
If you can post some sample code it would be most helpful as well.
Could this site be it? Though it links back to the page above.

You're right my friend, Magic is the original name of the "programming language", nowadays is called UniPaaS (Uni Platform as a Service), I use it to develop some business application. Maybe is the fastest way to create an applications(data manipulation), you can create apps in just a few days, but like everything in life has its own drawbacks:
it's very weird so that makes it
difficult to learn.
you do not have all the control of what's happening in the background
and you have to pay a lot for licensing (servers,clients, etc)
If you are interested in learning this, you can download a "free" version of the software that only works with sqlite databases called UniPaaS Jet.

Magic Language is as it’s called today uniPaaS, it used to be Magic than eDeveloper and now uniPaaS as PachinSV menchend before.
uniPaaS is an application platform enabling enterprises, independent software vendors (ISVs) and system integrators (SIs) to more successfully build and deploy business applications.
You can download the free version of uniPaaS Jet here: http://web.magicsoftware.com/unipaas-jet-download,
try it yourself and see how easy it is to use.
Magic technology as you descried is a Magic Software Enterprises tool (uniPaaS), you can find more information on:
official website: www.magicsoftware.com/en/products/?catID=70&pageID=55
uniPaaS Jet developer group on facebook: https://www.facebook.com/groups/unipaasJet/
Magic developer zone: devnet.magicsoftware.com/en/unipaas
Let me know if you find the information helpful
Bob

As PachinSV explained, there is a RAD once called Magic, then eDeveloper, now UniPaaS. This RAD is dedicated for database applications. Programming in this RAD does not look like anything else I know, you mostly don't write code as with usual languages, but it is nearly impossible to explain just with words. The applications are interpreted, not compiled.
As PachinSV said, when developing, you must follow UniPaaS' way of doing things. This is probably why so many people never manage to use Magic properly: if you thought like Magic before learning about it, then you will adapt to it easily; but if you have a long and successful experience using other database development tools, then often the Magic paradigm will never become natural to you. The learning curve is quite steep, you must learn a lot of things before being able to write a little application.
Previous versions stored the "code" inside a database table. The last version, UniPaas stores the code in xml files. I could send you an example, if PachinSV does not answer you before. But the files are pretty big: the smallest xml file I have in a test app is 4000 bytes, and any application is made of at least 11 files, an empty application is 7600 bytes. You must also understand that developers never use those files (they are undocumented AFAIK), they are only the storage format used internally by UniPaaS. The only way to use them is to set them up as a UniPaaS application.

I'm still an active MAGIC Developer... This is the old name used and its a completely different paradigm like some of you mentioned. I've been developing it from Magic version 8.x to eDeveloper 9.x to 10.x then renamed to UniPAAS.
The newer version is much easier to use and it is still very RAD in the sense that there is little or no code you write... a lot of the common programming tasks like IO, SQL command...etc is handled by the tool and is transparent ( so even less code to write since we use it in almost all types of applications)... Its mostly an Enterprise tool... you wouldnt use it for small application...
You can download the free version to learn the paradigm... but the enterprise licenses are expensive.. you need both the development tool and the runtime license if you want to deploy... so it can be costly for small scale projects...
I enjoy it personally, especially when you have to do quick proof of concepts or a quick data migration or porting onto any db platform and bridging any existing system through a wide range of gateways they provide with the licensed version.. It is up to date with the commonly used web technology out there...like SOAP, RIA ...
It's more popular in Europe... The HQ in the States is in Irvine... we used to have 2 branches in Canada but it closed down in 2001 .... Visit the Magic User Group on Yahoo... Its a very active forum with lots of cool people who will help you out in your quest...
http://tech.groups.yahoo.com/group/magicu-l/

I Programmed with Magic for 6 years and found it to be a amazingly fast tool, easy to understand if you are a competent database programmmer because all operations are really about data manipulation. It is certainly a niche area develop in and because of this jobs are few and far between. As it is interpreted there are really no bugs to make. It will work with many databases/connections simultaneously but there is a big memory and processing hit.
Drawbacks :
Little control over communications between machines and devices
No mobile API as yet
Niche area so few skilled practitioners or companies willing to invest.
Good Points :
You can say you are a Magician; you can impress people with uber fast apps development (really)
It is easy to understand if you don't have a PHD in Maths
zero programming "bugs" can creep in. What you do is what you get.

Developed in The original Magic PC referred to by several of the above folks.
It is exactly this: FAST, FAST, but expensive and rigid in what it will allow you to do. It works on a tick tack toe like matrix. Dropping in commands into the various sections determines when they are run. The middle column is run indefinitely until you break the cycle. It is like a do Until loop. If you have to do an item once you put it into this infinite loop and end it after one cycle.
The first column procedures are run first, ONCE, before the infinite middle column is run. The 3rd column of commands is run after the infinite cycle, once. It is very efficient and logical once you get over the idea of an infinite loop.
Types can be specified and an associated program to present the type. Then everywhere the type is used all the settings automatically kick in. I like especially that one can write the program and 5 months later change the name of a variable and it is carried throughout the program. In fact the program does not use your name for anything. The internal name of any and all variables is hidden to the end user, so of course it is not a problem to change a name. It takes a minute to write an input program for any table. It takes a minute to write an export/import program for all the data files in the database.
Attaching to a type of database like Btrieve or SQL independent of the program itself.
I stopped using the language because they demand more for the runtime engine than I could charge for the programs I wished to run with it. Bill Gates went the opposite direction. VB is superior in control and being able to drop `10 datagridviews onto the same screen, but development is 10 times slower.
It's niche then is PROOF of concept for a program in a big company or conversion, importing, exporting for a development company. It is good for $25k programs that are database heavy and not going mobile.
uniPaaS, Magic PC

I did some Magic work around 1993. It was a DOS based 4GL that came from Israel. Haven't seen it since.

How does it compare to VB?
It doesn't.
Is it still a better RAD tool than current .net or mvc frameworks like django, ror ...etc?
If you mean "is it more Rapid", then yes, otherwise no.
How hard is it to learn?
About as hard as learning MS Access.
Coincidentally, if you want to get an idea of what it is and how it works, I've found that comparing it to MS Access is handy. It works in much the same way from a user's or developer's perspective. Obviously what happens in the background is vastly different, but if you've ever developed a form in design view in Access, Magic will seem very familiar.

Google tells me there's also MAGIC/L. All I could find about it was this blurb:
A procedural language written in
Forth. Originally ran on Z80's under
CP/M and later available for IBM-PCs
and Sun 3s.

The only Magic programming language that I know about is one used by a company called Meditech. It's a proprietary language derived from MUMPS.
The language is truly miserable - here's a sample.

Related

How much time do I need to learn LabVIEW [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 1 year ago.
Improve this question
I know that this question is too abstract. But. How much time do i need to learn LabVIEW to become average LabVIEW developer? For example, if I buy good book about LabVIEW and have 8 hours per day (on my work) dedicated to LabVIEW learning how many days i will spend on LabVIEW learning? Could you please provide example from your own experience. More information about me that can be helpful: I'm a developer and know c\c++\python and a little bit of java languages.
Like Swinders said, it might depend a lot on your sensibilities. I have seen people who had a really hard time migrating to the data flow concept. It's a different paradigm from the classic text-based languages and some people can't easily think in these concepts.
If you get past that hurdle, you'll find that the IDE handles a lot of the annoying things you used to take care of for you (things such as syntax and memory allocation). This allows you to become productive very quickly.
It doesn't mean, however, that your level would be high. One potential pit you should try hard to avoid is casting your existing experience onto LV. The most common example is probably local variables. This may be shocking to people coming from a text-based world, but LV does not have variables, per-se. Unfortunately, it does have elements called variables and people migrating from C who find them jump on them and use them as they would use variables in C, leading to LV code which looks like C code and is bad code (at least in LV).
If you do manage to work around this, I would guess you would become better than the global average in less than a month and better than most professional developers after creating three projects you would later look at and say "what the hell was I thinking?".
I never took any of the NI courses (although I understand some of the advanced architecture ones are pretty good), but I would suggest you also spend some time in some of the online communities (such as LAVA or the NI forums) and look at some of the examples and discussions there. There's a lot of material about best practices, design patterns, etc., which would allow you to become a more professional developer.
Above all, do not abandon your current professional conduct. If you have a structured process for designing and developing software, you already have a leg up on the majority of LV programmers. Just make sure you adapt and keep using such a process.
I started with no commercial programming experience (I have always programmed for fun) and followed an on-line tutorial to pick up the basics of LabVIEW. Within a week I was able to understand existing code and could develop a small application.
It is hard to give an estimate on how long it would take to become an 'average' LabVIEW developer as this depends on what you mean by 'average'. One thing to consider is how easy you are able to think in terms of data flow rather than procedural languages. If you can pick up new programming languages quickly then this will help.
Would you be the only person using LabVIEW or are there others at your place of work that could mentor you? You may also find that there are user groups operating near you which I would recommend (check the NI website or contact your local NI office).
There is then the experience that you will need to gain to allow you to produce good LabVIEW code. I was lucky to be able to attend the National Instruments training courses a few years ago which I think helped me but only by using it have I become an 'average' LabVIEW developer.
I'd say a few weeks or most with devoting the majority of your work time to it. I had a similar background to you when I started to develop in LabVIEW. The hardest part was adapting to the lack of variables. There are local variables, but it's not what you're used to at all. Additionally, their functions, called Virtual Instruments (VIs) can have multiple inputs and outputs, similar to how Python can handle n-tuples.
I will warn you, their array handling features are terrible. A lot of general concepts you might be used to are difficult to implement. My mantra when working with the language is it makes hard things easy and easy things hard. There are also a lot of "gotchas" in the language set, especially with their DAQmx function. I'm not sure what you're planning on developing and their Real-Time module has it's own issues as well, different issues from the main language set.
I would definitely spend some time on NI's website and read as many whitepapers as you can, especially about good design practices, here and here. Learn their State Machine (here or here) and Producer/Consumer pattern well, that's the backbone of many applications you'll be writing.
Good luck, it will make your head spin for a while.
There are some excellent resources to help you get started. If your employer can afford training, you can get started pretty quickly by taking a week of training run by National Instruments. The NI website also has an outstanding developer community that is highly responsive to questions even from novice developers. But I would say that the key to being comfortable with the idioms and style of the language is just plain old practice that you get by solving problems using LabVIEW on a regular basis.
You will find eventually that there is the question of hardware and instruments. Labview is really all about data acquisition-- either through NI's DAQ hardware or through traditional GPIB instruments, or through 3rd party api's (activeX, .NET assemblies). If you're using LabVIEW, you're probably interfacing to hardware of some type. This can get really challenging with complex instruments and measurements. If you're getting started, I would recommend making sure that you have unlimited access to at least some of the hardware you'll be working with. In other words, make sure that your manager understands that you need a lot of access to the hardware in order to get good at developing with it.
We are using LabVIEW to create test software for our factory test systems. In the past years I have already trained some beginners to understand LabVIEW. I would say it depends on how good you are at learning new concepts. I have trained some to be able to produce standalone applications using the queued message handler concept, doing dynamic GUIs and using hardware drivers within about 3 weeks. Unfortunately there were others aswell that were only able to learn half of that within half a year.
The most important thing in my opinion is the learning source. Having an experienced LabVIEW user that can guide you is the best option. If there is no one not available I would recommend Youtube Tutorials combined with the shipped LabVIEW examples.
The LabVIEW core tutorials are not very handy in my opinion. Those are quite boring and far from what you really need to get started.

How to learn ”the way of ABAP"? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
I never worked with SAP solutions. I have a reasonable understanding of business, but no accounting background. How to learn ABAP on examples that will simultaneously enlighten me with the "way of SAP"?
It doesn't have to be a deep knowledge, just something to start for somebody who was in the world of Python and C# but needs understand how SAP world works.
(this is not a duplicate of "Learning SAP-ABAP")
Learning ABAP is not particularly difficult if you know other programming languages.
Let's first distinguish between ABAP and ABAP OO. ABAP is the old, procedural language and ABAP OO is its extension with classes.
ABAP has the usual control structures, like if-then-else or loops. Its syntax takes a bit of getting used to (I found especially annoying the part about putting or not putting spaces before parentheses), but is definitely doable.
There are some structures you don't find in C++ and C#, for example the grouping of functions in function groups, which have their own local variables, so if you call something that is in a different function group, things can get messy.
But generally, if you understand scope and namespaces, it shouldn't be a problem.
I found ABAP OO pretty straightforward compared to ABAP, because it basically only added the classes / packages that I knew from C++ / C# before.
How to learn them, I would propose the following for someone who is new to ABAP and wants a DEEP knowledge of it (see later the more functional aspects):
-buy yourself a proper ABAP book, e.g. something from SAP Press
-don't read it just yet
-start with a web course or a simple book, along the lines of "learn ABAP in 24 hours"
-start coding
-as you are coding, you will inevitably ask yourself: "how does this and that work? is the PERFORM using pass-by-reference or pass-by-value for passing the arguments?" Look those questions up in your proper ABAP book
-probably after a few months, you will be familiar enough with the language that you can read through the book without falling asleep
Just a caveat: It IS a useful skill to know ABAP programming, but even if you don't consider the other technologies SAP consists of (like workflow or PDF Forms, that don't have anything to do with ABAP), there are still a lot of frameworks that differ in their logic. So just like even though you know C++, knowing the Win32 framework does not mean you can start banging out code that runs under UNIX, knowing ABAP does not mean that you can work productively in a specific module right away. Unfortunately, SAP modules tend to use different frameworks, some of them more reused than others.
If you do not want a deep knowledge of ABAP, but want to understand the SAP modules functionally, you should consider using the products themselves in addition to programming and learning about the functional aspects.
I'm afraid there is really no quick way to learn how the "SAP World" functions; you need to have a bit of technical, functional and also architectural knowledge for that and since the modules are so vastly different from all those aspects, it takes a lot of time until you can have a vast overview of everything. But even with technical and some functional knowledge you would be well on your way; as they say, "in SAP, nobody expects you to know everything".
There are at least two different sets of issues you should be looking for:
Knowledge of ABAP as a programming language
Knowledge of the "Business Domain" that your writing your software for and its implementation in SAP - tables, forms, programs, reports etc, (and each of the modules such as FI HR etc. is more than a normal person can usually be proficient in)
(1) gives you general knowledge on how to write a program, read and update the database, and maybe write a GUI. But the programs that you write will almost always be in the context of (2), so you will need to know that as well.
If you want to get started, it is best to have some general knowledge of the ABAP language, the business domain can't really be learned from a book. Actual project work is much more helpful.
Start by downloading one of the netweaver trial systems from sdn.sap.com (choose one of the ABAP trial systems).
For reference you have the ABAP manuals online here (the reference documents also have a lot of small example programs). For more example code you can enter transaction SE38 (report editor) and search for programs starting with BC or starting with DEMO (put BC* in the name box and press F4).
since you asked me to respond to this question as well. I was hired as a sap-java-developer because there are very very few on the market, even though I didn't know anything about sap java before I entered. I got advice from my co-workers and learned as fast as possible to become productive. It wasnt such a big deal in the end.
I'm one year into the business now but I'm still in a newbie-state. The sap technology environment is huge. SDN (Sap Developer Network) is my best friend whenever I'm stuck.
It definitely helps when you end up in a company with sap-knowledge because you dont have to build up all systems from scratch and you have the licenses for the various sap products at hand. Most trial versions from sap just wont do the trick on the long run.

Getting out of CRUD [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
Definition:
CRUD - Create, Read, Update, Delete; The four basic functions of persistent storage. In the context of this question, specifically related to business applications.
I'll be honest, my goal when I began programming did not include being a lifetime CRUD programmer. Financial data is only so interesting for so long. And to me, that seems like the majority of programming gigs.
I'm still fairly fresh out of school, so any experience is still very beneficial, but eventually I want to move to something "less CRUD like." Currently, I have my eye on some machine control type work. However, I'm just not sure how to go in that direction.
So I want to get a feel for what other developers think about the topic.
Do you enjoy CRUD and why?
What have your experiences in CRUD been like?
How did you move from CRUD to non-CRUD work?
If you've moved, what do you like and hate?
If you've moved, what skills benefited/hastened the transition?
Edit:
I'm approaching CRUD with the attitude that I want to solve problems, not re-create the same form with different fields for a dozen different tables.
I don't think that there's really anyone who enjoys doing CRUD (well at least anyone sane). It's the most tedious part of web programming. My advice is to find or write a framework to automate this for you.
evolutility
django admin panel and django forms
However, if that's the majority of your work, you definitely should consider changing jobs.
Get a different job. Seriously, not all software development is developing business applications. Developing shipping software would make you much happier, I think. Try to find a job at a software company, and write some stuff that's going to ship to customers. Also, if you want to get into some of the low level hardware-style stuff, just start hacking away on some basic microcontrollers so you have at least some background with that.
Develop a framework to make CRUD creation easy in your line of work. Once you have done that, use the free time to improve it in terms of Usability, Security, Performance etc. That should keep your work interesting for a while.
I agree that CRUD's pretty boring. But I don't think it's the fact that it's financial data that makes it so. Perhaps you'd find that financial data a lot more interesting if, for example, it was streaming into a neural net based expert system you'd written to work out how best to invest it?
There's definitely an awful lot more to programming than CRUD. Find an aspect that interests you, and pursue it.
I am curious that no one mention task-based UI and CQRS here.
In fact, to answer your questions :
I don't enjoy CRUD...why ? see the following answers to your question
My experience of CRUD is that's a pain to write CRUD (full stack frameworks are a workaround at best I think), and often a pain for users as well
I move to non-CRUD work when I understand that building software is about giving a powerful tool for users, not a database editor with some business rules
I like to build software less coupled to full stack framework (Symfony2, ASP.NET & cie...), more fully object oriented, but I am more and more annoyed by RDBMS CRUD orientation, and more and more attracted by EventStore (Event Sourcing)
Let's get inspired by task based UI, CQRS and Event Sourcing (search Google, I do not have enough reputation to add more links...) => all together
However, I would like to be less opinionated to finish : there are some points that will not let you get out of CRUD. Some users love CRUD, they feel like in Excel...and also there are probably some applications for which CRUD fulfill all the needs...
CRUD - yes in the end we are storing, reading and updating data. But so what? That is just one part of the equation, at least in my world.
In business, data is essential, but it is the business logic and the decisions made from that data that is important. I have found it very rewarding to take raw data and use it to help business make decisions. We do that with business logic in our code, not to mention the endless ways of presenting that data in the presentation layer.
Yes in the end CRUD is involved, but it is much more than that, no?
Just my opinion.
Having a wide range of experience, my solution is to create my perfect product and start a business around it. I'm facing all sorts of interesting challenges, such as how to stream realtime data from an embedded device to a browser. This stuff gets my programming juices flowing and I have a list of important, fun features to add.
Dream up your perfect product domain, find some people who could benefit and ask them what problems they have. Once you pick up a common theme that interests you (mine was automation and power monitoring) start hacking. Of course for me it helps that my father has run the electronics company Technman for the last 30 years, and wants to create this product with me.
First, have you gone through most of what there is to know about persistent storage? It's worth figuring out how to practically apply database theory, etc. in your current job. Once you've been doing it for a few years and have it all figured out you should definitely think about expanding your horizons. I'd agree with you - unless you're building the DBMS itself - I find that the persistent storage part of the job gets to be fairly boring.
One of the best ways to get a job in a new area is to take a prototype of something relevant to the job to demonstrate at an interview. This is an incredibly powerful statement to make.
Embedded software is really my thing, but the market for this is slowly shrinking in North America and moving to the developing world, and it's a fairly specialized area to get started in.
It seems to me that the application space is still growing. Consider iPhone, J2ME, or Windows Mobile development for example. You can learn to do these on your own with a relatively small investment in equipment.
If you're not already doing this, there's also the web application space. Application server platforms like JBOSS and Glassfish are free and fairly easy to learn. Plus they provide a link back to the CRUD which you already know.
Yes, a lot of business software is CRUD. I used to work on that.
In machine control, part of that can be CRUD too. For example, logging sensor data and reporting it somewhere. Basically CRUD.
But I will admit - in machine control, it's mostly non-CRUD. You would probably enjoy doing something that actually makes an assembly line move, or builds cars, or makes motors spin at a certain speed. I know I do. At a financial institution, it's literally just numbers. Nothing "real" like a motor or a car.
Just about every program is going to have to create, read, update and delete some sort of data. In some systems this presents its' own challenges.
However most of the time reading and writing to databases is fairly easy (which is why they make databases). It is what you do with the data once you have it which is interesting, and generally unique to a business, and keeps you employed.
This article I agree with, basically a lot of programming is boring.
However if you are good and determined enough you will eventually get to do something interesting.
Find or write a way to do the CRUD portions of the applications faster. Do so, tell your manager you are done with your assigned tasks (make sure they ARE done; tested, documented, etc.), and ask what you should do next.
Just take a look to Django and move on to the interesting coding!!!
(Or RoR, or Grails, or whichever suit best to you, but CRUDS shouldn't be still being coded by hand from scratch)
Modern frameworks can do all the crud for you. Checkout the standalone GORM from the GRAILS project.
When I was an undergraduate, I changed my major from Electrical Engineering to Computer Science because I wanted to write video games. Later on, when I started working on business applications for real money, I learned that I simply enjoy solving problems with code.
You may be in the wrong profession.
In this economy, it might be hard for you to just get another job, but that doesn't mean you shouldn't try. Find some type of work you think you would enjoy, go learn it and look for job opportunities. It doesn't hurt to make some phone calls and go on a few interviews even if you think you're unlikely to get the job. Even better, you could figure out a way to start your own company.
Get into web-dev? Seriously the level of basic crud I have to do building web-apps is pretty low, even when there's a DB.
For CRUD of windows FORM based applications developed in c# .net
RocketFramework is the answer

Practices for programming in a scientific environment? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 5 years ago.
Improve this question
Background
Last year, I did an internship in a physics research group at a university. In this group, we mostly used LabVIEW to write programs for controlling our setups, doing data acquisition and analyzing our data. For the first two purposes, that works quite OK, but for data analysis, it's a real pain. On top of that, everyone was mostly self-taught, so code that was written was generally quite a mess (no wonder that every PhD quickly decided to rewrite everything from scratch). Version control was unknown, and impossible to set up because of strict software and network regulations from the IT department.
Now, things actually worked out surprisingly OK, but how do people in the natural sciences do their software development?
Questions
Some concrete questions:
What languages/environments have you used for developing scientific software, especially data analysis? What libraries? (for example, what do you use for plotting?)
Was there any training for people without any significant background in programming?
Did you have anything like version control, and bug tracking?
How would you go about trying to create a decent environment for programming, without getting too much in the way of the individual scientists (especially physicists are stubborn people!)
Summary of answers thus far
The answers (or my interpretation of them) thus far: (2008-10-11)
Languages/packages that seem to be the most widely used:
LabVIEW
Python
with SciPy, NumPy, PyLab, etc. (See also Brandon's reply for downloads and links)
C/C++
MATLAB
Version control is used by nearly all respondents; bug tracking and other processes are much less common.
The Software Carpentry course is a good way to teach programming and development techniques to scientists.
How to improve things?
Don't force people to follow strict protocols.
Set up an environment yourself, and show the benefits to others. Help them to start working with version control, bug tracking, etc. themselves.
Reviewing other people's code can help, but be aware that not everyone may appreciate that.
What languages/environments have you used for developing scientific software, esp. data analysis? What libraries? (E.g., what do you use for plotting?)
I used to work for Enthought, the primary corporate sponsor of SciPy. We collaborated with scientists from the companies that contracted Enthought for custom software development. Python/SciPy seemed to be a comfortable environment for scientists. It's much less intimidating to get started with than say C++ or Java if you're a scientist without a software background.
The Enthought Python Distribution comes with all the scientific computing libraries including analysis, plotting, 3D visualation, etc.
Was there any training for people without any significant background in programming?
Enthought does offer SciPy training and the SciPy community is pretty good about answering questions on the mailing lists.
Did you have anything like version control, bug tracking?
Yes, and yes (Subversion and Trac). Since we were working collaboratively with the scientists (and typically remotely from them), version control and bug tracking were essential. It took some coaching to get some scientists to internalize the benefits of version control.
How would you go about trying to create a decent environment for programming, without getting too much in the way of the individual scientists (esp. physicists are stubborn people!)
Make sure they are familiarized with the tool chain. It takes an investment up front, but it will make them feel less inclined to reject it in favor of something more familiar (Excel). When the tools fail them (and they will), make sure they have a place to go for help — mailing lists, user groups, other scientists and software developers in the organization. The more help there is to get them back to doing physics the better.
The course Software Carpentry is aimed specifically at people doing scientific computing and aims to teach the basics and lessons of software engineering, and how best to apply them to projects.
It covers topics like version control, debugging, testing, scripting and various other issues.
I've listened to about 8 or 9 of the lectures and think it is to be highly recommended.
Edit: The MP3s of the lectures are available as well.
Nuclear/particle physics here.
Major programing work used to be done mostly in Fortran using CERNLIB (PAW, MINUIT, ...) and GEANT3, recently it has mostly been done in C++ with ROOT and Geant4. There are a number of other libraries and tools in specialized use, and LabVIEW sees some use here and there.
Data acquisition in my end of this business has often meant fairly low level work. Often in C, sometimes even in assembly, but this is dying out as the hardware gets more capable. On the other hand, many of the boards are now built with FPGAs which need gate twiddling...
One-offs, graphical interfaces, etc. use almost anything (Tcl/Tk used to be big, and I've been seeing more Perl/Tk and Python/Tk lately) including a number of packages that exist mostly inside the particle physics community.
Many people writing code have little or no formal training, and process is transmitted very unevenly by oral tradition, but most of the software group leaders take process seriously and read as much as necessary to make up their deficiencies in this area.
Version control for the main tools is ubiquitous. But many individual programmers neglect it for their smaller tasks. Formal bug tracking tools are less common, as are nightly builds, unit testing, and regression tests.
To improve things:
Get on the good side of the local software leaders
Implement the process you want to use in your own area, and encourage those you let in to use it too.
Wait. Physicists are empirical people. If it helps, they will (eventually!) notice.
One more suggestion for improving things.
Put a little time in to helping anyone you work directly with. Review their code. Tell them about algorithmic complexity/code generation/DRY or whatever basic thing they never learned because some professor threw a Fortran book at them once and said "make it work". Indoctrinate them on process issues. They are smart people, and they will learn if you give them a chance.
This might be slightly tangential, but hopefully relevant.
I used to work for National Instruments, R&D, where I wrote software for NI RF & Communication toolkits. We used LabVIEW quite a bit, and here are the practices we followed:
Source control. NI uses Perforce. We did the regular thing - dev/trunk branches, continuous integration, the works.
We wrote automated test suites.
We had a few people who came in with a background in signal processing and communication. We used to have regular code reviews, and best practices documents to make sure their code was up to the mark.
Despite the code reviews, there were a few occasions when "software guys", like me had to rewrite some of this code for efficiency.
I know exactly what you mean about stubborn people! We had folks who used to think that pointing out a potential performance improvement in their code was a direct personal insult! It goes without saying that that this calls for good management. I thought the best way to deal with these folks is to go slowly, not press to hard for changes and if necessary be prepared to do the dirty work. [Example: write a test suite for their code].
I'm not exactly a 'natural' scientist (I study transportation) but am an academic who writes a lot of my own software for data analysis. I try to write as much as I can in Python, but sometimes I'm forced to use other languages when I'm working on extending or customizing an existing software tool. There is very little programming training in my field. Most folks are either self-taught, or learned their programming skills from classes taken previously or outside the discipline.
I'm a big fan of version control. I used Vault running on my home server for all the code for my dissertation. Right now I'm trying to get the department to set up a Subversion server, but my guess is I will be the only one who uses it, at least at first. I've played around a bit with FogBugs, but unlike version control, I don't think that's nearly as useful for a one-man team.
As for encouraging others to use version control and the like, that's really the problem I'm facing now. I'm planning on forcing my grad students to use it on research projects they're doing for me, and encouraging them to use it for their own research. If I teach a class involving programming, I'll probably force the students to use version control there too (grading them on what's in the repository). As far as my colleagues and their grad students go, all I can really do is make a server available and rely on gentle persuasion and setting a good example. Frankly, at this point I think it's more important to get them doing regular backups than get them on source control (some folks are carrying around the only copy of their research data on USB flash drives).
1.) Scripting languages are popular these days for most things due to better hardware. Perl/Python/Lisp are prevalent for lightweight applications (automation, light computation); I see a lot of Perl at my work (computational EM) since we like Unix/Linux. For performance stuff, C/C++/Fortran are typically used. For parallel computing, well, we usually manually parallelize runs in EM as opposed to having a program implicitly do it (ie split up the jobs by look angle when computing radar cross sections).
2.) We just kind of throw people into the mix here. A lot of the code we have is very messy, but scientists are typically a scatterbrained bunch that don't mind that sort of thing. Not ideal, but we have things to deliver and we're severely understaffed. We're slowly getting better.
3.) We use SVN; however, we do not have bug tracking software. About as good as it gets for us is a txt file that tells you where bugs specific bugs are.
4.) My suggestion for implementing best practices for scientists: do it slowly. As scientists, we typically don't ship products. No one in science makes a name for himself by having clean, maintainable code. They get recognition from the results of that code, typically. They need to see justification for spending time on learning software practices. Slowly introduce new concepts and try to get them to follow; they're scientists, so after their own empirical evidence confirms the usefulness of things like version control, they will begin to use it all the time!
I'd highly recommend reading "What Every Computer Scientist Should Know About Floating-Point Arithmetic". A lot of problems I encounter on a regular basis come from issues with floating point programming.
I am a physicist working in the field of condensed matter physics, building classical and quantum models.
Languages:
C++ -- very versatile: can be used for anything, good speed, but it can be a bit inconvenient when it comes to MPI
Octave -- good for some supplementary calculations, very convenient and productive
Libraries:
Armadillo/Blitz++ -- fast array/matrix/cube abstractions for C++
Eigen/Armadillo -- linear algebra
GSL -- to use with C
LAPACK/BLAS/ATLAS -- extremely big and fast, but less convenient (and written in FORTRAN)
Graphics:
GNUPlot -- it has very clean and neat output, but not that productive sometimes
Origin -- very convenient for plotting
Development tools:
Vim + plugins -- it works great for me
GDB -- a great debugging tool when working with C/C++
Code::Blocks -- I used it for some time and found it quite comfortable, but Vim is still better in my opinion.
I work as a physicist in a UK university.
Perhaps I should emphasise that different areas of research have different emphasis on programming. Particle physicists (like dmckee) do computational modelling almost exclusively and may collaborate on large software projects, whereas people in fields like my own (condensed matter) write code relatively infrequently. I suspect most scientists fall into the latter camp. I would say coding skills are usually seen as useful in physics, but not essential, much like physics/maths skills are seen as useful for programmers but not essential. With this in mind...
What languages/environments have you used for developing scientific software, esp. data analysis? What libraries? (E.g., what do you use for plotting?)
Commonly data analysis and plotting is done using generic data analysis packages such as IGOR Pro, ORIGIN, Kaleidegraph which can be thought of as 'Excel plus'. These packages typically have a scripting language that can be used to automate. More specialist analysis may have a dedicated utility for the job that generally will have been written a long time ago, no-one has the source for and is pretty buggy. Some more techie types might use the languages that have been mentioned (Python, R, MatLab with Gnuplot for plotting).
Control software is commonly done in LabVIEW, although we actually use Delphi which is somewhat unusual.
Was there any training for people without any significant background in programming?
I've been to seminars on grid computing, 3D visualisation, learning Boost etc. given by both universities I've been at. As an undergraduate we were taught VBA for Excel and MatLab but C/MatLab/LabVIEW is more common.
Did you have anything like version control, bug tracking?
No, although people do have personal development setups. Our code base is in a shared folder on a 'server' which is kept current with a synching tool.
How would you go about trying to create a decent environment for programming, without getting too much in the way of the individual scientists (esp. physicists are stubborn people!)
One step at a time! I am trying to replace the shared folder with something a bit more solid, perhaps finding a SVN client which mimics the current synching tools behaviour would help.
I'd say though on the whole, for most natural science projects, time is generally better spent doing research!
Ex-academic physicist and now industrial physicist UK here:
What languages/environments have you used for developing scientific software, esp. data analysis? What libraries? (E.g., what do you use for plotting?)
I mainly use MATLAB these days (easy to access visualisation functions and maths). I used to use Fortran a lot and IDL. I have used C (but I'm more a reader than a writer of C), Excel macros (ugly and confusing). I'm currently needing to be able to read Java and C++ (but I can't really program in them) and I've hacked Python as well. For my own entertainment I'm now doing some programming in C# (mainly to get portability / low cost / pretty interfaces). I can write Fortran with pretty much any language I'm presented with ;-)
Was there any training for people without any significant background in programming?
Most (all?) undergraduate physics course will have a small programming course usually on C, Fortran or MATLAB but it's the real basics. I'd really like to have had some training in software engineering at some point (revision control / testing / designing medium scale systems)
Did you have anything like version control, bug tracking?
I started using Subversion / TortoiseSVN relatively recently. Groups I've worked with in the past have used revision control. I don't know any academic group which uses formal bug tracking software. I still don't use any sort of systematic testing.
How would you go about trying to create a decent environment for programming, without getting too much in the way of the individual scientists (esp. physicists are stubborn people!)
I would try to introduce some software engineering ideas at undergraduate level and then reinforce them by practice at graduate level, also provide pointers to resources like the Software Carpentry course mentioned above.
I'd expect that a significant fraction of academic physicists will be writing software (not necessarily all though) and they are in dire need of at least an introduction to ideas in software engineering.
What languages/environments have you used for developing scientific software, esp. data analysis? What libraries? (E.g., what do you use for plotting?)
Python, NumPy and pylab (plotting).
Was there any training for people without any significant background in programming?
No, but I was working in a multimedia research lab, so almost everybody had a computer science background.
Did you have anything like version control, bug tracking?
Yes, Subversion for version control, Trac for bug tracing and wiki. You can get free bug tracker/version control hosting from http://www.assembla.com/ if their TOS fits your project.
How would you go about trying to create a decent environment for programming, without getting too much in the way of the individual scientists (esp. physicists are stubborn people!).
Make sure the infrastructure is set up and well maintained and try to sell the benefits of source control.
I'm a statistician at a university in the UK. Generally people here use R for data analysis, it's fairly easy to learn if you know C/Perl. Its real power is in the way you can import and modify data interactively. It's very easy to take a number of say CSV (or Excel) files and merge them, create new columns based on others and then throw that into a GLM, GAM or some other model. Plotting is trivial too and doesn't require knowledge of a whole new language (like PGPLOT or GNUPLOT.) Of course, you also have the advantage of having a bunch of built-in features (from simple things like mean, standard deviation etc all the way to neural networks, splines and GL plotting.)
Having said this, there are a couple of issues. With very large datasets R can become very slow (I've only really seen this with >50,000x30 datasets) and since it's interpreted you don't get the advantage of Fortran/C in this respect. But, you can (very easily) get R to call C and Fortran shared libraries (either from something like netlib or ones you've written yourself.) So, a usual workflow would be to:
Work out what to do.
Prototype the code in R.
Run some preliminary analyses.
Re-write the slow code into C or Fortran and call that from R.
Which works very well for me.
I'm one of the only people in my department (of >100 people) using version control (in my case using git with githuib.com.) This is rather worrying, but they just don't seem to be keen on trying it out and are content with passing zip files around (yuck.)
My suggestion would be to continue using LabView for the acquisition (and perhaps trying to get your co-workers to agree on a toolset for acquisition and making is available for all) and then move to exporting the data into a CSV (or similar) and doing the analysis in R. There's really very little point in re-inventing the wheel in this respect.
What languages/environments have you used for developing scientific software, esp. data analysis? What libraries? (E.g., what do you use for plotting?)
My undergraduate physics department taught LabVIEW classes and used it extensively in its research projects.
The other alternative is MATLAB, in which I have no experience. There are camps for either product; each has its own advantages/disadvantages. Depending on what kind of problems you need to solve, one package may be more preferable than the other.
Regarding data analysis, you can use whatever kind of number cruncher you want. Ideally, you can do the hard calculations in language X and format the output to plot nicely in Excel, Mathcad, Mathematica, or whatever the flavor du jour plotting system is. Don't expect standardization here.
Did you have anything like version control, bug tracking?
Looking back, we didn't, and it would have been easier for us all if we did. Nothing like breaking everything and struggling for hours to fix it!
Definitely use source control for any common code. Encourage individuals to write their code in a manner that could be made more generic. This is really just coding best practices. Really, you should have them teaching (or taking) a computer science class so they can get the basics.
How would you go about trying to create a decent environment for programming, without getting too much in the way of the individual scientists (esp. physicists are stubborn people!)
There is a clear split between data aquisition (DAQ) and data analysis. Meaning, it's possible to standardize on the DAQ and then allow the scientists to play with the data in the program of their choice.
Another good option is Scilab. It has graphic modules à la LabVIEW, it has its own programming language and you can also embed Fortran and C code, for example. It's being used in public and private sectors, including big industrial companies. And it's free.
About versioning, some prefer Mercurial, as it gives more liberties managing and defining the repositories. I have no experience with it, however.
For plotting I use Matplotlib. I will soon have to make animations, and I've seen good results using MEncoder. Here is an example including an audio track.
Finally, I suggest going modular, this is, trying to keep main pieces of code in different files, so code revision, understanding, maintenance and improvement will be easier. I have written, for example, a Python module for file integrity testing, another for image processing sequences, etc.
You should also consider developing with the use a debugger that allows you to check variable contents at settable breakpoints in the code, instead using print lines.
I have used Eclipse for Python and Fortran developing (although I got a false bug compiling a Fortran short program with it, but it may have been a bad configuration) and I'm starting to use the Eric IDE for Python. It allows you to debug, manage versioning with SVN, it has an embedded console, it can do refactoring with Bicycle Repair Man (it can use another one, too), you have Unittest, etc. A lighter alternative for Python is IDLE, included with Python since version 2.3.
As a few hints, I also suggest:
Not using single-character variables. When you want to search appearances, you will get results everywhere. Some argue that a decent IDE makes this easier, but then you will depend on having permanent access to the IDE. Even using ii, jj and kk can be enough, although this choice will depend on your language. (Double vowels would be less useful if code comments are made in Estonian, for instance).
Commenting the code from the very beginning.
For critical applications sometimes it's better to rely on older language/compiler versions (major releases), more stable and better debugged.
Of course you can have more optimized code in later versions, fixed bugs, etc, but I'm talking about using Fortran 95 instead of 2003, Python 2.5.4 instead of 3.0, or so. (Specially when a new version breaks backwards compatibility.) Lots of improvements usually introduce lots of bugs. Still, this will depend on specific application cases!
Note that this is a personal choice, many people could argue against this.
Use redundant and automated backup! (With versioning control).
Definitely, use Subversion to keep current, work-in-progress, and stable snapshot copies of source code. This includes C++, Java etc. for homegrown software tools, and quickie scripts for one-off processing.
With the strong leaning in science and applied engineering toward "lone cowboy" development methodology, the usual practice of organizing the repository into trunk, tag and whatever else it was - don't bother! Scientists and their lab technicians like to twirl knobs, wiggle electrodes and chase vacuum leaks. It's enough of a job to get everyone to agree to, say Python/NumPy or follow some naming convention; forget trying to make them follow arcane software developer practices and conventions.
For source code management, centralized systems such as Subversion are superior for scientific use due to the clear single point of truth (SPOT). Logging of changes and ability to recall versions of any file, without having chase down where to find something, has huge record-keeping advantages. Tools like Git and Monotone: oh my gosh the chaos I can imagine that would follow! Having clear-cut records of just what version of hack-job scripts were used while toying with the new sensor when that Higgs boson went by or that supernova blew up, will lead to happiness.
What languages/environments have you
used for developing scientific
software, esp. data analysis? What
libraries? (E.g., what do you use for
plotting?)
Languages I have used for numerics and sicentific-related stuff:
C (slow development, too much debugging, almost impossible to write reusable code)
C++ (and I learned to hate it -- development isn't as slow as C, but can be a pain. Templates and classes were cool initially, but after a while I realized that I was fighting them all the time and finding workarounds for language design problems
Common Lisp, which was OK, but not widely used fo Sci computing. Not easy to integrate with C (if compared to other languages), but works
Scheme. This one became my personal choice.
My editor is Emacs, although I do use vim for quick stuff like editing configuration files.
For plotting, I usually generate a text file and feed it into gnuplot.
For data analysis, I usually generate a text file and use GNU R.
I see lots of people here using FORTRAN (mostly 77, but some 90), lots of Java and some Python. I don't like those, so I don't use them.
Was there any training for people
without any significant background in
programming?
I think this doesn't apply to me, since I graduated in CS -- but where I work there is no formal training, but people (Engineers, Physicists, Mathematicians) do help each other.
Did you have anything like version
control, bug tracking?
Version control is absolutely important! I keep my code and data in three different machines, in two different sides of the world -- in Git repositories. I sync them all the time (so I have version control and backups!) I don't do bug control, although I may start doing that.
But my colleagues don't BTS or VCS at all.
How would you go about trying to
create a decent environment for
programming, without getting too much
in the way of the individual
scientists (esp. physicists are
stubborn people!)
First, I'd give them as much freedom as possible. (In the University where I work I could chooe between having someone install Ubuntu or Windows, or install my own OS -- I chose to install my own. I don't have support from them and I'm responsible for anything that happens with my machins, including security issues, but I do whatever I want with the machine).
Second, I'd see what they are used to, and make it work (need FORTRAN? We'll set it up. Need C++? No problem. Mathematica? OK, we'll buy a license). Then see how many of them would like to learn "additional tools" to help them be more productive (don't say "different" tools. Say "additional", so it won't seem like anyone will "lose" or "let go" or whatever). Start with editors, see if there are groups who would like to use VCS to sync their work (hey, you can stay home and send your code through SVN or GIT -- wouldn't that be great?) and so on.
Don't impose -- show examples of how cool these tools are. Make data analysis using R, and show them how easy it was. Show nice graphics, and explain how you've created them (but start with simple examples, so you can quickly explain them).
I would suggest F# as a potential candidate for performing science-related manipulations given its strong semantic ties to mathematical constructs.
Also, its support for units-of-measure, as written about here makes a lot of sense for ensuring proper translation between mathematical model and implementation source code.
First of all, I would definitely go with a scripting language to avoid having to explain a lot of extra things (for example manual memory management is - mostly - ok if you are writing low-level, performance sensitive stuff, but for somebody who just wants to use a computer as an upgraded scientific calculator it's definitely overkill). Also, look around if there is something specific for your domain (as is R for statistics). This has the advantage of already working with the concepts the users are familiar with and having specialized code for specific situations (for example calculating standard deviations, applying statistical tests, etc in the case of R).
If you wish to use a more generic scripting language, I would go with Python. Two things it has going for it are:
The interactive shell where you can experiment
Its clear (although sometimes lengthy) syntax
As an added advantage, it has libraries for most of the things you would want to do with it.
I'm no expert in this area, but I've always understood that this is what MATLAB was created for. There is a way to integrate MATLAB with SVN for source control as well.

Coming up to speed on the programming environment [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
I'm not a full-time software guy. In fact, in the last ten years, 90 % of my work was either on the hardware or doing low-level (embedded) code.
But the other 10% involves writing shell scripts for development tools, making kernel changes to add special features, and writing GUI applications for end-users.
The problem is that I find myself facing significant holes in my knowledge, often because it's been years since I've done "X", and I've either forgotten, or the environment has changed.
Every so often, there are threads on TheDailyWTF.com along the lines of "WTF: the guy spent all day writing tons of code, when he could have called foobar() in library baz". I've been there myself, because I don't remember much beyond the #include <stdio.h> stuff (for example), and my quick search somehow missed the right library.
What methods have you found effective to crash-learn and/or crash-refresh yourself in programming environments that you rarely touch?
Ask developers you know that work in the environment that you are interested in.
Search the web a lot.
Ask specific questions in relevant IRC channels (Freenode is great).
Ask specific questions on StackOverflow and other sites.
There really isn't any substitue for being "in the daily flow" of the programming environment in question. Having a good feel for the current state of the art is something you only get from experience, as I'm sure you can verify in you own areas of expertise.
i try to keep up with general news about languages i'm interested in but aren't necessarily using at the moment. being able to follow the general changes helps a lot for when you have to pick it up again.
beyond that, i personally find it easiest to grab an up to date reference book, and code a few basic things to get me used to the environment again, ie as a web programmer i'd make a simple crud app, or a quick web service/client.
For frameworks/APIs (such as a JavaScript framework or a widget library):
Quickly scan through the entire API documentation; get a glimpse of all that's out there instead of picking the first method that seems to fit your needs.
If available, glance at the source code of the
framework to see how the
API was intended to be used. Seeing what's behind the curtain helps. And also
some of the methods will have been used
internally, showcasing their true intents.
Don't necessarily always trust existing code (Googled, from co-workers, from books) since not everyone does the due diligence to find out the most proper way to use an API. Sometimes even samples in API documentation can be out-of-date.
In newer full-featured environments like Java, .NET, and Python, there are library solutions to almost every common problem. Don't think "how can I program this in plain C", but "which library solves this problem for me?" It's an attitude shift. As far as resources, the library documentation for the three environments I mentioned are all good.
The best solution I think is to get a book on the topic / environment you need to catch up on.
Ask questions from developers who you know who have the experience in that area.
You can also check out news groups (Google Groups makes this easy) and forums. You can ask questions, but even reading 10 minutes of the latest popular questions for a particular topic / environment will keep you a little bit "in the know".
The same thing can go for blogs too if you can find a focussed blog. These are pretty rare though and I personally don't look to blogs to keep me "in the know" on a particular environment. (I personally find blogs most popular and interesting in the "here's something neat" or "here's how I failed and you can avoid it" or "general practice" areas.)
In addition to the answers above, I think what you are asking for will take a significant amount of your time, and you must be willing to spend that time to achieve your goals. My method would be pretty much the same as Owen's answer; get a reference book or tutorial and work through the examples hacking in changes as you go to experiment with how any given thing works. I'd say as a bare minimum, allocate a hour to do this every other day, in a time that you know you won't be interrupted. Any less, and you'll probably continue to struggle.
The best way to crash-learn is simple, simply do it, use google to search for X tutorial, open your favorite browser and start typing away. Once you reached a certain level of feeling with X, do look at other people things, there is lots of open source out there and there must be someboby who has used X before, look at how they solved certain problems and learn from this, this is an easy way to verify that you are 'on the right track' or that you're doing things or thinking in patterns that other people would define as 'common sense'.
Crash-refreshing something is much easier since you have a suspended learning curve already, the way I do this is to keep some of the example you did while writing or keep some projects you did. Then you can easily refresh and use your own examples.
The library issue you mention here well, only improving your search skills will improve that one (although looking on how others solved this will help as well)
Don't try and pick up every environment.
Focus on the one that's useful and/or interesting, and then pick a few quality blogs to regularly read or podcasts to listen to. You'll pick up the current state of the environment fairly quickly.
Concrete example: I've been out of the Java world for a long time, but I've been put on a Java project in the last few months. Since then I've listened to the Java Posse podcast and read a few blogs, and although I'm far from a Java guru I've got back up to speed without too much trouble.
Just a though. While we are working on our code we know that we need to work very hard to optimize the critical path, but on non critical path we usually don't spend to much effort to optimize.
From your description you are working 90% on embedded and 10% on rest, lets assume that in 50% of the rest you are spending more time that needed. So according to my calculation you are optimizing about 5% of your work flow ...
Of course the usual google/SO/forums search is useful before you doing something new, but investing more than just that is waste of time for my opinion, unless you want to waste some time just for fun or general education ... :), but this is another story.
By the way I'm in same position and last time i needed some GUI and used MFC (because i used it sometimes 10 years ago :) ) and i perfectly understand that i probably will get better results with C# and friends, but the learning curve just not justify this especially knowing that i need mix the C code with GUI.