VM with SIS installed - virtual-machine

I am looking for a virtual machine with Berkeley's SIS tool installed.
After a major effort, I managed to compile the tool from the source code in my own VM; however, even though the tool works most of the time, it fails to execute many functions, resulting in segmentation faults.
For this reason, I started looking for a VM with the tool installed. So far, I have not got any good result from Google.
Please let me know if you are aware of a VM or know how to correctly compile the tool from the source code. Perhaps advising on the best linux distribution (and version) that works well with the tool.
PS: I could not find any suitable tags for this question except for virtual-machine. The tag SIS refers to something completely different. Electronic Design Automation (EDA) may be a more suitable tag than "logical-operators," but it also refers to something else.

Related

needed : Program for easily switching developer environment

This might be the wrong forum, but its filled with so many smart people so someone might know a solution.
One of my customers has given me several assignments that require a quite similar yet wery different development setup (different versions of class-libraries and such) and the problem is that each time I need to switch between the projects I need to do a lot of configuring to get the compilation to work and not include wrong versions of tools used.
If I dont get it right there might be a lot of cleanup afterwards.
It takes me at least an hour to switch projects. Often several.
Now I realize that the customer has an issue with a lot of branching in their setup and they are working on that, but thats a long process.
So.. My question is. Is there some tool that allows me to take "snapshots" of working project-environments and switch between them?
I'm working on windows.
There are many options. One of them is virtualization: running an entire operating system on top of your operating system. It will emulate a fake hard drive, fake network card, etc. This will run an entire desktop 'in a box'.
In the last years containerization is very popular: running a separate environment (file system, OS configuration) but leaving the heavy stuff to the host (in your case, Windows) such as networking, hardware, etc. This is geared towards running a single application (though this is flexible) 'in a box'.
In your case, since you need to encapsulate an entire development environment, it would seem to be best to go with a virtual machine.
A no-cost solution is Virtualbox. Paid alternatives such as VMWare may offer better usability, performance or features.
If your class libraries are in a specific location, you could containerize the compiler: you work on your PC, then build inside of a 'box' with its own environment. Docker is often used.
Depending on what language you are programming in, there may be a specific solution. Python has VirtualEnv, for example.

Are applications dependent on the environment where it was compiled?

We are having a System.BadImageFormatException in our MSI installers. I have already read about the target frameworks, but we already checked and it's targeting the correct framework (.NET Framework 4.5 same with our QA machines).
We have exactly the same source codes, but the results of the msi installer compiled by our 'build team' fails, but the msi installer compiled by us 'dev' works. Question is, does the environment where an application was built and compiled affects the output (example: msi installers)?
There are basically two reasons for this error:
A cross-architecture call from 32-bit code to 64-bit (or vice versa). Different architectures require different MSI setups Heath Stewarts blog and so everything in a 32-bit setup (especially managed custom action code) should be explicitly 32-bit and explicitly 64-bit in a 64-bit install. For example, when an x64 system encounters AnyCpu code it might load the X64 runtime, and then a reference to a 32-bit assembly will fail and get this error.
A .NET framework runtime attempt to load the "wrong" framework. The NET 4 runtime is somewhat backwards compatible, so you are most likely to get this error when code expecting the NET 2 runtime encounters a NET 4 engine. The devil is in the details here, but again, this is much like the architecture issue. If anything loads the NET 2 runtime and the calling sequence tries to call a NET 4 assembly to run in the 2.0 FW it will fail with this message.
Having said that, it's not clear exactly how you are calling the managed code, whether through DTF or something else (such as the Visual Studio InstallUtilLib mechanism). And finally the machine you build on makes no difference to the eventual runtime environment. It's no different from a code file which will work on one machine but fail on another because (for example) it can't find the C++ runtime. The issue isn't the build machine, it's the environment of the target machine.
Some Suggested Debugging Steps
So it is the actual MSI file which triggers these errors, or the application after installation?
Below are some thoughts and questions to consider when trying to debug issues such as these (in no particular order). My bet is on issue 3 in this first list:
Does this exception occur as you run the MSI itself (or is it a setup.exe?), or as you try to launch the application after installation? Just to verify - I assume the MSI.
Do you have custom actions in the setup? If you have managed code custom actions in your MSI, what platforms do you target in your build? Any CPU I would presume? Please verify. I think there are some issues with COM-interop here, but I am fuzzy on the details. Sometimes you may have to pick a specific platform. In this case you can get such error messages (bad image). See section "Managed Code" below for a whole "rant" about managed code and deployment - and some problems that may result.
Regardless of the above, in your WiX source file, what is the value of the Platform attribute in the Product element? Possible values: x86, x64, intel64, intel, arm, ia64. Please report (and try to test with other values as well - x86, x64 for example). This affects the MSI's platform setting. If you don't use WiX - open the compiled MSI file, and check the summary stream for the Platform setting. Using Orca this is View => Summary Information... - look for Platform.
Do you have any malware scanners, security software or other "potential blockers" for MSI compilation and / or installation on your build computer? Or on the test system where you try to install? (We must always mention these issues - people can waste days if we don't - even if it rarely seems to be the only issue).
Is this a localized MSI using Asian characters? (or Arabic, or any other complex character sets?). This I just mention - frankly I don't see how this is 100% relevant, but I want to clarify this "variable" for your scenario (i.e can we eliminate this as a potential error source). It would generally cause runtime errors, not System.BadImageFormatException issues - I believe.
I assume a compare of the different MSI files may not work because one of the files is a "bad image"? Did you try? Maybe it is still a valid COM structured storage file - but the msiexec.exe engine can't handle it? If it is, then tools may be able to read the content inside the file just fine - I don't know, give it a try.
For your scenario: I initially thought a single, compiled MSI behaved differently in different locations (environments) - and hence suggested to check for any damage in transit (network issues, samba issues, storage issues, malware issue, etc...) by doing a binary diff on the copies in the different locations (bit-level comparison). Since you seem to compile two (or more) MSI files from the same sources, such a binary compare is obviously meaningless. Differences are certain.
However, a "content compare" could tell you something - this compares actual content in the tables / streams inside the MSIs. I think I will add a Q/A on how to compare MSI files that I can link to from here (added: How can I compare the content of two (or more) MSI files?). This presumes that the MSI is readable - even if it is not runnable. Only way to know is to try.
I hope and believe that the above list should help you sort out your problem.
I wrote myself off a cliff below on the subject of managed code issues. The idea was to describe a couple of issues to check, but it became a long discussion. I may delete all the stuff below and perhaps resurrect it elsewhere. It may not be relevant for you at all. The overall topic is managed code and how it can crash in new and "interesting" ways:
Managed Code
This is another one of those sprawling answers that got out of hand. I
think it still has value, leaving it in.
A couple of further issues with regards to .NET custom actions (managed code). I am far from an expert on this topic, since I shun them like the plague (for now - this may change over time).
Some of this veered quite a bit off topic - for your purpose - but I will leave it in as general comments on managed code for MSI use.
MSI expert Chris Painter is the man for this topic - he has taken on this potential "world of pain" and seem to benefit from such custom actions too, but these managed custom actions seem generally accepted to be problematic - if you approach them in a naive way. Be pragmatic and weigh benefits against potential problems listed below.
A friendly piece of advice: for worldwide distribution I would never use managed code as of now - though it is "getting safer" - we have to admit that. There are too many potential error sources for a large scale distribution MSI package using such custom actions (home users may uninstall .NET, corporate users may see versions of .NET disabled, and the whole list of problems below, and I fear "catch 22" uninstall problems more than anything - a whole section on this below, etc...).
As I said, I am not an expert, but there are many, and serious problems. Maybe Chris can correct me if they are "sorted" by now. The DTF framework (distributed along with WiX) features support for embedding a managed code custom action dll inside a regular win32 dll wrapper. This helps reliability. I will dig up a few links here for reference. Chris has been a pioneer and early-adopter here.
Partial list of managed code problems for custom action use:
The .NET framework may be missing, disabled or corrupted (entirely or in the version requested / needed for your code). Now, what if all your 3000 corporate packages have a .NET dll with managed code embedded in them? They can't even uninstall in this case - much less upgrade. More below in issue #5.
When targeting different versions of the .NET runtime with different custom actions, all will load the same CLR version. So they tell me (I could not believe it whilst reading it - please read it!). Enough for me to run for the hills :-). "This can blow up in any number of ways" is what I hear myself think. Apply suitable paranoia accordingly! The resident evil of all things rears its ugly head again - etc... Seriously, don't listen to paranoia, but be on alert for serious problems. Is this problem managable? I guess - I would have to say yes, but it is not a problem to ignore. Serious UAT / QA needed on many different OS and .NET versions. Would a native dll do better? I think so.
Components installed to the GAC can not be used as dependencies for your managed custom actions in the setup (chicken or the egg - I suppose). This has to do with the commit models of Fusion / MSI.
Bob Arnson has commented on this - check it out (he is on the core WiX team). I don't know if this still is his top issue with managed code - along with rollback.
Small digression: I have read Arnson stating that VBScript actions are worse than managed code (Painter certainly agrees, and definitely the WiX boss himself Rob Mensching - blog). I think this is true for just about all cases, but not for corporate application package scenarios (which I have experience with) - or ad-hoc testing that will never be used in production (quick and easy).
I describe the reasoning behind this here (pragmatic issue): Windows Installer fails on Win 10 but not Win 7 using WIX (essentially anything compiled adds a source control problem in the real, chaotic world - and corporate packagers have to pick up each other's work on the fly and a fully embedded, transparent source file in the MSI saves the day - all the time, and there is a skill set issue as well, and there is more...).
I do not recommend VBScript for anything but corporate use in controlled environments (standardized workstations). VBScript is not good enough for public, worldwide MSI releases in any shape or form. They can work for read-only custom actions returning no error codes and set to ignore all runtime errors, but no - there are better ways.
UPDATE: I can add that in a snag I would use VBScript in read-only custom actions in the GUI sequence (just a property setter script) in order to get rid of the .NET framework as a dependency altogether. The time will come when the .NET framework is on all target machines, but it is not quite there yet (and even if it is there it could be broken. Windows now actively fixes ActiveScript to always be running - and MSI hosts its own ActiveScripting runtime - scripts will run, but you could easily mess up the code yourself to make the custom actions horrendous still).
I should add that my recommendation to use Javascript over VBScript in the link above will be removed soon. Javascript has proved just as bad as VBScript in practical use, with some added snags that are too detailed to go into. The enhanced exception handling offered by Javascript does not make up for the fact that the MSI API seems to have been tested with VBScript during its development. Javascript was probably not, and hence has a few clunky issues when working with the MSI API that are not immediately apparent. I have wasted costly time on this - I would recommend you don't waste yours.
I also use scripting for testing, prototyping and debugging my MSI packages (to debug property settings, app searches, override command lines for testing, etc...). I find this the quickest way (who wants to compile something ad-hoc for this?). Just don't roll with your script test code for release! If using Installshield I use Installscript for such "scripting".
And for the future: one good use for managed code would be embedded
directly in the MSI, in inspectable (and reusable) form - making custom actions white box - with full source embedded and with full code access security too, making them unable to run with freebasing elevated rights. Just thinking about what could come - let us see what you are doing in this custom action of yours?
To elaborate issue 1, managed code may hard-code a certain .NET runtime version that is not available. I guess this is probably the easier problem to deal with? Correct me if I am wrong Chris. I am just a dabbler with this. Setting "lastest version" could still cause issues though...
Let me add a pet peeve of mine as well: if a managed custom action fails during uninstall due to a corrupt .NET framework (or for any other reason - focused on managed code issues - for example a design / security change in Windows itself from Windows Update) - you can't uninstall and thereby not (major)-upgrade your existing installation. A serious catch 22 in my opinion. Try this if you have 3000 live packages and thousands of desktops to manage and the dll is embedded in each MSI...
Creating custom action code of any kind that trigger errors on uninstall / upgrade was my big fear when making a C++ custom action dll as well - so it is not unique to managed code. A classic error is to set custom actions after InstallFinalize or in the UI sequence to "check exit code" - and a trivial error returned causes full rollback of a major upgrade. A classic "catch 22" - now you can not upgrade without fixing the problem in the old product's uninstall sequence.
Despite this being a general problem for all custom action code, I feel the risk is heightened quite a bit with managed code. What if some weird policy change to the .NET framework makes all packages in a large corporation un-uninstallable and un-upgradeable since they all have embedded the same problematic custom action dll? Or worse yet, it is a Windows design change that you can't roll-back?
A contingency should be available in such cases. This is the core reason I stay away from managed code entirely - I like down to the wire better - fewer layers to depend upon. Minimal dependencies, minimal entanglements (no imperial entanglements). If a minimal dependency C++ dll does not run, then the core of Windows is generally broken and the system needs a rebuild in most cases anyway. For .NET custom actions you would minimally have to fix the .NET framework (which might be easier than I think - for all I know - don't think so though).
I was looking at ways to make the DLL external to all corporate packages in a pre-requisite package (ideally with a minimal, baseline, embedded DLL in the setup itself as well - if the external DLL is missing / not found). The idea being that an external DLL is preferred once available, and upgradable for all packages by a single, updated "prerequisite package". All 3000 packages fixed - all at once?
I never got around to determining the technical feasibility of this. Bear with me, I am getting off topic for your purpose. If the WiX guys are reading - what are the technical possibilities here off the top off your head? Essentially I am expecting to hear "impossible" - and then I am done with it. When thinking about this I was preparing for potential problems with the embedded DLL in Asian and Arabic locations (potentially serious and unexpected and fatal runtime failures due to Unicode / code page issues), and also for any unexpected security changes in Windows (that we keep seeing - Windows 10 ransomware protection which currently intermittently triggers runtime failure for files installed to userprofile folders, or the sudden need for admin rights for MSI repair - kb2918614 which appeared out of the blue on Vista, and whatever else they keep changing unexpectedly...). I did not want to sit with thousands of un-upgradable, un-uninstallable packages - already deployed to tens of thousands of machine.
My "last resort" contingency for corporate use was to "hack patch" all cached MSI files in the local, super-hidden MSI cache folder using a "home grown" patcher EXE deployed by a hotfix package. Generally insane in every way, but it looked technically possible (until digital signatures shuts off the possibility?). And for me the only acceptable "last resort" I could think of if tens of thousands of trading floor machines were hit by disaster suddenly.
I can think of at least two other options - one of which is to minor upgrade affected packages (lots of work, cleaner, guaranteed to work). The last option will not be mentioned :-) - (Voldemort, "those we do not speak of", etc...).
An auto generation feature for minor upgrade patches to patch the embedded custom action dll's was also on my list of contingencies - the minor upgrade would only patch the dll - no other changes. Then problems could be handled on a package-by-package basis. This patch should be available at the click of a button when pointing to a live MSI package in need of patching. An "embedded custom action dll hotfixer". A thing that should not ever be used if at all possible. Contingency "solutions" are rarely pretty.
My two cents: I can think of few scenarios where minimal dependencies are more important than for an embedded custom action in an MSI. It must work on any machine, in any state, in any language, in any location in any installation mode (and uninstall is the catch 22 here) ideally without any non-standard dependencies at all. I statically link C++ code for this very reason. For worldwide distribution I feel this is the only thing that is currently good enough - statically linked C++ code - (with the possible exception of Installscript - from Installshield - which is now running without dependencies apparently - embedded runtime? I don't know how they do it - in the olden days there were legendary problems with the required runtime pre-requisite for the Installscript language. It should be fixed since version 12 of Installshield).
This is not a complete list. It is my "run for the hills list" :-).
No fear though - just be aware of it all - and use the benefits of managed code if they are substantial enough for you, but don't expect entirely smooth sailing is my take on it. I would be upfront with my manager about these potential bear traps, without sounding like a total, paranoid lunatic. A good manager will be able to "sell" any contingency plans as necessities, that you can get time to work on and even demonstrate quickly (believe me, attention span here is short - it has to be the quickest demo ever). The big question is whether you have one package to deal with, or thousands like we do in corporate deployment. Things change a lot for the latter. Risk must be minimized for all features that are embedded in all deployed packages.
If I am 100% honest, it is not as bad with managed code as it used to be. Using DTF and other frameworks have helped. But the potential runtime issues for uninstall problems are worrying. A global change to the .NET framework in the company - and all your packages can no longer uninstall? Or a newer version of the .NET framework reveals unknown bugs in the custom action not found when it was deployed? It may suddenly "manifest itself" on attempted uninstall / upgrade. Managable, but you will curse yourself...
I would prepare your support guys for the above managed code issues - they should know about the issues and really understand what .NET is about.
"We have never seen any problems with our managed code custom actions" - famous last words - to be honest.
If your target computers are uniform and standardized (SOE environment) - which is normal for corporations - then your packages may appear better than they really are (now this is true for packages with scripts too). Just wait for the next SOE version based on a new operating system... I would pilot test early with all packages in the package estate.
You could still face the irony that all target computers start failing in exactly the same way (Windows design changes in Windows Updates, security software updates that trigger interference, SOE updates that fail for some locations, etc...).
For worldwide distribution things are quite different and things tend to fail in any number of ways that are hard to debug and fix or even work out at all. You normally have no access to the problem system at all - for starters. Maybe read some further comments in "The Complexity of Deployment"-section here: Windows Installer and the creation of WiX.
So I would never use managed code for global distribution of a complex package - unless you are delivering a very specific product and know the nature of your target machines in more detail than normal. Cost / benefit.
I would have a contingency for what to do if many machines are affected by unforeseen triggers of "deadlocks" such as not being able to uninstall / upgrade. Some paranoia in this scenario, but not impossible. Silly "war games". Risk is for your manager to manage, and for you to handle technically.
Adding a link to an aging, but still valid FAQ entry from installsite.org on the topic of managed custom actions and their problems: How can I create Custom Actions in Managed Languages, like C#?.
And be skeptical of any custom actions in the first place!
Managed code just adds to custom action volatility. Custom actions are complex and difficult to get right in the first place. They run impersonated or in the wrong context unintentionally, they run twice unexpectedly, they don't run at all when expected to, they run in the wrong installation mode, they crash due to missing dependencies, they cause exceptions due to bad coding that fail upgrades and uninstalls alike by triggering rollback, you hard code references to localized folders so your setup crashes in non-English machines, you name it...
Built-in constructs in MSI itself, or pre-written custom actions (with rollback support) in frameworks such as WiX or commercial tools such as Installshield and Advanced Installer have been tested by thousands, millions or even billions (!) of users - and they are written by the best deployment experts available. Even for these components, bugs are still found - which says it all. Do you think you could do it better on your own? Always prefer ready-made, tested and maintained solutions - if available.
A whole rant about the problems with custom actions in general: Why is it a good idea to limit the use of custom actions in my WiX / MSI setups?
"Sources"
Some further links (some of this content may be showing its age by now, but these are trustworthy sources - not to be ignored - Mensching is the WiX benevolent dictator):
Don’t use managed code to write your custom actions!
Link to more details about the dangers of managed code custom actions in an MSI.
Managed Code CustomActions, no support on the way and here's why.

What is a native build environment?

I am simply reading information off the interwebs, currently the cmake about page, and I need the information to fill in the gaps, it helps to see the big picture.
Surely the answer is straightforward, I hope. What is a native build environment?
Context: I need to know how to build software on my machine (CodeBlocks, etc), why I need to do this, the advantages of doing this, etc. But first, I need to know every piece of jargon I come across, and I could not find any explanations about exactly what a "Native build environment" is, although I can speculate to some degree.
"Native" as in "runs directly in the host operating system" and not "runs in a virtual machine or emulator."
The particular point that CMake's about page is trying to convey is the manner in which CMake achieves cross-platform functionality: specifically not by virtualizing, but by cooperating/collaborating directly with the host system, and the "normal" ways the host system is used to doing things.
Is the build environment then just the directory holding all the garbage needed for a compiler to build the software then?
That's an oversimplification – there's nothing to say that it's a single directory – but more or less, yes. The term is not jargon, it literally means "the state of the world" (aka, environment) needed for the build.
What would you call the other thing then, Non-native?
Sure, or virtualized, or emulated, or whatever other intermediate layer has been added.
Why do we need the distinction as well?
Why not? It's useful to have a concise, clear, simple term so we can communicate precisely and with minimal confusion and ambiguity.
Why 'non-native'? If you haven't already figured this out - there is something called cross-compilation.
Simply put, if I don't have access to target hardware (or an equivalent virtual machine) on which the software needs to run, how do I develop this software on my host and package it to run on that target?
Cross-compilation addresses this by providing necessary tools that perform a translation (or other important low-level stuff) to give you the final software. Such an environment to develop software is called non-native.
Well, I believe we need the term to state the technique.

updating IDE old to new C++ Builder

I'm currently trying to compile an old program (made with C++ builder 2 or 3) with the "current" Embarcadero RAD Studio XE2.
So, I was wondering whether there is an easy way to use the old code, as Borland once claimed to be fully compatible to lower versions... however I couldn't find a "project-file", only source-code (.cpp, .h, .res, etc.).
I tried to "add to project" the main .cpp, however there seem to be some wrong include-paths... it also seem to use the OWL-package and includes its important source-files...
I'm a bit confused which type of main project I have to open first, since you need to open a new project before adding the source to it. As the running .exe has a GUI, I tried a Form-Window first, but it may be better to use a console or service as the real form is produced within the code as far as I understand.
So, after installing OWL and correcting the include-paths, do you think it should be running fine? Or is there something else to take care of?
If your old project was using OWL, you're probably well outside of the supported upgrade path.
That being said, valid C++ code should still compile and work and I've heard of people using OWL with recent versions of C++Builder. (via OWLNext)
Regarding your confusion as to which type of project to use, I believe a console application would be your best bet. A forms application is completely wrong, that will bring in the VCL and give you no end of problems trying to reconcile the different windowing systems. A service application is a completely different beast as well, and isn't meant for GUI applications. A console application should work, but you'll need more. The OWLNext project has a wiki that should help quite a bit.

Building Cross Platform app - recommendation

I need to build a fairly simple app but it needs to work on both PC and Mac.
It also needs to be redistributable on a disc or usb drive as a standalone desktop app.
Initially I thought AIR would be perfect for this (it ticks all the API requirements), but the difficulty is making it distributable, as the app would require the AIR runtime to be installed to run.
I came across Shu Player as an option as it seems to be able to package the AIR runtime with the app and do a (silent?) install.
However this seems to break the T&C from Adobe (as outlined here) so I'm not sure about the legality.
Another option could be Zinc but I haven't tested it so I'm not sure how well it'll fit the bill.
What would you recommend or suggest I check out?
Any suggestion much appreciated
EDIT:
There's a few more discussions on mono usage (though no real conclusion):
Here and Here
EDIT2:
Titanium could also fit the bill maybe, will check it out.
Any more comments from anyone?
EDIT3 (one year on): It's actually been almost a year since I posted that question but it seems some people still come across it every now and then, and even contribute an answer, even a year later.
Thought I'd update the question a bit. I did not get around to try the tcl/tk option at the end, time constraint and the uncertainty of the compatibility to different os versions led me to discard that as an option.
I did try Titanium for a bit but though the first impressions were ok, they really are pushing the mobile platform more than anything, and imho, the desktop implementation suffers a bit from that lack of attention. There are also some report of problems with some visual studio runtime on some OSs (can't remember the details now though).. So discarded that too.
I ended up going with XULRunner. The two major appeals were:
Firefox seems to work out of the box on most OS version, so I took it as good faith that a XULRunner app would likely be compatible with most system. Saved me a lot of testing and it turned out that it did run really well on all platforms, there hasn't been a single report of not being able to start the app
It's Javascript baby! Language learning curve was minimal. The main thing to work out is what the additional xpcom interfaces are and how to query them.
On the down side:
I thought troubleshooting errors was a sometimes difficult task, the venkman debugger is kinda clunky, ended up using the console more than anything.
The sqlite interface is a great asset for a desktop app but I often struggled to find relevant error infos when something didn't work - maybe i was doing it wrong.
It took a little while to work out how to package the app as a standalone app for both PC and Mac. The final approach was to have a "shell" mac app and a shell pc app and a couple of "compile" script that would copy the shells and add the custom source code onto it in the correct location.
One last potential issue for some, due to the nature of xulrunner apps, your source code will be deployed with the app, you can use obfuscation if you want but that's something to keep in mind if you want to protect your intellectual property
All in all, great platform for a cross-platform app. I'd highly recommend it.
Tcl/Tk has one of the best packaging solutions out there. You can easily wrap a cross-platform application (implemented in a fully working virtual filesystem) with a platform-specific binary to get a single file executable for just about any modern desktop system. Search google for the terms starkit, starpack and tclkit. Such wrapped binaries are tiny in comparison to many executables these days.
Many deride Tk as being "old" or "immature" but it's one of the oldest, most stable toolkits out there. It uses native widgets when such widgets exist.
One significant drawback of Tcl/Tk, however, is that it lacks any sort of printing support. If your application needs to print you'll have to be a bit creative. There are platform-specific solutions, and the ability to generate postscript documents, and libraries to create pdfs, but it takes a little extra effort.
Java is probably your best bet, although not all Windows PCs will necessarily have Java (most should). JavaFX is new enough you can't count on it - you'll probably find a lot of machines running Java 1.5 or (shudder) 1.4. I believe recent Mac OS still ships with 1.5 (latest version may have changed to 1.6).
Consider JavaFX
It would run everywhere with a modern JRE ..!
AIR could be an option, but only if you don't mind distributing two different files (the offline runtime installer and your app), and expecting the user to run one and then the other. You do have to submit an online form at Adobe's site saying you agree to distribute the offline installer as-is, rather than digging out individual DLLs or whatever, before they give you the installer.
Unfortunately there's currently no way to get both an AIR app and the runtime to install from one file though. I'm not sure what the deal with Shu is, or whether it's doing anything that isn't kosher.
i would recommended zink. it has all the functionalities you require for desktop. however, the las time i used it it was a bit glitchy.
i was hung up by trying to write a 6M file to the disk. thought it trough and changed the code to write 512K chunks at a time (3min work, fast).
probably it still has some little annoying glitches like making you think on root lvl but the ease of use and the features are just way too sweet to ignore.