Portable implementation of XSLT node-set function - xslt-1.0

I am currently working on refactoring a large XSLT 1.0 Library that includes several thousand XSLT files. The library was designed to run using MSXML and consequently has ms:node-set() calls littered throughout. It strikes me that if we ever need to port the library to a different XSLT Engine it is going to be a mission to go through and update all the references to the node-set function.
Is it possible to implement the node-set function in a more portable fashion so there is a single point of change when it comes time to port the library? For example, define a single custom function, say my:node-set(), that wraps/overrides the ms:node-set() function and replace all the ms:node-set() references in the library to reference my:node-set() instead.
I am not interested in solutions that involve moving to XSLT 2.0

User-specified functions are not a feature of XSLT 1.0 (though some implementations support user definition of functions as an extension). So the user-defined wrapper you have in mind will not work. (Sigh.) You will probably get better portability if you use the EXSLT flavor of node-set().

Related

In what way is gobject facilitating binding?

On the official website of gobject, we can read:
GObject, and its lower-level type system, GType, are used by GTK+ and most GNOME libraries to provide:
object-oriented C-based APIs and
automatic transparent API bindings to other compiled or interpreted
languages
The first part seems clear to me but not the second one.
Indeed, when talking about gobject and binding, the concept introduced is often gobject-intropspection, but as far as I understand, gobject-introspection can be used to create .gir and .typelib for any documented C library, not only for gobject-based library.
Therefore I wonder what makes gobject particularly binding-friendly.
as far as I understand, gobject-introspection can be used to create .gir and .typelib for any documented C library, not only for gobject-based library.
That's not really true in practice. You can do some very basic stuff, but you have to write the GIR by hand (instead of just running a program which scans the source code). The only ones I'm aware of are those distributed with gobject-introspection (the *.gir files, the *.c files there are to avoid cyclical dependencies), and even those are generally only a fairly small subset of the C API.
As for other features, almost everything in GObject helps… the basic idea is that bindings often need RTTI. There are types like GValue (a simple box to store a value + type information), GClosure (for callbacks), properties and signals describe themselves with GTypes, etc. If you use GObjects (instead of creating a new fundamental type) you get run-time data about inheritance and interfaces, and GObject's odd construction scheme even allows other languages to subclass types declared in C.
The reason g-ir-scanner can't really do much on non-GObject libraries is that all that information is missing. After scanning the source code looking for annotations, g-ir-scanner will actually load the compiled module and use GObject's API to grab this information (which makes cross-compiling painful). In other words, GObject-Introspection is a much smaller project than you think… a huge percentage of the data it needs it gets from the GObject API.

Kotlin runtime jar vs kotlin stdlib jar

What's the difference between kotlin-runtime.jar (225.1K) and kotlin-stdlib.jar (727.3K) (sizes are for 1.0.0-beta-1103 version)? Which one should I distribute with my application? For now I live with kotlin-stdlib.jar, because that's what Android Studio generated, but I wonder if I can use kotlin-runtime.jar since it's smaller.
The runtime library only contains base Kotlin language types required to execute compiled code. It is a minimal classes set required.
The standard library contains utility functions you need for comfortable development. These are such functions for collections manipulations, files, streams and so on.
In theory you can use just runtime but you generally shouldn't because there are no standard library in it so you will lose many utility functions required for comfortable development (such as map, filter, toList and so on) so I don't think you should.
So in fact you need both. If you need make the result package smaller then you can process you app with proguard.
Update
Starting from Kotlin 1.2, kotlin-runtime and kotlin-stdlib are merged into single artifact kotlin-stdlib.
We merge kotlin-runtime and kotlin-stdlib into the single artifact kotlin-stdlib. Also we’re going to rename kotlin-runtime.jar, shipped in the compiler distribution, to kotlin-stdlib.jar, to reduce the amount of confusion caused by having differently named standard library in different build systems.
That rename will happen in two stages: in 1.1 there will be both kotlin-runtime.jar and kotlin-stdlib.jar with the same content in the compiler distribution, and in 1.2 the former will be removed.
Refer to Kotlin 1.1: What’s coming in the standard library for details.

Can you load shared objects (libraries) and call their functions (FFI) from ABAP?

Is there a possibility to load a dynamic shared object/library from a file on the application server and load it's functions (i.e. a Foreign Function Interface) from ABAP?
I am aware that you can call kernel functions with the CALL statement, but perhaps there are functions in the kernel that support loading libraries and calling their functions?
I'm not aware of a kernel function that would let you do that. There may be one but kernel functions are certainly not publicly documented so you'd need to do your own exploration of the disp+work executable to see if one exists. And if you find one, you'd then need to determine what the parameters are. Not an easy task. If you're up for exploring, I'd probably do it on a Linux system and use objdump and elfsh as my starting toolset.
If I was trying to implement something like what you describe, I'd write a generic "library loader" RFC server in C using the NetWeaver RFC SDK. I'd use C, because it would give the most flexibility loading the external library. You'd need to handle the OS-specific portions of loading the library (eg, using dlopen() on a Unix system, LoadLibrary() / LoadLibraryEx on Windows), but you could then wrap the library functions in generic function module calls (ala, RFC_READ_TABLE) and call them dynamically.

How to provide specific GWT implementations

Suppose I am working on exposing some of my server-side classes to a GWT application, but certain parts could be done much better using GWT-specific components (like JSNI, for instance).
What are some techniques for doing so without being too hacky?
For instance, I am aware of using a subpackage and using the <super-source/> tag, but this requires the package names to be different, which causes eclipse to complain. The general solution in the community is to then tell eclipse to use that as a source folder, but then eclipse complains about there being two classes with the same name.
Ideally, there would just be a way to keep everything in a single source tree, and actually have different classes which apply the alternate implementations. This would feel like a more OO approach.
I would like to add a suffix to a class like _gwt which accomplishes this automatically, and I know I could write a script to do this kind of transformation, but that is a kludge for sure.
I've been considering using Google's GIN/GUICE libraries for my projects in general, and I think there might be some kind of a solution there, but I am not sure as I have not thoroughly investigated it.
What are some solutions you have tried in the past on GWT projects?
The easiest way to have split implementations is to use super-source code, but only enough to instantiate a uniquely-named instance or dispatch to a different method. Ideally, the super-source implementation is just a few lines long, and not so bad that you can't roll it by hand.
To work around the Eclipse / javac double-mapping and package name issues, the GWT source uses two top-level roots for user code: user/src and user/super. For example, the AutoBeans package has a split-implementation of JSON quoting and evaluation, one for the JVM and one for the browser.
There's really no non-kludgy way to implement super-source, as this is a feature way outside what you can specify in the language. There's nothing that lets you say "use this implementation in this environment" without the use of some external tool.

Alternatives to NTidy and other ports, need to format html with custom tokens

Looking to format (automated, in application) some html / nvelocity templates. Tidy seems to be the answer for this, however all the .Net ports seem to be problematic and not very well maintained. Most rely on unmanaged code under the covers and that starts imposing other restrictions on the project.
For example, to use the code associated with http://www.codeproject.com/KB/mcpp/eftidynet.aspx, the projects now has to be a x86 build.
Is there a new preferred solution for doing this? Or is there a completely managed port of TidyHtml that understands nvelocity or allows custom token definition?
Let's list them:
EfTidy
ZetaHtmlTidy (mixed-mode, so it needs different assemblies for x86/x64)
tidyfornet (managed assembly but depends on external HTMLTidy native dll)
TidyATL (ATL wrapper, old, unmaintained, I think it's also mixed-mode and it even requires COM registration?)
TidyNet (fully-managed DLL, no external dependencies)
Even though it's old and unmaintained, I'm using TidyNet because it's fully managed. Does the job just fine.
BTW: Tidy and NVelocity are completely unrelated. I'd never process NVelocity templates with Tidy as it will probably break them... However you might want to run Tidy to the resulting html after processing the NVelocity template.