TestNG XSLT (Saxon) Report Generator Jars - selenium

I am trying to generate XSLT file using TestNG + Selenium + Java.
I find out that to do so , we need to have these three jar files
saxon-8.7.jar
SaxonLiaison.jar
testng-xslt-maven-plugin-test-0.0.jar
I got those files from separated sources and mostly from someone's personal repository.
Is there any official webpage to download the latest version of those three jar files?
I went to Saxon Page and apparently we need to pay to use Saxon product? Not sure if I am on the right page though.
Are those 3 Jar files mentioned above free of charge? Thanks.

I can answer part of the question.
I don't know if testNG/Selenium really requires Saxon 8.7 specifically (that's a very old version) or whether it will also run with more recent versions. Sometimes when people say the requirement is for version X, they simply mean that's the one that it was tested with. Either way, I think it almost certainly requires only the open source version of Saxon (which was called Saxon-B up to version 9.1, and Saxon-HE from versions 9.2 to the current 9.7). For all the open-source versions, the official distribution is on the Saxon project at Sourceforge, https://sourceforge.net/projects/saxon/files/ Obviously, for these versions no license (or payment) is required.
The page you cite from www.saxonica.com is only relevant to commercial versions of Saxon. I guess we should correct it, because we should allow for users landing there from a google search and therefore not assume any context.
I can't help you on the other two JAR files.

Related

Use java 8 features (newer janino version) in pentaho data integration

Pentaho Data Integration 8.0.x is using Janino 2.5.16, released in 2010 for compiling the User Defined Java Class step. There is a JIRA in pentaho for updating this to use a newer Janino version which would bring new java 8 related features in pentaho v8.2.0 GA. But there is no info on when will this be released.
Is there any other way I can use a newer janino version (janino-3.0.8.jar) with exiting pentaho for UDJC? I tried to copy updated jar in the lib and also added commons-compiler-3.0.8.jar to fulfill dependency. Now when I open Spoon, I get the following error:
Please advise on how this can be achieved. I understand that just replacing the jar may not be enough but just want if something else can be done.
This is not easy. Even now, since you got ClassNotFound, public api of janino is changed. Some classes are removed some are changed. What is actual needs to update it?
If you need really complicated business logic, then create custom plugin. Documentation and tutorials are available and you can look into sources of current builtin plugins (sources are available on github).
What important new version of janino has, that old doesn't (beside java8 support)? Checkout kettle engine, look into sources of UserDefinedClass step, change code to support new janino version, test and make own build of pdi kettle, and try to send push request to maintainers of repository.
Any of this quite complicated, This plugin is builtin into engine, and you have to make own build. Own build means, you have to support it by yourself. This is non trivial, project is huge and now even bigger and continue evolving, I spent several days to make my first custom build (version of 4, was in ivy) just for purpose to know better and debug complicated cases, and it used never in production.
Maintainers of repository must have good reason to include your changes into stream, it must be well tested and it is long procedure and most probably doesn't worth it. A lots of changed since 2010, I probable have seen in release notes, new version of java already have abilities to compile at runtime.
My advice is to make you own plugin.

How long will a version of a Python package be available on PyPI?

I would like to distribute an application that depends on several PyPI-packaged libraries. I have carefully selected certain versions of some of these libraries as newer versions (in some cases) are incompatible. My installer downloads them (with pip) at install-time and sets up the environment for the application. But how long are those versions going to be available? 6 hours? 2 years? Anything in between?
I'm basically looking for some sort of policy that tells me how long those versions of libraries are going to be hosted on PyPI (and who makes that decision).
In-before-"distribute them yourself": That is an answer to a different question.
This is really about how PyPI works, not how I distribute my application.
The friendly people in #python tell me that authors can delete any version of their packages at any time.
The only way to indemnify yourself against a version of something becoming nuked is to (assuming their license allows it) ship it yourself.
There is an argument for continuous integration against the latest versions on PyPI but that does assume there will be a new version and that the author doesn't just delete the whole thing. CI is just a good practice here, not a panacea.

What's the recommended way to get the latest sakai code to test against?

My standard route has been to go to confluence, find the docs sections, then navigate through to the install docs for the version, e.g. sakai 10:
https://confluence.sakaiproject.org/x/iYGLBQ
Through one means or another I happened across the source route to this too, so starting here....
http://source.sakaiproject.org/release/
You get redirected to the latest stuff, and appended version numbers to that url gives you other docs, e.g. adding 2.8.2 or 10 to the end of the url
But the links to what I should download are quite often not there, at the time of writing the 10 tar ball and zip in the confluence links are dead and the source.sakaiproject links doesn't have the 10 docs yet (redirects to 2.9.3) presumably this is because v10 is not released yet....
So, I'd like to evaluate a new version of a sakai source install, what's the best way to do this? (considering the official documentation for install is still being formed)
Do I download the latest SVN, or the latest RC or the latest beta or??? How do I know what's best to test against without being "too" bleeding edge? Is there a recommended tar ball/zip link to test against? Is there a "latest good" SVN branch?
The latest code is always in the Sakai trunk (currently svn):
https://source.sakaiproject.org/svn/sakai/trunk/
That code may very well not be stable though as it is where things are being actively developed. If you are not actively developing then you should stick to the releases as indicated on the project website here:
http://sakaiproject.org/current-release
If you want to use something in between (say an upcoming release) then you can grab the most recent tag or maybe use a recent branch (both currently in svn, latest shown below at the time I write this):
https://source.sakaiproject.org/svn/sakai/branches/sakai-10.x/
https://source.sakaiproject.org/svn/sakai/tags/sakai-10-rc02/
The reality of the situation is that if you want to use something other than the release then you should really participate in the dev community for Sakai. Joining the mailing lists and the weekly calls will provide the information you are asking about and much more.

What is the difference between Lazarus and CodeTyphon

Firstly, I saw some topics about these two but weren't my answer.
I'm looking for a good FPC(Free Pascal Compiler) IDE on GNU/Linux.
There are some IDE's like Lazarus and CodeTyphon. I need suggestion to choose one of those.
I've tried Lazarus once but all windows was separated. It looks messy and not interesting.
I would like to know what are the distinguishes between these two ?
I would like to know advantages / disadvantages each of those. Thank you
CodeTyphon is a distro of Lazarus, like Ubuntu and Debian are distros of Linux.
CodeTyphon comes with a large package of components and plugins, that otherwise you would have to google and download and install.
CodeTyphon have their own idea what are stable versions and what are not stable yet for both of FPC (compiler) and Lazarus(IDE). Whether their assessment is better or worse than upstream's Lazarus Team's, I don't know.
What about one-single-window plugin, it is work-in-progress and it doesn't seems to me it is ready for production use, no matter would you get it as part of CT or download and add it to vanilla Lazarus. However maybe it better works on Linux than on Windows, I don't know.
There were however issues with code legality in CT grande bundle. It is widely believed that Orca (if I remember the name) violates copyrights of glScene/vgScene, which also happened in early Delphi FMX releases but was fixed by EMBA later. There also were disputes in FPC forums/wiki about CodeTyphon pirating some open-source components. See answer by Peter Dunne below.
Your question is akin to asking the difference between Linux and Ubuntu. Lazarus is an IDE/component library, based on FreePascal (FPC). And CodeTyphon is a distribution of Lazarus and FPC. So CodeTyphon is just one way to install a functioning installation of Lazarus.
Lazarus uses the same floating window design as older versions of Delphi. Installing from CodeTyphon won't change that.
Myself and several friends highlighted several licensing issues with codetyphon
most of which could have been corrected by sourcing the included files from known good source and ensuring the correct license headers were included
PirateLogic refused to correct the issues which means they are using code in direct violation of the original license terms
The fact its open source code does not change the fact they are pirating the code by not including the correct license even after the issue was highlighted
I also found several instances of copyright code included which appears to be proprietary and not FOSS at all
They also changed the path & file names on some libraries so that source is no longer compatible with standard lazarus/component installs
This in my view is totally illogical
These 2 factors heavily undermine what was potentially the best FPC/Lazarus distro
Hardly professional
Lazarus can be a daunting installation process due to it's nature as a cross compiling environment. You don't just download an installer and click ok. A typical "installation" is actually a bootstrap FPC compiler doing a three-pass compilation of an "install". There are plenty of good installation scripts/methods from the official Lazarus/FPC team and in the community for a . But, understandably, the installation process is a skill in itself.
CodeTyphon is a a different/separate branch of an installer system, which is more of a utility suite/tools/third party code compilation library. If you want the simplest installation experience go with CodeTyphon. It has the nice graphical front end for managing the compiler. You can conveniently do the fancy stuff like build "cross-compilers" for almost every "target" operating system out there. It also is jam packed with hundreds of the best components/libraries pre-installed. It is a very actively maintained project and very professional. A whole lot of work is done for you.
Even if you want to be learn the low level compiler capabilities, CodeTyphon is a good place to start. It is written in FCP/Lazarus and is open source. Simply study it as "working demo app" and the other info on the compiler details. If you crash it, at least you don't have to learn to climb the hill. You get to get to start from the top and lose control on the way down. Start from scratch (and a three hour reinstallation) Hahaha
Lazarus also has a package "AnchorDock" which allows you to dock all the windows into one. Either install the anchor dock design package after installing Lazarus, or install Lazarus using the script at getlazarus.org which will do it for you.

Should we store JRE in CVS/SVN?

I want to bundle JRE 6.0 together with my java application. All my source code reside in CVS. My client will check-out the code and build it themselves. Should I store JRE in CVS?
I normally advocate putting most everything in source control, but this seems a little excessive. Why ?
the JRE is readily available from http://java.sun.com
it doesn't change that often. I'd expect you to specify a minimum version for your code to run against (e.g. 1.5, 1.6 etc.)
I would not put a JDK or JRE into a source code repository:
It is bad practice to put externally versioned things into your version control because it usually leads to over-constraining, obscuring and/or hard-wiring your app's external dependencies. (Maven or Ivy are good solutions for dealing with external dependencies, though not in this case,)
Putting binaries into version control is a bad idea for some version control systems.
But I think your real problem (actually, your user's organization's problem) is the IT folks who refuse to contemplate upgrading the JRE:
They need to be made aware of the
fact that they can install multiple
JRE versions on the one machine, and
configure apps to launch with the JRE
version they require. (It is trivial
on Linux ...)
They need to be made aware of the fact
that their policy is an impediment to
progress.
They need to be made aware of the fact
that their policy is a potential security
issue. If they force users to deploy their
own copies of JDKs / JREs in random places,
it will be difficult to ensure that JRE security
patches get applied. (Besides, 1.4.2 is due
to be end-of-life'd soonish, and security
patches for it will cease.)
EDIT: and there is also the legal question of whether "redistributing" a JRE out of your source code repository is a violation of Sun's click-through JRE/JDK download license. (I don't know ...)
As best practice, you shouldn't keep any binary files in the source control system. For Java developers there is maven that does it's work better in versioning jar files. The reason is that we want to keep our source repository as small as possible so it is faster for those that checks out our code for the first time.
But if you still want to keep binary files in the source control, it would be best to avoid using CVS, because CVS is bad in versioning binary files. You can search with google, why it is bad. If you use SVN, then it still okay because SVN handles binary files much better than CVS.
I see nothing wrong with storing the JRE in CVS.
However, it's not so important whether you do or not as long as your script can pull it as part of the build. For example, if you want to host a downloadable jre.zip on an HTTP server, or point to it in a Maven repo, that's just as good.
Well won't your client all ready have the JRE if you expect him to compile the code before running it? The JDK contains the JRE.
Depends a lot on what you use to handle dependencies. If you use Maven, then create a maven package with the stuff you need, and host it on a local repository.
If you just have CVS (like we do) then it is fine to create big binary packages (since you will need them) which you can then put in CVS. Just be aware that they should be static for best CVS performance.
ALso note that the jsmooth package can create an EXE file of your jar with an JRE embedded in it. This might solve your deployment problem.
For remote compilation, Eclipse can work with a plain JRE. You just need to tell Eclipse where JRE you already have prepared above is located on the disk. There is also a folder inside the Eclipse distribution where the launcher looks automatically.
I'm wondering about the client building the application themselves. It will require some kind of Java compiler, most probably javac wich is part of the JDK. So your client will not only need a JRE, but a JDK as well (unless they will be using Jikes or another alternative compiler).
javac is capable of generating bytecode for previous versions of Java, so using a newer compiler should not pose any problems.
Personally, I would not include large binaries like a JRE as part of my own repository. The JRE can be considered very stable and just listing the minimum version required should be enough. Installing a JRE is also something quite different than installing a single Java application. The two activities should not be mixed.