Xpages: Load jar dynamically - dynamic

Has someone experience in loading jars dynamically for a XPages Application?
We would like to have some calculation code which is going to change quite often in external Jar Files and load them dynamically when they are needed. Does anyone know if that's possilbe with Domino?

You could create a tasklet which you can roll out like an OSGI plugin. This way you can execute the calculations in this tasklet which you can update independently of your application. That way you only need to update your update site and all applications who use that code will the latest version installed.
You can find more info about it here: http://xpag.es/?1926
Another solution would be to put the jar file on your server in the java/ext/lib directory. And every time a new release is created you can update that file on the server. A server / HTTP task restart would be necessary ofcourse.

Related

Run sql script automatically when building Spring boot project

I have a simple Spring Boot rest Api with a MySql database. It currently only has one table, so in order to create the table if it doesn't exist, I just have some Java code that does the job when the server is initialized.
But I don't think that's the right way to go. I believe a better way would be to have an external sql file which would be run from Spring each time I run my project.
So, let's say I have a file called TABLES.sql with all the db tables. How can I configure Spring to run this automatically each time it boots?
Thanks!
EDIT:
Just for further clarification, I have configured my project to run on a Docker db on a "dev" environment, and on a real instance on a "prod" environment. And the db user, pass etc are all configurable. I'm just messing around basically to learn stuff. :)
What you need is a schema migration tool. We use Flyway in our project and it works great.
With this you write incremental SQL scripts which all run to give you the final version of database. To actually run the migrations you need to use flyway's migrate goal.
mvn -P<profile> flyway:migrate
If you use Hibernate, a file named import.sql in the root of the classpath will be executed on startup. This can be useful for demos and for testing if you are careful, but probably not something you want to be on the classpath in production. It is a Hibernate feature (nothing to do with Spring).
Get more info here: Database initialization.
OK, I think I've found what I were looking for. I can have a schema-mysql.sql file with all my table creates etc, according to this spring guide:
Initialize a database using Spring JDBC
This is basically what I want
Thanks for your help though!

Load package dynamically

Is it possible to load a specific package during runtime?
I want to have a kind of plugins where each one has the same functions than the others but with different behaviour, and depending on the configuration file, load one or other.
No, Go doesn't support dynamically loaded libraries.
Your best bet is to start the plugin as its own executable and communicate with it through sockets or via stdin/stdout.
2017 update
This answer is no longer true, Go now supports plugins (for Linux and MacOS only as of June 2021)
There is support for this now as of go 1.8
https://golang.org/pkg/plugin/
You might consider executing the ‘plugin’ packages at runtime, by writing out a new program (say, to a temp directory) and executing via exec.Command, something along the lines of exec.Command("go", "run", files…).Run()
You’ll see some similar code here.
Just do these,create a codegen that reads the configuration, generates a basic go file with the packages loaded in order and then execute that, compile languages won't nor provide dynamic loading, even dart suffers in a way,simple just read your configuration file then create a temporary file with the necessary codes to load up and communicate with sockets or http
I think what you are looking for is the special function init
if you add a
func init() {
}
inside a package it will run it the first time the package is imported.
This happens only in the same binary. As other have already said go does not support dynamically loaded libraries.

Deploy multiple configurations from command line without changing project files

Please don't be too harsh, because I do not grasp this entirely correctly still, but msbuild/msdeploy is giving me some headaches lately.
Hopefully someone can provide a textual aspirin of some kind? So here is what I want to do:
I have a web application project, that has multiple configurations, thus multiple web.config-transforms.
I would like to deploy this project from command line.
I would rather not want to modify its project file. (I want to be able to do this for several web applications so as least as editing as possible is much appreciated)
I would like to be able to build it only once and then deploy the different configurations from it.
So far I deployed from command line using something like this:
msbuild D:\pathToFile\DeployVariation01.csproj
/p:Configuration=Debug;
Platform=AnyCpu;
DeployOnBuild=true;
DeployTarget=MSDeployPublish;
MSDeployServiceURL="localhost";
DeployIisAppPath="DeployApp/DeployThis01";
MSDeployPublishMethod=InProc
And this performs just what I want, except it only deploys the "Debug"-Configuration.
How can I, with minimal adjustments, make it deploy my other configurations as well?
I was thinking maybe I could build a package that includes all my configurations and then deploy from that and decide "while deploying" which configuration to deploy?
Unfortuanetly I am pretty much stuck here, the approaches I have read about all seem to require some modifications to project files, is there a way around that?
UPDATE:
I am still not really where I want to be here :).
But I looked into this PackageWeb-approach (also interesting video about that here) and it seems pretty nice; I can now build a package that includes all my transforms and then deploy from that as often as I want into multiple configurations.
One thing that I dislike about this is that I have to store my password in plain text into the generated parameters file for the powershell script, does someone know a way around this, I really would rather have that being an encrypted password.
Also other approaches to solve my original problem are still appreciated.
I am working on the same problem and am taking two paths using Microsoft Web Deploy or MSDeploy which is now in version 3.0.
I first compile the project using MSBUILD using the Package target passing in system.configuration, system.packagelocation. The Package Target generates a set of package files including a {PackageName}.SetParameters.xml file. The SetParameters.xml file by default allows on-publish changes to ConnectionStrings without recompiling when using msdeploy.exe to publish the file. The publish transformation process can also be customized by adding a parameters.xml file to the process defining additional parameterized web.config settings which can be changed at deploy time.
After the initial build I use the {PackageName}.deploy.cmd file generated by MSBUILD during the Package process to deploy the package to the target website. The Package process essentially duplicates the process you are currently doing from MSBUILD in that I can publish one Build-Configuration web.config transform from one compile. The process provides a consistent deployment process that can target remote servers from a central CI environment, which is great from a purely deployment process. The PackageBuild/Deploy process is parameterized within TeamCity, requiring changes to only a few parameters to setup a new deployment.
Like you, I cannot, however, compile a single version of code and deploy to multiple servers using the process as it exists today - which is my current focus. I want to parameterize the transform in a Continuous Deployment, build-once-deploy-many pattern to Dev, QA, User Testing, Staging, and Production.
I anticipate using one of two methods:
Create a Parameters.xml file for each project defining the variable deployment parameters along with a custom {ServerName}.SetParameters.xml for each target deployment, both to be used in conjunction with msdeploy.exe.
a. I am not sure defining a parameters.xml is a flexible enough process for my needs as the current project inserts and removes a variable number of web.config settings. Implementing a parameters file incorporating all of the variables could be too complex for my taste. I would also end up creating all of the target transformations, instead of the current developers initiated process. Not ideal.
I am following up on very recent updates to VS2012 Web Tools 2012.2 which allow tying a web.config transform to the publish profiles (profile.pubxml) now stored under SolutionName/Properties/PublishProfiles in VS2012.
VS2012 release 2012.2 adds the capability to create a second transform tied to the publish profile. The resulting transform process first runs the build configuration transformation, followed by the publish transformation, i.e. Release Transform followed by TargetServer Transform. Sayed Hashimi has a great YouTube video demonstrating the entire process using MSBUILD.
What is not entirely clear is whether the second transform is supported separately from the build using MSDeploy in a Continuous Deployment, build-once-deploy-many Pattern, or if the publish transformation is only supported during a separate Package/Build for each target transformation.
Option 1 will definitely work for some environments and was my first plan for tackling a Continuous Deployment process. I would much rather use Web Transforms to accomplish the process if possible.
An outside third possibility is using one of several CodePlex commandline projects that are capable of transforming web.config using the XDT transform engine. Unfortunately, using these tools would mean splicing the results into the Build/Package MSBUILD process in order to get the resulting web.config transformation into the deployment package - something I've not yet been successful in accomplishing. Sayed Hashimi also has a PackageWeb project from 2012 that might work as well. I am hoping his more recent work replaces the need for the extra steps involved in the packageweb solution.
Let me know if you decide on a solution - as I am definitely interested.

Is there a maven plugin that can diff files and output the result to file?

I've been working on integration-tests for a Java web service. The integration test now sends SOAP-requests to the server which are asserted via the SoapUI-plugin and for each of the SOAP-requests an xml file is produced and saved (a part of the integration-test phase).
Is there a plugin that allows me to diff the xml files that has been output and saved against a similar set of xml files that were produced in an earlier run? The idea is to diff xml files output from the previous release version with the current version to make sure the expected changes have been made to the xml files.
I hope my question is clear enough. Thanks in advance
EDIT: The xml files that I would like to compare against will be copied in to a directory (lets say, target/compare_against) by the person that is running the test. They are not under SCM.
The only plugin that will do diffs “natively” is the scm plugin, and that only if you've got the other version of the file committed to a repository. (I say “natively” because it probably just runs the diff in a subprocess internally anyway.) I mention this because your question wasn't really clear about how you were keeping around the data from the previous runs.
If that doesn't fit, you'll find the antrun plugin easiest.
I haven't seen any Maven plugins that will do it. You might be able to find an Ant Task (maybe this one?) to do it and use the antrun plugin to run the task. I did see some stuff about xmldiff and Maven/Ant integration but, it's kind of bare.

QuickBuild: How can I create a builder to open a tarball package (tar.gz) whose name will change with each version?

I'm using PMEase QuickBuild to perform automated builds of our Maven2 projects and a nightly sanity test to ensure nothing is broken.
The test needs to untar packages which are created by the automated Maven2 projects. The problem is that the package names change frequently due to project versions being incremented all the time.
Does anyone know how I can configure QuickBuild to pick up the version (ideally from the POM file of the individual components), if this is possible at all?
I don't know if this is an option for you but it looks like you can do it the other way around. Quoting Build with Maven:
Control build version
If you want to control the build
version from QuickBuild side, please
follow below steps:
Change the POM file and define the project version as
${buildVersion}. Do not forget to
commit the file into your SCM after
change.
Define a build property like below when define the Maven build
step:
buildVersion=${build.version}
There are maybe other options but I must admit that my knowledge (zero) of QuickBuild is very limited
I created a work around to this issue by having QuickBuild execute a shell script which did the untarring by using wildcards, similar to the following (to avoid computing the exact version):
tar xzf filename-*.tar.gz
I couldn't figure out how to do this in QuickBuild, so I offloaded the work to the shell script.