Is it possible to create a NetBeans project to run PayaraMicro? - netbeans-8

I use NetBeans to create Payara Server normal Java Web projects.
But now I would like to run a new Java Web project using Payara Micro.
Today, I'm using a "main class" and I need to open this class, type
Shift+F6 to run, but it would be great if normal F6 would work.
Here is my "main":
public class Run {
public static void main(String[] args) {
try {
PayaraMicro.getInstance()
.addDeployment("/sistemas/sitesat2mod/build/web/")
.setHttpPort(8080)
.setHttpAutoBind(true)
.bootstrap();
} catch (BootstrapException e) {
e.printStackTrace();
}
}
}

There's no direct support for Payara Micro in Netbeans to run web applications yet.
The simplest solution is to open the build.xml configuration file and insert the followin snippet right below the line with the import statement:
<target name="-run-deploy-nb"/>
<target name="run" depends="run-deploy">
<java jar="/path/to/payara-micro.jar">
<jvmarg value="-Xmx256m">
<arg value="--deploy"/>
<arg value="${dist.war}"/>
<arg value="--port"/>
<arg value="8080"/>
<arg value="--autobindhttp"/>
</java>
</target>
Instead of /path/to/payara-micro.jar specify absolute path to your payara-micro.jar, or if payara-micro.jar is inside your project dir in a lib directory, you may specify the relative path with the basedir variable like this:
<java jar="${basedir}/lib/payara-micro.jar">
After you save the build.xml file, you can press F6 and your application will be deployed with Payara Micro. You should then configure command line parameters in build.xml instead of your Run Java class (you should delete your Run class because it won't be used)
Edit:
If you want to restart (redeploy) your application, you have to press Ctrl + Shift + Del to stop the running application before pressing F6 to run the new version. So each time you want to redeploy, first press Ctrl + Shift + Del and then F6.

An alternative approach is to restructure your project to use Maven build system, which is directly supported by Netbeans without any plugins.
There's Payara Micro maven plugin which can be added to the build configuration and this plugin can start and stop Payara Micro. If you configure it to first stop a running instance and then start a new instance, it will restart Payara Micro in a single action.
Most of the new projects these days are based on maven because it's a standard way to structure and build projects and it's supported by many IDE's and even from command line, whereas traditional Ant-based Netbeans projects aren't supported by other IDE's automatically.

Related

IBM MobileFirst: using external jar files during command line build

We are trying to use a org.JSON.JSONObject library for some intense json processing in the adapter side. We have Java classes which processes the data received from http adapters.
mobilefirst 6.3.0 and using cli 20150701 build
(the recent one).
This JSON referencing has no issues when building from eclipse mobilefirst studio environment.
We are building this environment in ubuntu linux 14.04. There is an error in referencing org.JSON.JSONObject..x.jar file when we execute
mfp start or mfp build or mfp deploy
Is there a way to reference this jar file during mfp start or mfp build or mfp deploy as a classpath.
We need to have something like
mfp -classpath "path/to/json.jar" build
please help.
As it turns out, the CLI does not yet recognize jars placed into the server/lib folder of your project. In oder to make this work, you can make a simple edit to the following file:
[cli install folder]/mobilefirst-cli/node_modules/generator-worklight-server/lib/build.xml
At or about line 123, add the third fileset element shown below:
<!-- Classpath for server runtime libraries used when building the WAR -->
<path id="server-classpath">
<fileset dir="${worklight.jars.dir}" includes="worklight-jee-library.jar" />
<fileset dir="${worklight.server.install.dir}/wlp/dev" includes="**/*.jar" />
<!-- add server/lib folder to classpath -->
<fileset dir="${worklight.app.dir}/../server/lib" includes="**/*.jar" />
</path>
After that, running 'mfp start' (or 'mfp restart' if your server is already running) will compile your custom Java code with any jars that you add to the server/lib folder included in the classpath.
JARs for use by your adapters should be added to you Project's server directory in the folder lib. They will be included in your Projects WAR file when the project is built (in Studio or by the ant tasks) and when you deploy that WAR it will be visible to your adapters.
I agree with #bjustin_ibm. Thanks for that. While the above approach works, there's also another way of doing this.
Alternative hack
Just add your required .jars to the following location, it gets added to the classpath during mfp start
/home/instanceubuntu/.ibm/mobilefirst/6.3.0/server/wlp/dev/spi/third-party
This solution is more simple and doesn't really have to maintain the build.xml file.
Hope this helps.

Where is the "jrebel JAR-file" in the my WAR-file?

I want to use jrebel with intellij IDEA and JBOSS AS7.
(have a web application(so have WAR)).
By apache-ANT , build my-WAR and deploy it under JBOSS AS.
I know how introduce any changes of classes or resources to jrebel in the my-WAR (If I'm not mistaken!) , as follows:
(In rebel.xml)
<classpath>
<dir name="D:/project/myProject/out/production/myProject">
</dir>
</classpath>
<web>
<link target="/">
<dir name="D:/project/myProject/resources">
</dir>
<dir name="D:/project/myProject/view">
</dir>
</link>
</web>
But, really, "jerebel jar-file" Where is the my-WAR?
in => "warFile >WEB-INF > lib"?I did not see it.
please help me.
jrebel.jar is packaged inside the JRebel plugin for IntelliJ IDEA. It doesn't have to be deployed with the WAR itself. JRebel plugin will set the correct JVM parameters to the command line when you start via "Run with JRebel" or "Debug with JRebel":
-javaagent:/path/to/jrebel.jar
UPDATE: the newer versions of JRebel are configured using -agentpath JVM option instead:
-agentpath:${JREBEL_HOME}/lib/<platform-specific-binary>
See the documentation reference of the correct settings.
You don't have to do it yourself if you start the server from the IDE. If you start the server from the command line then you would have to add the JVM argument yourself with the correct path to jrebel.jar as described here: http://manuals.zeroturnaround.com/jrebel/standalone/launch-from-command-line.html#jboss-7-x
Java agents intercept class loading and thus have to be loaded before other classes. As you might have guessed, jrebel.jar is a Java agent and therefor nothing requires it to be packaged inside a WAR.
Instead, rebel.xml, the configuration file, has to be packaged in the WAR, in WEB-INF/classes directory. JRebel uses rebel.xml to detect, where the compiled classes and resources are. So when the application is deployed, JRebel finds rebel.xml configuration file and won't load the application classes from WAR itself, but instead it will use the path that is specified in that rebel.xml file. This is why rebel.xml has to be inside the war, as you may also start the server from command line instead of the IDE.

MSBuild - trying to run xUnit (.net) tests

I'm trying to set up a C# project that'll run xUnit tests when I build, so I can use them in continuous integration. I have a regular project, a class library test project using xUnit, and my test runner project. From everything I've read, it appears that I should be able to get this working by doing this in the test runner project:
<?xml version="1.0" encoding="utf-8"?>
<Project ToolsVersion="4.0" DefaultTargets="Test"
xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
[auto-created project stuff]
<UsingTask AssemblyFile="xunit.runner.msbuild.dll"
TaskName="Xunit.Runner.MSBuild.xunit" />
<Target Name="Test">
<xunit Assembly="$(MSBuildProjectDirectory)\..\OnePageOneDb.Tests\bin\Debug\OnePageOneDb.Tests.dll" />
</Target>
</Project>
When I build my solution after a change (usually editing the .csproj file), I get this:
The "Xunit.Runner.MSBuild.xunit" task
could not be loaded from the assembly
C:\Users[myusername]\Code\OnePageOneDb\OnePageOneDb.TestRunner\xunit.runner.msbuild.dll.
Could not load file or assembly
'file:///C:\Users[myusername]\Code\OnePageOneDb\OnePageOneDb.TestRunner\xunit.runner.msbuild.dll'
or one of its dependencies. The system
cannot find the file specified.
Confirm that the
declaration is correct, that the
assembly and all its dependencies are
available, and that the task contains
a public class that implements
Microsoft.Build.Framework.ITask.
Even if I add xunit.runner.msbuild.dll and xunit.runner.utility.dll to the project in the location it refers to, I get this message. But if I build again with no changes, I consistently get this:
The "xunit" task was not found. Check
the following: 1.) The name of the
task in the project file is the same
as the name of the task class. 2.) The
task class is "public" and implements
the Microsoft.Build.Framework.ITask
interface. 3.) The task is correctly
declared with in the
project file, or in the *.tasks files
located in the
"C:\Windows\Microsoft.NET\Framework\v4.0.30319"
directory.
But I've checked all these things:
The task class in xunit.runner.msbuild.dll is Xunit.Runner.MSBuild.xunit (and xunit is lowercase in the class name).
The task class inherits from Task, which implements ITask.
So maybe there's a problem in UsingTask, but I don't know what it is.
(I also thought the problem might be that xunit.runner.msbuild.dll is targeted at .NET 2.0, and I'm using VS 2010, but I recreated the test runner project in .NET 2.0 and the problem persisted.)
Can anyone help?
You need to specify correct path to xunit.runner.msbuild.dll.
First of all, you can just set the full path and test that xunit just works as you want.
But for real environment you should specify relative path to the dll.
<UsingTask AssemblyFile="$(MSBuildProjectDirectory)\..\..\lib\xunit\xunit.runner.msbuild.dll"
TaskName="Xunit.Runner.MSBuild.xunit" />
MSBuildProjectDirectory is a reserved property and contains "the absolute path of the directory where the project file is located".
EDIT:
Try to use target by full name Xunit.Runner.MSBuild.xunit
<Target Name="Test">
<Xunit.Runner.MSBuild.xunit Assembly="$(MSBuildProjectDirectory)\..\OnePageOneDb.Tests\bin\Debug\OnePageOneDb.Tests.dll" />
</Target>
I get exactly the same error message if I have Pex and Moles installed. Everything works fine after uninstalling them.
By default building in "release" configuration triggers running xunit tests.
If you are trying to disable running xunit tests in tfsbuild pass the following build parameter.
This can be useful in the new cross platform builds where running unit tests is a separate step.
/p:RunXunitTests=false

Why does headless PDE Build omit directories I've specified in build.properties's bin.includes?

One of my Eclipse plug-ins (OSGi bundles) is supposed to contain a directory (Database Elements) of .sql files. My build.properties shows:
bin.includes = META-INF/,\
.,\
Database Elements/
(...which looks right to me.)
When I build and run from within my interactive Eclipse IDE, everything works fine: calls to Bundle.getEntry(String) and Bundle.findEntries(String, String, bool) return valid URL objects; my tests are happy; my code is happy.
When I build via headless ant script (using PDE Build), those same calls end up returning null. My tests break; my code breaks. I find that Database Elements is quietly but simply missing from my plug-in's JAR package. (META-INF and the built classes still make it in there fine.) I scoured the build log (even eventually invoking ant -verbose on the relevant portion of the build script) but saw no mention of anything helpful.
What gives?
It appears there was a bug (though I was unable to search up a Bugzilla citation) in the PDE Build ant-script generation process as of 3.2 that produced an ant build.xml script fragment like this from the bin.includes:
<copy todir="${destination.temp.folder}/my_plugin" failonerror="true" overwrite="false">
<fileset dir="${basedir}" includes="META-INF/,Database Elements/" />
</copy>
The relevant Ant documentation says that includes contains a "comma- or space-separated list of patterns". Thus (since my directory name contains a space and was copied literally into the includes attribute value) I think the copy task was trying to include a file named Database and a directory named Elements/. Neither existed, so they were quietly ignored. I suspect the same problem would have bitten if I had a comma in my directory name, but I did not test this.
Since I use Eclipse 3.5 interactively, I decided to finally decouple my headless build's Eclipse instance from my target platform (which remains at 3.2 for the moment) and to update my headless PDE Build to 3.5 (by attempting to produce a minimal PDE Build configuration from my interactive instance's plug-ins). Now, the generated build.xml contains this instead:
<copy todir="${destination.temp.folder}/my_plugin" failonerror="true" overwrite="true">
<fileset dir="${basedir}">
<include name="META-INF/"/>
<include name="Database Elements/"/>
</fileset>
</copy>
The relevant Ant documentation this time indicates that the only special characters in an individual include are * and ?. Indeed, the bug seems to have been fixed sometime between 3.2 and 3.5: my 3.5-based headless PDE Build now produces a plugin that contains Database Elements; my tests are happy; my code is happy; I'm happy.

Ant scripts cccheckin/cccheckout using the CCRC plugin to eclipse?

Is it possible to use Ant scripts to checkin/checkout source code elements while using the CCRC plugin to eclipse? I am getting an error message saying that the element the script is attempting to check out is not part of the VOB, but of course it is there and I can check it out manually.
It should be possible to use those Ant ClearCase tasks with CCRC views ("web views" which are anologous to snapshot views)
A script like this one should work:
<project name="Testing ClearCase " default="CC" basedir=".">
<target name="CC">
<property name="FileSrc" value="MyView/MyVob/MyDir"/>
<property name="dist" value="dist"/>
<cccheckout viewpath="${FileSrc}/myFile"
reserved="false"
nowarn="true"
comment="Auto Build from script"
failonerr="false" />
<copy file="${dist}/myFile" tofile="${FileSrc}/myFile"/>
<cccheckin viewpath="${FileSrc}/myFile"
comment="Checked in by myFile.xml ANT script"
nowarn="false"
failonerr="false"
identical="true"/>
</target>
</project>
But you need to make sure your current directory is (in this script) just above where you update your web CCRC view "myView".
The only issues I know of are:
if CCRC try to checkout a file of a replicated Vob.
if the parent directory of a file to be checked-in was renamed from another view
The Ant ClearCase tasks in VonC's answer use the cleartool command (getClearToolCommand() in org.apache.tools.ant.taskdefs.optional.clearcase.ClearCase.java). When I invoke a cleartool operation, even from within or above the CCRC view, I get the error message from the question.
Now (as some years have passed since VonC's answer) there is a CCRC CLI that can be used instead (http://www-01.ibm.com/support/docview.wss?uid=swg24021929, setting CCSHARED to your top level \eclipse directory). The commands are similar to those provided by cleartool, although as it appears not to support UCM to solve your problem of doing a check out I first had to set an activity on the stream using the CCRC eclipse plugin.
To get the CCRC CLI to work with the ant ClearCase tasks would require changing the task to:
Call rcleartool rather than cleartool.
Since cleartool points to an .exe and rcleartool is a bat for loading a jar, ProcessBuilder won't be able to process the new command (I tested with rcleartool.bat and cmd \c rcleartool.bat) unless you convert the jar to an exe.