I am trying to run an analysis that would capture the values of the locator when running selenium test. Having a large number of projects and versions to test, I want to avoid directly modifying the project files whatsoever, hence, I resorted to the use of a javaagent (with ByteBuddy). The idea is to instrument the test classes so they will get me the information I need. Hence, in the pom.xml of the project I am analyzing i had to setup the surefire-plugin as follow
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>3.0.0-M1</version>
<configuration>
<argLine>#{argLine} -javaagent:\path\to\my\agent.jar</argLine>
<forkCount>0</forkCount>
</configuration>
</plugin>
The argLine field allows me to attach the agent to the surefire process and the forkCount forces surefire to only use one JVM which should allow me to have a visibility on all the classes that are loaded (I think).
As for the agent, I have the premain method as follow
public static void premain(final String agentArgs,
final Instrumentation inst) {
System.out.println("Starting to collect metrics");
new AgentBuilder.Default()
.with(new AgentBuilder.InitializationStrategy.SelfInjection.Eager())
.type(ElementMatchers.any())
.transform(new LocatorReporterTransformer())
.with(AgentBuilder.TypeStrategy.Default.REDEFINE)
.installOn(inst);
}
I used the ElementMatchers.any() to be sure to be able to see all the classes that were intercepted by my agent. In the LocatorReporterTransormer class, i have the method transform(...) as follow
#Override
public DynamicType.Builder<?> transform(DynamicType.Builder<?> builder,
TypeDescription typeDescription,
ClassLoader classLoader,
JavaModule javaModule) {
System.out.println(typeDescription.getName());
return builder;
}
My goal was to capture some of the classes from my tests, but it seems they are never loaded in the JVM, which is weird to me. Thus, my question is: How can I safeley add a javaagent to surefire and make sure it can access all the classes? Is there a way for the javaagent to capture all the subprocess from a target?
The classes were not all visible because more than one process was executing. It seems that the forkCount = 0 d did not do the trick. One solution is to actually attach the agent on the maven process using the MAVEN_OPTS and then use fockMode = never. Here is the implementation that works in my case:
set "MAVEN_OPTS=-javaagent:\path\to\agent.jar"
mvn test -DforkMode=never
Related
I am creating a managed object inside App.java (which is the main class of my module). I am using guice library with dropwizard framework and getting this exception only when running with IntelliJ if I run the same code with mvn it works perfectly fine which is weird and beyond my theory. So if someone has experienced something like this or have some theory behind this then please share. feel free to ask any detail.
environment.lifecycle().manage(new Managed() {
#Override
public void start() throws Exception {
}
#Override
public void stop() throws Exception {
}
});
Exception stacktrace:-
Exception in thread "main" com.google.inject.CreationException: Unable to create injector, see the following errors:
1) Injecting into inner classes is not supported. Please use a 'static' class (top-level or nested) instead of com.phonepe.growth.App$4.
at ru.vyarus.dropwizard.guice.module.installer.InstallerModule.bindExtension(InstallerModule.java:191) (via modules: ru.vyarus.dropwizard.guice.module.GuiceSupportModule -> ru.vyarus.dropwizard.guice.module.installer.InstallerModule)
1 error
at com.google.inject.internal.Errors.throwCreationExceptionIfErrorsExist(Errors.java:470)
at com.google.inject.internal.InternalInjectorCreator.initializeStatically(InternalInjectorCreator.java:155)
at com.google.inject.internal.InternalInjectorCreator.build(InternalInjectorCreator.java:107)
at com.google.inject.Guice.createInjector(Guice.java:99)
at ru.vyarus.dropwizard.guice.injector.DefaultInjectorFactory.createInjector(DefaultInjectorFactory.java:20)
at ru.vyarus.dropwizard.guice.GuiceBundle.createInjector(GuiceBundle.java:191)
at ru.vyarus.dropwizard.guice.GuiceBundle.run(GuiceBundle.java:138)
at ru.vyarus.dropwizard.guice.GuiceBundle.run(GuiceBundle.java:93)
at io.dropwizard.setup.Bootstrap.run(Bootstrap.java:200)
at io.dropwizard.cli.EnvironmentCommand.run(EnvironmentCommand.java:42)
at io.dropwizard.cli.ConfiguredCommand.run(ConfiguredCommand.java:85)
at io.dropwizard.cli.Cli.run(Cli.java:75)
at io.dropwizard.Application.run(Application.java:79)
at com.phonepe.growth.App.main(App.java:48)
This is actually a bug in guicey classpath scan (.enableAutoConfig).
Your application class is covered by a scan and so inner classes are also visible: when you start app from idea "Managed" inner class (created by the compiler for new Managed() {...}) is detected and registered as an extension which also means registration in guice context. But guice can't instantiate the inner class and throws an error.
You can enable extra diagnostic messages with .printLifecyclePhasesDetailed() (on guice bundle) and see that additional extension is indeed appear when running from idea.
Most certainly, when you run app from maven it builds into jar first and then launched. In this case, classpath scan works a bit differently and doesn't see inner classes (inside jar).. so everything works.
Please note that you don't need to instantiate and register managed object (and other common objects) manually: you can simply declare managed as a separate class and guicey will find it and properly register (both in guice and dropwizard). This is the expected way of extensions registrations, especially together with classpath scan.
I have an msf4j application in package com.a.sample1 and I want to scan some component in com.a.sample2. Is there a way to do it in msf4j? I am using:
public static void main(String[] args) {
MSF4JSpringApplication
.run(Application.class, args);
}
I can't put my application in com.a package to scan both sample1 and sample2 automatically, one reason is com.a.sample2 is coming from some external library.
In Spring Boot, if the components, JPA Repositories or Entities are not in sub packages of Application.java's package then we need to specify them explicitly. Is this at all possible in MSF4J?
Though I am still waiting for the answer to scan package other than application package, there is work around. I have created an annotation, and Imported configuration class in that annotation.
So, when you add the annotation (created in sample1) in sample2, it will import configurations from sample1 and load the beans into sample2.
I checked MSF4J sources and found, what scans started only for package of Application class, passed as first argument into run method: https://github.com/wso2/msf4j/blob/release-2.1.0/spring/src/main/java/org/wso2/msf4j/spring/MSF4JSpringApplication.java#L165
Unfortunately it is private method and you cannot change it.
From another side, "source" argument (first argument used in run method) used only for determining package autoscan - so, you can simple place any DummyClass into com.a package and run it via :
MSF4JSpringApplication
.run(DummyClass.class, args);
I have a web maven application that have the database EJB jar as a dependency.
The database EJB is the one which has all the JPA entities and the persistence.xml file, so it's responsible by all database operations.
I just read http://arquillian.org/guides/testing_java_persistence/ and it explains how to test persistence using arquillian.
The tutorial considers that the persistence.xml file is in the webapp path, so it adds META-INF/persistence.xml as a resource.
So I'd like to know, how do I add database's persistence.xml when running the arquillian tests from my webapp? Is that possible?
Maybe the answer comes a little late, but anyhow I hope it's still of value for you:
You have two options, either read the archive from a file (maybe generated my mvn package) or create the archive yourself using ShrinkWrap:
Option (1), called from somewhere annotated with #Deployment:
/** maven did it for us .. we just have to read the file */
private static Archive<?> getArchiveFromFile() {
JavaArchive artifact = ShrinkWrap.create(ZipImporter.class, ARCHIVE_NAME).importFrom(ARCHIVE_FILE)
.as(JavaArchive.class);
return artifact;
}
Option (2), I found it useful to inspect the file from time to time so there's an option to write it to the file system:
/** doing it the hard way ... guess you won't like it as EVERY class plus related stuff needs to be specified */
private static Archive<?> getArchiveManually() {
// creating archive manually
JavaArchive artifact = ShrinkWrap.create(JavaArchive.class, ARCHIVE_NAME)
.addPackage(FooServiceBean.class.getPackage()).addPackage(Foo.class.getPackage())
.addPackage(FooService.class.getPackage()).addAsResource("META-INF/persistence.xml")
.addAsResource("META-INF/beans.xml");
// so we might write it for further inspection
if (WRITE_ARCHIVE) {
artifact.as(ZipExporter.class).exportTo(new File("D:/abc/xyz/tmp/" + ARCHIVE_NAME), true);
}
return artifact;
}
So your answer is included within the second option ;-)
I've got an outstanding issue in jasmine-maven-plugin and I can't figure it out.
You're welcome to try this out yourself, but the gist is that when one runs:
mvn jasmine:test
The properties configured in the pom.xml for the plugin are not set on the Mojo bean.
Upon inspection it's pretty clear that each property on the bean is falling back on its default value. However, when you run the test phase itself (which jasmine:test is bound to), like:
mvn test
It works fine.
Any ideas? The preamble at the top of the TestMojo looks like:
/**
* #component
* #goal test
* #phase test
* #execute lifecycle="jasmine-lifecycle" phase="process-test-resources"
*/
Update: Now I'm even more confused. Upon further reading, it seems this behavior is really unexpected, since the configuration that I'm seeing as missing is done in a <configuration> element right under the plugin, not under an <execution/>, per this document:
Note: Configurations inside the tag differ from those that are outside in that they cannot be used from a direct command line invocation. Instead they are only applied when the lifecycle phase they are bound to are invoked. Alternatively, if you move a configuration section outside of the executions section, it will apply globally to all invocations of the plugin.
And of course I'm an idiot. I was looking at the wrong POM, and sure enough the configuration was inside an <execution> block.
So I'll try to feed Google by answering my own question in big bold letters:
When you invoke a Maven goal from the command line, it will only pick up your pom.xml's configuration element if that configuration was made directly under the <plugin/> element, and not under any <execution/> element.
I work on a complex, multi-module maven project. One of the modules is an executable jar that implements a command-line application. I need to integration test this application. I need to run it several times, with several different command-lines and validate the exit status and stdout/err. However, I can't find a plugin for maven that claims to support this, and also can't track down a JUnit library that supports testing command-line applications.
Before you say 'don't test the main method - instead do bla', in this case I really do mean to test the main method, not some subsidiary functionality. The whole point is to run the application as a user would in its own VM and environment, and validate that it is behaving itself - parsing command-line options correctly, exiting with the write status and hot-loading the right classes from the right plugin jars.
My current hack is to use apache-exec from within a junit test method. It appears to be working, but is quite fiddly to set up.
public void testCommandlineApp()
throws IOException
{
CommandLine cl = new CommandLine(resolveScriptNameForOS("./runme")); // add .sh/.bat
cl.addArgument("inputFile.xml");
exec.setWorkingDirectory(workingDir);
exec.setExitValues(new int[] { 0, 1, 2 });
int exitCode = exec.execute(cl);
assertEquals("Exit code should be zero", 0, exitCode);
}
Why not simply use a shell script, using the maven-exec-plugin to build your classpath?
STDOUT=$(mvn exec:java -DmainClass=yourMainClass --arg1 --arg2=value2)
RETURN_CODE=$?
# validate STDOUT
# validate RETURN_CODE
You can even use something like shunit2 if you prefer a more structured approach.