Arquillian Suite Extension - Should we add all test classes in Suite - jboss-arquillian

When using Arquillian Suite Extension , should we add the other test classes using ShrinkWrap or Arquillian should do that.
Currently I am trying to use Arquillian Suite, it is adding only the class with #Runwith and rest all classes using same deployment are not added and class not found exception is generated
Please help.

You didn't mention for which class ClassNotFoundException is being thrown. So please update your original question.
btw. it seems that you simply missed some classes in your #Deployment method. Consider usage of addPackages method to add all relevant classes at once.
#Deployment
public static JavaArchive createDeployment() {
JavaArchive jar = ShrinkWrap.create(JavaArchive.class)
.addPackages(true, "my.package")
.addAsManifestResource(EmptyAsset.INSTANCE, "beans.xml");
return jar;
}

Related

how to execute Cucumber Step defination with TestNG annotation

I am supposed to migrate on Cucumber. I do have project framework with Selenium, TestNG with Data Driven Framework, Maven. I am exploring Cucumber feasibility with TestNG annotation.
My question is, How we can create connection between #Test method and Step definition of cucumber. Let's example our code is written in #BeforeClass, #Test, #AfterClass method. So how we can migrate with Step definition.
Feature File :
Feature: Is it Friday yet?
Everybody wants to know when it's Friday
Scenario: Sunday isn't Friday
Given today is Sunday
When I ask whether it's Friday yet
Step Definition:
#Given("^today is Sunday$")
public void today_is_Sunday() {
// Write code here that turns the phrase above into concrete actions
System.out.println("this is demo1");
}
#When("^I ask whether it's Friday yet$")
public void i_ask_whether_is_s_Friday_yet() {
// Write code here that turns the phrase above into concrete actions
System.out.println("this is demo2");
}
Class Exection:
#CucumberOptions(features = "cfeature/firstDemo.feature", glue = { "mytest/Stepd" })
public class demo01 extends AbstractTestNGCucumberTests {
private TestNGCucumberRunner tcr;
#BeforeClass(alwaysRun = true)
public void beforeClass() throws Exception {
tcr = new TestNGCucumberRunner(this.getClass());
}
#Test(groups="cucumber", description="Runs CucumberFeature")
public void testdemo() {
System.out.println("Hello");
}
#AfterClass(alwaysRun = true)
public void afterClass() {
tcr.finish();
}
}
Console:
Hello
[33mUndefined scenarios:[0m
[33mcfeature/firstDemo.feature:4 [0m# Sunday isn't Friday
1 Scenarios ([33m1 undefined[0m)
5 Steps ([33m5 undefined[0m)
0m0.073s
You can implement missing steps with the snippets below:
As of now, #Test annotation is calling. But, How to replace it with Step Definition. Please assist.
Not sure what the confusion here. Here's how you can relate TestNG and cucumber terminologies.
<test> tag in TestNG can be visualized as a feature file in cucumber.
#Test method in TestNG can be visualized as a scenario in cucumber.
A Step definition in cucumber has nothing directly equivalent to in TestNG because, its part of a scenario. But for the sake of understanding you can visualize it as one line of code doing a logical operation in TestNG.
The default implementation of AbstractTestNGCucumberTests is as below:
It contains a data provider internally which provides one feature file at a time.
It contains a #Test method which is bound to the above mentioned data provider, which retrieves all the scenarios in the feature file and then runs them one after the other.
You can build your own variant of AbstractTestNGCucumberTests to do various different things (such as support concurrent scenario execution which is currently not available in Cucumber JVM bindings).
As an example you can take a look at Cucumber-roadrunner library that I built which uses the above concept to support parallel scenario execution and also provides thread safe reports.
With respect to the error you are facing viz., You can implement missing steps with the snippets below: is basically because cucumber jvm bindings perhaps isn't able to bind your feature file with a glue code (which is what you are providing via the #CucumberOptions annotation). You should perhaps take a closer look at the cucumber jvm bindings documentation to understand how to provide the correct values.
You can also take a look to gherkin with QAF which is pure TestNG implementation for gherkin. It is using TestNG (NOT cucumber runner) and provides you all the features of testNG including parallel execution, listeners, grouping, priority etc...
Each scenario converted as TestNG test and you can run scenarios as parallel. Furthermore you can also use inbuilt or custom data-providers while authoring BDD. No need additional runner just configure as usual using appropriate factory class for the BDD syntax you are using.

Using jdk with javac instead of ant

I wrote a sample app when trying out java.
calculator/Calc.java
package calculator;
public abstract class Calc {
public static float Add(float f1, float f2) {
return f1 + f2;
}
public static float Subtract(float f1, float f2) {
return f1 - f2;
}
}
Main.java
import calculator.Calc;
public class Main {
public static void main (String[] args) {
System.out.println("1+1=" + Calc.Add(1f, 1f));
System.out.println("1-0.5=" + Calc.Subtract(1f, 0.5f));
}
}
I managed to compile these classes using jdk into their class files.
Browsing on the web, I found a package called Ant which allows to compile files too. Compiling with Ant also gives me the java class files.
What I want to know is, whats the need for Ant when I can compile my classes directly using jdk without writing any configuration file. Is investing some time into Ant any good when I can build my code using jdk straight away?
Ant is a build automation tool. It can be used to not only compile classes but also run those classes, test those classes,generate javadocs. Mainly used for independent deployment and/or testing of an application.
whats the need for Ant when I can compile my classes directly using
jdk without writing any configuration file.
Check this out.
Is investing some time into Ant any good when I can build my code
using jdk straight away?
Depends on what you want to do with our code. Look up Maven and Gradle as well.
Gradle is the newest among them and arguably better as well.

Define a missing method through AOP?

I'm in a situation where the implementation of a library we are using is newer than the implementation one of our dependencies was coded against. E.g. Dependency uses MyLibrary-1.0 and we're using MyLibrary-2.0.
In the newer implementation a deprecated method has been removed, which causes run-time errors for us.
I'm trying to use AOP (Spring-AOP to be specific) to intercept calls made to the missing method, and proxy them into an existing method... but I can't seem to get the Aspect right.
It feels like Java is raising the 'java.lang.NoSuchMethodError' exception before my Aspect has an opportunity to intercept. Is there some trick I'm missing, or is this just not feasible (e.g. the method must exist in order to aspect it)?
#Before("execution(* com.mylibrary.SomeClass.*(..))")
Fails with java.lang.NoSuchMethodError
#Around("target(com.mylibrary.SomeClass) && execution(* missingMethod(..))")
Fails with java.lang.NoSuchMethodError
Assuming that your are talking about a 3rd party library which is independent of Spring, you cannot use Spring AOP with its proxy-based "AOP lite" approach which only works for public, non-static methods of Spring components. Please use the more powerful AspectJ instead. The Spring manual explains how to integrate full AspectJ with load-time weaving (LTW) into Spring applications. If your application is not based on Spring so far and you just wanted to use the framework because of Spring AOP, you can skip the whole Spring stuff altogether and use plain AspectJ.
The feature you want to use is an inter-type declaration (ITD), more specifically AspectJ's ability to declare methods for existing classes. Here is some sample code:
3rd party library:
package org.library;
public class Utility {
public String toHex(int number) {
return Integer.toHexString(number);
}
// Let us assume that this method was removed from the new library version
/*
#Deprecated
public String toOct(int number) {
return Integer.toOctalString(number);
}
*/
}
Let us assume that the method I commented out was just removed from the latest version your own project depends on, but you know how to re-implement it.
Project dependency depending on old version of 3rd party library:
package com.dependency;
import org.library.Utility;
public class MyDependency {
public void doSomethingWith(int number) {
System.out.println(number + " in octal = " + new Utility().toOct(number));
}
}
Because the previously deprecated method Utility.toOct does not exist anymore in the version used by your own project, you will get NoSuchMethodError during runtime when calling MyDependency.doSomethingWith.
Your own application:
package de.scrum_master.app;
import org.library.Utility;
import com.dependency.MyDependency;
public class Application {
public static void main(String[] args) {
System.out.println("3333 in hexadecimal = " + new Utility().toHex(3333));
new MyDependency().doSomethingWith(987);
}
}
As you can see, the application also uses the same library, but a different method which still exists in the current version. Unfortunately, it also uses the dependency which relies on the existence of the removed method. So how should we repair this?
Aspect using ITD:
AspectJ to the rescue! We just add the missing method to the 3rd party library.
package de.scrum_master.aspect;
import org.library.Utility;
public aspect DeprecatedMethodProvider {
public String Utility.toOct(int number) {
return Integer.toOctalString(number);
}
}
If you compile this project with the AspectJ compiler Ajc, it just works. In your real life scenario, compile the aspect into its own aspect library, put the weaving agent aspectjweaver.jar on the JVM command line in order to activate LTW and enjoy how it weaves the method into the library class via byte code instrumentation during class-loading.
Log output:
3333 in hexadecimal = d05
987 in octal = 1733
Et voilĂ ! Enjoy. :-)
When the JVM load a class, it resolves all dependencies in a "linker" phase : external classes, properties and method. You can't pass this phase in your case, because methods are missing.
There are two modes on (Spring-)AOP: Proxy, and weaving.
Proxy create... a proxy around a class: the targeted class must exist and be loaded
Weaving can happen before a class is loaded: when a classloader load a class, an array of byte[] is passed to the weaver, which can manipulate the class bytecode before the class is really reified. This type of aop can work in your case. However, it will not be an easy task.

Use different structure for grails unit tests

I have a very simple package structure, only one level deep for all my grails artifacts- the name of the application "estra"- because the grails application structure is already providing the separation folders. But when writing unit-tests all the classes are inside the same estra.* package and I want to keep them separated like this estra.domain, estra.controllers, etc.
Right now everything works fine, but the tests are pretty simple. Will I face any problem in the future with dependency injection or something?
No, the package name don't influence in your test since in your test class you "say" which class is tested, using the #TestFor annotation. But remember that in unit tests you need to manually set your dependencies.
class ServiceOne {
def serviceTwo
}
#TestFor(ServiceOne)
class ServiceOneTests {
#Before
public void setup() {
service.serviceTwo = new ServiceTwo() //or mocked instance...
}
}

Scripting Eclipse with Rhino: classloader belongs to the plugin providing Rhino, not the plugin using it

I am using Rhino to script an Eclipse (RCP) application. The problem is that from Javascript I only have access to classes available to the plugin that provides Rhino, and not to all the classes available to the plugin that runs the scripts.
The obvious answer would be to put Rhino in the scripting plugin, but this doesn't work because it's already provided by one of the application's own plugins (which also provides things I need to script) and Eclipse always uses this version instead of the version closer to hand.
Is there a way to change the classloader used by Rhino
or is it possible to ensure that Eclipse loads the Rhino classes from one plugin rather than another?
Thanks to Thilo's answer I used this:
import net.weissmann.tom.rhino.Activator; // Plugin activator class
import org.mozilla.javascript.tools.shell.Main;
public class JSServer extends Thread {
//[...]
public void run() {
// recent versions of the Main class kindly export
// the context factory
Main.shellContextFactory.initApplicationClassLoader(
Activator.class.getClassLoader()
) ;
//[...]
}
Is there a way to change the classloader used by Rhino
Rhino should be using the current Thread's ContextClassLoader. Try Thread.setContextClassLoader (don't forget to restore it).
If that does not work, maybe you can create your own Rhino ContextFactory:
public final void initApplicationClassLoader(java.lang.ClassLoader loader)
Set explicit class loader to use when searching for Java classes.
I don't know Rhino specifics, but you could consider using Eclipse "buddy classloading" with the "registered" policy.
Rhino's plug-in (net.weissmann.tom.rhino, say) would declare itself "open to extension" by specifying Eclipse-BuddyPolicy: registered in its MANIFEST.MF. Plug-ins with classes that Rhino should be able to see would specify Eclipse-RegisterBuddy: net.weissmann.tom.rhino and would need a bundle-level dependency on net.weissmann.tom.rhino.
http://www.eclipsezone.com/articles/eclipse-vms/
http://wiki.eclipse.org/index.php/Context_Class_Loader_Enhancements#Technical_Solution