Parallel Execution of Selenium Script on One Physical Machine - selenium

I have a hybrid framework based on Selenium WebDriver. It is taking around 2-3 hours to run the test suite I have right now. What is the best way to start running the tests parallel On the same machine (Even if I use Selenium Grid, how many nodes can I have at max on one machine, provided I also I have to use the same machine as the Hub ?). I have the constraint of using only one physical machine, and I am not using Test NG.

Run it with maven using the failsafe plugin by configuring to run in parallel with multiple threads.
Below is a sample code for running junit tests in 5 threads with test classes (placed in default location src/test/java and using the default class inclusion rules) running in parallel. The webdriver are instantiated in the BeforeClass and destroyed in the AfterClass methods. The webdriver instances are stored in a static ThreadLocal variable for keeping them separate.
pom.xml -
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-failsafe-plugin</artifactId>
<version>2.21.0</version>
<configuration>
<parallel>classes</parallel>
<threadCount>5</threadCount>
</configuration>
<executions>
<execution>
<id>integration-test</id>
<phase>integration-test</phase>
<goals>
<goal>integration-test</goal>
</goals>
</execution>
<execution>
<id>verify</id>
<phase>verify</phase>
<goals>
<goal>verify</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
BaseClass -
public class BaseTestIT {
#BeforeClass
public static void setup() {
System.setProperty("webdriver.chrome.driver", "");
WebDriver driver = new ChromeDriver();
ThreadWebDriver.set(driver);
}
#AfterClass
public static void teardown() {
ThreadWebDriver.get().quit();
ThreadWebDriver.remove();
}
}
ThreadLocal -
public class ThreadWebDriver {
private static final ThreadLocal<WebDriver> threadWebDriver = new InheritableThreadLocal<WebDriver>();
private ThreadWebDriver() { }
public static WebDriver get() {
return threadWebDriver.get();
}
public static void set(WebDriver driver) {
threadWebDriver.set(driver);
}
public static void remove() {
threadWebDriver.remove();
}
}
TestOneClass - Both the methods will run in same thread.
public class FirstTestIT extends BaseTestIT {
#Test
public void testOne() {
System.out.println(Thread.currentThread().getId());
WebDriver driver = ThreadWebDriver.get();
driver.get("https://junit.org/junit4/");
driver.manage().window().maximize();
}
#Test
public void testTwo() {
System.out.println(Thread.currentThread().getId());
WebDriver driver = ThreadWebDriver.get();
driver.get("https://junit.org/junit5/");
driver.manage().window().maximize();
}
}
TestTwoClass - Single method will run in separate thread.
public class ThirdTestIT extends BaseTestIT{
#Test
public void test() {
System.out.println("third one");
WebDriver driver = ThreadWebDriver.get();
driver.get("https://stackoverflow.com/questions/tagged/selenium");
driver.manage().window().maximize();
}
}
You can even look at using a sharedwebdriver concept in which the driver will only be shut when the JVM closes. This link uses cucumber but pretty easy to modify just for selenium use. Remove the before and after hooks and handle this part in junit hooks.

Related

Unable to use aspectJ interceptor in a non-spring project

I am trying to use Aspectj to execute some code after some method execution. I cannot use spring AOP as the project is a non-spring project and at this point of time I cannot change it to spring project. I have tried with a very simple implementation as below but it is not at all working:
POM of my project
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>test.aspectj</groupId>
<artifactId>HelloAspectj</artifactId>
<version>0.0.1-SNAPSHOT</version>
<properties>
<maven.compiler.plugin.version>3.5.1</maven.compiler.plugin.version>
</properties>
<dependencies>
<dependency>
<groupId>org.aspectj</groupId>
<artifactId>aspectjrt</artifactId>
<version>1.9.6</version>
</dependency>
<dependency>
<groupId>org.aspectj</groupId>
<artifactId>aspectjweaver</artifactId>
<version>1.9.6</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<version>${maven.compiler.plugin.version}</version>
<configuration>
<source>1.8</source>
<target>1.8</target>
</configuration>
</plugin>
</plugins>
</build>
</project>
A normal class and method after which the aspect methods will run:
package tester;
public class HelloWorld {
private String name;
public void setName(String name) {
this.name = name;
}
public void printHello() {
System.out.println("Spring 3 : Hello ! " + name);
}
}
Aspect class
package tester;
import org.aspectj.lang.annotation.After;
import org.aspectj.lang.annotation.Aspect;
import org.aspectj.lang.annotation.Before;
#Aspect
public class TestAspect {
#Before(" call(void java.io.PrintStream.println(String)) " +
"&& !within(net.andrewewhite.aspects..*)")
public void beforePrintlnCall() {
System.out.println("About to make call to print Hello World");
}
#After(" call(void java.io.PrintStream.println(String)) " +
"&& !within(net.andrewewhite.aspects..*)")
public void afterPrintlnCall() {
System.out.println("Just made call to print Hello World");
}
}
Main class
package tester;
public class MainApp {
public static void main(String[] args) {
// TODO Auto-generated method stub
System.out.println("Hello World");
}
}
aop.xml
<aspectj>
<aspects>
<aspect name="tester.TestAspect"/>
</aspects>
</aspectj>
Project Structure:
Now i am expecting that it will price About to make call to print Hello World & Just made call to print Hello World BUt it is only printing Hello World
can someone help here..
If you want to use compile-time weaving, use
Mojohaus AspectJ Maven plugin until Java 8 and AspectJ 1.8.x or
Nick Wongdev's AspectJ Maven fork for Java 9+.
Javac via Maven Compiler plugin is not enough.
If you wish to use load-time weaving (LTW), it should be okay to compile your aspects with Javac via Maven Compiler plugin, as long as you only use annotation-driven #AspectJ syntax. For native AspectJ syntax you always need the AspectJ compiler Ajc via AspectJ Maven plugin, no matter what. For LTW you also need the weaving agent on the Java command line via -javaagent:/path/to/aspectjweaver.jar. There is also a hot-attachment option, but that is advanced stuff and you need to know what you are doing and the application must know that it wants to attach the weaver, so let's not talk about this here, as you are clearly a beginner.
All of this has been documented on the AspectJ website and the AspectJ Maven website. I have also answered numerous questions about AspectJ + Maven here, you should easily find some. Before asking questions, you should really search first. This website does not replace manuals and tutorials.

Not able to find my object created for DesiredCapabilities class

I have started building up my first Appium test in Android and for that, I started writing my code.
I instantiated my DesiredCapabilities object but when I am trying to use that reference, I cant see that reference.
Link to Image for the issue:
Below are the dependencies added for my project:
<dependencies>
<!-- https://mvnrepository.com/artifact/org.seleniumhq.selenium/selenium-java -->
<dependency>
<groupId>org.seleniumhq.selenium</groupId>
<artifactId>selenium-java</artifactId>
<version>3.13.0</version>
</dependency>
<!-- https://mvnrepository.com/artifact/io.appium/java-client -->
<dependency>
<groupId>io.appium</groupId>
<artifactId>java-client</artifactId>
<version>6.1.0</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.testng/testng -->
<dependency>
<groupId>org.testng</groupId>
<artifactId>testng</artifactId>
<version>6.14.3</version>
<scope>test</scope>
</dependency>
</dependencies>
Please help me what is missing in order to help to proceed with coding.
Is any jar being missed which I need to associate?
You can not access the method using references directly, in the class because there is no execution entry point. You have to write your code within some method, constructor or in block(static/non-static). Refer below examples :
Way I
DesiredCapabilities capabilities =DesiredCapabilities.android();
public FirstDemoClass() {
// TODO Auto-generated constructor stub
capabilities.setCapability("deviceName", "emulator-5554");
}
Way II
DesiredCapabilities capabilities =DesiredCapabilities.android();
// method
public void setCapabilities() {
capabilities.setCapability("deviceName", "emulator-5554");
}
Way III
static DesiredCapabilities capabilities = DesiredCapabilities.android();
// block
static {
capabilities.setCapability("deviceName", "emulator-5554");
}
public static void main(String[] args) {
}
This should work. let me know if any thing there.
Initialize your device capabilities like
public class FirstDemoClass{
public static void main(String[] args){
AppiumDriver<WebElement> driver;
DesiredCapabilities caps = new DesiredCapabilities();
caps.setCapability("deviceName", "Android phone");
caps.setCapability("udid", "your device unique id");
caps.setCapability("platformName", "Android");
caps.setCapability("platformVersion", "phone version");
caps.setCapability("appPackage", appPackage);
caps.setCapability("appActivity", appActivity);
driver=new AndroidDriver<WebElement>(new URL(
"http://127.0.0.1:4723/wd/hub"), caps);
}
}

Read data from Redis to Flink

I have been trying to find a connector to read data from Redis to Flink. Flink's documentation contains the description for a connector to write to Redis. I need to read data from Redis in my Flink job. In Using Apache Flink for data streaming, Fabian has mentioned that it is possible to read data from Redis. What is the connector that can be used for the purpose?
We are running one in production that looks roughly like this
class RedisSource extends RichSourceFunction[SomeDataType] {
var client: RedisClient = _
override def open(parameters: Configuration) = {
client = RedisClient() // init connection etc
}
#volatile var isRunning = true
override def cancel(): Unit = {
isRunning = false
client.close()
}
override def run(ctx: SourceContext[SomeDataType]): Unit = while (isRunning) {
for {
data <- ??? // get some data from the redis client
} yield ctx.collect(SomeDataType(data))
}
}
I think it really depends on what you need to fetch from redis. The above could be used to fetch a message from a list/queue, transform/push and then delete it form the queue.
Redis also supports Pub/Sub, so it would possible to subscribe, grab the SourceConext and push messages downstream.
Currently, Flink Redis Connector is not available but it can be implemented by extending RichSinkFunction/SinkFunction class.
public class RedisSink extends RichSinkFunction<String> {
#Override
public void open(Configuration parameters) throws Exception {
//open redis connection
}
#Override
public void invoke(String map) throws Exception {
//sink data to redis
}
#Override
public void close() throws Exception {
super.close();
}
}
There's been a bit of discussion about having a streaming redis source connector for Apache Flink (see FLINK-3033), but there isn't one available. It shouldn't be difficult to implement one, however.
One of challenges in getting your Flink program to use Jedis to talk to Redis is getting the appropriate libraries into the JAR file you submit to Flink. Absent this, you will get call stacks indicating certain classes are undefined. Here is a snippet of a Maven pom.xml I created to move Redis and its dependent component, apache commons-pool2, into my JAR.
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<version>2.9</version>
<executions>
<execution>
<id>unpack</id>
<!-- executed just before the package phase -->
<!-- https://ci.apache.org/projects/flink/flink-docs-release-1.3/dev/linking.html -->
<phase>prepare-package</phase>
<goals>
<goal>unpack</goal>
</goals>
<configuration>
<artifactItems>
<artifactItem>
<groupId>org.apache.commons</groupId>
<artifactId>commons-pool2</artifactId>
<version>2.4.2</version>
<type>jar</type>
<overWrite>false</overWrite>
<outputDirectory>${project.build.directory}/classes</outputDirectory>
<includes>org/apache/commons/**</includes>
</artifactItem>
<artifactItem>
<groupId>redis.clients</groupId>
<artifactId>jedis</artifactId>
<version>2.9.0</version>
<type>jar</type>
<overWrite>false</overWrite>
<outputDirectory>${project.build.directory}/classes</outputDirectory>
<includes>redis/clients/**</includes>
</artifactItem>
</artifactItems>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>

Attaching attachments in Allure reports using cucumber jvm after hooks

With Allure report framework when a step fails we can attach a screenshot or logs by calling a method with #Attachment annotation.
#Attachment(value = "Message", type = "text/plain")
public String attachLog(){
return "Hello, Test failed!";
}
But this means I have to explicitly call this method ( attachLog() ) in every step before assertions. Which seems unreasonable.
In CucumberJvm the "after" hooks are a great way to attach screenshots or logs. we do this by checking the scenario status and attach screenshots/logs based on the outcome.
I tried invoking the above method ( attachLog() ) in the cucumberJvm "after" hook. But unfortunately did not work.
Is there a solution to make this work?
Cheers
Vinod
You can override test failure method from ru.yandex.qatools.allure.cucumberjvm.AllureRunListener
public class CustomAllureListener extends AllureRunListener {
#Override
public void testFailure(Failure failure) {
super.testFailure(failure);
if (!failure.getDescription().isSuite()) { // check is needed to avoid double attaching
attachFailed();
}
}
#Attachment(value = "Message", type = "text/plain")
public String attachFailed(){
return "Test failed!";
}
}
And don't forget to change listener in pom.xml file
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.18.1</version>
<configuration>
<testFailureIgnore>false</testFailureIgnore>
<argLine>
-javaagent:${settings.localRepository}/org/aspectj/aspectjweaver/${aspectj.version}/aspectjweaver-${aspectj.version}.jar
</argLine>
<properties>
<property>
<name>listener</name>
<value>com.mycompany.testing.CustomAllureListener</value>
</property>
</properties>
</configuration>
<dependencies>
<dependency>
<groupId>org.aspectj</groupId>
<artifactId>aspectjweaver</artifactId>
<version>${aspectj.version}</version>
</dependency>
</dependencies>
</plugin>

maven2 - why failesafe plugin is ignoring my junit annotations?

I have set up a java/maven project in order to perform tests this way:
unit tests are executed with the surefire plugin
integration tests are executed with the failsafe plugin
here is the POM (ugly compact formating):
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>org.sample</groupId>
<artifactId>sample-service</artifactId>
<version>0.0.0</version>
<name>sdp-sample-service</name>
<build> <plugins>
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<configuration><debug>true</debug><source>1.6</source><target>1.6</target></configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId><artifactId>maven-surefire-plugin</artifactId><version>2.6</version>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId><artifactId>maven-failsafe-plugin</artifactId><version>2.6</version>
<executions><execution>
<id>integration-test</id>
<phase>integration-test</phase>
<goals>
<goal>integration-test</goal>
<goal>verify</goal>
</goals>
<configuration>
<junitArtifactName>none:none</junitArtifactName>
<failIfNoTests>false</failIfNoTests>
<testFailureIgnore>true</testFailureIgnore>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
<dependencies>
<dependency><groupId>junit</groupId><artifactId>junit</artifactId><version>4.8.2</version><scope>test</scope></dependency>
</dependencies>
</project>
I have a sample UNIT test class looking like that (again ugly compact format) :
package org.sample;import java.util.logging.Logger;import org.junit.*;
public class SampleUnitTest {
private static final Logger LOG = Logger.getLogger("SampleUnitTest");
#BeforeClass public static void beforeClass() {LOG.info("#BeforeClass");}
#Before public void before() {LOG.info("#Before");}
#AfterClass public static void afterClass() { LOG.info("#AfterClass");}
#After public void after() { LOG.info("#After"); }
#Test public void test1() { LOG.info("test1");}
#Test public void test2() { LOG.info("test2");}
}
I have the exact same Integration test:
package org.sample;import java.util.logging.Logger;import org.junit.*;
public class SampleIT {
private static final Logger LOG = Logger.getLogger("SampleIT");
#BeforeClass public static void beforeClass() {LOG.info("#BeforeClass");}
#Before public void before() {LOG.info("#Before");}
#AfterClass public static void afterClass() { LOG.info("#AfterClass");}
#After public void after() { LOG.info("#After"); }
#Test public void test1() { LOG.info("test1");}
#Test public void test2() { LOG.info("test2");}
}
And maven output is:
$ mvn clean install
...
[INFO] [surefire:test {execution: default-test}]
...
-------------------------------------------------------
T E S T S
-------------------------------------------------------
Running org.sample.SampleUnitTest
6 janv. 2011 14:38:38 org.sample.SampleUnitTest beforeClass
INFO: #BeforeClass
6 janv. 2011 14:38:38 org.sample.SampleUnitTest before
INFO: #Before
6 janv. 2011 14:38:38 org.sample.SampleUnitTest test1
INFO: test1
6 janv. 2011 14:38:38 org.sample.SampleUnitTest after
INFO: #After
6 janv. 2011 14:38:38 org.sample.SampleUnitTest before
INFO: #Before
6 janv. 2011 14:38:38 org.sample.SampleUnitTest test2
INFO: test2
6 janv. 2011 14:38:38 org.sample.SampleUnitTest after
INFO: #After
6 janv. 2011 14:38:38 org.sample.SampleUnitTest afterClass
INFO: #AfterClass
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.062 sec
Results :
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0
...
[INFO] [failsafe:integration-test {execution: integration-test}]
...
-------------------------------------------------------
T E S T S
-------------------------------------------------------
Running org.sample.SampleIT
6 janv. 2011 14:38:38 org.sample.SampleIT test1
INFO: test1
6 janv. 2011 14:38:38 org.sample.SampleIT test2
INFO: test2
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.015 sec
Results :
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0
...
Question: why failsafe integration tests totaly ignore my Junit annotation ?
Remove
<junitArtifactName>none:none</junitArtifactName>
from the configuration. It forces Surefire to run in Junit3 mode.