Pass dynamic value to test method parameter using TestNG class - selenium

I am automating a web page which runs in multi-threading environment, so I am exporting every test result into a file system and I wanted to maintain every test result uniquely for the future reference. So is there a way to pass file name as parameter to a test method dynamically while calling it from TestNG class.
I know we can pass parameters from .xml file but if I do that the values will more like static and can be seen by all the thread running parallel.
Test class will be called from main method as bellow
public class Test {
public static void main(String[] args) throws ParseException {
try
{
TestNG testng = new TestNG();
testng.setTestClasses(new Class[] { Testing.class });
testng.run();
}
catch(Exception e)
{
e.printStackTrace();
}
}
}
Bellow code is my test method
public class Testing {
#Test
#Parameters("filename")
public void testMethod(String fileName){
System.out.println("filename is: "+fileName);
// ---- remaining test logic -----
}
}
Or can we use TestListenerAdapter onStart() method to inject parameter values...?.

If you want unique file name you can just add it a time stamp
Date date = new Date();
Format formatter = new SimpleDateFormat("yyyy-MM-dd_HH:mm:ss");
String timeStamp = formatter.format(date);
String fileName = "TestResults-" + timeStamp;

You can store your values into ITestContext which will be available for all tests.
You can set up the values in a configuration method (#BeforeSuite for example) or a listener.

Pass Dynamic Parameters to TestNG suite during runtime
What the below code does:
I want to add a list of parameters to each test during runtime. These parameters are passed as maven runtime arguments. They are read using System.getProperty() method as shown below. Then these parameters are added to the test inside suite and testng is ran successfully. This can be really useful in other scenarios as well.
The below code reads the testng.xml file and adds parameter to
List<String> parameters = new ArrayList<>();
parameters = Arrays.asList(System.getProperty("parameters").split(",");
TestNG tng = new TestNG();
File initialFile = new File("testng.xml");
InputStream inputStream = FileUtils.openInputStream(initialFile);
Parser p = new Parser(inputStream);
List<XmlSuite> suites = p.parseToList();
for(XmlSuite suite:suites){
List<XmlTest> tests = suite.getTests();
for (XmlTest test : tests) {
for (int i = 0; i < parameters.size(); i++) {
HashMap<String, String> parametersMap = new HashMap<>();
parametersMap.put("parameter",parameters.get(i));
test.setParameters(parametersMap);
}
}
}
tng.setXmlSuites(suites);
tng.run();

Related

Creating common test data for multiple feature files

My requirement is as follows:
I have a couple of .feature files. I want to create test data that would be common to all of these feature files. Once the test data is created the scenarios will be run from the feature files.
I also want some info back after the test data is created. eg., ids of the data that i have created. So i can use this info to call the api's, add in payloads in my scenarios.
I think we could do this by:
1. Creating a junit java file. I define a static method with #BeforeClass in there and use Karate's runner() to run my create-test-data.feature file (I can use Karate to hit application api to create some data). I define a property in my java class of type Object and set it with the result of Runner.runFeature().
Then I create a separate feature file test-data-details.feature. I define my Java Interop code here. eg.,
def test_data =
"""
var JavaOutput = Java.type('com.mycompany.JavaFile');
var testData = JavaOutput.propertyName;
"""
Now that I have my test data object in my test-data-details.feature file. I call this .feature file (callonce) in the Background section of my feature files that have test scenarios in. So I can retries the test data details like id, name. etc that I can then use in the api request paths and payloads.
I am not sure if the above design is the correct way to go ahead. I tried but getting some issues in my Java file where getClass() below complains that it cannot be used in static method.
#RunWith(Karate.class)
public class AccountRunner {
public static Object job = null;
#BeforeClass
public static void create_job(){
Map<String, Object> result = Runner.runFeature(getClass(), "test-data.feature", null, true);
job = result.get("job");
}
}
Now all of the above can be totally wrong. Need help on how to tackle this scenario in Karate.
Thanks
From your question I understand you have a common test data feature file, which you want to run before all the test and hold that response in a variable that can be used in all of the test features.
You can also achieve this in karate-config.js using karate.callSingle()
In your karate-config.js
config["testdata"] = karate.callSingle("test-data.feature")
Your test-data.feature will be executed once before all the tests and store the response in testdata you can use this variable directly in your feature.
So i have implemented the following design:
I have created two methods one with BeforeClass and other with AfterClass annotation in my TestRunner.java file. In these methods I am able to call the specific data creation and clean-up feature files and pass args as Json object.
#RunWith(Karate.class)
#KarateOptions(tags = {"~#ignore"})
public class AccountRunner {
public static Map<String, Object> result = null;
#BeforeClass
public static void create_job() throws IOException, ParseException {
Class clazz = AccountRunner.class;
URL file_loc = clazz.getResource("create-test-data-1.json");
File file = new File(file_loc.getFile());
JSONParser parser = new JSONParser();
Object obj = parser.parse(new FileReader(file));
JSONObject jsonObject = (JSONObject) obj;
Map<String, Object> args = new HashMap();
args.put("payload", jsonObject);
result = Runner.runFeature(CommonFeatures.class, "create-data.feature", args, true);
}
#AfterClass
public static void delete_investigation() {
Map<String, Object> args = new HashMap();
args.put("payload", result);
Runner.runFeature(CommonFeatures.class, "delete-job.feature", args, true);
}
}
To run these tests via command line using "mvn test" command, i have done following changes in pom.xml.
`<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>3.0.0-M3</version>
<configuration>
<includes>
<include>**/*Runner.java</include>
</includes>
</configuration>
</plugin>`
With this solution I able to run my tests in IDE by executing the runner directly or by command line. However, I have not found a way to run all of my tests by following the karate suggested approach where I have a *Test.java file at test suite level and I use default maven configuration with "mvn test". The features fails to run as the .feature file is called before the Runner file is executed which has method to create test data for tests.
Maybe someone can suggest what else, I could do to use Karate approach of running *Test.java file instead of each *Runner.java file.

How do I replace one test in an XML-defined suite with three others in TestNG?

Our team uses TestNG to run some tests in Selenium. We need to be able to run a given test on 3 different browsers (Chrome, Firefox, and [sadly] IE). We have a browser parameter on our base test class and really we could just declare three tests, one each for each browser; however, we'd really like to just be able to specify the browser value as "Standard 3" and have that run the test on each browser automatically.
So, I've built a class that implements ISuiteListener and attempts to create the new tests on the fly. However, any way I try to add tests fails. That is, no new tests I try to add will be executed by the suite. It's like nothing I did actually changed anything.
Here's my code:
public class Standard3BrowserSuiteListener implements ISuiteListener {
#Override
public void onStart(final ISuite suite) {
final XmlSuite xmlSuite = suite.getXmlSuite();
final Map<String, String> suiteParameters = xmlSuite.getParameters();
final List<XmlTest> currentTests = new ArrayList<XmlTest>(xmlSuite.getTests());
final ArrayList<XmlTest> testsToRun = new ArrayList<XmlTest>(currentTests.size());
for (final XmlTest test : currentTests) {
final Browser browser;
final Map<String, String> testParameters = test.getAllParameters();
{
String browserParameter = testParameters.get("browser");
if (browserParameter == null) {
browserParameter = suiteParameters.get("browser");
}
browser = Util.Enums.getEnumValueByName(browserParameter, Browser.class);
}
if (browser == Browser.STANDARD_3) {
XmlTest nextTest = cloneTestAndSetNameAndBrowser(xmlSuite, test, testParameters, "Chrome");
xmlSuite.addTest(nextTest);
testsToRun.add(nextTest); // alternate I've tried to no avail
nextTest = cloneTestAndSetNameAndBrowser(xmlSuite, test, testParameters, "Firefox");
xmlSuite.addTest(nextTest);
testsToRun.add(nextTest); // alternate I've tried to no avail
nextTest = cloneTestAndSetNameAndBrowser(xmlSuite, test, testParameters, "IE");
xmlSuite.addTest(nextTest);
testsToRun.add(nextTest); // alternate I've tried to no avail
} else {
testsToRun.add(test);
}
}
// alternate to xmlSuite.addTest I've tried to no avail
testsToRun.trimToSize();
currentTests = xmlSuite.getTests();
currentTests.clear();
currentTests.addAll(testsToRun);
}
private XmlTest cloneTestAndSetNameAndBrowser(final XmlSuite xmlSuite, final XmlTest test,
final Map<String, String> testParameters, final String browserName) {
final XmlTest nextTest = (XmlTest) test.clone();
final Map<String, String> nextParameters = new TreeMap<String, String>(testParameters);
nextParameters.put("browser", browserName.toUpperCase());
nextTest.setName(browserName);
final List<XmlClass> testClasses = new ArrayList<XmlClass>(test.getClasses());
nextTest.setClasses(testClasses);
return nextTest;
}
#Override
public void onFinish(final ISuite suite) {}
}
How can I replace the test with the browser value "Standard 3" with 3 tests and have it run properly? Thanks!
Here's what you need to do :
Upgrade to the latest released version of TestNG.
Build an implementation of org.testng.IAlterSuiteListener
Move your implementation that you created in ISuiteListener into this listener implementation.
Wire in this listener via the <listeners> tag in your suite XML File (or) via ServiceLoaders (As described in the javadocs of this interface)

How to read excel file in cucumber project?

I am creating testing automation framework using java but i am not able to read excel file in cucumber.
is there any way to use #DataProvider functionality og testNG?
I do not want to use datatable of feature file.
If you use CucumberJVM, I don't think you can make use of TestNG Data Providers without major hacks. Or at least this is not a "cucumber way" of doing things. Data Table is a Cucumber equivalent of TestNG Data Provider:
https://cucumber.io/docs/reference#data-tables
This is how you parametrise tests in Cucumber. I'm not saying the solution you are looking for can't be implemented, I'm saying you are most likely looking for a wrong thing. CucumberJVM makes use of DataProviders internally, to handle features this way:
https://github.com/cucumber/cucumber-jvm/blob/master/testng/src/main/java/cucumber/api/testng/AbstractTestNGCucumberTests.java
In case it helps others:
here is described how to link a Cucumber Scenario Outline to read data from an Excel file
https://startingwithseleniumwebdriver.blogspot.com/2017/04/getting-data-from-external-file-using.html
and here is described how to load data from an Excel file in a Cucumber feature file before executing Scenario steps
https://startingwithseleniumwebdriver.blogspot.com/2017/04/loading-data-from-external-file-to.html
I my case this was very useful, as for each Scenario step I had to load Excel data (data from multiple rows having same group ID) in order to perform further validations. Like this the Cucumber feature file was a bit cleaner while the Excel had all the details under the hood.
ExcelBDD Java edition can resolve this problem gracefully. code example
static Stream<Map<String, String>> provideExampleList() throws IOException {
String filePath = TestWizard.getExcelBDDStartPath("excelbdd-test")
+ "excelbdd-test\\src\\test\\resources\\excel.xlsx";
return Behavior.getExampleStream(filePath,"Expected1","Scenario1");
}
#ParameterizedTest(name = "Test{index}:{0}")
#MethodSource("provideExampleList")
void testgetExampleWithExpected(Map<String, String> parameterMap) {
assertNotNull(parameterMap.get("Header"));
System.out.println(String.format("=======Header: %s=======", parameterMap.get("Header")));
for (Map.Entry<String, String> param : parameterMap.entrySet()) {
System.out.println(String.format("%s --- %s", param.getKey(), param.getValue()));
}
}
more detail at ExcelBDD Guideline By Java Example
Here is the example how to read TestData from excel
public class Framework {
static String TestDataPath = System.getProperty("user.dir")
+ "\\ExcelFiles\\TestData.xlsx";
public static HashMap<String, HashMap<String, String>> hm1 = new HashMap<>();
static String s3;
public static void ReadTestData() throws IOException {
FileInputStream file = new FileInputStream(TestDataPath);
XSSFWorkbook workbook = new XSSFWorkbook(file);
XSSFSheet sheet = workbook.getSheet("Sheet1");
Row HeaderRow = sheet.getRow(0);
for (int i = 1; i < sheet.getPhysicalNumberOfRows(); i++) {
Row currentRow = sheet.getRow(i);
HashMap<String, String> currentHash = new HashMap<String, String>();
for (int j = 0; j < currentRow.getPhysicalNumberOfCells(); j++) {
Cell currentCell1 = currentRow.getCell(0);
switch (currentCell1.getCellType()) {
case Cell.CELL_TYPE_STRING:
s3 = currentCell1.getStringCellValue();
System.out.println(s3);
break;
case Cell.CELL_TYPE_NUMERIC:
s3 = String.valueOf(currentCell1.getNumericCellValue());
System.out.println(s3);
break;
}
Cell currentCell = currentRow.getCell(j);
switch (currentCell.getCellType()) {
case Cell.CELL_TYPE_STRING:
currentHash.put(HeaderRow.getCell(j).getStringCellValue(),
currentCell.getStringCellValue());
break;
case Cell.CELL_TYPE_NUMERIC:
currentHash.put(HeaderRow.getCell(j).getStringCellValue(),
String.valueOf(currentCell.getNumericCellValue()));
break;
}
}
hm1.put(s3, currentHash);
}
Here is the Model cucumber file and testData.
Scenario Outline: Successful Login with Valid Credentials
Given User is on Home Page
When User Navigate to LogIn Page
And User enters mandatory details of "<TextCase>"
Then Message displayed Login Successfully
Examples:
|TextCase|
|Case1 |
|Case2 |
[Test data img Link][1]
[1]: https://i.stack.imgur.com/IjOap.png
Here is the Model Stepdefination File
#When("^User enters mandatory details of \"([^\"]*)\"$")
public void user_enters_mandatory_details_of(String arg1) throws Throwable {
// Write code here that turns the phrase above into concrete actions
driver.FindElement("UserName").sendKeys(Framework.hm1.get(arg1).get("UserName"));
Framework.FindElement("Password").sendKeys(Framework.hm1.get(arg1).get("Password"));
}
Follow above three steps in cucumber you will able to read test data.

jbehave run only specific story

I have jbehave integrated with Selenium. I am running my tests through command line as below
C:\eclipse_workspace\MySeleniumTests>mvn clean test -Dwebdriver.firefox.bin="C:\Program Files\Mozilla\Firefox\firefox.exe"
I have used jbehave-maven-plugin. Maven picks up all the Embedder impl (JunitStories in my case) from the source directory and execute them one by one. Configuration for that is <include>**/*Stories.java</include> in pom.xml
It then looks for relevant .story files in the specified dir and executes them. Say, I have two story files one.story and two.story, both of them are executed.
Over a time, number of story files are going to increase I only want to execute specific story files should there be a way to do this? I am thinking to pass specific story file names as run time parameters but don’t know what is required to make that happen.
I got it working with the below code
mvn clean test -Dwebdriver.firefox.bin="C:\Program Files\Mozilla\Firefox\firefox.exe" -Dstory=myStory.story
Override storyPaths() method in embedder class as below.
public class MyTestStories extends JUnitStories /* InjectableEmbedder */{
#Override
protected List<String> storyPaths() {
List<String> storiesToRun = new ArrayList<String>();
String storyProperty = System.getProperty("story");
if (storyProperty == null || storyProperty.isEmpty()) {
throw new RuntimeException("Please specify which stories to run");
}
String[] storyNames = storyProperty.split(",");
StoryFinder sf = new StoryFinder();
URL baseUrl = CodeLocations.codeLocationFromClass(this.getClass());
for (String storyName : storyNames) {
storiesToRun.addAll(sf.findPaths(baseUrl, storyName, ""));
}
return storiesToRun;
}
Try the following:
mvn clean test -Dwebdriver.firefox.bin="C:\Program Files\Mozilla\Firefox\firefox.exe" -Djbehave.story.name=<story filename without extension (wildcards are supported)>
You should also use custom test suite implementation:
public abstract class JBehaveTestSuite extends ThucydidesJUnitStories {
private static final String STORY_NAME_PATTERN = "**/${jbehave.story.name:*}.story";
public JBehaveTestSuite() {
findStoriesCalled(storyNamesFromEnvironmentVariable());
}
#Override
public void run() throws Throwable {
super.run();
}
private String storyNamesFromEnvironmentVariable() {
return SystemPropertyUtils.resolvePlaceholders(STORY_NAME_PATTERN);
}
}

Running selenium against multiple browsers with MSTEST

Im using selenium and using mstest to drive it. My problem is i want my entire suite to run against 3 different browsers(IE,Firefox and chrome).
What i can't figure out is how to data drive my test on a suite level or how to rerun the suite with different paramateres.
I know i can add a datasource to all my tests and have the individual test run against multiple browsers but then i would have to duplicate the 2 lines for the datasource for every single test which i don't think is very good solution.
So anybody knows how i can data drive my assembly initialize? or if there's another solution.
This is what I did. The benefit of this approach is that it will work for any test framework (mstest, nunit, etc) and the tests themselves don't need to be concerned with or know anything about browsers. You just need to make sure that the method name only occurs in the inheritance hierarchy once. I have used this approach for hundreds of tests and it works for me.
Have all tests inherit from a base test class (e.g. BaseTest).
BaseTest keeps all driver objects (IE, FireFox, Chrome) in an array (multiDriverList in my example below).
Have the following methods in BaseTest:
public void RunBrowserTest( [CallerMemberName] string methodName = null )
{
foreach( IDriverWrapper driverWrapper in multiDriverList ) //list of browser drivers - Firefox, Chrome, etc. You will need to implement this.
{
var testMethods = GetAllPrivateMethods( this.GetType() );
MethodInfo dynMethod = testMethods.Where(
tm => ( FormatReflectionName( tm.Name ) == methodName ) &&
( FormatReflectionName( tm.DeclaringType.Name ) == declaringType ) &&
( tm.GetParameters().Where( pm => pm.GetType() == typeof( IWebDriver ) ) != null ) ).Single();
//runs the private method that has the same name, but taking a single IWebDriver argument
dynMethod.Invoke( this, new object[] { driverWrapper.WebDriver } );
}
}
//helper method to get all private methods in hierarchy, used in above method
private MethodInfo[] GetAllPrivateMethods( Type t )
{
var testMethods = t.GetMethods( BindingFlags.NonPublic | BindingFlags.Instance );
if( t.BaseType != null )
{
var baseTestMethods = GetAllPrivateMethods( t.BaseType );
testMethods = testMethods.Concat( baseTestMethods ).ToArray();
}
return testMethods;
}
//Remove formatting from Generic methods
string FormatReflectionName( string nameIn )
{
return Regex.Replace( nameIn, "(`.+)", match => "" );
}
Use as follows:
[TestMethod]
public void RunSomeKindOfTest()
{
RunBrowserTest(); //calls method in step 3 above in the base class
}
private void RunSomeKindOfTest( IWebDriver driver )
{
//The test. This will be called for each browser passing in the appropriate driver in each case
...
}
To do this, we wrote a wrapper around webdriver and we use a switch statement based on a property to select the browser type.
Here's a snippet. Using the DesiredCapabilities, you can tell grid which browsers to execute against.
switch (Controller.Instance.Browser)
{
case BrowserType.Explorer:
var capabilities = DesiredCapabilities.InternetExplorer();
capabilities.SetCapability("ignoreProtectedModeSettings", true);
Driver = new ScreenShotRemoteWebDriver(new Uri(uri), capabilities, _commandTimeout);
break;
case BrowserType.Chrome:
Driver = new ScreenShotRemoteWebDriver(new Uri(uri), DesiredCapabilities.Chrome(), _commandTimeout);
break;
}
This idea is better for an automated CI scenario rather than interactive UI, but you can use a runsettings file and declare a parameter in that:
<?xml version='1.0' encoding='utf-8'?>
<RunSettings>
<TestRunParameters>
<Parameter name="SELENIUM_BROWSER" value="Firefox" />
</TestRunParameters>
</RunSettings>
You'll need a TestContext on your Test class
public TestContext TestContext { get; set; }
Then in your MSTest when you initialise the Driver you can check which browser you want to run
switch (TestContext.Properties["SELENIUM_BROWSER"]?.ToString())
{
case BrowserType.Chrome:
return new ChromeDriver();
case BrowserType.Edge:
return new EdgeDriver();
case BrowserType.Firefox:
return new FirefoxDriver();
}
You would then run the suite of tests n times, once for each runsettings file