How to achieve data driven testing using postman and newman? - api

I have a requirement to automate API`s and also integrate the same with Azure devOps pipeline. I am currently using Cypress and i am successful in doing so.
My clients wants to use postman for automation.
I have to run a single API with multiple combinations like different set of query parameters with different set of Request Body.
I know by using Cypress fixture we can achieve data driven testing , but can we do the same with Postman, if yes ? how can we integrate the same to Azure Pipeline for running different combinations of data ?

data driven testing in postman is straight forward, csv header is the variable name :
create a csv file called data.csv :
age
1
2
3
now call the varaible in any part like {{name}}
eg let request be :
{
"user": "{{name}}"
}
Now run the script using newman or collection runner. For cicd integration we should use newman. Export the collection as json and run
newman run collection.json -d data.csv
thats it , at a single time you can use only one csv file , but can run the configuration using different csv file by rerunning command with different datafiles specified using -d

Related

Executing Pentaho transformation(ktr) using node js with Pentaho CE

I can able to successfully execute the .ktr files using browser and as well as using postman tool by using below url
http://localhost:8089/kettle/executeTrans/?trans=D:\Pentaho\ktr\MyJson_to_Database.ktr
But I want to automate the process and this ktr and it need to accept a json file as input(right now the json data is in side the ktr file itself). As I am using NodeJS to automate the ktr executing processing, I am trying to use wreck and post method to execute it(I am new to wreck), I am facing difficulties to identify the problem whether the error is due to wrek or kettle transformation itself
In the mean time I am trying to execute it without passing path as query string in url and instead I want to use it in body, I have searched google with no success so far.
EDIT 1
I am able to reach to the ktr file from NodeJS Microservice and now the challenge is to read the file path inside docker image.
Could you work storing the json data in a file, and modifying/adding the transformation to read the file and pass the information in the file?

How to get the count of projects, repositories and teams created in Azure DevOps?

How to get the list of projects, repositories and teams created in Azure DevOps. Also the administrators, contributors and pull request approver's list.
I have seen the API mentioned in Azure docs but it provides the info in json format which has a lot of info and is really difficult to pull out the project and repository name.
How can I get that data in excel or word document?
Best option looks like Rest APIs which returns json, but I am afraid there is no easy way to export them to excel or word.
For people who are searching what exactly are the Rest API endpoints to fetch them, here are the details;
How to get git repositories?
You can follow following steps to see the count of repositories
Login to Azure DevOps
Open one of following links by modifying your organization and project name
https://dev.azure.com/{ORGANIZATION_NAME}/{PROJECT_NAME}/_apis/git/repositories/
https://dev.azure.com/{ORGANIZATION_NAME}/{PROJECT_ID}/_apis/git/repositories/
Scroll to the end of the json response and you should see the count at the end of the page
How to get projects?
Follow step 1 and then open the link below
https://dev.azure.com/{ORGANIZATION_NAME}/_apis/projects
How to get teams?
Follow step 1 and then open the link below
https://dev.azure.com/{ORGANIZATION_NAME}/_apis/teams
How to find the project Id?
In Chrome, open the developer tools, open Azure DevOps and navigate to your project then you should see following calls which will give you the project id
How to get the count of projects, repositories and teams created in Azure DevOps?
I am afraid there is no such out of box method to get that data in excel or word document at this moment. We could not get the list of projects,repositories via queries and export it to the excel or word document.
To achieve this, you can accept the advise of Shayki Abramczyk, using the Rest API.
After get the response in json format, then parse the json file with Powershell or other scripts, like:
PS Script Get All Team Projects Listed as HTML–TFS/VSTS
Besides, when we parse the josn file, we even could export the data to the csv file:
Ticket: How to export data to CSV in PowerShell?
Hope this helps.
list all repos and count the lines assuming you have access to.
# get repo details
$repo_list = az devops project list --query 'value[].name' -o tsv | % { az repos list --project $_ -o tsv}
# count the output lines
$repo_list = az devops project list --query 'value[].name' -o tsv | % { az repos list --project $_ -o tsv} | wc -l```
note: ran this in powershell with a computer with git bash. the wc command is not a powershell command normally
The best way to retrieve the list of Project and team is using the OData feed. One can use excel or Power bi to retrieve the data. I am attaching how to retrieve the list of team and Project from Azure DevOps. https://learn.microsoft.com/en-us/azure/devops/report/powerbi/access-analytics-power-bi?view=azure-devops

How to run multiple test classes with a single command?

I've started digging about how to run integration tests with Flutter and the pro/cons it has.
Besides that the test output sometimes isn't helpful at all, I'd like to know if there's a way to run multiple dart classes with a single command.
Let's put an example where I've in my test_driver folder 4 classes:
login.dart
login_test.dart
register.dart
register_test.dart
In order to execute the tests on one of them, I'd do:
flutter drive --target=test_driver/login.dart
Is there any way where I can avoid doing
flutter drive --target=test_driver/login.dart
flutter drive --target=test_driver/register.dart
A posible workaround I've in mind is create a bash script to get all the files that doesn't end with "_test.dart" and execute that command.
Thanks!

AWS Data pipeline postStepCommand unable to access INPUT1_STAGING_DIR

In EMR Activity of a Data pipline, I am trying to use postStepCommand (as documented here ) to invoke a shell script. As part of it I am trying to access the standard directory paths ${INPUT1_STAGING_DIR} and ${OUTPUT1_STAGING_DIR}
But seems like it's not able to access it's value. Is it by design ?

Execute a script in Pentaho DI using an URL

I am new to use this tool.
I am trying to load or execute a script in PHP (http:\.....\re_certification/re_certification.php) in Pentaho. I don't know what tool can I use to be able to do it.
Any idea or example?
Use the HTTP Client step, it will fetch an URL using parameters given - http://wiki.pentaho.com/display/EAI/HTTP+Client.
Do not forget that it needs to be triggered - use generate rows or any other input step before this.
There is also a sample at the
samples/transformations/HTTP Client - simple retrieval example.ktr