How to read test data from external JSON file and compare with postman response? - api

I have array of test data json file in local machine. I am writing a test in postman and trying to compare the response against the test data. Instead of doing it in collection runner, is it possible to pass the test data into Tests and compare with the response?

This is not something that is currently supported in Postman:
https://github.com/postmanlabs/postman-app-support/issues/7210
Alternatively what you could do
Copy/paste content of file (assuming it's not huge) and read into a variable which you could then do a string comparison against.
Store the content of the file in an endpoint where you can request it in the pre-request script, save it to a variable then do string comparison.
Neither solution is pretty in my opinion but probably the best you can do.

Related

Blue Prism webservice call and csv download

I am trying to configure Blue Prism to make an API call, and then response of this API is a csv file.
Currently I have configured the webservice with "GET" command on the base URL.
But I am not sure what needs to be done in order to download / save the csv file that the API sends. I am assuming that it needs to be explicitly told.
Please help!
In your process, make the HTTP request to the API using the Utility - HTTP VBO's HTTP Request action. Store the Result output parameter in a Text-typed Data Item.
If you need to process the data in a tabular format...
Once the CSV data is there, you can use the Get CSV As Collection action in the Utility - Strings VBO to parse the CSV content into a collection:
If you need to simply save the file...
... use the Utility - File Management VBO's Write Text File action and point it to the location you need to save the CSV to:

Karate UI automation: How can I read content of generated PDF/word/excel file using Karate UI automation [duplicate]

I have an export to excel feature in our application.
For which I have one scenario:
Perform export to excel
Validate API response status and exported excel content.
With Postman, I am able to save exported excel in .xlsx format with "Send and Download" option on which later I am validating content (Column Headers and row values) manually.
Is there any way to automate this scenario end to end through API automation?
Currently, I am doing get operation (Karate framework) which is responding me these headers in response:
Content-Type →application/vnd.openxmlformats-officedocument.spreadsheetml.sheet
Accept-Ranges →bytes
Body: Stream objects which are not human readable.
Status: 200 ok
If e2e automation not possible/feasible, what should be the acceptance criteria for automation in this case than?
2 options.
If you are sure the binary contents of the file will never change, do a binary comparison, see this example: upload-image.feature: And match response == read('karate-logo.jpg')
You have to write some custom code. There are Java libraries to read Excel. Use one of those, read the data, and then compare with expected results. Refer the docs on Java interop and write a helper function to do this.
EDIT - also see this answer: https://stackoverflow.com/a/53050249/143475
Find out a way to implement the solution with Java and Karate. Below are the steps:
Send your responsebytes to a java class Utility. This is how I did in Karate feature file:
And def helper = Java.type('com.java.Utility')
And def excel = helper.ByteArrayToExcel(responseBytes)
In Utility class, you will have a ByteArrayToExcel method which will contain this code:
import org.apache.commons.io.FileUtils;
FileUtils.writeByteArrayToFile(new
File("src\test\java\testdata\Actual_Response.xlsx"), ResponseBytes);
Now, You will have the excel in the specified location.
Write a method to compare two excel file (Actual and your expected one for the particular request). Google it, you will find the code. Specify, its return type to boolean.
In Karate, use the boolean like this:
And match excelCompareResult == true
Hope it will help.

JMeter - Adding a json response from a file and set it as response body in a get request

Im doing some json validating. I need to be able to add a local json file to a response in dummy sampler.
This because i need to validate new json files every day.
I figure out that i might use the JSR223 Post Response to get the file from a folder then set the Sampleresult.setResponseData to JSON.
Is there anyone who have a clue how i fo this?
The easiest is using __FileToString() function, you can put it to the "Response Data" field of the Dummy Sampler and it will read the file from the file system and respond with the file's contents:
If you have a folder with multiple JSON files and want to read them sequentially or randomly - consider using Directory Listing Config Plugin

Play: Automating test data setup

I have a playframework project that has reached beta/user testing.
For this testing we require test data to exist in the environment
I am looking for a way to automate this via scripts.
The best way will be via call's to the API passing the correctly shaped data based on the models in the project (thus dependant on the project not external).
Are there any existing SBT plugins that I could utilise that would be able to create the appropriate JSON and pass it to the API to setup the environment
Why do you need a plugin for this? I think what you want to do is to have a set of Json, then call the end-points and see what is the response from the back-end. In case of "setting up" based on a call that has a Json, you could use FakeRequest in your tests:
val application = newGuiceApplicationBuilder().build()
val response = route(application, FakeRequest(POST, "/end-point")).get
contentAsString(response) must include("where is Json")
In your test you can also test the response from the back-end and the Json you are feeding it:
Create a set of Json using Writes, based on a case class you are using in the back-end. You could also purposely create an invalid Json as well, that misses a field for example; or has an invalid structure.
Use Table driven testing and sending FakeRequest with the body/header containing your Json; and then checking it against the expected results.
I'm on the move, when I get home, I can write an example code here.

Pentaho rest client with variable url

I'm new to Pentaho and using the Rest Client. I can get the Rest client to work by using generate rows for the url. But then I need to pass part of the result of the json to be part of the url for the next request. I'm not sure how to do this. Any suggestions.
Remember that PDI works with streams, you, for each REST request you made, you will have one row as result. You will have as many rows as many requests you do.
I'm not sure if you can deserialize the JSON object directly from the PDI interface, but in the worst scenario, you can use the "User Defined Java Class" to use some external library (like Gson) and deserialize the object.
Then, you can create another variable in the UDJC step and concatenate the attributes you need on the URL string that comes from the last step.
In the other hand you can use "Modified Javascript" to deserialize it and return the attributes you need to then concatenate it with the URL. To use it, just declare varibles inside the code, and then use "Get variables" button to retrieve the available fields to send to the next step.
There are many ways to do it, I suggest you to use the Modified Javascript because it's easier to handle.
You CAN parse the Json response, just use Json Input a a nex step, and then use XPath to parse the field you want: $.result.the.thing.u.want.