Validation of REST API with different input files each time - api

We have built REST API that takes input as a File, which provides the expected output/response content after server-end logic execution.
However we have hundreds of such input files, that we would like to validate and verify the result/output.
Could you pls let us know, is there any possibility that we can do this using script/post-man/any other tool, to choose/target a folder(which contains those input files) and does makes API call ?
Thanks.

https://stackoverflow.com/a/64954126/6793637
Check this answer on how to give a file path as variable.
now use data driven approach
filePATh
c:/1
c:2

Related

karate api test: how to store all responses of a feature file in an array [duplicate]

Using Karate i am making API calls sequentially. I need to store API request and response for this sequential flow of APIs in separate text files for each API call.
Need to understand how can i achieve this.
I have tried logback which stores entire execution logs in a text file.
Take a look at karate.prevRequest which will give you the "request". Now use some custom Java (or JS) code and write whatever you want to a file.
I think personally it is un-necessary because Karate's HTML report has all that you need. If someone is asking you to do this, please try convince that person that this exercise is a waste of time.

Simplest way/tool to automate API calls and to save Json results in a file?

What would be the simplest tool/editor (ideally for Mac) to run web API queries (Stateless RESTful web API) in a loop in order to store Json results in a file ?
Very simple basically, trying just to automate the following :
- a first call to get a list of IDs
- then for each each ID, doing a call to get a few values related to this ID. Values are returned in a Json file, I would like to store them in a file (csv or excel)
To test the queries, I've used "Advanced REST client" to set a request with my authentication information header and do a few API queries tests, it works well but now I basically want to create a script to get the whole set of data which is returned and save in a file. With the idea to run this script from time to time. You can't to that with "Advanced REST client", right?
Sorry it's not (yet!) a super advanced question but any help would be greatly appreciated.
You may try Postman - definitely works on (accursed) Mac

Direct URL to Evernote notebook, note and tag

Our product (Yoke.io) integrates Evernote through REST API. We need to generate direct URL link to a specific notebook, note or tag so that user can click the link to access them.
However, current URL format contains parameters named "ses", "sh" and "sds" in addition to the ids for notebook/note/tag. I have no idea what these parameters mean and if they are different for different users, platforms, etc.
For example, if I want to access a notebook with id "3ec5f3c1-bd4d-4f94-b924-367b13eaf3bc", and generate the following links:
https://www.evernote.com/Home.action#b=3ec5f3c1-bd4d-4f94-b924-367b13eaf3bc
https://www.evernote.com/Home.action#b=3ec5f3c1-bd4d-4f94-b924-367b13eaf3bc&ses=4&sh=1&sds=5&
The first link (#1) won't work but the second link (#2) works.
I could hard-code "ses", "sh" and "sds" parameters in the URL but my feelings is that these parameters will change for different users.
Could anyone explain more on what are these parameters for and how I can generate a direct URL link to a specific Evernote notebook/note/tag?
Thanks a lot for your help.
Regards,
Tao
ses, sh and sds are hash parameters we use when serializing the state of the web client. If you try manipulating the hash parameters to get the web client into a certain state, note that these are undocumented APIs and are subject to change at any time. That said, they won't change super often.
None of those three parameters will change on a per-user basis, they represent the "view" of the client you're in at the time. The simplest way to get a url in the format you like is to navigate to that view in the web client, copy the hash, and replace note and/or notebook guids in the url (b for the notebook guid and n for the note guid).

How to find the source location of a dynamic token in JMeter?

I've been using Fiddler tool to capture the HTTP request-responses, then manually finding out the source location of a dynamic token (in a recorded page). I'd then use regular expression extractor on that source page to extract and store the value of that dynamic token in a variable, and use that variable in later pages.
Just wondering if there's an easier way for this. Is there any tool in JMeter that can help us find the source location of a dynamic token?
Thank you,
--Ishti
As of may 2015, there's nothing available OOTB except to save request / responses to file with ViewResultsTree and search in resulting file, or search in each response in ViewResultsTree gui.
An option would be to write a BackendListenerClient implementation that writes data in jdbc or ElasticSearch instance and uses it to search through SQL or elastic search queries.
A contribution would be welcome.
It is possible that this is implemented in future releases.

CloudConnect: Dynamic URL in REST Connector

What is the best way to create a dynamic request URL for the REST Connector in CloudConnect?
e.g. I want the URL to be for example www.myservice.com/api/{todays-date}/report.json and the URL must change accordingly everytime the ETL runs.
Is there some way to make this happen by code in CloudConnect? I didn't find any straight forward way but I found that one might be able to import a remote file containing the URL.
Does anyone have experience or tips on this subject?
what should work the best is to generate this parameter (e.g. in Data Generator) or read some data from the source (e.g. list of IDs) and send it to the REST component as an input parameter (e.g. metadata field name is 'today_date', so use ${today_date}). As far as I know this should work.
Another option is to use a parameter from a *.prm file (like workspace.prm). You can use ${PARAMETER_NAME} in your URL and this should also work correctly.
Hope this helps.
Radek