karate api test: how to store all responses of a feature file in an array [duplicate] - api

Using Karate i am making API calls sequentially. I need to store API request and response for this sequential flow of APIs in separate text files for each API call.
Need to understand how can i achieve this.
I have tried logback which stores entire execution logs in a text file.

Take a look at karate.prevRequest which will give you the "request". Now use some custom Java (or JS) code and write whatever you want to a file.
I think personally it is un-necessary because Karate's HTML report has all that you need. If someone is asking you to do this, please try convince that person that this exercise is a waste of time.

Related

Validation of REST API with different input files each time

We have built REST API that takes input as a File, which provides the expected output/response content after server-end logic execution.
However we have hundreds of such input files, that we would like to validate and verify the result/output.
Could you pls let us know, is there any possibility that we can do this using script/post-man/any other tool, to choose/target a folder(which contains those input files) and does makes API call ?
Thanks.
https://stackoverflow.com/a/64954126/6793637
Check this answer on how to give a file path as variable.
now use data driven approach
filePATh
c:/1
c:2

Data storage after API call form Postman/SoapUI

I need to create an automated test-setup for some webservies, and plan to use SoapUI or Postman for that. My question is pretty basic. What happnds to the data after a request is made?
E.g. if the response contains data from a system, and display it in the Postman UI, will Postman store the response? Or what will happnd to it after the request?
I'm asking for security purpose and I was not able to find a concrete answer myself. Thank you in advance.
Postman provides us the explicit ways to store data or not. When you try to run a collection then in the settings we can specify if we want to store responses, cookies, etc or not. Configure it as per your need.
As per the official site
"Postman does not track any content of your requests/responses."
Under File--> settings
You can even avoid using the cloud version if you don't want to sync up things
Re SoapUI...
If you call a service once, then the data remains in the UI. If you run a second or third time, then only the last response is shown in he UI.
Once you close SoapUI, the request and response data is gone.
However, you can save the data from every request and response by using a datasink step, should that be what you want.

Is there any way to update the values of the feature file from the internally called feature file [duplicate]

I need to post a request to get an authorization token and include it in the header for all subsequent test requests. This token changes every time but it is valid for the entire test session as long as I keep sending requests. In each feature file I can call another feature file to get this token. But I don't want to do this for every feature file. I just want to get the token one time at the start of the test and use it for all feature files. How do I do that? I've read the Karate information on GitHub but did not find the answer.
The second example in the demos answers all your questions: karate-demo.
EDIT1: Sorry, I read your question too fast. You can use karate.call() in karate-config.js so it applies to all feature files. I don’t recommend this because you will always have some features where you DONT need this. Just use a call to a feature and don’t over-engineer your tests.
EDIT2: I thought about this a little more, if you are comfortable with Java, you could make a call to a singleton at the start of each feature (or even in the global karate-config.js) and in that singleton cache the value of the auth token. So you can do exactly what you need and it will be flexible.
EDIT3: based on this question - we added this functionality to Karate as a karate.callSingle() operation, here's the doc: https://github.com/intuit/karate/tree/develop#the-karate-object

Bulk extract with Authenticated Connector (import.io)

I am new to import.io and this forum.
I am trying to extract information from a target database where I have to run a query with an input. With help of the support, I successfully created the authenticated connector. With multiple inputs that have to be manually entered in the UI, it fetches the data properly.
The problem is I have more than 10,000 inputs to run, so it has to be in a form of bulk extraction. import.io support told me that they do not have this feature within their UI and suggested to use their API posted in here: http://api.docs.import.io/#!/Query_Methods/queryPost.
Could anyone walk me through to make a use of this? I just need a working script that takes multiple string lines as inputs and run the connector that I built and post the result. I am not very familiar with this kind of technology but I am very willing to learn.
Thanks all in advance!
I would be happy to walk you through a bit of an into. It will be a bit basic though since I don't know your specific use case.
Yes, support was correct. You will need to use the POST query in order to pass your authentication credentials as inputs.
I will break down this query by steps. Essentially, our API docs are just a simple UI to pass through your credentials, then you can generate a query API.
ID - This is the GUID of your connector. This information can be found at the end of the URL, like this: https://import.io/data/mine/?tag=CONNECTOR&id=33f4e828-25ce-40c4-948c-9b734c70d1ab
Query - This is where you will put the inputs from your connector in order to execute. Be sure to keep this in structured JSON or it will bring back errors when you are querying.
Once you have successfully entered that information you will query the API.
This will give you the request URL that you need to query the API.
If you have anymore questions, just let me know.
Thanks,
Meg

Can CSV data be sent to OpenERP/Odoo through the API?

I can import Comma Separated Values (CSV) data through the admin pages, into most models. This process handles the external IDs so that the data can be added to or amended as appropriate in later CSV imports. This is a manial action.
Through the API, the same records can be created and amended, and external IDs can be set. This, however, requires a lot of the logic that would otherwise be handled by the CSV importer to be coded by hand, in the external application that uses the API to push in data. Pushing data through the API can be automated.
Is there a way the API can be used (so no changes need to be made to code within Odoo) to push CSV data (so the logic for insert/update/relationships/external IDs/ etc. is handled by Odoo)? This would be a kind of hybrid approach, and I am trying to avoid the need to create import modules within Odoo.
Edit: the "external ID" is often called the "XML ID". I think it is a terminology that has stuck from earlier versions of OpenERP, rather than having anything specific to do with XML.
Edit
This page describes a load() function that pushes CSV-like data through a pipeline to load it into the system:
http://openerp-server.readthedocs.org/en/latest/06_misc_import.html
I can't see how to translate the summary on that page into an operation through the API, if indeed that is possible. I'm guessing I will need the interface (entry point), model, method (load(), probably), and some additional parameters, but the details are beyond me.
The answer is kind of "yes".
The load() method can be used against any model to load data. This method takes data in the same structure as a CSV file would provide.
The first parameter is an array of field names, like the column headings on a CSV import.
The second parameter is an array of records. Each record is an array of values matching each field.
The API will return a list of errors where they are catered for by OpenERP. Many errors, however, just result in database exceptions on OpenERP and so need to be picked up as an API failure. This is largely because the OpenERP API is not designed as a generic API, but as a part of the GUI, and so the data sent to the API is very much bound to the current state of the application through that GUI. In other words, invalid data will seldom find its way to the API using the OpenERP GUI.
I have wrapped the loader functionality, catching errors and exceptions, in my PHP OpenERP API library here:
https://github.com/academe/openerpapi/blob/master/src/App/Loader.php
Hopefully that will be useful to others too.
I think the answer is "no".
However, this technique has been explained to me:
Create a module with little in it but CSV files for importing.
Install the module.
When a new CSV file needs to be imported, transfer it into the module (FTP or similar).
Once transferred, run the update() method for the module. This can be done through the API.
The update method will scan and load all the CSV files set up within the module. Care needs to be taken to make sure only one upload/update transaction will be run at any time.
I'll post additional details here when I have got this working, or will happily accept an alternate answer if there is a better way to handle this.