Savon not extracting whether to qualify elements from WDSL properly - savon

Opening an issue as mentioned in the description: https://www.savonrb.com/version2/globals.html.
I am using Savon with a WSDL: https://sandbox.hesapi.labworks.org/st_api/wsdl
However, it is not extracting whether to qualify elements or not. The request will fail unless I pass the element_form_default: :qualified parameter as such:
client = Savon.client(wsdl: 'https://sandbox.hesapi.labworks.org/st_api/wsdl', log: true, log_level: :debug, pretty_print_xml: true, element_form_default: :qualified)
Works fine with the parameter.

Related

How to do softassertion with karate? [duplicate]

This question already has an answer here:
Is it possible to do soft assertion in the karate
(1 answer)
Closed 1 year ago.
I have a feature that use other two features something like that:
When call read(ser.getCarList) headers
When call read(ser.getTaxes) headers
So the first feature getCarList has two validations
When method Get
* configure continueOnStepFailure = true
Then status 200
And match response = read ('this:getCarAssertion')
* configure continueOnStepFailure = true
I have tried with the new keyword but when I get a status code 200 but a bad response the next feature getTaxes does not continue in the execution
The continueOnStepFailure is a new keyword that was meant to be used when looking to validate results and not fail immediately on the first failure. The purpose was for assertions or validations so as much information can be validated when asserting results of tests.
To avoid its usage to be as a pure if condition for several steps (with unexpected consequences), the default behavior of * configure continueOnStepFailure = true will only continue the execute if the fails happen in a match step and once you disable the mechanism with * configure continueOnStepFailure = false the test will fail (but still provide details for each step within the continueOnStepFailure block). This is because match is the recommended keyword for any sort of validations and is how you can leverage the powerful JSON assertions library etc.
It is also recommended to also explicity set * configure continueOnStepFailure = false after the set of match keywords so there are no unexpected behaviors after that conscious decision of continuing to evaluate keywords after a failure.
That being said there are ways to extend and configure the behavior of the continueOnStepFailure beyond the default behavior. The keyword also takes a JSON input, instead of a boolean, that allows some more extensibility. E.g. the default behavior of the keyword can be represented as follows:
* configure continueOnStepFailure = { enabled: true, continueAfter: false, keywords: ['match'] }
This means the continueOnStepFailure mechanism will be enabled, the scenario execution will not continue after the mechanism is disabled and it'll only accept failures if those happen in the match keyword. Note that if you set continueAfter to true the scenario will continue to execute the remaining steps but the scenario itself will still be marked as failed (with appropriate output in the report and typical failed behavior for any caller of that scenario). I highly discourage to set continueAfter to true.
For your specific use case, the status keyword is definitely within the boundaries of assertions that I've described. status 200 is just a shortcut for match responseStatus == 200. Very likely we should add status to the default behavior, given that it's a match assertion. With the extended configuration in JSON you can do the following for your use-case:
When method Get
And configure continueOnStepFailure = { enabled: true, continueAfter: false, keywords: ['match', 'status'] }
Then status 200
And match response = read ('this:getCarAssertion')
And configure continueOnStepFailure = false
Some additional examples can be found in the unit tests in this pull request. For quick reference, this is how your Karate Test report will look like:

Adding an encryption layer to DataTables.js

I’m currently using DataTables.js with server-site data source written on PHP.
The server-side script gives out the data exactly as required by datatables:
{“iTotalDisplayRecords”:”777”,”sEcho”:0,”aaData”:[[row1],[row2],[row3]]}
Now I would like to add an additional security layer with encrypting the response from the server and decrypting it after it is received by datatables.
I need this as I noticed some of the clients work through HTTPS proxy and the content of some rows get mistakenly blocked.
I’m using this solution for server-side PHP script to give out encrypted content using openssl_encrypt.
Then at client side I have:
function datatable_init (source) {
$.getJSON(source, function(data) {
decryptedContent = JSON.parse(CryptoJSAesDecrypt(“password”, data));
oTable = $(‘dtable’).dataTable({
“bProccesing”: false,
“bServerSide: true,
//“sAjaxSource”: source,
“data”: decryptedContent
...
});
I had to replace ”sAjaxSource” to ”data” as it is different datasource type now which requires different type of datatable JSON format:
{data:[[row1],[row2],[row3]}
and I can’t to pass iTotalDisplayRecords anymore.
Is there a way I can keep feeding server-side format of JSON to datatable but feed it as a local JS object/array?
P.S. Another idea I had is to encrypt/decrypt each individual row of the table but that’s probably going to be more complicated and slower
The ajax.dataSrc option seems to be helpful, because it provides the possibility to modify the data received via ajax and thus allows you to define a function which takes care of decrypting the received data again. Especially the last example given on the reference page looks promising in my opinion.

Extract report results with CloudConnect

I would like to extract raw report results within the CloudConnect process.
So far I have managed to get response from the raw report API end point - https://secure.gooddata.com/gdc/app/projects/{project_id}/execute/raw/
This response contains URI to the file and if I put that URI to browser, file is uploaded.
I have tried passing this URI to the following readers without success:
CSV Reader produces the following error:
------------------- Error details ------------------
Component [CSV Reader:CSV_READER] finished with status ERROR.
Parsing error: Unexpected end of file in record 1, field 1 ("date"),
metadata "outOfStock";
value: Raw record data is not available, please turn on verbose mode.
File Download - I don't know how to pass the URI through the port to "URL to Downlaod" parameter.
HTTP Connector again I don't see how to pass URI from the port.
What is the way to do this?
EDIT
If I use the HTTP Connector as suggested by #Filip, I get the following error:
Error details:
Component [HTTP connector:HTTP_CONNECTOR] finished with status ERROR. hostname in
certificate didn't match: xxx.com != secure.gooddata.com OR secure.gooddata.com
I have tried setting header to X-GDC-CHECK-DOMAIN: false with no effect.
The HTTP connector is the right component to go with. Leave the URL property empty and use the component’s property called “Input mapping”, where in the graphic editor you can assign the input edge field to the URL field.
Solution from GoodData support:
HTTP connector can be also used, but it is very complex, because
logging in to GoodData has to be created. REST connector has it built
in.
If you want to run the example graph, you have to be logged in in
CloudConnect with a user who has access to the project from where you
would like to export the report. You also have to change URL to
the one of white-labeled account in both REST connector components and change project
and report definition in the first REST connector.
So the graph that works looks like this:
Here are the main fields that you will need to set for each element:
Get Results URI - set params for POST request:
Request URL = https://secure.gooddata.com/gdc/app/projects/${GDC_PROJECT_ID}/execute/raw/
Request Body =
{
"report_req": {
"reportDefinition": "gdc/md/${GDC_PROJECT_ID}/obj/${OBJECT_ID}"
}
}
Get URI from Response - just map uri value to corresponding field:
<Mapping cloverField="uri" xpath="uri"/>
Load Results - make sure it is connected to metadata with two fields, one for response with data, other to pass through the uri.
Load Results - you will need to exclude uri field to process the data:
Exclude Fields = uri

How do I reference a variable within the Jmeter User Defined Variable control?

I'm currently creating a test suite of API tests in JMETER.
I've created a "User Defined Variable" config element to help parameterize the tests. The value goes into the "Path" of the API request.
However.....
When I input
NAME: dev.testAppUrl
VALUE: https://devurl/api/applications/${ID}
the test returns an error because its treating ${ID} as a literal string in the URL path.
If the url is hardcoded in the test request it works fine to leave the ${ID} in there and that value is scraped from a previous request using a "regular expression extractor control" and populated as expected. But I would love to not hardcode these path values.
You should use eval function to get the ${ID} replaced at run time.
${__eval(${dev.testAppUrl})}

How to pass regular expression extracted value in json format for PUT call in jmeter?

I am testing RESTapi with (json format) using (HTTP Request sampler) in jmeter. I am facing problems with the PUT calls for update operation.
The PUT call with parameters don't work at all using (HTTP Request sampler), so now i am using the post body to pass the Json.
How can i pass the extracted values from the previous response to next PUT request in thread group? Passing the 'Regex veritable' to PUT call in Post body don't work, it doesn't take ${value} in Post body.
How do i carry out UPDATE operations using (HTTP Request sampler) in Jmeter?
Check that your regexp extractor really worked using a debug sampler to show extracted value.
Check your regexp extractor is scoped correctly.
See this configuration:
A Variable:
Its use with a PUT request:
The sampler result: