POST Request Error (REST API - TALEND ESB) - api

Im trying to send Data with the POST request (using POSTMANfor testing with the body) but the data is saved with NULL values and the id always = 0 (ID in MySQL DataBase with Auto-Increment option)
Help pls, ty
there is all pictures

Related

Http status code when data not found in Database

I'm trying to understand which Http Status Code to use in the following use case
The user tries to do a GET on an endpoint with an input ID.
The requested data is not available in the database.
Should the service send back:
404 - Not Found
As the data is NOT FOUND in the database
400 - Bad Request
As the data in the input request is not valid or present in the db
200 - OK with null response
200 - OK with an error message
In this case we can use a standard error message, with a contract that spans across all the 200 OK responses (like below).
BaseResponse {
Errors [{
Message: "Data Not Found"
}],
Response: null
}
Which is the right (or standard) approach to follow?
Thanks in advance.
Which is the right (or standard) approach to follow?
If you are following the REST API Architecture, you should follow these guidelines:
400 The request could not be understood by the server due to incorrect syntax. The client SHOULD NOT repeat the request without modifications.
It means that you received a bad request data, like an ID in alphanumeric format when you want only numeric IDs. Typically it refers to bad input formats or security checks (like an input array with a maxLength)
404 The server can not find the requested resource.
The ID format is valid and you can't find the resource in the data source.
If you don't follow any standard architecture, you should define how you want to manage these cases and share your thought with the team and customers.
In many legacy applications, an HTTP status 200 with errors field is very common since very-old clients were not so good to manage errors.

How to use data in csv file row wise to be used per request via newman?

I have bunch of requests in my postman collection for example :-
Request 1
Request 2
...
Request N
For each of these requests , I want to pass a client id for which is unique per request. I have created a data file with those client ids. So the data in CSV file is as follows : -
Client Id
1
2
..
N
My requirement is to use Client ID 1 in Request 1 , Client ID 2 in Request 2 instead of iterating Client ID 1 though the entire collection.
So basically data in CSV file to be used row wise in all the requests.
Would really appreciate suggestions on how this can be achieved.
I tried using Runner but it doesn't fit my requirement
Maybe it would be easier not to use .csv file here, but Postman Environment Variables.
If you're having the number of ClientIDs matches the number of request, you can do something like this:
In the Pre-Request Script of first request you have to initiate an array of clientIDs:
const clientIdArr = [1,2,3,4,5,6,7,8,9,10];
pm.environment.set('clientIdArr', clientIdArr);
Then we will shift the first value of array of clientID in every subsequent Postman Collection request:
const currentArr = pm.environment.get('clientIdArr');
const currentValue = currentArr.shift();
pm.environment.set('clientIdArr', currentArr);
pm.environment.set('currentClientId', currentValue);
Then you can use {{currentClientId}} environment variable in your actual request and exectute the Postman Collection via Collection Runner.
For more details how Array.prototype.shift() works please refer to the following link.
If you have a large amount of requests in your Postman Collection you might consider having those scripts as Postman Global Functions.

Can't connect Azure Table Storage to PowerBI (415) Unsupported Media Type)

I'm getting the error below while connecting to Azure Table Storage,
Details:
Blockquote "AzureTables: Request failed: The remote server returned an error:
(415) Unsupported Media Type. (None of the provided media types are
supported)
The one thing I noticed is that if I fill up only the account name it will automatically add the rest of the url which is ".table.core.windows.net" where in the portal its table.cosmosdb.azure.com.
With core.windows.net Im getting err "AzureTables: Request failed: The remote name could not be resolved". But it might messing up some headers while using table.cosmosdb.azure.com
Please advise.
Thank you.
m
You should be able to connect to your azure table storage/CosmosDB account using powerBi using the following link structure: https://STORAGEACCOUNTNAME.table.core.windows.net/ , or https://yourcosmosdbname.documents.azure.com:443/ for cosmosdb
You can get the correct link by going to Portal > go to Storage accounts > Click on Tables/CosmosDB > You'll find the table link you would like to link to powerbi > remove the last table name after "/", then use it to connect in powerbi, it will later allow you to select the specific table in powerBI:
These are screenshots from testing for CosmosDB:
Errors 415:
When it comes to these errors, they can be caused by cache, which can be flushed by going to:
In Power BI Desktop: Go to "File" and select "Options". Under "Data Load" you have the option to clear the cache. After doing this you can use "Get Data" and "OData-feed" as normal and the URL won't return the 415 error
Check the following link for additional suggestions:
Not clear how you consume the table service API, but here is the solution that worked for me for React SPA and fetch api.
Request header must contain:
"Content-Type":"application/json"
It was failing for me with single quotes, and worked with double.

How to pass parameters in post call in pentaho-spoon?

I have made an api and I want to access a post call in it. I made the following transformation in kettle:
with a params field in Generate Rows step as:
and REST Client step configuration as:
but I am unable to get any of the parameters in my post call on server side. If I write a simple post call in python as:
import requests
url = "http://10.131.70.73:5000/searchByLatest"
payload = {'search_query': 'donald trump', 'till_date': 'Tuesday, 7 June 2016 at 10:40'}
r = requests.post(url, params=payload)
print(r.text)
print(r.status_code)
I am able to get the parameters by request.args.get("search_query") on the client side in Flask. How can I make an equivalent POST call in kettle?
I found the solution myself eventually. Describe the fields in generate rows as:
and in the parameters tab in REST Client step, we should get the same fields:
Works perfect!

Parsing HTTP POST data to XML in BizTalk

how can I parse a POST data like this
ELEMENT1=12345&ELEMENT2=56789&ELEMENT3=9
or
{"ELEMENT1":"12345","ELEMENT2":"56789","ELEMENT3":"9"}
to
<ELEMENT1>12345</ELEMENT1>
<ELEMENT2>56789</ELEMENT2>
<ELEMENT3>9</ELEMENT3>
Where the element name comes from the POST data
The different POST data doesn't have to be in the same schema.