Does any of you know any open-source API where the following input output can be processed and generated:
Input: CSV File
Output: Graph/Chart representing the simple CSV**
I need to consume an API like this in my Spring Boot Project, to save time for having to write all the code myself.
Related
I am trying to configure Blue Prism to make an API call, and then response of this API is a csv file.
Currently I have configured the webservice with "GET" command on the base URL.
But I am not sure what needs to be done in order to download / save the csv file that the API sends. I am assuming that it needs to be explicitly told.
Please help!
In your process, make the HTTP request to the API using the Utility - HTTP VBO's HTTP Request action. Store the Result output parameter in a Text-typed Data Item.
If you need to process the data in a tabular format...
Once the CSV data is there, you can use the Get CSV As Collection action in the Utility - Strings VBO to parse the CSV content into a collection:
If you need to simply save the file...
... use the Utility - File Management VBO's Write Text File action and point it to the location you need to save the CSV to:
Team,
I have service to register a user with certain data along with unique mail id and phone no in JSON file format as a body (for ex: registerbody.json).
Before Post call I am generating unique mail id , phone no and updating the same json file (registerbody.json) fields which is in the same folder where feature file locates. I see the file is updated with the required data during runtime.
I used read () method and performed POST request
Surprisingly read method is not taking updated JSON file instead it is reading old data in the registerbody.json file.
Do you have any idea on this, why it is picking up old data even though file is updated with the latest information?
Please assist me with this.
Karate uses the Java classpath, which is typically target/test-classes. So if you edit a file in src/test/java Karate won't see it unless it is copied. This copying is automatically done when you build / compile your code.
My suggestion is use target/ as a temp folder and then you can read using the file: prefix:
* def payload = read('file:some.json')
Before Post call I am generating unique mail id , phone no and updating the same json file (registerbody.json)
You are making a big mistake here, Karate specializes in updating JSON based on variables. I suggest you take 5 minutes and read this part of the docs VERY carefully: https://github.com/intuit/karate#reading-files
Especially the part about embedded expressions: https://github.com/intuit/karate#embedded-expressions
I have a playframework project that has reached beta/user testing.
For this testing we require test data to exist in the environment
I am looking for a way to automate this via scripts.
The best way will be via call's to the API passing the correctly shaped data based on the models in the project (thus dependant on the project not external).
Are there any existing SBT plugins that I could utilise that would be able to create the appropriate JSON and pass it to the API to setup the environment
Why do you need a plugin for this? I think what you want to do is to have a set of Json, then call the end-points and see what is the response from the back-end. In case of "setting up" based on a call that has a Json, you could use FakeRequest in your tests:
val application = newGuiceApplicationBuilder().build()
val response = route(application, FakeRequest(POST, "/end-point")).get
contentAsString(response) must include("where is Json")
In your test you can also test the response from the back-end and the Json you are feeding it:
Create a set of Json using Writes, based on a case class you are using in the back-end. You could also purposely create an invalid Json as well, that misses a field for example; or has an invalid structure.
Use Table driven testing and sending FakeRequest with the body/header containing your Json; and then checking it against the expected results.
I'm on the move, when I get home, I can write an example code here.
I am using PubSub to capture realtime data. Then using GCP Dataflow to stream the data into BigQuery. I am using Java for dataflow.
I want to try out the templates given in DataFlow.
The process is: PubSub --> DataFlow --> BigQuery
Currently I am sending message in string format into PubSub (Using Python here). But the template in dataflow is only accepting JSON message. The python library is not allowing me to publish a JSON message. Can anyone suggest me a way publish a JSON message to PubSub so that I can use the dataflow template to do the Job.
The pipeline pumping data from PubSub to BQ provided by Google now assume JSON format and a matching schema on the other side.
Publishing JSONs to Pubsub is no different from publishing strings. You can try the following code snippets for python dict to JSON conversion:
import json
py_dict = {"name" : "Peter", "locale" : "en-US"}
json_string = json.dumps(py_dict)
If you'd like to do heavy customization to the pipeline, you can also take the source code at the following location and build your own.
https://github.com/GoogleCloudPlatform/DataflowTemplates/blob/master/src/main/java/com/google/cloud/teleport/templates/PubSubToBigQuery.java
How can i create a simple api using mulesoft stodio? I am using the MySql Database and trying to create REST apis and following this tutorial.
http://www.mulesoft.org/documentation/display/current/Creating+an+API+for+a+MySQL+Database
But facing the error
Error executing graph: ERROR (com.mulesoft.mule.module.datamapper.api.exception.DataMapperExecutionException). Message payload is of type: ArrayList
You have something wrong with your DataMapper definitions, probably a mismatch between the data coming from MySql and the data types for the fields in your DataMapper. Check your data types, the complete error message in Mule Studio, and the output of the Logger between the MySql component and DataMapper if you have a configuration similar to the example in the tutorial.
i was in assumption that the data mapper source data does not match with the payload it received.
according to error , the data-mapper input is of some format you defined but the received input is of array-list.
I am getting the 404 response on the link that you had mentioned in your question.
Your error is not related to what you were asking about How to create Rest API in mule.
As per concern about creating Rest API mule offers two ways:
Using the Rest Component (Recommended when you have rest specification defined in java using the jersey or apache-axis)
ApiKit Router with RAML (Higly recommended by mulesoft community)
As per your concern about error.
you might have not properly mapped the input fields of datamapper with the outbound messages from DB connector.
DB Connector always returns the response as ArrayList that need to be caste into any collection like Map or Array.
I recommend kindly use data sense feature of mulesoft DB connector that will automatically map the input fields of data mapper.
if possible share your flow.
Hope this helps.
Check the inbound properties of data mapper. output of DB query should not be in unknown format. If it is unknown format, then there is something wrong with your query output.
If you are looking for API creation, then check out the below link.
https://docs.mulesoft.com/anypoint-platform-for-apis/walkthrough-design-existing
check the document how to create API
https://docs.mulesoft.com/anypoint-platform-for-apis/
and regarding the DataMapper error just check the input data type to the output data type you are mapping. just cross check the input format to output format.
I suggest to use data weave now . Data mapper is obsolete . Define metadata for your HTTP end by giving some sample JSON and with database you will automatically get metadata extracted by data sense. Now you can use drag and drop for mapping your fields and it will transform input to requited output. you can also look data weave expression generated and now you can change that easily to tweak that.