Post API call on new data in the Table - api

I need to call post api function as soon as there's new data in the table. This needs to be done with Pyspark Streaming. Has anyone implemented same use case ?

Related

What is correct way of updating data after its fetched by a REST API

I have to design a GET REST API which fetch data from Mongo collection whenever its called by consumer. Once the data is returned that data should not fetched when consumer calls next time.
Below are few approaches thought about.
Expose GET API and consumer send a Time Stamp value each time when they call. Our API will fetch records which match timestamp >= givenTimeStamp.
Create a GET API with pagination send specific number of records every time API is getting called. In this approach once Records are fetched we need to have some kind of indicator to tell that record is being fetched. Like update the record once after fetched as part of GET API implementation. But this approach break GET contract.
Design a GET API which fetch data with specific number of records also create PUT/PATCH API which should be called by consumer passing the ID of GET call once GET is process to update the fetched records. This seems to be clean and inline with REST principles. But consumer has to make 2 calls GET and PUT.
How to handle this kind of use case in REST.

Update bigquery dataset access from java

We have a requirement where we need to give access to a particular user group in a bigquery dataset that contains views created by java code. I found that datasets.patch method can help me do it but not able to find documentation of what needs to be passed in the http request.
You can find the complete documentation on how to update BigQuery dataset access controls in the documentation page linked. Given that you are already creating the views in your dataset programatically, I would advise that you use the BigQuery client library, which may be more convenient than performing the API call to the datasets.patch method. In any case, if you are still interested in calling the API directly, you should provide the relevant portions of a dataset resource in the body of the request.
The first link I shared provides a good example of updating dataset access using the Java client libraries, but in short, this is what you should do:
public List<Acl> updateDatasetAccess(DatasetInfo dataset) {
// Make a copy of the ACLs in order to modify them (adding the required group)
List<Acl> previousACLs = dataset.getAcl();
ArrayList<Acl> ACLs = new ArrayList<>(previousACLs);
ACLs.add(Acl.of(new Acl.User("your_group#gmail.com"), Acl.Role.READER));
DatasetInfo.Builder builder = dataset.toBuilder();
builder.setAcl(ACLs);
bigquery.update(builder.build());
}
EDIT:
The way to define the dataset object is the following one:
BigQuery bigquery = BigQueryOptions.getDefaultInstance().getService();
Dataset dataset = bigquery.getDataset(DatasetId.of("YOUR_DATASET_NAME"));
Take into account that if you do not specify credentials when constructing the client object bigquery, the client library will look for credentials in the GOOGLE_APPLICATION_CREDENTIALS environment variable.

Send multiple updates in AtTask PUT request

I am wondering if whether in the API for AtTask there is a method for posting multiple updates in a single URL request.
As an example, I need to update the extRefIDs on 1,000 records. Do I make 1,000 calls to the API (expensive in terms of overhead), or can I send a single request with a JSON or XML payload that contains something like this:
{data
{id:1234,extRefID:xx}
{id:1235,extrefID:xy}
}
etc? Would certainly be less overhead on both systems if there was a method for this. Thanks in advance!
You can do bulk updates on objects of the same type by passing in a single JSON array into the "updates" parameter:
PUT .../api/v4.0/task?updates=[{"ID":"abc123","extRefID":"val1"},{"ID":"def456","extRefID":"val2"}]
Hope this help.

Stub for google calendar API when using rspec in ROR3

The following is the method I am trying to use for getting the API to make the calls to Google calendar. I am not sure what the stub should return. Should I capture the normal response and use it as is or is there a reference with minimum set of parameters?
api = client.discovered_api('calendar', 'v3')
result = client.execute!(:api_method => api.calendar_list.list)
I can see that Omniauth provides it's own mock support and I can see that Google provides Python mock libraries, but I'm not aware of any direct Google support for mocking from Ruby.
That said, given your example, you would need test doubles for client and api. It's not clear where client is coming from, but assuming that's established as a double somehow, you'd have at a minimum:
api = double('api')
client.should_receive(:discovered_api).and_return(api)
api.stub_chain(:calendar_list, :list)
client.should_receive(:execute!).and_return(... whatever result you want ...)
If in addition you want to confirm that your code is passing the right parameters to the Google API, then you'd need to augment the above with message expectations and, in the case of the api stub_chain, a return value which would then have to feed into the message expectations for the execute! call.
I'm still not sure that answers your question, but if not, I'll look forward to reading any additional comments.

How to capture the response object from a webhook call

I am creating a Rails 3.2 app and I am using Paymill as the payment gateway.
I am trying to setup a webhook on my system (Already setup on Paymill side). This webhook should respond to callbacks when a transaction was successful.
How can I "capture" the response object in my code? Using params?
Thankful for all help!
I don't know paymill, but it looks like it works the same way as stripe.
Thus, you have to handle the response with params.
You can have a look on this code sample: https://github.com/apalancat/paymill-rails
A webhook call from Paymill includes a JSON in the request. This JSON includes some meta data about the event that was triggered and the objects affected. So, you'd have to take the request body and parse the JSON to extract the information you are looking for. You can see a sample JSON file here:
https://www.paymill.com/de-de/dokumentation/referenz/api-referenz/#events