How to invoke client's callback at server end? - serialization

RPC do things like this:
server define a function foo_max(para)
client call foo_max, send 'max+para'
server get 'max' and call foo_max(para)
server send return val
client get result and return
But this way is not flexible because I must define all functions at server end. My case is that I want to support user-defined callback function at client end, so how can I call the client's callback funciton at server end?
Thanks

Related

gRPC and C#: receive message bigger than maximum allowed

I am doing some test to request some data to a remote database from a client. For that, I have a client gRPC that call a method in the gRPC, this gRPC server use EF to get the data and send the result to the client.
Well, in my case, I get about 3MB of data, that is higher than the default maximum size allowed for the channel.
I know that I can resolve the problem when I create the channel in the client, in this way, for example, to 60 mb:
var channel = GrpcChannel.ForAddress("http://localhost:5223",
new GrpcChannelOptions
{
MaxReceiveMessageSize = 62914560,
MaxSendMessageSize = 62914560,
});
But although I can increase this when I create the channel, I can't ensure that some query returns more data than the maximum allowed.
So I would like to know how I can handle this.
In this case, the method is unaray, it is not a stream.
Thanks.

Insert Record to BigQuery or some RDB during API Call

I am writing a REST API GET endpoint that needs to both return a response and store records to either GCP Cloud SQL (MySQL), but I want the return to not be dependent on completion of the writing of the records. Basically, my code will look like:
def predict():
req = request.json.get("instances")
resp = make_response(req)
write_to_bq(req)
write_to_bq(resp)
return resp
Is there any easy way to do this with Cloud SQL Client Library or something?
Turns our flask has a functionality that does what I require:
#app.route("predict", method=["GET"]):
def predict():
# do some stuff with the request.json object
return jsonify(response)
#app.after_request
def after_request_func(response):
# do anything you want that relies on context of predict()
#response.call_on_close
def persist():
# this will happen after response is sent,
# so even if this function fails, the predict()
# will still get it's response out
write_to_db()
return response
One important thing is that a method tagged with after_request must take an argument and return something of type flask.Response. Also I think if method has call_on_close tag, you cannot access from context of main method, so you need to define anything you want to use from the main method inside the after_request tagged method but outside (above) the call_on_close method.

How to set http response code in Parse Server cloud function?

A parse server cloud function is defined via
Parse.Cloud.define("hello", function(request, response) {..});
on the response, I can call response.success(X) and response.error(Y), and that sets the http response code and the body of the response.
But how do I define a different code, like created (201)?
And how do I set the headers of the response?
thanks, Tim
You are allowed to return any valid JSON from response.success(). Therefore, you could create an object with fields such as code, message, and value, so you can set the code, give it a string descriptor, and pass back the value you normally would, if there is one. This seems to accomplish what you need, though you will have to keep track of those codes across your platforms. I recommend looking up standard http response codes and make sure you don't overlap with any standards.

Can I use lua in the openresty nginx http block

I want to request some api and set the response as a nginx variable. But it says "set_by_lua_block" directive is not allowed here. How can I achieve this?
http {
set_by_lua_block $hostIp {
local http = require 'resty.http'
local httpc = http.new()
local res, err = httpc:request_uri('http://some-pai')
local body, err = res:read_body()
ngx.log(ngx.INFO, "Using ngx.INFO")
ngx.log(ngx.INFO, body)
return body
}
...
}
set_by_lua_block is not allowed in http context
https://github.com/openresty/lua-nginx-module#set_by_lua
set_by_lua_* may be used within server context.
But your code will not work anyway because resty.http uses cosocket API.
At least the following API functions are currently disabled within the
context of set_by_lua:
Output API functions (e.g., ngx.say and ngx.send_headers)
Control API functions (e.g., ngx.exit)
Subrequest API functions (e.g., ngx.location.capture and ngx.location.capture_multi)
Cosocket API functions (e.g., ngx.socket.tcp and ngx.req.socket).
Sleeping API function ngx.sleep.
If you really need to request something once before nginx start - write script and set environment variable. Then
set_by_lua $my_var 'return os.getenv("MY_VAR")';

after receiving soap response modify xml and send to anther service using http adapter

while using http adapter I need to call first service that return XML,
after receiving the response I want to change values and send back to anther service,
how can I do it ?
do http adapter has json to xml function ?
WL adapter will automatically convert XML to JSON for you, however it doesn't have any manual JSON<->XML conversion APIs.
In your case possible solution might be to retrieve XML as plaintext by supplying returnedContentType:"plain" in invocation options. Alter whatever you need using regex/string replace. Use resulting string in 2nd procedure invocation as post body.
Alternatively, you can use 3rd party library to parse/convert/do whatever you need with XML, e.g. http://www.json.org/java/ (more info about how to use it in your adapter - http://public.dhe.ibm.com/software/mobile-solutions/worklight/docs/v506/04_08_Using_Java_in_adapters.pdf)
After checking number of solutions, I state the the http result will be a plain text,
then made a call to java function sending the xml as String, and used
javax.xml to hold and alter the XML.
XPath to retrieve the correct node using org.w3c.dom.*
Hope this will help you too.