Converting String to Date when using functional WebFlux - spring-webflux

When we send a URL with request parameters that needs to be converted to date, in SpringMVC we can do something like the code below in the controller and the fasterxml json library does the automatic conversion!
public String getFare(##RequestParam(value = "flightDate") #DateTimeFormat(iso = ISO.DATE) LocalDate date)
But how to achieve the same when we use the HandlerFunction (Spring webflux)? For example, in my HandlerFunction
public HandlerFunction<ServerResponse> getFare = serverRequest ->
{
Optional<String> flightDate = serverRequest.queryParam("flightDate");
}
The code serverRequest.queryParam("flightDate") gives a String. Is it possible to get the same automatic conversion here?

No. (you can look at Spring's source code and see that no other way to get the queryParams other than getting it as Optional<String>)
You must convert the field to Date yourself
Date flightDate = request.queryParam("flightDate ")
.map(date -> {
try {
return new SimpleDateFormat("dd-MMM-yyyy").parse(date);
} catch (ParseException e) {
return null;
}
}).orElse(null);

Related

Google Cloud Functions - Realtime Database Trigger - how to deserialize data JSON to POJO?

As described on the Google Cloud Functions docs, it is possible to trigger a Function based on Firebase Realtime Database events (write/create/update/delete).
The following docs sample explains how to get the delta snapshot.
public class FirebaseRtdb implements RawBackgroundFunction {
private static final Logger logger = Logger.getLogger(FirebaseRtdb.class.getName());
// Use GSON (https://github.com/google/gson) to parse JSON content.
private static final Gson gson = new Gson();
#Override
public void accept(String json, Context context) {
logger.info("Function triggered by change to: " + context.resource());
JsonObject body = gson.fromJson(json, JsonObject.class);
boolean isAdmin = false;
if (body != null && body.has("auth")) {
JsonObject authObj = body.getAsJsonObject("auth");
isAdmin = authObj.has("admin") && authObj.get("admin").getAsBoolean();
}
logger.info("Admin?: " + isAdmin);
if (body != null && body.has("delta")) {
logger.info("Delta:");
logger.info(body.get("delta").toString());
}
}
}
The sample works perfectly but the question is: How can I deserialize this delta to a POJO?
I tried:
val mObject = gson.fromJson(body.get("delta").toString(), MyCustomObject::class.java)
But I am getting:
com.google.gson.JsonSyntaxException: java.lang.IllegalStateException: Expected BEGIN_ARRAY but was BEGIN_OBJECT
As far as I know it is because MyObject class has a List<T> field, and Firebase Database always convert Lists to Maps with integer keys.
I preferably do not want to change every List<T> to Map<Int,T>, because I have a lot of classes :(
Thanks in advance!
So, here is what I ended up doing (maybe not the best solution!):
1) Create a custom Json Deserializer for Firebase-coming lists:
class ListFirebaseDeserializer<T> : JsonDeserializer<ArrayList<T>> {
override fun deserialize(json: JsonElement?, typeOfT: Type?, context: JsonDeserializationContext?): ArrayList<T> {
val result = ArrayList<T>()
val typeOfElement = (typeOfT as ParameterizedType).actualTypeArguments[0]
json?.let {
json.asJsonObject.entrySet().forEach {
entry->
result.add(Gson().fromJson(entry.value, typeOfElement))
}
}
return result
}
}
This takes the lists that Firebase turned into maps and convert it back to actual lists.
2) Annotate every list in my POJO with #JsonAdapter(ListFirebaseDeserializer::class), for instance:
class MyCustomObject {
#JsonAdapter(ListFirebaseDeserializer::class)
var myPaymentList = ArrayList<Payment>()
}
It could be a pain if you already have lots of lists to annotate, but it is better than having to use maps instead.
Hope it helps!

How can I convert a Stream of Mono to Flux

I have a method that try use WebClient to return a Mono
#GetMapping("getMatch")
public Mono<Object> getMatch(#RequestParam Long matchId) {
return WebClient.create(OpenDotaConstant.BASE_URL).get()
.uri("/matches/{matchId}", matchId)
.accept(MediaType.APPLICATION_JSON)
.retrieve()
.bodyToMono(Object.class);
}
It can return result that I expected.
Then I try to create another method to support List as params
#GetMapping("getMatches")
public Flux<Object> getMatches(#RequestParam String matchesId) {
List<Long> matchesList = JSON.parseArray(matchesId, Long.class);
return Flux.fromStream(matchesList.parallelStream().map(this::getMatch));
}
But this time return a weird result.
[
{
"scanAvailable": true
},
{
"scanAvailable": true
}
]
I'm new to reactive-programming, What is the correct way to combine Stream and Mono,and then convert to the Flux?
Probably, what you need is the following:
#GetMapping("getMatches")
public Flux<Object> getMatches(#RequestParam String matchesId) {
List<Long> matchesList = JSON.parseArray(matchesId, Long.class);
return Flux.fromStream(matchesList.stream())
.flatMap(this::getMatch);
}
Instead of:
#GetMapping("getMatches")
public Flux<Object> getMatches(#RequestParam String matchesId) {
List<Long> matchesList = JSON.parseArray(matchesId, Long.class);
return Flux.fromStream(matchesList.parallelStream().map(this::getMatch));
}
Notes:
Basically, you expect getMatches endpoint to return Flux<Object>. However, as it is written - it actually returns Flux<Mono<Object>>, therefore you see the strange output. To get Flux<Object>, I suggest, first, create Flux<Long> that are match ids, and then flatMap the result of calling getMatch (that returns Mono<Object>), this finally gives Flux<Object>.
Also, there is no need to use parallelStream(). Because you're already using reactor, everything will be performed concurrently on reactor scheduler.

Lagom http status code / header returned as json

I have a sample where I make a client request to debug token request to the FB api, and return the result to the client.
Depending on whether the access token is valid, an appropriate header should be returned:
#Override
public ServerServiceCall<LoginUser, Pair<ResponseHeader, String>> login() {
return this::loginUser;
}
public CompletionStage<Pair<ResponseHeader, String>> loginUser(LoginUser user) {
ObjectMapper jsonMapper = new ObjectMapper();
String responseString = null;
DebugTokenResponse.DebugTokenResponseData response = null;
ResponseHeader responseHeader = null;
try {
response = fbClient.verifyFacebookToken(user.getFbAccessToken(), config.underlying().getString("facebook.app_token"));
responseString = jsonMapper.writeValueAsString(response);
} catch (ExecutionException | InterruptedException | JsonProcessingException e) {
LOG.error(e.getMessage());
}
if (response != null) {
if (!response.isValid()) {
responseHeader = ResponseHeader.NO_CONTENT.withStatus(401);
} else {
responseHeader = ResponseHeader.OK.withStatus(200);
}
}
return completedFuture(Pair.create(responseHeader, responseString));
}
However, the result I get is:
This isn't really what I expected. What I expect to receive is an error http status code of 401, and the json string as defined in the code.
Not sure why I would need header info in the response body.
There is also a strange error that occurs when I want to return a HeaderServiceCall:
I'm not sure if this is a bug, also I am a bit unclear about the difference between a ServerServiceCall and HeaderServiceCall.
Could someone help?
The types for HeaderServiceCall are defined this way:
interface HeaderServiceCall<Request,Response>
and
CompletionStage<Pair<ResponseHeader,Response>> invokeWithHeaders(RequestHeader requestHeader,
Request request)
What this means is that when you define a response type, the return value should be a CompletionStage of a Pair of the ResponseHeader with the response type.
In your code, the response type should be String, but you have defined it as Pair<ResponseHeader, String>, which means it expects the return value to be nested: CompletionStage<Pair<ResponseHeader,Pair<ResponseHeader, String>>>. Note the extra nested Pair<ResponseHeader, String>.
When used with HeaderServiceCall, which requires you to implement invokeWithHeaders, you get a compilation error, which indicates the mismatched types. This is the error in your screenshot above.
When you implement ServerServiceCall instead, your method is inferred to implement ServiceCall.invoke, which is defined as:
CompletionStage<Response> invoke()
In other words, the return type of the method does not expect the additional Pair<ResponseHeader, Response>, so your implementation compiles, but produces the incorrect result. The pair including the ResponseHeader is automatically serialized to JSON and returned to the client that way.
Correcting the code requires changing the method signature:
#Override
public HeaderServiceCall<LoginUser, String> login() {
return this::loginUser;
}
You also need to change the loginUser method to accept the RequestHeader parameter, even if it isn't used, so that it matches the signature of invokeWithHeaders:
public CompletionStage<Pair<ResponseHeader, String>> loginUser(RequestHeader requestHeader, LoginUser user)
This should solve your problem, but it would be more typical for a Lagom service to use domain types directly and rely on the built-in JSON serialization support, rather than serializing directly in your service implementation. You also need to watch out for null values. You shouldn't return a null ResponseHeader in any circumstances.
#Override
public ServerServiceCall<LoginUser, Pair<ResponseHeader, DebugTokenResponse.DebugTokenResponseData>> login() {
return this::loginUser;
}
public CompletionStage<Pair<ResponseHeader, DebugTokenResponse.DebugTokenResponseData>> loginUser(RequestHeader requestHeader, LoginUser user) {
try {
DebugTokenResponse.DebugTokenResponseData response = fbClient.verifyFacebookToken(user.getFbAccessToken(), config.underlying().getString("facebook.app_token"));
ResponseHeader responseHeader;
if (!response.isValid()) {
responseHeader = ResponseHeader.NO_CONTENT.withStatus(401);
} else {
responseHeader = ResponseHeader.OK.withStatus(200);
}
return completedFuture(Pair.create(responseHeader, response));
} catch (ExecutionException | InterruptedException | JsonProcessingException e) {
LOG.error(e.getMessage());
throw e;
}
}
Finally, it appears that fbClient.verifyFacebookToken is a blocking method (it doesn't return until the call completes). Blocking should be avoided in a Lagom service call, as it has the potential to cause performance issues and instability. If this is code you control, it should be written to use a non-blocking style (that returns a CompletionStage). If not, you should use CompletableFuture.supplyAsync to wrap the call in a CompletionStage, and execute it in another thread pool.
I found this example on GitHub that you might be able to adapt: https://github.com/dmbuchta/empty-play-authentication/blob/0a01fd1bd2d8ef777c6afe5ba313eccc9eb8b878/app/services/login/impl/FacebookLoginService.java#L59-L74

Pig udf on Filter

I have a use-case in which i need to take in the date of a month to return the previous month's last date.
Ex: input:20150331 output:20150228
I will be using this previous month's last date to filter a daily partition(in pig script).
B = filter A by daily_partition == GetPrevMonth(20150331);
I have created an UDF(GetPrevMonth) which takes the date and returns the previous month's last date.But unable to use it on the filter.
ERROR:Could not infer the matching function for GetPrevMonth as multiple or none of them fit. Please use an explicit cast.
My udf takes tuple as input.
Googling it says that UDF cannot be applied on filters.
Is there any workaround? or am i going wrong somewhere?
UDF:public class GetPrevMonth extends EvalFunc<Integer> {
public Integer exec(Tuple input) throws IOException {
String getdate = (String) input.get(0);
if (getdate != null){
try{
//LOGIC to return prev month date
}
Need help.Thanks in advance.
You can call a UDF in a FILTER, but you are passing a number to the function while you expect it to receive a String (chararray inside Pig):
String getdate = (String) input.get(0);
The simple solution would be to cast it to chararray when calling the UDF:
B = filter A by daily_partition == GetPrevMonth((chararray)20150331);
Generally, when you see some error like Could not infer the matching function for X as multiple or none of them fit, 99% of the time the reason is that the values you are trying to pass to the UDF are wrong.
One last thing, even if it is not necessary, in a future you might want to write a pure FILTER UDF. In that case, instead of inheriting from EvalFunc, you need to inherit from FilterFunc and return a Boolean value:
public class IsPrevMonth extends FilterFunc {
#Override
public Boolean exec(Tuple input) throws IOException {
try {
String getdate = (String) input.get(0);
if (getdate != null){
//LOGIC to retrieve prevMonthDate
if (getdate.equals(prevMonthDate)) {
return true;
} else {
return false;
}
} else {
return false;
}
} catch (ExecException ee) {
throw ee;
}
}
}

Pig - passing Databag to UDF constructor

I have a script which is loading some data about venues:
venues = LOAD 'venues_extended_2.csv' USING org.apache.pig.piggybank.storage.CSVLoader() AS (Name:chararray, Type:chararray, Latitude:double, Longitude:double, City:chararray, Country:chararray);
Then I want to create UDF which has a constructor that is accepting venues type.
So I tried to define this UDF like that:
DEFINE GenerateVenues org.gla.anton.udf.main.GenerateVenues(venues);
And here is the actual UDF:
public class GenerateVenues extends EvalFunc<Tuple> {
TupleFactory mTupleFactory = TupleFactory.getInstance();
BagFactory mBagFactory = BagFactory.getInstance();
private static final String ALLCHARS = "(.*)";
private ArrayList<String> venues;
private String regex;
public GenerateVenues(DataBag venuesBag) {
Iterator<Tuple> it = venuesBag.iterator();
venues = new ArrayList<String>((int) (venuesBag.size() + 1)); // possible fails!!!
String current = "";
regex = "";
while (it.hasNext()){
Tuple t = it.next();
try {
current = "(" + ALLCHARS + t.get(0) + ALLCHARS + ")";
venues.add((String) t.get(0));
} catch (ExecException e) {
throw new IllegalArgumentException("VenuesRegex: requires tuple with at least one value");
}
regex += current + (it.hasNext() ? "|" : "");
}
}
#Override
public Tuple exec(Tuple tuple) throws IOException {
// expect one string
if (tuple == null || tuple.size() != 2) {
throw new IllegalArgumentException(
"BagTupleExampleUDF: requires two input parameters.");
}
try {
String tweet = (String) tuple.get(0);
for (String venue: venues)
{
if (tweet.matches(ALLCHARS + venue + ALLCHARS))
{
Tuple output = mTupleFactory.newTuple(Collections.singletonList(venue));
return output;
}
}
return null;
} catch (Exception e) {
throw new IOException(
"BagTupleExampleUDF: caught exception processing input.", e);
}
}
}
When executed the script is firing error at the DEFINE part just before (venues);:
2013-12-19 04:28:06,072 [main] ERROR org.apache.pig.tools.grunt.Grunt - ERROR 1200: <file script.pig, line 6, column 60> mismatched input 'venues' expecting RIGHT_PAREN
Obviously I'm doing something wrong, can you help me out figuring out what's wrong.
Is it the UDF that cannot accept the venues relation as a parameter. Or the relation is not represented by DataBag like this public GenerateVenues(DataBag venuesBag)?
Thanks!
PS I'm using Pig version 0.11.1.1.3.0.0-107.
As #WinnieNicklaus already said, you can only pass strings to UDF constructors.
Having said that, the solution to your problem is using distributed cache, you need to override public List<String> getCacheFiles() to return a list of filenames that will be made available via distributed cache. With that, you can read the file as a local file and build your table.
The downside is that Pig has no initialization function, so you have to implement something like
private void init() {
if (!this.initialized) {
// read table
}
}
and then call that as the first thing from exec.
You can't use a relation as a parameter in a UDF constructor. Only strings can be passed as arguments, and if they are really of another type, you will have to parse them out in the constructor.