scalaquery how to create datetime - scalaquery

I foundout that scalaquery uses java.sql.date as the date object. But it drops time when I create java.sql.date.
Is there any way that I can use to create mysql datetime field in scalaquery?

There is no java.util.Date support in ScalaQuery. However, it's possible to enhance ScalaQuery with your own type mappers. For java.util.Date such a wrapper could look like this
implicit val JavaUtilDateTypeMapper = MappedTypeMapper.base[java.util.Date, Long] (_.getTime, new java.util.Date(_))
converting the java.util.Date into a simple Long. It depends on your database what's the best target, in my case a Long works perfectly.

Instead of using java.sql.Date, try using java.sql.Timestamp, when declaring the column.
It looks like that yields a better mapping for a column with both date and time elements.

Related

java.util.Date to kotlinx.datetime.LocalDateTime

I have a value of type java.util.Date which was obtained from a legacy third-party API. Is there a direct way of converting it to kotlinx.datetime.LocalDateTime? I know how to do it only in a roundabout way, such as serializing to String and deserializing, or by converting to java.time.LocalDateTime first and then to the wanted type.
i Think the best thing you can do is to convert it to a string and than convert into a kotlinx.datetime.LocalDateTime
The problem is that not all java types can be converted directly into a kotlin one especially not with the old DataType java.util.Date.
It is also posible with the new DataType java.time.LocalDateTime.

Performance difference - Jackson ObjctMapper.writeValue(writer, val) vs ObjectMapper.writeValueAsString(val)

Is there any significant performance difference between the following two?
String json = mapper.writeValueAsString(searchResult);
response.getWriter().write(json);
vs
mapper.writeValue(response.getWriter(), searchResult);
writeValueAsString JavaDoc says:
Method that can be used to serialize any Java value as a String.
Functionally equivalent to calling writeValue(Writer,Object) with
StringWriter and constructing String, but more efficient.
So, in case, you want to write JSON to String is much better to use this method than writeValue. Both these methods use _configAndWriteValue.
In your case it is better to write JSON directly to response.getWriter() than generating String object and after that writing it to response.getWriter().

What is a good alternative for Spark CatalystSqlParser?

I have previously used CatalystSqlParser to parse input strings to DataType like this:
private def convertToDataType(inputType: String): DataType = CatalystSqlParser.parseDataType(inputType)
It was very convenient and easy to implement. However as I can see for now CatalystSqlParser is not available for use. The import org.apache.spark.sql.catalyst.parser.CatalystSqlParser is not working.
Is there any alternative similar to CatalystSqlParser?
You can using CatalystSqlParser.parseDataType by calling DataType.fromDDL()

How To convert Pcollection<String> variable into String

I have PCollection<String> of type String and I want to transform this to get values of specific column from BigQuery table. So I used BigQueryIO.readTableRows to get values from BigQuery.
Here is my Code:
PCollection<TableRow> getConfigTable = pipeline.apply("read from Table",
BigQueryIO.readTableRows().from("TableName"));
RetrieveDestTableName retrieveDestTableName = new RetrieveDestTableName();
PCollection<String> getDestTableName = getConfigTable.apply(ParDo.of(new DoFn<String,String>(){
#ProcessElement
public void processElement(ProcessContext c){
c.output(c.element().get("ColoumnName").toString());
}
}));
As per above code I will get an output from getDestTableName of type PCollection<String> but I want this output in String variable.
Is there any way to convert PCollection<String> to String datatype variable so that I can able to use variable in my code?
Converting a PCollection<String> to a String is not possible in the Apache Beam programming model. A PCollection simply describes the state of the pipeline at any given point. During development, you do not have literal access to the strings in the PCollection.
You can process the strings in a PCollection through transforms. However, it seems like you need the table configuration to construct the rest of the pipeline. You'll need to know the destination ahead of time or you can use DynamicDestinations to determine which table to write to during pipeline execution. You cannot get the table configuration value from the PCollection and use it to further construct the pipeline.
It seems that you want something like JdbcIO.readAll() but for BigQuery, allowing the read configuration(s) to be dynamically computed by the pipeline. This is currently not implemented for BigQuery, but it'd be a reasonable request.
Meanwhile your options are:
Express what you're doing as a more complex BigQuery SQL query, and use a single BigQueryIO.read().fromQuery()
Express the part of your pipeline where you extract the table of interest without the Beam API, instead using the BigQuery API directly, so you are operating regular Java variables instead of PCollections.

Apply SQL "LIKE" to bytes

I must create a DAO with hibernate that can work in a generic way, that means to execute some queries based on properties types.
My generic DAO works ok when filtering String properties of any class, it accepts "contains", "starts with", "ends with" using "like" restrictions:
Restrictions.like(propertyName, (String) value, getMatchMode());
The problem I have is that I need to also create a similar "contains", "starts with", "ends with" to bytes (byte[]) properties, the hibernate
SimpleExpression like(String propertyName, Object value)
api does not work (probably totally expected not to work), so I was thinking maybe I could convert the bytes stored in DB into a String, and then with a workaround apply the normal stringed Restrictions.like api.
The problem is that I think there's no standard way to convert bytes[] into String since there's no standard data type among DB platforms, I mean, Oracle uses "RAW", hsql uses "VARBINARY" and so on (Oracle uses its own RAWTOHEX for instance).
Or should any of you have an idea how to sort out the problem it will be very welcome.
Cheers.
///RGB
In MySQL you could use HEX to convert BINARY to String. i.e.
SELECT
*
FROM
myTable
WHERE
HEX(myBinaryField) LIKE 'abc%'`
In your Java code you could use some Base64 Encoder, which will convert bytes to string. Then you could just persist the Base64 encoded String and use normal LIKE queries. Maybe not the most efficient way, but it should work well.