Eclipselink not converting oracle.sql.TIMESTAMPTZ - eclipselink

I am getting an error when I am fetching value from DB using Eclipselink as persistence provider. It is not converting oracle.sql.TIMESTAMPTZ to java.sql.Timestamp or to java.util.Date.
Query q = em.createNativeQuery("SELECT * FROM MY_Schema.MyTable MT WHERE MT.START_DT = (SELECT MAX(START_DT) FROM MY_Schema.MyTable)",MyTable.class);
#Entity
#Table(name = "MyTable", schema = "MY_Schema")
public class MyTable implements Serializable {
#EmbeddedId
private MyTableId id;
#Embeddable
public class MyTableId implements Serializable {
#Temporal(TemporalType.TIMESTAMP)
#Column(name = "END_DT")
private Calendar endTime;
#Temporal(TemporalType.TIMESTAMP)
#Column(name = "START_DT")
private Calendar startTime;
Exception
Caused by: Exception [EclipseLink-3001] (Eclipse Persistence Services- 2.6.8.WAS-v20181218-0accd7f): org.eclipse.persistence.exceptions.ConversionException ExceptionDescription: The object [oracle.sql.TIMESTAMPTZ#6156ebf7], of class[class oracle.sql.TIMESTAMPTZ], could not be converted to [class java.sql.Timestamp]. at org.eclipse.persistence.exceptions.ConversionException.couldNotBeConverted(ConversionException.java:78) at org.eclipse.persistence.internal.helper.ConversionManager.convertObjectToTimest mp(ConversionManager.java:751) at org.eclipse.persistence.internal.helper.ConversionManager.convertObject(ConversionManager.java:112)
Things I found
While debugging I found that in eclipselink ConversionManager class
there is no handling for oracle.sql.TIMESTAMPTZ. It is directly throwing exception in this method.
/**
* INTERNAL:
* Build a valid instance of java.sql.Timestamp from the given source object.
* #param sourceObject Valid object of class java.sql.Timestamp, String, java.util.Date, or Long
*/
protected java.sql.Timestamp convertObjectToTimestamp(Object sourceObject) throws ConversionException {
java.sql.Timestamp timestamp = null;
if (sourceObject instanceof java.sql.Timestamp) {
return (java.sql.Timestamp)sourceObject;// Helper timestamp is not caught on class check.
}
if (sourceObject instanceof String) {
timestamp = Helper.timestampFromString((String)sourceObject);
} else if (sourceObject instanceof java.util.Date) {// This handles all date and subclasses, sql.Date, sql.Time conversions.
timestamp = Helper.timestampFromDate((java.util.Date)sourceObject);
} else if (sourceObject instanceof Calendar) {
return Helper.timestampFromCalendar((Calendar)sourceObject);
} else if (sourceObject instanceof Long) {
timestamp = Helper.timestampFromLong((Long)sourceObject);
} else {
throw ConversionException.couldNotBeConverted(sourceObject, ClassConstants.TIMESTAMP);
}
return timestamp;
}

oracle.sql.TIMESTAMPTZ handling is DB specific and done by subclasses of the Oracle9Platform. Make sure you have specified the correct target-database platform class that matches your database using the 'target-database' persistence property

Related

jackson cannot deserialize (spring cloud stream kafka)

I am trying to read a json message from kafka and got an exception, which says Jackson cannot deserialize the json to POJO.
The json is like {"code":"500","count":22,"from":1528343820000,"to":1528343880000}, which is an output of kafka stream.
The POJO declares all attributes of the json, and is exactly the same POJO to produce the json message. So I have no idea why it would happen.
I am using spring cloud stream 2.0.0.RELEASE.
Any help would be appreciated. Thanks.
POJO:
public class CodeCount {
private String code;
private long count;
private Date from;
private Date to;
#Override
public String toString() {
return "CodeCount [code=" + code + ", count=" + count + ", from=" + from + ", to=" + to + "]";
}
public CodeCount(String code, long count, Date from, Date to) {
super();
this.code = code;
this.count = count;
this.from = from;
this.to = to;
}
public String getCode() {
return code;
}
public void setCode(String code) {
this.code = code;
}
public long getCount() {
return count;
}
public void setCount(long count) {
this.count = count;
}
public Date getFrom() {
return from;
}
public void setFrom(Date from) {
this.from = from;
}
public Date getTo() {
return to;
}
public void setTo(Date to) {
this.to = to;
}
}
Stacktrace:
2018-06-07 15:18:51.572 ERROR 1 --- [container-0-C-1] o.s.integration.handler.LoggingHandler : org.springframework.messaging.converter.MessageConversionException: Could not read JSON: Cannot construct instance of `com.example.CodeCount` (no Creators, like default construct, exist): cannot deserialize from Object value (no delegate- or property-based Creator)
at [Source: (byte[])"{"code":"500","count":22,"from":1528343820000,"to":1528343880000}"; line: 1, column: 2]; nested exception is com.fasterxml.jackson.databind.exc.InvalidDefinitionException: Cannot construct instance of `com.example.CodeCount` (no Creators, like default construct, exist): cannot deserialize from Object value (no delegate- or property-based Creator)
at com.fasterxml.jackson.databind.exc.InvalidDefinitionException.from(InvalidDefinitionException.java:67) ~[jackson-databind-2.9.3.jar!/:2.9.3]
at com.fasterxml.jackson.databind.DeserializationContext.reportBadDefinition(DeserializationContext.java:1451) ~[jackson-databind-2.9.3.jar!/:2.9.3]
at com.fasterxml.jackson.databind.DeserializationContext.handleMissingInstantiator(DeserializationContext.java:1027) ~[jackson-databind-2.9.3.jar!/:2.9.3]
at com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1275) ~[jackson-databind-2.9.3.jar!/:2.9.3]
at com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:325) ~[jackson-databind-2.9.3.jar!/:2.9.3]
at com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:159) ~[jackson-databind-2.9.3.jar!/:2.9.3]
at com.fasterxml.jackson.databind.ObjectMapper._readMapAndClose(ObjectMapper.java:4001) ~[jackson-databind-2.9.3.jar!/:2.9.3]
at com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:3109) ~[jackson-databind-2.9.3.jar!/:2.9.3]
at org.springframework.messaging.converter.MappingJackson2MessageConverter.convertFromInternal(MappingJackson2MessageConverter.java:221) ~[spring-messaging-5.0.6.RELEASE.jar!/:5.0.6.RELEASE]
... 37 common frames omitted
Jackson needs access to the default constructor to deserialize, add the default constructor to the pojo, ie:
public CodeCount() {
}
You can annotate the existing constructor, and args, and Jackson will use this:
#JsonCreator
public CodeCount(#JsonProperty("code") String code,
#JsonProperty("count") long count,
#JsonProperty("from") Date from,
#JsonProperty("to") Date to) {
super();
this.code = code;
this.count = count;
this.from = from;
this.to = to;
}
Passing in the dates may complicate it a bit, but it is definitely still possible.

The implementation of the FlinkKafkaConsumer010 is not serializable error

I created a custom class that is based on Apache Flink. The following are some parts of the class definition:
public class StreamData {
private StreamExecutionEnvironment env;
private DataStream<byte[]> data ;
private Properties properties;
public StreamData(){
env = StreamExecutionEnvironment.getExecutionEnvironment();
}
public StreamData(StreamExecutionEnvironment e , DataStream<byte[]> d){
env = e ;
data = d ;
}
public StreamData getDataFromESB(String id, int from) {
final Pattern TOPIC = Pattern.compile(id);
Properties properties = new Properties();
properties.setProperty("bootstrap.servers", "localhost:9092");
properties.setProperty("group.id", Long.toString(System.currentTimeMillis()));
properties.setProperty("key.deserializer", "org.apache.kafka.common.serialization.StringDeserializer");
properties.setProperty("value.deserializer", "org.apache.kafka.common.serialization.ByteArrayDeserializer");
properties.put("metadata.max.age.ms", 30000);
properties.put("enable.auto.commit", "false");
if (from == 0)
properties.setProperty("auto.offset.reset", "earliest");
else
properties.setProperty("auto.offset.reset", "latest");
StreamExecutionEnvironment e = StreamExecutionEnvironment.getExecutionEnvironment();
DataStream<byte[]> stream = env
.addSource(new FlinkKafkaConsumer011<>(TOPIC, new AbstractDeserializationSchema<byte[]>() {
#Override
public byte[] deserialize(byte[] bytes) {
return bytes;
}
}, properties));
return new StreamData(e, stream);
}
public void print(){
data.print() ;
}
public void execute() throws Exception {
env.execute() ;
}
Using class StreamData, trying to get some data from Apache Kafka and print them in the main function:
StreamData stream = new StreamData();
stream.getDataFromESB("original_data", 0);
stream.print();
stream.execute();
I got the error:
Exception in thread "main" org.apache.flink.api.common.InvalidProgramException: The implementation of the FlinkKafkaConsumer010 is not serializable. The object probably contains or references non serializable fields.
Caused by: java.io.NotSerializableException: StreamData
As mentioned here, I think it's because of some data type in getDataFromESB function is not serializable. But I don't know how to solve the problem!
Your AbstractDeserializationSchema is an anonymous inner class, which as a result contains a reference to the outer StreamData class which isn't serializable. Either let StreamData implement Serializable, or define your schema as a top-level class.
It seems that you are importing FlinkKafkaConsumer010 in your code but using FlinkKafkaConsumer011. Please use the following dependency in your sbt file:
"org.apache.flink" %% "flink-connector-kafka-0.11" % flinkVersion

Codec not found for requested operation: [list<varchar> <-> java.nio.HeapByteBuffer]

I am having an issue with storing data to Cassandra table from apache ignite when I am trying to insert into a column of list data type in Cassandra
Cassandra table:
CREATE TABLE business_categories (
id int,
category_name TEXT,
sub_categories list<TEXT>,
PRIMARY KEY(category_name, id)
);
xml file:
<persistence keyspace="ignite" table="business_categories">
<keyspaceOptions>
REPLICATION = {'class' : 'SimpleStrategy', 'replication_factor' : 1}
AND DURABLE_WRITES = true
</keyspaceOptions>
<tableOption>
comment = 'Cache test'
AND read_repair_chance = 0.2
</tableOption>
<keyPersistence class="com.cache.business.model.BusinessCategoriesKey" strategy="POJO"/>
<valuePersistence class="com.cache.business.model.BusinessCategoriesValue" strategy="POJO"/>
</persistence>
key class object:
public class BusinessCategoriesKey implements Serializable {
private static final long serialVersionUID = 581472167344584014L;
private int id;
private String category_name;
public int getId() {
return id;
}
public void setId(int id) {
this.id = id;
}
public String getCategory_name() {
return category_name;
}
public void setCategory_name(String category_name) {
this.category_name = category_name;
}
}
value class object:
public class BusinessCategoriesValue implements Serializable {
private static final long serialVersionUID = -1694694702874919854L;
private List<String> sub_categories = new ArrayList<>();
public List<String> getSub_categories() {
return sub_categories;
}
public void setSub_categories(List<String> sub_categories) {
this.sub_categories = sub_categories;
}
public static long getSerialversionuid() {
return serialVersionUID;
}
}
I am getting the below error message
Caused by: com.datastax.driver.core.exceptions.CodecNotFoundException: Codec not found for requested operation: [list <-> java.nio.HeapByteBuffer]
The sub_categories field is a java.util.List and it seems Apache Ignite does not provide a direct mapping to appropriate Cassandra type for such kind of Java types.
So, this field could be persisted into Cassandra only if you manually specify all mapping details for the object type and if field type itself is implementing java.io.Serializable interface.
In such case, the field will be persisted into a separate table column as a blob.
Please try to modify your code in the following way:
CREATE TABLE business_categories (
id int,
category_name text
sub_categories blob,
PRIMARY KEY(category_name, id)
);
Persistence descriptor:
<persistence keyspace="ignite" table="business_categories">
<keyspaceOptions>
REPLICATION = {'class' : 'SimpleStrategy', 'replication_factor' : 1}
AND DURABLE_WRITES = true
</keyspaceOptions>
<tableOption>
comment = 'Cache test'
AND read_repair_chance = 0.2
</tableOption>
<keyPersistence class="com.cache.business.model.BusinessCategoriesKey" strategy="POJO"/>
<valuePersistence class="com.cache.business.model.BusinessCategoriesValue"
strategy="POJO"
serializer="org.apache.ignite.cache.store.cassandra.serializer.JavaSerializer">
<field name="sub_categories" column="sub_categories"/>
</valuePersistence>
You can find additional details here: Cassandra Integration Examples

OrmLite Foreign Collection to List

I try to use foreign collections in ORMLite. However, I dont know how to convert it into list. I try to do something like this :
public class Car implements Serializable {
#DatabaseField(columnName = "carId" , generatedId = true, id=true)
private int id;
#DatabaseField(columnName = "carNumber")
private String mNumber;
#DatabaseField(columnName = "carName")
private String mName;
#ForeignCollectionField(eager = true,columnName = "carParts")
private Collection<Part> mParts;
ArrayList<Part> parts = new ArrayList<>(mParts);
public ArrayList<Part> getParts() {
return parts;
}
public void setParts(ArrayList<Part> parts) {
this.parts = parts;
}
but when I try to use it I get exception :
java.lang.NullPointerException: collection == null
at this line :
ArrayList<Part> parts = new ArrayList<>(mParts);
please, help.
The reason is simple - you have to wait until mParts will be initialized by ORMLite library, then you can create ArrayList from it.
public ArrayList<Part> getParts() {
return new ArrayList<>( mParts );
}

java.lang.IllegalStateException: No application context active

I am getting illegal state exception in below code at
PropertyDao propertyDao = PropertyDao.getInstance();
public class JMySpellCheckerServlet extends TinyMCESpellCheckerServlet {
private static final long serialVersionUID = -2460237918745522935L;
private SpellChecker loadSpellChecker(final String lang) throws SpellCheckException
{
PropertyDao propertyDao = PropertyDao.getInstance();
McsProperty messageLangProperty =
propertyDao.getMcsProperty(PropertyDao.PROPERTY_MESSAGE_LANG);
}
}
Obviously the #Application context is not yet available... you can set it up explicitly like this (before calling PropertyDao.getInstance):
if ( !Contexts.isApplicationContextActive() ) {
Lifecycle.setupApplication();
}