Unsupported Operation Exception mybatis - sql

So in this code what I want is a random SQL query from the outside that will be loaded into a property file. As of now i've got the property file with a query in it to test this. So i would like some data out, with a headline and all the data beneath. Basically just the data to start with though, as this test should do. But I get the error message that i've linked below. I can't for the life of me figure out where my problem is. Help please! :)
I've got the following code;
DataHandler class.
public class DataHandler{
DataService dataService = new DataService();
public String getPropertyValue() throws IOException {
Properties prop = new Properties();
String propFileName = "randomSqlQuery.properties";
InputStream inputStream = getClass().getClassLoader().getResourceAsStream(propFileName);
prop.load(inputStream);
if (inputStream == null) {
throw new FileNotFoundException("property file '" + propFileName + "' not found in the classpath");
}
String result = prop.getProperty("sqlQuery");
return result;
}
public Data getKeysAndValues() throws IOException {
String query = getPropertyValue();
List<List<Object>> randomSqlQuery = dataService.getRandomSqlQuery(query);
List<List<Object>> recordList = new ArrayList<>();
List<String> headline = new ArrayList();
if (randomSqlQuery != null && randomSqlQuery.size() > 0) {
{
List<Object> record = randomSqlQuery.get(0);
getHeadlines(record, headline);
}
for (int i = 1; i < randomSqlQuery.size(); i++) {
List<Object> singleRecord = randomSqlQuery.get(i);
recordList.add(singleRecord);
System.out.println(recordList);
}
}
return new DataImpl(headline, recordList);
}
private void getHeadlines(List<Object> record, List<String> headline) {
for (Object headlineName : record) {
headline.add((String) headlineName);
System.out.println(headlineName);
}
}
}
DataMapper class
public interface DataMapper {
public List<List<Object>> getRandomSqlQuery(#Param("query") String query);
}
DataMapper XML
<?xml version="1.0" encoding="UTF-8" ?>
<!DOCTYPE mapper PUBLIC "-//mybatis.org//DTD Mapper 3.0//EN"
"http://mybatis.org/dtd/mybatis-3-mapper.dtd">
<mapper namespace="nd.mappers.DataMapper">
<select id="getRandomSqlQuery" resultType="java.util.List">
${query}
</select>
</mapper>
DataImpl class, which has an interface connected to it
public class DataImpl implements Serializable, Data {
private final List<String> headers;
private final List<List<Object>> records;
public DataImpl(List<String> headers, List<List<Object>> records) {
this.headers = Collections.unmodifiableList(headers);
this.records = Collections.unmodifiableList(records);
}
#Override
public List<String> getHeaders() {
return this.headers;
}
#Override
public List<List<Object>> getRecords() {
return this.records;
}
}
And a DataService class
public class DataService implements DataMapper {
#Override
public List<List<Object>> getRandomSqlQuery(String query) {
SqlSession sqlSession = MyBatisUtil.getSqlSessionFactory().openSession();
try {
DataMapper dataMapper = sqlSession.getMapper(DataMapper.class);
return dataMapper.getRandomSqlQuery(query);
} finally {
sqlSession.close();
}
}
}
Finally the test
//dataHandler instantiated in top
#Test
public void getKeysAndValues() throws IOException {
dataHandler.getKeysAndValues();
}
And here is my error!
### Error querying database. Cause: java.lang.UnsupportedOperationException
### The error may exist in nd/mappers/DataMapper.xml
### The error may involve defaultParameterMap
### The error occurred while setting parameters
### SQL: SELECT * FROM PERSON ### Cause: java.lang.UnsupportedOperationException
org.apache.ibatis.exceptions.PersistenceException
### Error querying database. Cause: java.lang.UnsupportedOperationException
### The error may exist in nd/mappers/DataMapper.xml
### The error may involve defaultParameterMap
### The error occurred while setting parameters
### SQL: SELECT * FROM PERSON
### Cause: java.lang.UnsupportedOperationException
at org.apache.ibatis.exceptions.ExceptionFactory.wrapException(ExceptionFactory.java:26)
at org.apache.ibatis.session.defaults.DefaultSqlSession.selectList(DefaultSqlSession.java:111)
No idea what to do. The SQL is coming from the property file. Sorry for massive text.

I just resolved the same error message. The problem lies here:
<select id="getRandomSqlQuery" resultType="Person">
${query}
</select>
The result type should not be a collection but a type that the
collection contains. In your case it should be some Person
POJO. This type then must be defined in the (say mybatis-config.xml) configuration
file:
<typeAliases>
<typeAlias alias="Person" type="com.example.bean.Book"/>
</typeAliases>
Also, you would want to check for null here:
try {
DataMapper dataMapper = qlSession.getMapper(DataMapper.class);
return dataMapper.getRandomSqlQuery(query);
} finally {
if (sqlSession != null) {
sqlSession.close();
}
}

Related

JUnit 5 Parameterized test #ArgumentsSource parameters not loading

I have created below JUnit5 parameterized test with ArgumentsSource for loading arguments for the test:
public class DemoModelValidationTest {
public ParamsProvider paramsProvider;
public DemoModelValidationTest () {
try {
paramsProvider = new ParamsProvider();
}
catch (Exception iaex) {
}
}
#ParameterizedTest
#ArgumentsSource(ParamsProvider.class)
void testAllConfigurations(int configIndex, String a) throws Exception {
paramsProvider.executeSimulation(configIndex);
}
}
and the ParamsProvider class looks like below:
public class ParamsProvider implements ArgumentsProvider {
public static final String modelPath = System.getProperty("user.dir") + File.separator + "demoModels";
YAMLDeserializer deserializedYAML;
MetaModelToValidationModel converter;
ValidationRunner runner;
List<Configuration> configurationList;
List<Arguments> listOfArguments;
public ParamsProvider() throws Exception {
configurationList = new ArrayList<>();
listOfArguments = new LinkedList<>();
deserializedYAML = new YAMLDeserializer(modelPath);
deserializedYAML.load();
converter = new MetaModelToValidationModel(deserializedYAML);
runner = converter.convert();
configurationList = runner.getConfigurations();
for (int i = 0; i < configurationList.size(); i++) {
listOfArguments.add(Arguments.of(i, configurationList.get(i).getName()));
}
}
public void executeSimulation(int configListIndex) throws Exception {
final Configuration config = runner.getConfigurations().get(configListIndex);
runner.run(config);
runner.getReporter().consolePrintReport();
}
#Override
public Stream<? extends Arguments> provideArguments(ExtensionContext context) {
return listOfArguments.stream().map(Arguments::of);
// return Stream.of(Arguments.of(0, "Actuator Power"), Arguments.of(1, "Error Logging"));
}}
In the provideArguments() method, the commented out code is working fine, but the first line of code
listOfArguments.stream().map(Arguments::of)
is returning the following error:
org.junit.platform.commons.PreconditionViolationException: Configuration error: You must configure at least one set of arguments for this #ParameterizedTest
I am not sure whether I am having a casting problem for the stream in provideArguments() method, but I guess it somehow cannot map the elements of listOfArguments to the stream, which can finally take the form like below:
Stream.of(Arguments.of(0, "Actuator Power"), Arguments.of(1, "Error Logging"))
Am I missing a proper stream mapping of listOfArguments?
provideArguments(…) is called before your test is invoked.
Your ParamsProvider class is instantiated by JUnit. Whatever you’re doing in desiralizeAndCreateValidationRunnerInstance should be done in the ParamsProvider constructor.
Also you’re already wrapping the values fro deserialised configurations to Arguments and you’re double wrapping them in providesArguments.
Do this:
#Override
public Stream<? extends Arguments> provideArguments(ExtensionContext context) {
return listOfArguments.stream();
}}

Upgrade Solution to use FluentValidation Ver 10 Exception Issue

Please I need your help to solve FluentValidation issue. I have an old desktop application which I wrote a few years ago. I used FluentValidation Ver 4 and Now I'm trying to upgrade this application to use .Net framework 4.8 and FluentValidation Ver 10, but unfortunately, I couldn't continue because of an exception that I still cannot fix.
I have this customer class:
class Customer : MyClassBase
{
string _CustomerName = string.Empty;
public string CustomerName
{
get { return _CustomerName; }
set
{
if (_CustomerName == value)
return;
_CustomerName = value;
}
}
class CustomerValidator : AbstractValidator<Customer>
{
public CustomerValidator()
{
RuleFor(obj => obj.CustomerName).NotEmpty().WithMessage("{PropertyName} is Empty");
}
}
protected override IValidator GetValidator()
{
return new CustomerValidator();
}
}
This is my base class:
class MyClassBase
{
public MyClassBase()
{
_Validator = GetValidator();
Validate();
}
protected IValidator _Validator = null;
protected IEnumerable<ValidationFailure> _ValidationErrors = null;
protected virtual IValidator GetValidator()
{
return null;
}
public IEnumerable<ValidationFailure> ValidationErrors
{
get { return _ValidationErrors; }
set { }
}
public void Validate()
{
if (_Validator != null)
{
var context = new ValidationContext<Object>(_Validator);
var results = _Validator.Validate(context); **// <======= Exception is here in this line**
_ValidationErrors = results.Errors;
}
}
public virtual bool IsValid
{
get
{
if (_ValidationErrors != null && _ValidationErrors.Count() > 0)
return false;
else
return true;
}
}
}
When I run the application test I get the below exception:
System.InvalidOperationException HResult=0x80131509 Message=Cannot
validate instances of type 'CustomerValidator'. This validator can
only validate instances of type 'Customer'. Source=FluentValidation
StackTrace: at
FluentValidation.ValidationContext1.GetFromNonGenericContext(IValidationContext context) in C:\Projects\FluentValidation\src\FluentValidation\IValidationContext.cs:line 211 at FluentValidation.AbstractValidator1.FluentValidation.IValidator.Validate(IValidationContext
context)
Please, what is the issue here and How can I fix it?
Thank you
Your overall implementation isn't what I'd consider normal usage however the problem is that you're asking FV to validate the validator instance, rather than the customer instance:
var context = new ValidationContext<Object>(_Validator);
var results = _Validator.Validate(context);
It should start working if you change it to:
var context = new ValidationContext<object>(this);
var results = _Validator.Validate(context);
You're stuck with using the object argument for the validation context unless you introduce a generic argument to the base class, or create it using reflection.

java.lang.ClassCastException: [Ljava.lang.Object; cannot be cast to java.util.List

I have a List<List<String>> dataTableList and I would like to get a specific list from there and put it on my List<String> dataList so that I could loop through that specific lists' value and alter it.
However, whenever I try to do that,I always get an error of:
java.lang.ClassCastException: [Ljava.lang.Object; cannot be cast to java.util.List.
Here's a sample of how I am trying to assign a specific list from dataTableList to dataList:
//First I looped through the List of Lists and called each list fetched as dataList
for(List<String> dataList : getTryLang().getDataTableList()){
//then, I created an iterator to be used later when I return the List I fetched with altered value
int iter = 0;
//next, I created a for-loop to iterate through the values of the List the I feched
for(int x; x < dataList.size(); x++){
//here, I formatted each value to amount or currency format.
dataList.set(x, getDataConvert().convertAmount(dataList.get(x)));
//finally, after I formatted everything, I returned it or set it to the specific index it's been located before
getTryLang().getDataTableList().set(iter, dataList);
}
iter++;
}
EDIT:
Here's my code and I modified some of it and didn't include some so that I could focus on expressing where the problem occurs.
Here's my TryLang.java:
#ManagedBean
#SessionScoped
public class TryLang implements Serializable {
public TryLang() {}
//declare
private List<List<String>> dataTableList;
//getter setter
public List<List<String>> getDataTableList() {
return dataTableList == null ? dataTableList = new ArrayList<>() : dataTableList;
}
public void setDataTableList(List<List<String>> dataTableList) {
this.dataTableList = dataTableList;
}
}
Then here's my BookOfAccountsController.java:
#ManagedBean
#RequestScoped
public class BooksOfAccountsController implements Serializable {
public BooksOfAccountsController() {}
//declare
#ManagedProperty(value = "#{dataConvert}")
private DataConvert dataConvert;
#ManagedProperty(value = "#{tryLang}")
private TryLang tryLang;
//getter setter NOTE: I wouldn't include other getter setters to shorten the code here :)
public TryLang getTryLang() {
return tryLang == null ? tryLang = new TryLang() : tryLang;
}
public void setTryLang(TryLang tryLang) {
this.tryLang = tryLang;
}
//I would just go straight to the method instead
public void runBooksOfAccounts() throws SystemException, SQLException {
//So there are dbCons here to connect on my DB and all. And I'll just go straight on where the List<List<String>> is being set
//Here's where the List<List<String>> is being set
getTryLang().setDataTableList(getCemf().getFdemf().createEntityManager().createNativeQuery("SELECT crj.* FROM crj_rep crj").getResultList());
getTryLang().setDataTableColumns(getCemf().getFdemf().createEntityManager().createNativeQuery("SELECT col.column_name FROM information_schema.columns col WHERE table_schema = 'public' AND table_name = 'crj_rep'").getResultList());
for (int x = 0; x < getTryLang().getDataTableColumns().size(); x++) {
try {
Integer.parseInt(getTryLang().getDataTableColumns().get(x));
getTryLang().getDataTableColumns().set(x, getDataConvert().accountCodeConvert(getTryLang().getDataTableColumns().get(x)));
//then here is where the error points at
for (List<String> dataList : getTryLang().getDataTableList()) {
try{
int iter = 0;
dataList.set(x, getDataConvert().convertAmount(new BigDecimal(dataList.get(x))));
getTryLang().getDataTableList().set(iter, dataList);
iter++;
}catch(ClassCastException ne){
System.out.println("cannot convert " + ne);
}
}
} catch (NumberFormatException ne) {
//print the error
}
}
}
}

The implementation of the FlinkKafkaConsumer010 is not serializable error

I created a custom class that is based on Apache Flink. The following are some parts of the class definition:
public class StreamData {
private StreamExecutionEnvironment env;
private DataStream<byte[]> data ;
private Properties properties;
public StreamData(){
env = StreamExecutionEnvironment.getExecutionEnvironment();
}
public StreamData(StreamExecutionEnvironment e , DataStream<byte[]> d){
env = e ;
data = d ;
}
public StreamData getDataFromESB(String id, int from) {
final Pattern TOPIC = Pattern.compile(id);
Properties properties = new Properties();
properties.setProperty("bootstrap.servers", "localhost:9092");
properties.setProperty("group.id", Long.toString(System.currentTimeMillis()));
properties.setProperty("key.deserializer", "org.apache.kafka.common.serialization.StringDeserializer");
properties.setProperty("value.deserializer", "org.apache.kafka.common.serialization.ByteArrayDeserializer");
properties.put("metadata.max.age.ms", 30000);
properties.put("enable.auto.commit", "false");
if (from == 0)
properties.setProperty("auto.offset.reset", "earliest");
else
properties.setProperty("auto.offset.reset", "latest");
StreamExecutionEnvironment e = StreamExecutionEnvironment.getExecutionEnvironment();
DataStream<byte[]> stream = env
.addSource(new FlinkKafkaConsumer011<>(TOPIC, new AbstractDeserializationSchema<byte[]>() {
#Override
public byte[] deserialize(byte[] bytes) {
return bytes;
}
}, properties));
return new StreamData(e, stream);
}
public void print(){
data.print() ;
}
public void execute() throws Exception {
env.execute() ;
}
Using class StreamData, trying to get some data from Apache Kafka and print them in the main function:
StreamData stream = new StreamData();
stream.getDataFromESB("original_data", 0);
stream.print();
stream.execute();
I got the error:
Exception in thread "main" org.apache.flink.api.common.InvalidProgramException: The implementation of the FlinkKafkaConsumer010 is not serializable. The object probably contains or references non serializable fields.
Caused by: java.io.NotSerializableException: StreamData
As mentioned here, I think it's because of some data type in getDataFromESB function is not serializable. But I don't know how to solve the problem!
Your AbstractDeserializationSchema is an anonymous inner class, which as a result contains a reference to the outer StreamData class which isn't serializable. Either let StreamData implement Serializable, or define your schema as a top-level class.
It seems that you are importing FlinkKafkaConsumer010 in your code but using FlinkKafkaConsumer011. Please use the following dependency in your sbt file:
"org.apache.flink" %% "flink-connector-kafka-0.11" % flinkVersion

Storm Kafkaspout KryoSerialization issue for java bean from kafka topic

Hi I am new to Storm and Kafka.
I am using storm 1.0.1 and kafka 0.10.0
we have a kafkaspout that would receive java bean from kafka topic.
I have spent several hours digging to find the right approach for that.
Found few articles which are useful but none of the approaches worked for me so far.
Following is my codes:
StormTopology:
public class StormTopology {
public static void main(String[] args) throws Exception {
//Topo test /zkroot test
if (args.length == 4) {
System.out.println("started");
BrokerHosts hosts = new ZkHosts("localhost:2181");
SpoutConfig kafkaConf1 = new SpoutConfig(hosts, args[1], args[2],
args[3]);
kafkaConf1.zkRoot = args[2];
kafkaConf1.useStartOffsetTimeIfOffsetOutOfRange = true;
kafkaConf1.startOffsetTime = kafka.api.OffsetRequest.LatestTime();
kafkaConf1.scheme = new SchemeAsMultiScheme(new KryoScheme());
KafkaSpout kafkaSpout1 = new KafkaSpout(kafkaConf1);
System.out.println("started");
ShuffleBolt shuffleBolt = new ShuffleBolt(args[1]);
AnalysisBolt analysisBolt = new AnalysisBolt(args[1]);
TopologyBuilder topologyBuilder = new TopologyBuilder();
topologyBuilder.setSpout("kafkaspout", kafkaSpout1, 1);
//builder.setBolt("counterbolt2", countbolt2, 3).shuffleGrouping("kafkaspout");
//This is for field grouping in bolt we need two bolt for field grouping or it wont work
topologyBuilder.setBolt("shuffleBolt", shuffleBolt, 3).shuffleGrouping("kafkaspout");
topologyBuilder.setBolt("analysisBolt", analysisBolt, 5).fieldsGrouping("shuffleBolt", new Fields("trip"));
Config config = new Config();
config.registerSerialization(VehicleTrip.class, VehicleTripKyroSerializer.class);
config.setDebug(true);
config.setNumWorkers(1);
LocalCluster cluster = new LocalCluster();
cluster.submitTopology(args[0], config, topologyBuilder.createTopology());
// StormSubmitter.submitTopology(args[0], config,
// builder.createTopology());
} else {
System.out
.println("Insufficent Arguements - topologyName kafkaTopic ZKRoot ID");
}
}
}
I am serializing the data at kafka using kryo
KafkaProducer:
public class StreamKafkaProducer {
private static Producer producer;
private final Properties props = new Properties();
private static final StreamKafkaProducer KAFKA_PRODUCER = new StreamKafkaProducer();
private StreamKafkaProducer(){
props.put("bootstrap.servers", "localhost:9092");
props.put("acks", "all");
props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer");
props.put("value.serializer", "com.abc.serializer.MySerializer");
producer = new org.apache.kafka.clients.producer.KafkaProducer(props);
}
public static StreamKafkaProducer getStreamKafkaProducer(){
return KAFKA_PRODUCER;
}
public void produce(String topic, VehicleTrip vehicleTrip){
ProducerRecord<String,VehicleTrip> producerRecord = new ProducerRecord<>(topic,vehicleTrip);
producer.send(producerRecord);
//producer.close();
}
public static void closeProducer(){
producer.close();
}
}
Kyro Serializer:
public class DataKyroSerializer extends Serializer<Data> implements Serializable {
#Override
public void write(Kryo kryo, Output output, VehicleTrip vehicleTrip) {
output.writeLong(data.getStartedOn().getTime());
output.writeLong(data.getEndedOn().getTime());
}
#Override
public Data read(Kryo kryo, Input input, Class<VehicleTrip> aClass) {
Data data = new Data();
data.setStartedOn(new Date(input.readLong()));
data.setEndedOn(new Date(input.readLong()));
return data;
}
I need to get the data back to the Data bean.
As per few articles I need to provide with a custom scheme and make it part of topology but till now I have no luck
Code for Bolt and Scheme
Scheme:
public class KryoScheme implements Scheme {
private ThreadLocal<Kryo> kryos = new ThreadLocal<Kryo>() {
protected Kryo initialValue() {
Kryo kryo = new Kryo();
kryo.addDefaultSerializer(Data.class, new DataKyroSerializer());
return kryo;
};
};
#Override
public List<Object> deserialize(ByteBuffer ser) {
return Utils.tuple(kryos.get().readObject(new ByteBufferInput(ser.array()), Data.class));
}
#Override
public Fields getOutputFields( ) {
return new Fields( "data" );
}
}
and bolt:
public class AnalysisBolt implements IBasicBolt {
/**
*
*/
private static final long serialVersionUID = 1L;
private String topicname = null;
public AnalysisBolt(String topicname) {
this.topicname = topicname;
}
public void prepare(Map stormConf, TopologyContext topologyContext) {
System.out.println("prepare");
}
public void execute(Tuple input, BasicOutputCollector collector) {
System.out.println("execute");
Fields fields = input.getFields();
try {
JSONObject eventJson = (JSONObject) JSONSerializer.toJSON((String) input
.getValueByField(fields.get(1)));
String StartTime = (String) eventJson.get("startedOn");
String EndTime = (String) eventJson.get("endedOn");
String Oid = (String) eventJson.get("_id");
int V_id = (Integer) eventJson.get("vehicleId");
//call method getEventForVehicleWithinTime(Long vehicleId, Date startTime, Date endTime)
System.out.println("==========="+Oid+"| "+V_id+"| "+StartTime+"| "+EndTime);
} catch (Exception e) {
e.printStackTrace();
}
}
but if I submit the storm topology i am getting error:
java.lang.IllegalStateException: Spout 'kafkaspout' contains a
non-serializable field of type com.abc.topology.KryoScheme$1, which
was instantiated prior to topology creation.
com.minda.iconnect.topology.KryoScheme$1 should be instantiated within
the prepare method of 'kafkaspout at the earliest.
Appreciate help to debug the issue and guide to right path.
Thanks
Your ThreadLocal is not Serializable. The preferable solution would be to make your serializer both Serializable and threadsafe. If this is not possible, then I see 2 alternatives since there is no prepare method as you would get in a bolt.
Declare it as static, which is inherently transient.
Declare it transient and access it via a private get method. Then you can initialize the variable on first access.
Within the Storm lifecycle, the topology is instantiated and then serialized to byte format to be stored in ZooKeeper, prior to the topology being executed. Within this step, if a spout or bolt within the topology has an initialized unserializable property, serialization will fail.
If there is a need for a field that is unserializable, initialize it within the bolt or spout's prepare method, which is run after the topology is delivered to the worker.
Source: Best Practices for implementing Apache Storm