Mybatis TypeHandler to convert Integer to String - sql

I’m using Mybatis ORM with Vertica DB. I have an object like below, which I can’t modify
Class Employee{
Integer empId;
Integer deptId;
Integer salary;
//get and set methods for above
}
And code for TypeHandler is like below
public class IntegerToStringTypeHandler extends BaseTypeHandler<Integer> {
#Override
public void setNonNullParameter(PreparedStatement preparedStatement, int i, Integer integer, JdbcType jdbcType) throws SQLException {
System.out.println("i "+i +" integer "+ integer + " "+integer );
preparedStatement.setString(i,integer.toString());
}
#Override
public Integer getNullableResult(ResultSet resultSet, String columnName) throws SQLException {
return Integer.valueOf(resultSet.getString(columnName));
}
#Override
public Integer getNullableResult(ResultSet resultSet, int i) throws SQLException {
return Integer.valueOf(resultSet.getString(i));
}
#Override
public Integer getNullableResult(CallableStatement callableStatement, int i) throws SQLException {
return Integer.valueOf(callableStatement.getString(i));
}
}
And then table is like below
Desc Employee
Columns. Type
empId Integer
deptId Varchar(10)
Salary Integer
I have mybatis query like below
<select id=“getCount” parameter=“employee”>
Select count(1) from Employee
Where empid = #{employee.empid}
AND. deptId = #{employee. deptId,typeHandler=com.convert.type.IntegerToStringTypeHandler}
AND salary= #{employee.salary}
</select>
But it doesn’t work it fails with the following exception
Caused by: org.apache.ibatis.type.TypeException: Could not set parameters for mapping: ParameterMapping{property='__frch_employee_0.deptid’, mode=IN, javaType=class java.lang.Integer, jdbcType=null, numericScale=null, resultMapId='null', jdbcTypeName='null', expression='null'}. Cause: org.apache.ibatis.type.TypeException: Error setting non null for parameter #3 with JdbcType null . Try setting a different JdbcType for this parameter or a different configuration property. Cause: java.sql.SQLException: [Vertica][JDBC](10940) Invalid parameter index: 3.
at org.apache.ibatis.scripting.defaults.DefaultParameterHandler.setParameters(DefaultParameterHandler.java:89) ~[mybatis-3.4.5.jar:3.4.5]
Caused by: org.apache.ibatis.type.TypeException: Error setting non null for parameter #3 with JdbcType null . Try setting a different JdbcType for this parameter or a different configuration property. Cause: java.sql.SQLException: [Vertica][JDBC](10940) Invalid parameter index: 3.
at org.apache.ibatis.type.BaseTypeHandler.setParameter(BaseTypeHandler.java:55) ~[mybatis-3.4.5.jar:3.4.5]
Caused by: java.sql.SQLException: [Vertica][JDBC](10940) Invalid parameter index: 3.
at com.vertica.exceptions.ExceptionConverter.toSQLException(Unknown Source) ~[vertica-jdbc-8.1.1-7.jar:?]
at com.vertica.jdbc.common.SPreparedStatement.checkValidParameterIndex(Unknown Source) ~[vertica-jdbc-8.1.1-7.jar:?]
at com.vertica.jdbc.common.SPreparedStatement.setString(Unknown Source) ~[vertica-jdbc-8.1.1-7.jar:?]
at com.convert.type.IntegerToStringTypeHandler.setNonNullParameter(IntegerToStringTypeHandler.java:16) ~[classes/:?]
at com.convert.type.IntegerToStringTypeHandler.setNonNullParameter(IntegerToStringTypeHandler.java:11) ~[classes/:?]
at org.apache.ibatis.type.BaseTypeHandler.setParameter(BaseTypeHandler.java:53) ~[mybatis-3.4.5.jar:3.4.5]
at org.apache.ibatis.scripting.defaults.DefaultParameterHandler.setParameters(DefaultParameterHandler.java:87) ~[mybatis-3.4.5.jar:3.4.5]
at org.apache.ibatis.executor.statement.PreparedStatementHandler.parameterize(PreparedStatementHandler.java:93) ~[mybatis-3.4.5.jar:3.4.5]
at org.apache.ibatis.executor.statement.RoutingStatementHandler.parameterize(RoutingStatementHandler.java:64) ~[mybatis-3.4.5.jar:3.4.5]
at org.apache.ibatis.executor.SimpleExecutor.prepareStatement(SimpleExecutor.java:86) ~[mybatis-3.4.5.jar:3.4.5]
I have spent a lot of time. Could someone please let know what I am missing?
the main problem is in this line
#{employee. deptId,typeHandler=com.convert.type.IntegerToStringTypeHandler}
Thanks

The code here is cannot be the one that is actually executed, I guess you have rewrote the code in the post because:
- quoting characters “ / ”are not supported by the XML parser
there is an unexpected dot . after the first AND that likely end up to SQL syntax exception
the select does specify neither a Result Type nor a Result Map.
So please copy/paste the code actually executed, that will allow giving more accurate answer. your TypeHandler is working fine.

Related

Java 15 - unsupportedoperationexception: can't get field offset on a hidden class

Java 15, java-driver-core 4.15.0 and cassandra-unit 4.3.1.0 is used in the test.
When I trying to prepare statement in my test using cqlSession from EmbeddedCassandraServerHelper - I get 'com.datastax.oss.driver.api.core.AllNodesFailedException' exception. With message:
All 1 node(s) tried for the query failed (showing first 1 nodes, use getAllErrors() for more): Node(endPoint=localhost/127.0.0.1:9142, hostId=4e8b27ae-79f5-496a-a4ed-8ce0cb7bbb32, hashCode=aa1d75e): [com.datastax.oss.driver.api.core.servererrors.ServerError: java.lang.UnsupportedOperationException: can't get field offset on a hidden class: private final org.apache.cassandra.db.ClusteringComparator org.apache.cassandra.db.ClusteringComparator$$Lambda$143/0x0000000800d87768.arg$1]
With Java 14 it works fine.
So does Java 15 really not work properly with EmbeddedCassandraServerHelper?
Code sample:
public CqlSession cqlSession;
#Before
public void setUp() throws Exception {
EmbeddedCassandraServerHelper.startEmbeddedCassandra();
cqlSession = EmbeddedCassandraServerHelper.getSession();
new CQLDataLoader(cqlSession)
.load(new ClassPathCQLDataSet("people.cql"));
PreparedStatement ps = cqlSession.prepare("INSERT INTO person(id, name) values(?,?)");
BoundStatement bs = ps.bind("1234","Mike");
cqlSession.execute(bs);
}
people.cql:
CREATE TABLE person(
id varchar,
name varchar,
PRIMARY KEY(id));

Error while executing custom udf

Hi I have a hive table which has got data from sqoop so there is a string field fc which has got null values. all other values are in the form of numbers.
I have written a UDF so that I can get the value has 1000 is there is null in that column and if it is not null i should the same value.My udf code looks like below
package com.cascrmg.customudf;
import org.apache.hadoop.hive.ql.exec.UDF;
import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.Text;
public class sample extends UDF {
int returnVal;
// Accept a string input
public int evaluate(Text input) {
// If the value is null, return a 1000
if (input == null) {
returnVal = 1000;
} else {
returnVal = Integer.parseInt(input.toString());
}
// Lowercase the input string and return it
return returnVal;
}
}
But when I add it and try to execute it, I get below errors.
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Unable to execute method public int com.cascrmg.customudf.sample.evaluate(org.apache.hadoop.io.Text) on object com.cascrmg.customudf.sample#408e96d9 of class com.cascrmg.customudf.sample with arguments {null:org.apache.hadoop.io.Text} of size 1
at org.apache.hadoop.hive.ql.exec.FunctionRegistry.invoke(FunctionRegistry.java:993)
at org.apache.hadoop.hive.ql.udf.generic.GenericUDFBridge.evaluate(GenericUDFBridge.java:182)
at org.apache.hadoop.hive.ql.exec.ExprNodeGenericFuncEvaluator._evaluate(ExprNodeGenericFuncEvaluator.java:186)
at org.apache.hadoop.hive.ql.exec.ExprNodeEvaluator.evaluate(ExprNodeEvaluator.java:77)
at org.apache.hadoop.hive.ql.exec.ExprNodeEvaluator.evaluate(ExprNodeEvaluator.java:65)
at org.apache.hadoop.hive.ql.exec.SelectOperator.processOp(SelectOperator.java:77)
at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:815)
at org.apache.hadoop.hive.ql.exec.TableScanOperator.processOp(TableScanOperator.java:97)
at org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.forward(MapOperator.java:157)
at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:497)
... 9 more
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.hadoop.hive.ql.exec.FunctionRegistry.invoke(FunctionRegistry.java:969)
... 18 more
Caused by: java.lang.NumberFormatException: For input string: "null"
at java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
at java.lang.Integer.parseInt(Integer.java:580)
at java.lang.Integer.parseInt(Integer.java:615)
at com.cascrmg.customudf.sample.evaluate(sample.java:16)
Any help would be really appreciated.
Your input Text/String whichever you are passing as parameter to your function may contain "null" itself is the form of string or text only.
It means you are trying to compare "null"(which is a string ) with null so it will always gives you a false result.
so in this case if input is "null"(String) it will never be equal to null and if it false you are parsing that string to Int .Hence you are getting java.lang.NumberFormatException: For input string: "null" so you can check for both null and "null".
if (input == null||input=="null") {
returnVal = 1000;
} else {
returnVal = Integer.parseInt(input.toString());
}

java.lang.NoClassDefFoundError: antlr/ANTLRException

I am trying to update a value from database . but i dont know why this error is coming.
java.lang.NoClassDefFoundError: antlr/ANTLRException
at org.hibernate.hql.ast.ASTQueryTranslatorFactory.createQueryTranslator(ASTQueryTranslatorFactory.java:35)
at org.hibernate.engine.query.HQLQueryPlan.<init>(HQLQueryPlan.java:74)
at org.hibernate.engine.query.HQLQueryPlan.<init>(HQLQueryPlan.java:56)
DetailsDaoImpl.java
#Override
public boolean updateUserProfile(String name, String displayname) {
String profileUpdateQuery="UPDATE demo.updateprofile SET displayname='"+ displayname+"' WHERE userId=:userId";
System.out.println("query is"+profileUpdateQuery);
Query updateQuery=sessionFactory.getCurrentSession().createQuery(profileUpdateQuery);
System.out.println(updateQuery+"-----");
updateQuery.setParameter("userId", name);
int resultOfUpdate=updateQuery.executeUpdate();
System.out.println(resultOfUpdate);
return resultOfUpdate>0?true:false;
}
console
query isUPDATE demo.updateprofile SET displayname='xyz' WHERE userId=:userId

Spring Transient Data Access Resource Exception in jdbcTemplate update

I have a method to detect duplicate entry for a column:
(I inject to jdbcTemplate correctly)
private boolean isDuplicate(String username) {
String sql = " select username from users where username=?";
int result = jdbcTemplate.update(sql, new Object[]{username}, String.class);
return result;
}
But i got this exception in runtime:
org.springframework.dao.TransientDataAccessResourceException:
PreparedStatementCallback; SQL [ select username from users where username=?]; Invalid argument value: java.lang.ArrayIndexOutOfBoundsException;
nested exception is java.sql.SQLException: Invalid argument value: java.lang.ArrayIndexOutOfBoundsException
We can use the queryForList() method of jdbcTemplate like this:
results = jdbcTemplate.queryForList(sql,new Object[]{username},String.class);
if(results.isEmpty(){
//no duplicate
}
else{
//duplicate
}
Where results is a List<String>.

HibernateException: Errors in named query

When running a particular unit-test, I am getting the exception:
Caused by: org.hibernate.HibernateException: Errors in named queries: UPDATE_NEXT_FIRE_TIME
at org.hibernate.impl.SessionFactoryImpl.<init>(SessionFactoryImpl.java:437)
at org.hibernate.cfg.Configuration.buildSessionFactory(Configuration.java:1385)
at org.hibernate.cfg.AnnotationConfiguration.buildSessionFactory(AnnotationConfiguration.java:954)
at org.hibernate.ejb.Ejb3Configuration.buildEntityManagerFactory(Ejb3Configuration.java:891)
... 44 more
for the named query defined here:
#Entity(name="fireTime")
#Table(name="qrtz_triggers")
#NamedQueries({
#NamedQuery(
name="UPDATE_NEXT_FIRE_TIME",
query= "update fireTime t set t.next_fire_time = :epochTime where t.trigger_name = 'CalculationTrigger'")
})
public class JpaFireTimeUpdaterImpl implements FireTimeUpdater {
#Id
#Column(name="next_fire_time", insertable=true, updatable=true)
private long epochTime;
public JpaFireTimeUpdaterImpl() {}
public JpaFireTimeUpdaterImpl(final long epochTime) {
this.epochTime = epochTime;
}
#Override
public long getEpochTime() {
return this.epochTime;
}
public void setEpochTime(final long epochTime) {
this.epochTime = epochTime;
}
}
After debugging as deep as I could, I've found that the exception occurs in w.statement(hqlAst) in QueryTranslatorImpl:
private HqlSqlWalker analyze(HqlParser parser, String collectionRole) throws QueryException, RecognitionException {
HqlSqlWalker w = new HqlSqlWalker( this, factory, parser, tokenReplacements, collectionRole );
AST hqlAst = parser.getAST();
// Transform the tree.
w.statement( hqlAst );
if ( AST_LOG.isDebugEnabled() ) {
ASTPrinter printer = new ASTPrinter( SqlTokenTypes.class );
AST_LOG.debug( printer.showAsString( w.getAST(), "--- SQL AST ---" ) );
}
w.getParseErrorHandler().throwQueryException();
return w;
}
Is there something wrong with my query or annotations?
NamedQuery should be written with JPQL, but query seems to mix both names of persistent attributes and names of database columns. Names of database columns cannot be used in JPQL.
In this case instead of next_fire_time name of the persistent attribute epochTime should be used. Also trigger_name looks more like name of the database column than name of the persistent attribute, but it seems not to be mapped in your current class at all. After it is mapped, query is as follows:
update fireTime t set t.epochTime = :epochTime
where t.triggerName = 'CalculationTrigger'
If SQL query is preferred, then #NamedNativeQuery should be used instead.
As a side note, JPA 2.0 specification doesn't encourage changing primary key:
The application must not change the value of the primary key[10]. The
behavior is undefined if this occurs.[11]
In general entities are not aware of changed made via JPQL queries. That gets especially interesting when trying to refresh entity that does not exist anymore (because primary key was changed).
Additionally naming is little bit confusing:
Name of the class looks more like name of the service class
than name of the entity.
Starting name of the entity with lower
case letter is rather rare style.
Name of the entity, name of the
table and name of the class do not match too well.