Error while executing custom udf - hive

Hi I have a hive table which has got data from sqoop so there is a string field fc which has got null values. all other values are in the form of numbers.
I have written a UDF so that I can get the value has 1000 is there is null in that column and if it is not null i should the same value.My udf code looks like below
package com.cascrmg.customudf;
import org.apache.hadoop.hive.ql.exec.UDF;
import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.Text;
public class sample extends UDF {
int returnVal;
// Accept a string input
public int evaluate(Text input) {
// If the value is null, return a 1000
if (input == null) {
returnVal = 1000;
} else {
returnVal = Integer.parseInt(input.toString());
}
// Lowercase the input string and return it
return returnVal;
}
}
But when I add it and try to execute it, I get below errors.
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Unable to execute method public int com.cascrmg.customudf.sample.evaluate(org.apache.hadoop.io.Text) on object com.cascrmg.customudf.sample#408e96d9 of class com.cascrmg.customudf.sample with arguments {null:org.apache.hadoop.io.Text} of size 1
at org.apache.hadoop.hive.ql.exec.FunctionRegistry.invoke(FunctionRegistry.java:993)
at org.apache.hadoop.hive.ql.udf.generic.GenericUDFBridge.evaluate(GenericUDFBridge.java:182)
at org.apache.hadoop.hive.ql.exec.ExprNodeGenericFuncEvaluator._evaluate(ExprNodeGenericFuncEvaluator.java:186)
at org.apache.hadoop.hive.ql.exec.ExprNodeEvaluator.evaluate(ExprNodeEvaluator.java:77)
at org.apache.hadoop.hive.ql.exec.ExprNodeEvaluator.evaluate(ExprNodeEvaluator.java:65)
at org.apache.hadoop.hive.ql.exec.SelectOperator.processOp(SelectOperator.java:77)
at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:815)
at org.apache.hadoop.hive.ql.exec.TableScanOperator.processOp(TableScanOperator.java:97)
at org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.forward(MapOperator.java:157)
at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:497)
... 9 more
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.hadoop.hive.ql.exec.FunctionRegistry.invoke(FunctionRegistry.java:969)
... 18 more
Caused by: java.lang.NumberFormatException: For input string: "null"
at java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
at java.lang.Integer.parseInt(Integer.java:580)
at java.lang.Integer.parseInt(Integer.java:615)
at com.cascrmg.customudf.sample.evaluate(sample.java:16)
Any help would be really appreciated.

Your input Text/String whichever you are passing as parameter to your function may contain "null" itself is the form of string or text only.
It means you are trying to compare "null"(which is a string ) with null so it will always gives you a false result.
so in this case if input is "null"(String) it will never be equal to null and if it false you are parsing that string to Int .Hence you are getting java.lang.NumberFormatException: For input string: "null" so you can check for both null and "null".
if (input == null||input=="null") {
returnVal = 1000;
} else {
returnVal = Integer.parseInt(input.toString());
}

Related

Mybatis TypeHandler to convert Integer to String

I’m using Mybatis ORM with Vertica DB. I have an object like below, which I can’t modify
Class Employee{
Integer empId;
Integer deptId;
Integer salary;
//get and set methods for above
}
And code for TypeHandler is like below
public class IntegerToStringTypeHandler extends BaseTypeHandler<Integer> {
#Override
public void setNonNullParameter(PreparedStatement preparedStatement, int i, Integer integer, JdbcType jdbcType) throws SQLException {
System.out.println("i "+i +" integer "+ integer + " "+integer );
preparedStatement.setString(i,integer.toString());
}
#Override
public Integer getNullableResult(ResultSet resultSet, String columnName) throws SQLException {
return Integer.valueOf(resultSet.getString(columnName));
}
#Override
public Integer getNullableResult(ResultSet resultSet, int i) throws SQLException {
return Integer.valueOf(resultSet.getString(i));
}
#Override
public Integer getNullableResult(CallableStatement callableStatement, int i) throws SQLException {
return Integer.valueOf(callableStatement.getString(i));
}
}
And then table is like below
Desc Employee
Columns. Type
empId Integer
deptId Varchar(10)
Salary Integer
I have mybatis query like below
<select id=“getCount” parameter=“employee”>
Select count(1) from Employee
Where empid = #{employee.empid}
AND. deptId = #{employee. deptId,typeHandler=com.convert.type.IntegerToStringTypeHandler}
AND salary= #{employee.salary}
</select>
But it doesn’t work it fails with the following exception
Caused by: org.apache.ibatis.type.TypeException: Could not set parameters for mapping: ParameterMapping{property='__frch_employee_0.deptid’, mode=IN, javaType=class java.lang.Integer, jdbcType=null, numericScale=null, resultMapId='null', jdbcTypeName='null', expression='null'}. Cause: org.apache.ibatis.type.TypeException: Error setting non null for parameter #3 with JdbcType null . Try setting a different JdbcType for this parameter or a different configuration property. Cause: java.sql.SQLException: [Vertica][JDBC](10940) Invalid parameter index: 3.
at org.apache.ibatis.scripting.defaults.DefaultParameterHandler.setParameters(DefaultParameterHandler.java:89) ~[mybatis-3.4.5.jar:3.4.5]
Caused by: org.apache.ibatis.type.TypeException: Error setting non null for parameter #3 with JdbcType null . Try setting a different JdbcType for this parameter or a different configuration property. Cause: java.sql.SQLException: [Vertica][JDBC](10940) Invalid parameter index: 3.
at org.apache.ibatis.type.BaseTypeHandler.setParameter(BaseTypeHandler.java:55) ~[mybatis-3.4.5.jar:3.4.5]
Caused by: java.sql.SQLException: [Vertica][JDBC](10940) Invalid parameter index: 3.
at com.vertica.exceptions.ExceptionConverter.toSQLException(Unknown Source) ~[vertica-jdbc-8.1.1-7.jar:?]
at com.vertica.jdbc.common.SPreparedStatement.checkValidParameterIndex(Unknown Source) ~[vertica-jdbc-8.1.1-7.jar:?]
at com.vertica.jdbc.common.SPreparedStatement.setString(Unknown Source) ~[vertica-jdbc-8.1.1-7.jar:?]
at com.convert.type.IntegerToStringTypeHandler.setNonNullParameter(IntegerToStringTypeHandler.java:16) ~[classes/:?]
at com.convert.type.IntegerToStringTypeHandler.setNonNullParameter(IntegerToStringTypeHandler.java:11) ~[classes/:?]
at org.apache.ibatis.type.BaseTypeHandler.setParameter(BaseTypeHandler.java:53) ~[mybatis-3.4.5.jar:3.4.5]
at org.apache.ibatis.scripting.defaults.DefaultParameterHandler.setParameters(DefaultParameterHandler.java:87) ~[mybatis-3.4.5.jar:3.4.5]
at org.apache.ibatis.executor.statement.PreparedStatementHandler.parameterize(PreparedStatementHandler.java:93) ~[mybatis-3.4.5.jar:3.4.5]
at org.apache.ibatis.executor.statement.RoutingStatementHandler.parameterize(RoutingStatementHandler.java:64) ~[mybatis-3.4.5.jar:3.4.5]
at org.apache.ibatis.executor.SimpleExecutor.prepareStatement(SimpleExecutor.java:86) ~[mybatis-3.4.5.jar:3.4.5]
I have spent a lot of time. Could someone please let know what I am missing?
the main problem is in this line
#{employee. deptId,typeHandler=com.convert.type.IntegerToStringTypeHandler}
Thanks
The code here is cannot be the one that is actually executed, I guess you have rewrote the code in the post because:
- quoting characters “ / ”are not supported by the XML parser
there is an unexpected dot . after the first AND that likely end up to SQL syntax exception
the select does specify neither a Result Type nor a Result Map.
So please copy/paste the code actually executed, that will allow giving more accurate answer. your TypeHandler is working fine.

Infinite recursion when emitting nested TableRow in Google Cloud Dataflow

I'm trying to pass a TableRow I've generated between stages of my pipeline, and I get the following error:
Exception in thread "main"
com.google.cloud.dataflow.sdk.Pipeline$PipelineExecutionException:
java.lang.IllegalArgumentException: Forbidden IOException when writing to OutputStream
[... exception propagation ...]
Caused by: com.fasterxml.jackson.databind.JsonMappingException:
Infinite recursion (StackOverflowError) (through reference chain:
com.google.protobuf.Descriptors$Descriptor["file"]
->com.google.protobuf.Descriptors$FileDescriptor["messageTypes"]
->java.util.Collections$UnmodifiableRandomAccessList[0]->
[... many, many lines of this ...]
at com.fasterxml.jackson.databind.ser.std.BeanSerializerBase.serializeFields(BeanSerializerBase.java:733)
at com.fasterxml.jackson.databind.ser.BeanSerializer.serialize(BeanSerializer.java:155)
at com.fasterxml.jackson.databind.ser.impl.IndexedListSerializer.serializeContentsUsing(IndexedListSerializer.java:142)
at com.fasterxml.jackson.databind.ser.impl.IndexedListSerializer.serializeContents(IndexedListSerializer.java:88)
at com.fasterxml.jackson.databind.ser.impl.IndexedListSerializer.serialize(IndexedListSerializer.java:79)
at com.fasterxml.jackson.databind.ser.impl.IndexedListSerializer.serialize(IndexedListSerializer.java:18)
at com.fasterxml.jackson.databind.ser.BeanPropertyWriter.serializeAsField(BeanPropertyWriter.java:727)
at com.fasterxml.jackson.databind.ser.std.BeanSerializerBase.serializeFields(BeanSerializerBase.java:717)
at com.fasterxml.jackson.databind.ser.BeanSerializer.serialize(BeanSerializer.java:155)
at com.fasterxml.jackson.databind.ser.BeanPropertyWriter.serializeAsField(BeanPropertyWriter.java:727)
at com.fasterxml.jackson.databind.ser.std.BeanSerializerBase.serializeFields(BeanSerializerBase.java:717)
at com.fasterxml.jackson.databind.ser.BeanSerializer.serialize(BeanSerializer.java:155)
at com.fasterxml.jackson.databind.ser.impl.IndexedListSerializer.serializeContentsUsing(IndexedListSerializer.java:142)
[... many, many lines of this ...]
Caused by: java.lang.StackOverflowError
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)
at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
at java.net.URLClassLoader$1.run(URLClassLoader.java:368)
at java.net.URLClassLoader$1.run(URLClassLoader.java:362)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:361)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at com.fasterxml.jackson.databind.ser.std.BeanSerializerBase.serializeFields(BeanSerializerBase.java:736)
at com.fasterxml.jackson.databind.ser.BeanSerializer.serialize(BeanSerializer.java:155)
at com.fasterxml.jackson.databind.ser.BeanPropertyWriter.serializeAsField(BeanPropertyWriter.java:727)
at com.fasterxml.jackson.databind.ser.std.BeanSerializerBase.serializeFields(BeanSerializerBase.java:717)
at com.fasterxml.jackson.databind.ser.BeanSerializer.serialize(BeanSerializer.java:155)
at com.fasterxml.jackson.databind.ser.impl.IndexedListSerializer.serializeContentsUsing(IndexedListSerializer.java:142)
at com.fasterxml.jackson.databind.ser.impl.IndexedListSerializer.serializeContents(IndexedListSerializer.java:88)
at com.fasterxml.jackson.databind.ser.impl.IndexedListSerializer.serialize(IndexedListSerializer.java:79)
at com.fasterxml.jackson.databind.ser.impl.IndexedListSerializer.serialize(IndexedListSerializer.java:18)
[... snip ...]
at com.fasterxml.jackson.databind.ser.impl.IndexedListSerializer.serialize(IndexedListSerializer.java:79)
at com.fasterxml.jackson.databind.ser.impl.IndexedListSerializer.serialize(IndexedListSerializer.java:18)
at com.fasterxml.jackson.databind.ser.BeanPropertyWriter.serializeAsField(BeanPropertyWriter.java:727)
at com.fasterxml.jackson.databind.ser.std.BeanSerializerBase.serializeFields(BeanSerializerBase.java:717)
at com.fasterxml.jackson.databind.ser.BeanSerializer.serialize(BeanSerializer.java:155)
I'm constructing my TableRow recursively from a Google protobuf via its Descriptor - I traverse the descriptor depth-first recursively (since protobufs may have nested definitions) and build the TableRow as I traverse. Below is an excerpt from the TableRow creation class:
public void processElement(ProcessContext c) throws Exception {
TableRow row = getTableRow(c.element());
LOG.info(row.toPrettyString());
c.output(row);
}
private TableRow getTableRow(TMessage message) throws Exception {
TableRow row = new TableRow();
encode(message, row);
return row;
}
private TableCell getTableCell(TMessage message) throws Exception {
TableCell cell = new TableCell();
encode(message, cell);
return cell;
}
private void encode(TMessage message, GenericJson row) throws Exception {
Descriptors.Descriptor descriptor = message.getDescriptorForType();
List<Descriptors.FieldDescriptor> fields = descriptor.getFields();
for (Descriptors.FieldDescriptor fieldDescriptor : fields) {
Descriptors.FieldDescriptor.Type fieldType = fieldDescriptor.getType();
switch (fieldType) {
case DOUBLE:
case FLOAT:
case INT64:
case UINT64:
case INT32:
case FIXED64:
case FIXED32:
case UINT32:
case SFIXED32:
case SFIXED64:
case SINT32:
case SINT64:
case BOOL:
case STRING:
case BYTES:
case ENUM:
if (fieldDescriptor.isRepeated()) {
List<Object> tableCells = new ArrayList<>();
tableCells.addAll((List<?>) message.getField(fieldDescriptor));
row.set(fieldDescriptor.getName(), tableCells);
}
else {
row.set(fieldDescriptor.getName(), message.getField(fieldDescriptor));
}
break;
case MESSAGE:
if (fieldDescriptor.isRepeated()) {
List<TableRow> tableRows = new ArrayList<>();
for (Object o : (List<?>) message.getField(fieldDescriptor)) {
TMessage nestedMessage = (TMessage) o;
TableRow tableRow = getTableRow(nestedMessage);
tableRows.add(tableRow);
}
row.set(fieldDescriptor.getName(), tableRows);
}
else {
row.set(fieldDescriptor.getName(), getTableCell((TMessage) message.getField(fieldDescriptor)));
}
break;
case GROUP:
throw new Exception("groups are deprecated");
}
}
I believe that the TableRow is being created correctly because I've both tested this DoFn with some simple dummy data and looked at the result of the TableRow creation on a subset of my dataset (see the snippet above, where I LOG.info the result of the TableRow encoding), and the resulting TableRow seems to contain all of the data I expect with no extra fields.
Based on the stack trace and the code, it looks like something in the Protocol Buffer message may be self-referential. The JSON encoding is failing while following these references.
Looking at the code, my guess would be that you encountering an enum. If you look at the protocol buffer documentation of getField it says it returns an EnumValueDescriptor.
Looking at the EnumValueDescriptor, it has a link to the FileDescriptor, which has a link to EnumDescriptor which has a link to the FileDescriptor, which has a list of all the EnumDescriptors, which has a link to the FileDescriptor, etc.
If you handle the ENUM case specially (specifically to prevent protos from appearing as values in the JSON Map) it should fix your problem.

How do we check for value of a dataframe column is not null?

I would like to read a value of a column from a dataframe and check if that value is not null and the length of the value is <= 500
My code :
import org.apache.spark.sql.functions._
object OmegaProcess
{
// Some scala lines of Code
....
val line_flag = generateomegaLineFlag(omegaDF)
def generateomegaLineFlag(omegaDF: DataFrame): Int = {
if (omegaDF("omega_file_name") != null && length(omegaDF("omega_file_name")) <= 500 )
{
//Some Lines of code .....
}
100
}
}
But it does not compile due this below error
Type mismatch expected :Boolean, actual: Column
Could some one help me to fix this issue ?
try using isNotNull for checking aganist Null values
if (omegaDF("omega_file_name").isNotNull && length(omegaDF("omega_file_name")) <= 500 )

ImmutablePropertyException periodically when changing enum field's value via Rational Team Concert API

Hitting this issue with changing certain enumeration-based fields in my new RTC work item for a RTC API tool I'm working on.
Basically, I get an ImmutablePropertyException the first time I change the field, but the next time it works without an exception.
Want to get rid of the exceptions. I'm using a value RTC is actually returning to me as a valid enum value for the field.
Assigning RTC work item field: odc.impact a field value of ->
Integrity [odc.impact.literal.l4]
EXCEPTION: Could not assign value,
even though it was found in the enumeration list: [Unassigned,
Installability, Standards, Integrity]
com.ibm.team.repository.common.internal.ImmutablePropertyException at
com.ibm.team.repository.common.internal.util.ItemUtil$ProtectAdapter.notifyChanged(ItemUtil.java:2070)
at
org.eclipse.emf.common.notify.impl.BasicNotifierImpl.eNotify(BasicNotifierImpl.java:380)
at
com.ibm.team.repository.common.model.impl.StringExtensionEntryImpl.setTypedValue(StringExtensionEntryImpl.java:178)
at
com.ibm.team.repository.common.model.impl.StringExtensionEntryImpl.setValue(StringExtensionEntryImpl.java:360)
at org.eclipse.emf.common.util.BasicEMap.putEntry(BasicEMap.java:303)
at org.eclipse.emf.common.util.BasicEMap.put(BasicEMap.java:584) at
org.eclipse.emf.common.util.BasicEMap$DelegatingMap.put(BasicEMap.java:799)
at
com.ibm.team.repository.common.model.impl.ItemImpl.setStringExtension(ItemImpl.java:1228)
at
com.ibm.team.workitem.common.internal.model.impl.WorkItemImpl.setEnumeration(WorkItemImpl.java:3779)
at
com.ibm.team.workitem.common.internal.model.impl.WorkItemImpl.setValue(WorkItemImpl.java:2915)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:95)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:56)
at java.lang.reflect.Method.invoke(Method.java:620) at
com.ibm.team.repository.common.internal.util.ItemStore$ItemInvocationHandler.invoke(ItemStore.java:597)
at com.sun.proxy.$Proxy18.setValue(Unknown Source) at
com.rtc.vda.WorkItemInitialization.setAttributeValueEx(WorkItemInitialization.java:237)
at
com.rtc.vda.WorkItemInitialization.setAttributeValue(WorkItemInitialization.java:210)
at
com.rtc.vda.WorkItemInitialization.execute(WorkItemInitialization.java:186)
at
com.ibm.team.workitem.client.WorkItemOperation.execute(WorkItemOperation.java:85)
at
com.ibm.team.workitem.client.WorkItemOperation.doRun(WorkItemOperation.java:272)
at
com.ibm.team.workitem.client.WorkItemOperation.run(WorkItemOperation.java:242)
at
com.ibm.team.workitem.client.WorkItemOperation.run(WorkItemOperation.java:189)
at com.rtc.vda.RTCUtilities.createWorkItem(RTCUtilities.java:191) at
com.rtc.vda.RTCMain.main(RTCMain.java:178)
Assigning: odc.impact -> Integrity [odc.impact.literal.l4]
This is the code snippet to set the enum value:
public boolean setAttributeValueEx (IWorkItem w, String attributeKey, String valueName) {
// (REO) Get the attribute
IAttribute a = customAttributesMap.get(attributeKey);
// (REO) Buffer of valid values for error reporting
StringBuffer b = new StringBuffer();
try {
// (REO) Get the enumeration for this attribute from the repository (DO NOT CACHE IT OR YOU WILL HAVE PROBLEMS)
IWorkItemClient workItemClient = (IWorkItemClient) rtcParameters.getTeamRepository().getClientLibrary(IWorkItemClient.class);
IEnumeration<? extends ILiteral> rtcAttrEnumeration = workItemClient.resolveEnumeration(a, curMonitor);
// (REO) Find an enum value that matches this string and assign it
for (ILiteral literal : rtcAttrEnumeration.getEnumerationLiterals()) {
String vName = literal.getName();
String vId = literal.getIdentifier2().getStringIdentifier();
b.append(",");
b.append(vName);
if (valueName.equalsIgnoreCase(vName)) {
String msg2 = "Assigning: " + a.getIdentifier() + " -> " + vName + " [" + vId + "]";
RTCMain.out(msg2);
w.setValue(a, literal.getIdentifier2()); // (REO) SOURCE OF PERIODIC EXCEPTION
return true;
}
}
} catch (Exception e) {
RTCMain.out("EXCEPTION: Could not assign value, even though it was found in the enumeration list:\n\t[" + b + "]");
e.printStackTrace();
RTCMain.out("");
return false;
}
RTCMain.out("VALUE NOT FOUND: Valid values are:" + b);
return false;
}
Anyone know why I'm getting the periodic ImmutablePropertyException for only some of the fields, and why it goes away on the second call?
Thanks!
You just need to use the workingCopy.getWorkItem() object passed in to the execute() call rather than a cached version in a member variable. The attributes on the workingCopy object are not immutable and work fine.
public class WorkItemCreator extends WorkItemOperation {
...
#Override
protected void execute(WorkItemWorkingCopy workingCopy, IProgressMonitor monitor) throws TeamRepositoryException {
IWorkItem newWorkItem = workingCopy.getWorkItem();
// Set attribute values on newWorkItem to avoid ImmutablePropertyExceptions

jdbcTemplate query row map date column generically

I have a database with a date column, and when I perform a query I get each row as a Map of column names to column values. My problem is I do not know how to generically get the date column.
I am simply trying to cast it to a String at the moment, then parse it as java.util.Date, but this errors at the cast, and I am otherwise unsure as to how I can get the data?
This code is supposed to work with Sybase and Oracle databases too, so a generic answer would be greatly appreciated!
private static final String USER_QUERY = "SELECT USERNAME, PASSWORD, SUSPEND_START_DATE, SUSPEND_END_DATE FROM USERS";
public User readUsers(Subjects subjects) throws SubjectReaderException {
/* Perform the query */
List<User> users = new ArrayList<User>();
List<Map<String, Object>> rows = jdbcTemplate.queryForList(USER_QUERY);
/* Map the returned rows to our User objects */
for (Map<String, Object> row : rows) {
String username = (String) row.get("USERNAME");
/* Check if the user is suspended */
if(checkUserIsSuspended(row)){
continue;
}
User user = new User();
user.setUsername(username);
user.setPassword((String) row.get("PASSWORD"));
users.add(user);
}
return users;
}
private boolean checkUserIsSuspended(Map<String, Object> row) throws SubjectReaderException {
final String startDateString = (String) row.get("SUSPEND_START_DATE"); // this errors
if (startDateString != null) {
final String endDateString = (String) row.get("SUSPEND_END_DATE");
if (null != endDateString) {
return checkDate(startDateString, endDateString); // this just compares the current date etc
}
/* Return true if the Suspended start date is not null, and there is no end date column, or it is null */
return true;
}
/* Return false if the Suspended start date String is null - i.e. they have not been suspended */
return false;
}
The error:
java.lang.ClassCastException: com.sybase.jdbc3.tds.SybTimestamp cannot be cast to java.lang.String
It will always give this error because you are casting the Object com.sybase.jdbc3.tds.SybTimestamp to String.
Why don't you make this check directly in the SQL instead of creating a filter? Something like
SELECT USERNAME, PASSWORD, SUSPEND_START_DATE, SUSPEND_END_DATE
FROM USERS WHERE SUSPEND_START_DATE >= ?
and now you can use the queryForList passing as parameter the current time.
Another way for you to avoid this direct casts is using RowMapper. This way you can use ResultSet#getDate(String) and you won't be needing to cast anything as the JDBC driver will take care of the conversion for you :)