Hibernate Custom SQL Enum transformation fails - sql

There are similar questions here but none of them worked in my case. I have a custom SQL which returns 2 columns: one string, one number. And string column is always a full uppercase ENUM name. I want to feed this result set into my custom bean which has the said enum.
Following the answer here for Hibernate 5.X, code is below
Properties params = new Properties();
params.put("enumClass", "MyEnumClass");
params.put("useNamed", true);
Type myEnumType = new TypeLocatorImpl(new TypeResolver()).custom(MyEnumClass.class, params);
final Query query = getCurrentSession().createSQLQuery(MY_CUSTOM_SQL)
.addScalar("col1", myEnumType)
.addScalar("col2", StandardBasicTypes.INTEGER)
.setLong("someSqlVar", someVal)
.setResultTransformer(Transformers.aliasToBean(MyCustomBean.class));
return query.list();
This code does not even execute query.list() method, it fails at this line:
Type myEnumType = new TypeLocatorImpl(new TypeResolver()).custom(MyEnumClass.class, params);
Exception trace:
Caused by: org.hibernate.MappingException: Unable to instantiate custom type: com.example.MyEnumClass
...
Caused by: java.lang.InstantiationException: com.example.MyEnumClass
...
Caused by: java.lang.NoSuchMethodException: com.example.MyEnumClass.<init>()
at java.lang.Class.getConstructor0(Class.java:3082) ~[?:1.8.0_60]
at java.lang.Class.newInstance(Class.java:412) ~[?:1.8.0_60]
at org.hibernate.type.TypeFactory.custom(TypeFactory.java:202) ~[hibernate-core-5.1.0.Final.jar:5.1.0.Final]
at org.hibernate.type.TypeFactory.custom(TypeFactory.java:193) ~[hibernate-core-5.1.0.Final.jar:5.1.0.Final]
at org.hibernate.internal.TypeLocatorImpl.custom(TypeLocatorImpl.java:144) ~[hibernate-core-5.1.0.Final.jar:5.1.0.Final]
...
So hibernate is trying to call MyEnumClass.class.newInstance() and failing. It does not even check for properties I passed. Using Hibernate 5.1.0.Final, am I not supposed to use custom type this way?

I found a way to do it:
Properties params = new Properties();
params.put("enumClass", MyEnumClass.class.getName());
params.put("useNamed", true);
EnumType enumType = new EnumType();
enumType.setParameterValues(params);
CustomType customType = new CustomType(enumType);
final Query query = getCurrentSession().createSQLQuery(MY_CUSTOM_SQL)
.addScalar("col1", customType)
.addScalar("col2", StandardBasicTypes.INTEGER)
.setLong("someSqlVar", someVal)
.setResultTransformer(Transformers.aliasToBean(MyCustomBean.class));
return query.list();

Related

Jira Rest API - Problems to set custom fields

I try to set the field for Testcases in a Testplan. The value I get when reading it is JSONArray.
But when I write the very same JSONArray I extract to a new created Testplan, I get an error message.
Exception in thread "main" com.atlassian.jira.rest.client.api.domain.input.CannotTransformValueException: Any of available transformers was able to transform given value. Value is: org.codehaus.jettison.json.JSONArray: ["SBNDTST-361","SBNDTST-360","SBNDTST-358","SBNDTST-359"]
at com.atlassian.jira.rest.client.api.domain.input.ValueTransformerManager.apply(ValueTransformerManager.java:83)
at com.atlassian.jira.rest.client.api.domain.input.IssueInputBuilder.setFieldValue(IssueInputBuilder.java:134)
My method to set the field is this:
public void updateIssue(String issueKey, String fieldId, Object fieldValue) {
IssueInput input = new IssueInputBuilder()
.setFieldValue(fieldId, fieldValue)
.build();
restClient.getIssueClient()
.updateIssue(issueKey, input)
.claim();
The value for the fieldId is "customfield_17473". There is very little documentation on this. Does anyone have an idea how to proceed?
I found the solution by trial and error.
When I send an ArrayList it works.

.NET6/EF: The best overloaded method match for 'Microsoft.EntityFrameworkCore.DbSet<...>.Add(...)' has some invalid arguments

I'm working on a generic code to add .NET 6 Entity Framework DbSet<...> records, deserialized from JSON strings. The original code is (much) more elaborated, below are just samples to demonstrated the issue - the following method:
public static void AddRecord(dynamic dbSet, Type entityType, string json)
{
var dataRecord = System.Text.Json.JsonSerializer.Deserialize(json, entityType);
dbSet.Add(dataRecord);
}
results in a run-time error at dbSet.Add(dataRecord) call:
"The best overloaded method match for
'Microsoft.EntityFrameworkCore.DbSet<Northwind.Models.Category>.Add
(Northwind.Models.Category)' has some invalid arguments"}
...
This exception was originally thrown at this call stack:
System.Dynamic.UpdateDelegates.UpdateAndExecuteVoid2<T0, T1>
(System.Runtime.CompilerServices.CallSite, T0, T1)
if you call it, e.g., this way:
using (var ctx = ...)
{
...
var json = ...
...
AddRecord(ctx.Categories, typeof(Category), json);
}
I have intentionally, for clarity, used in the above code concrete dbSet (ctx.Categories) and compile time typedef (typeof(Category)) - in actual code these are run-time defined variables.
If you "unroll" the method code and write it this way:
using (var ctx = ...)
{
...
var json = ...
...
var dataRecord = System.Text.Json.JsonSerializer.Deserialize(json, typeof(Category));
ctx.Categories.Add(dataRecord);
}
you would still get the mentioned above run-time error for the .Add method.
But if you write:
var dataRecord = System.Text.Json.JsonSerializer.Deserialize<Category>(json);
ctx.Categories.Add(dataRecord);
or
var dataRecord = System.Text.Json.JsonSerializer.Deserialize(json);
ctx.Categories.Add((Category)dataRecord);
the code will work without any issues.
Finally, an attemp to use Convert.ChangeType doesn't help:
var dataRecord = System.Text.Json.JsonSerializer.Deserialize(json);
ctx.Categories.Add(Convert.ChangeType(dataRecord, typeof(Category)));
So, it looks like an explicit object type casting is compiled to and makes on rum-time some "special object interfaces arrangements", which dynamic object type casting doesn't?
[Update]
Okan Karadag's prompt answer below gave me a hint how to change AddRecord(...) method to workaround the subject issue:
public static void AddRecord(DbContext dbContext, Type entityType, string json)
{
var dataRecord = System.Text.Json.JsonSerializer.Deserialize(json, entityType);
dbContext.Add(dataRecord);
}
This method works flawlessly. Although it doesn't answer the subject question, why the original AddRecord(...) method
public static void AddRecord(dynamic dbSet, Type entityType, string json)
{
var dataRecord = System.Text.Json.JsonSerializer.Deserialize(json, entityType);
dbSet.Add(dataRecord);
}
results in 'The best overloaded method match for 'Microsoft.EntityFrameworkCore.DbSet<...>.Add(...)' has some invalid arguments...' runtime error at
dbSet.Add(dataRecord);
code line.
You can use generic for dynamic.
public void AddEntity<T>(string json) where T:class
{
var entity = JsonSerializer.Deserialize<T>(json);
ArgumentNullException.ThrowIfNull(entity);
dbContext.Add<T>(entity);
dbContext.SaveChanges();
}
You can see link for problem error.

Kafka to Flink to Hive - Writes failing

I am trying to Sink data to Hive via Kafka -> Flink -> Hive using following code snippet:
But I am getting following error:
final StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
DataStream<GenericRecord> stream = readFromKafka(env);
private static final TypeInformation[] FIELD_TYPES = new TypeInformation[]{
BasicTypeInfo.INT_TYPE_INFO, BasicTypeInfo.STRING_TYPE_INFO
};
JDBCAppendTableSink sink = JDBCAppendTableSink.builder()
.setDrivername("org.apache.hive.jdbc.HiveDriver")
.setDBUrl("jdbc:hive2://hiveconnstring")
.setUsername("myuser")
.setPassword("mypass")
.setQuery("INSERT INTO testHiveDriverTable (key,value) VALUES (?,?)")
.setBatchSize(1000)
.setParameterTypes(FIELD_TYPES)
.build();
DataStream<Row> rows = stream.map((MapFunction<GenericRecord, Row>) st1 -> {
Row row = new Row(2); //
row.setField(0, st1.get("SOME_ID"));
row.setField(1, st1.get("SOME_ADDRESS"));
return row;
});
sink.emitDataStream(rows);
env.execute("Flink101");
Caused by: java.lang.RuntimeException: Execution of JDBC statement failed.
at org.apache.flink.api.java.io.jdbc.JDBCOutputFormat.flush(JDBCOutputFormat.java:219)
at org.apache.flink.api.java.io.jdbc.JDBCSinkFunction.snapshotState(JDBCSinkFunction.java:43)
at org.apache.flink.streaming.util.functions.StreamingFunctionUtils.trySnapshotFunctionState(StreamingFunctionUtils.java:118)
at org.apache.flink.streaming.util.functions.StreamingFunctionUtils.snapshotFunctionState(StreamingFunctionUtils.java:99)
at org.apache.flink.streaming.api.operators.AbstractUdfStreamOperator.snapshotState(AbstractUdfStreamOperator.java:90)
at org.apache.flink.streaming.api.operators.AbstractStreamOperator.snapshotState(AbstractStreamOperator.java:356)
... 12 more
Caused by: java.sql.SQLException: Method not supported
at org.apache.hive.jdbc.HiveStatement.executeBatch(HiveStatement.java:381)
at org.apache.flink.api.java.io.jdbc.JDBCOutputFormat.flush(JDBCOutputFormat.java:216)
... 17 more
I checked hive-jdbc driver and it seems that the Method is not supported in hive-jdbc driver.
public class HiveStatement implements java.sql.Statement {
...
#Override
public int[] executeBatch() throws SQLException {
throw new SQLFeatureNotSupportedException("Method not supported");
}
..
}
Is there any way we can achieve this using JDBC Driver ?
Let me know,
Thanks in advance.
Hive's JDBC implementation is not complete yet. Your problem is tracked by this issue.
You could try to patch Flink's JDBCOutputFormat to not use batching by replacing upload.addBatch with upload.execute in JDBCOutputFormat.java:202 and remove the call to upload.executeBatch in JDBCOutputFormat.java:216. The down side will be that you issue for every record a dedicated SQL query which might slow down things.

Unable to Iterate over Hashmap using java8

The below given is snipet of the ccode i am facing trouble with and i am using jdk 8. I am facing error in the bold line of the code , for loop statement. I have mentioned the error too.:
do {
jobid = br.readLine();
metajson = br.readLine();
JSONObject obj = (JSONObject) jsonParser.parse(metajson);
System.out.println(jobid+" "+obj.toString());
//The below one should work
****for (HashMap.Entry<String, String> entry : obj.entrySet())****
{
System.out.println(entry.getKey() + "/" + entry.getValue());
}
}
Error:
Exception in thread "main" java.lang.Error: Unresolved compilation problems:
PropertyEntry cannot be resolved to a type
Duplicate local variable entry
Entry cannot be resolved to a type
at com.journaldev.json.Insert3.main(Insert3.java:64)
a error at this line that says "Type mismatch: cannot convert from element type Object to Map.Entry>"
I tried the Property.Map() and concept.Map() too, but the same issue is there. I also imported the whole collection class too. But i don't know the error is not resolving.
Because the returned set of entries has a type of <String, JSONObject> instead of <String, String> .
The following code compiles:
Set<Entry<String, JSONObject>> entrySet = jsonObject.entrySet();
for (Entry<String, JSONObject> entry : entrySet) {
String key = entry.getKey();
JSONObject innerJsonObject = entry.getValue();
}
Anyway that library (simple-json) has a bad design. JsonObject inherits itself from HashMap raw type (without filling it's generic type declaration) so the resulting entrySet's type is not known at compile time.
Tip: Use GSon or other json library instead of that.

JSF selectItems value from sql query

I want to populate selectOneMenu component with values extracted from database by sql query.
Query returns only store names which I want to enter as values to selectOneMenu
I get java.lang.IllegalArgumentException with stack starting with :
java.lang.IllegalArgumentException at com.sun.faces.renderkit.SelectItemsIterator.initializeItems(SelectItemsIterator.java:216)
at com.sun.faces.renderkit.SelectItemsIterator.hasNext(SelectItemsIterator.java:135)
at com.sun.faces.renderkit.html_basic.MenuRenderer.renderOptions(MenuRenderer.java:762)
This is my xhtml code (This is the only use of selectItems):
<h:selectOneMenu id="storeName" value="#{shoplist.store}">
<f:selectItems value="#{buyHistory.stores}" />
</h:selectOneMenu>
This is query from buyHistory bean:
public ResultSet getStores() throws SQLException {
...
PreparedStatement getStores = connection.prepareStatement(
"SELECT distinct STORE_NAME "
+ "FROM BuyingHistory ORDER BY STORE_NAME");
CachedRowSet rowSet = new com.sun.rowset.CachedRowSetImpl();
rowSet.populate(getStores.executeQuery());
return rowSet;
}
What am I doing wrong? Should I convert somehow from resultSet to SelectItem array/list?
Should I convert somehow from resultSet to SelectItem array/list?
Yes, that's one of the solutions. See also our h:selectOneMenu wiki page. The IllegalArgumentException will be thrown when the value is not an instance of SelectItem, or an array, or Iterable or Map.
Ultimately, your JSF backing beans should be completely free of java(x).sql dependencies. I.e. you should have no single line of import java(x).sql.Something; in your JSF code. Otherwise, it's bad design anyway (tight-coupling). Learn how to create proper DAO classes.
Why do you think that JSF would know how to transform a ResultSet from the persistence layer? JSF is a presentation layer framework :)
Yes you need to convert it to a SelectItem-List like this:
private List<SelectItem> transformToSelectItems(ResultSet resultSet) {
List<SelectItem> selectItems = new ArrayList<SelectItem>();
while(resultSet.next()) {
String storeName = resultSet.getString("STORE_NAME");
SelectItem item = new SelectItem(storeName, storeName);
selectItems.add(item);
}
return selectItems;
}
Be sure to notice BalusC's answer. This is just an example of how to construct a dynamic SelectItem-List. But you should definetely not have a ResultSet in your JSF-ManagedBeans.