Kotlin-Exposed's own example on their wiki does not work - intellij-idea

This code is pulled directly off of the Kotlin-Exposed Wiki yet does not work. Strange and frustrating that I cannot get it to work since I have an idea for a cool project which requires using a RDBMS. What am I missing? Is it broken?
import org.jetbrains.exposed.sql.StdOutSqlLogger
import org.jetbrains.exposed.sql.Database
import org.jetbrains.exposed.sql.Table
import org.jetbrains.exposed.sql.insert
import org.jetbrains.exposed.sql.transactions.transaction
import org.jetbrains.exposed.sql.selectAll
fun main(args: Array<String>) {
Database.connect("jdbc:h2:mem:test", driver = "org.h2.Driver")
transaction {
logger.addLogger(StdOutSqlLogger)
val stPeteId = Cities.insert {
it[name] = "St. Petersburg"
} get Cities.id
println("Cities: ${Cities.selectAll()}")
}
}
// Table definition
object Cities : Table() {
val id = integer("id").autoIncrement().primaryKey()
val name = varchar("name", 50)
}
// Entity definition
data class City(
val id: Int,
val name: String
)
When run in Intellij, I recieve this error message:
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
Exception in thread "main" org.h2.jdbc.JdbcSQLException: Table "CITIES" not found; SQL statement:
INSERT INTO CITIES (NAME) VALUES (?) [42102-196]
at org.h2.message.DbException.getJdbcSQLException(DbException.java:345)
at org.h2.message.DbException.get(DbException.java:179)
at org.h2.message.DbException.get(DbException.java:155)
at org.h2.command.Parser.readTableOrView(Parser.java:5552)
at org.h2.command.Parser.readTableOrView(Parser.java:5529)
at org.h2.command.Parser.parseInsert(Parser.java:1062)
at org.h2.command.Parser.parsePrepared(Parser.java:417)
at org.h2.command.Parser.parse(Parser.java:321)
at org.h2.command.Parser.parse(Parser.java:293)
at org.h2.command.Parser.prepareCommand(Parser.java:258)
at org.h2.engine.Session.prepareLocal(Session.java:578)
at org.h2.engine.Session.prepareCommand(Session.java:519)
at org.h2.jdbc.JdbcConnection.prepareCommand(JdbcConnection.java:1204)
at org.h2.jdbc.JdbcPreparedStatement.<init>(JdbcPreparedStatement.java:73)
at org.h2.jdbc.JdbcConnection.prepareStatement(JdbcConnection.java:288)
at org.h2.jdbc.JdbcConnection.prepareStatement(JdbcConnection.java:1188)
at org.jetbrains.exposed.sql.statements.InsertStatement.prepared(InsertStatement.kt:58)
at org.jetbrains.exposed.sql.statements.Statement.executeIn$exposed(Statement.kt:46)
at org.jetbrains.exposed.sql.Transaction.exec(Transaction.kt:103)
at org.jetbrains.exposed.sql.Transaction.exec(Transaction.kt:97)
at org.jetbrains.exposed.sql.statements.Statement.execute(Statement.kt:27)
at org.jetbrains.exposed.sql.QueriesKt.insert(Queries.kt:43)
at MainKt$main$1.invoke(Main.kt:14)
at MainKt$main$1.invoke(Main.kt)
at org.jetbrains.exposed.sql.transactions.ThreadLocalTransactionManagerKt.inTopLevelTransaction(ThreadLocalTransactionManager.kt:92)
at org.jetbrains.exposed.sql.transactions.ThreadLocalTransactionManagerKt.transaction(ThreadLocalTransactionManager.kt:64)
at org.jetbrains.exposed.sql.transactions.ThreadLocalTransactionManagerKt.transaction(ThreadLocalTransactionManager.kt:55)
at MainKt.main(Main.kt:11)
Process finished with exit code 1

I think the create statement is missing from their example. The DAO example on the project's GitHub page seems to have a few statements that the example you point to does not.
Try adding a create statement:
fun main(args: Array<String>) {
Database.connect("jdbc:h2:mem:test", driver = "org.h2.Driver")
transaction {
logger.addLogger(StdOutSqlLogger)
// ADD THIS - Create tables
create (Cities)
val stPeteId = Cities.insert {
it[name] = "St. Petersburg"
} get Cities.id
println("Cities: ${Cities.selectAll()}")
}
}
And if that works, I bet they would accept a PR for their documentation.

Related

MapStruct Property has no write accessor in MetaData for target name

I have a FlatDTO that needs to be mapped to a nested Response containing InfoData and MetaData
The code for the response is generated by OpenAPI. So the below definitions can't be changed.
package org.mapstruct.example.kotlin.openapi
import com.fasterxml.jackson.annotation.JsonProperty
import javax.validation.Valid
data class Response(
#field:Valid #field:JsonProperty("infoData", required = true) val infoData: InfoData,
#field:Valid #field:JsonProperty("metaData", required = true) val metaData: MetaData
)
data class InfoData(
#field:JsonProperty("id", required = true) val id: kotlin.String,
)
data class MetaData(
#field:JsonProperty("firstProperty") val firstProperty: String? = null,
)
I have defined FlatDTO as follows.
package org.mapstruct.example.kotlin.models
data class FlatDTO(
var id: String? = null,
var firstProperty: String,
)
Here is my mapper class which maps FlatDTO to Response
package org.mapstruct.example.kotlin.mapper
import org.mapstruct.Mapper
import org.mapstruct.Mapping
import org.mapstruct.Mappings
import org.mapstruct.example.kotlin.models.FlatDTO
import org.mapstruct.example.kotlin.openapi.Response
#Mapper
interface DataMapper {
#Mappings(
Mapping(target = "infoData.id", source = "id"),
Mapping(target = "metaData.firstProperty", source = "firstProperty")
)
fun flatToResponse(flatDTO: FlatDTO): Response
}
When I try to build the code using mvn clean install
I get the following error.
error: Property "firstProperty" has no write accessor in MetaData for target name "metaData.firstProperty".
[ERROR] #org.mapstruct.Mappings(value = {#org.mapstruct.Mapping(target = "infoData.id", source = "id"), #org.mapstruct.Mapping(target = "metaData.firstProperty", source = "firstProperty")})
I understand that this message is trying to say that there is no setter function for firstProperty because it's defined as val but that code cannot be edited. I can write my own custom mapper that works just fine.
I wanted to understand if there is a way to use MapStruct in this scenario.

Migration from Spring Batch 3 to 4

I'm trying to migrate from Spring Batch 3.0.6 to 4.1.1.
I'm stuck handling Spring Batch's execution context data as explained here.
What I would like is update tables batch_step_execution_context and batch_step_execution_context as follows, for each record :
unserialize context data using the deprecated XStreamExecutionContextStringSerializer
serialize the java object obtained using a jackson ObjectMapper
update the table using this new string
It has worked for some records and I could run Spring Batch 4. But for some jobs, the new string is not deserializable as a ExecutionContext.
Here is an example of such a string :
{
"setMoisAffaires":[
{
"premierJourDuMois":{
"date":1257030000000,
"stringFromDatePatternAmericain":"2009-11-01",
"hour":0,
"year":2009,
"dayOfMonth":1,
"dayOfWeek":7,
"dateString":"20091101",
"monthOfYear":11,
"minuteOfHour":0,
"secondOfMinute":0
},
"date":1257030000000,
"stringFromDatePatternAmericain":"2009-11-01",
"hour":0,
"year":2009,
"dayOfMonth":1,
"dayOfWeek":7,
"dateString":"200911",
"monthOfYear":11,
"minuteOfHour":0,
"secondOfMinute":0
},
//a lot of records like the previous...
],
"batch.taskletType":"com.sun.proxy.$Proxy57",
"batch.stepType":"org.springframework.batch.core.step.tasklet.TaskletStep"
}
Here is the trace when I try to do : ExecutionContext a = objectMapper.readValue(objectMapper.writeValueAsString(javaObject),ExecutionContext.class);
Exception in thread "main" com.fasterxml.jackson.databind.exc.UnrecognizedPropertyException: Unrecognized field "setMoisAffaires" (class org.springframework.batch.item.ExecutionContext), not marked as ignorable (one known property: "dirty"])
at [Source: (String)"{"setMoisAffaires":[{"prem...[truncated 15163 chars]; line: 1, column: 21] (through reference chain: org.springframework.batch.item.ExecutionContext["setMoisAffaires"])
at com.fasterxml.jackson.databind.exc.UnrecognizedPropertyException.from(UnrecognizedPropertyException.java:61)
at com.fasterxml.jackson.databind.DeserializationContext.handleUnknownProperty(DeserializationContext.java:823)
at com.fasterxml.jackson.databind.deser.std.StdDeserializer.handleUnknownProperty(StdDeserializer.java:1153)
at com.fasterxml.jackson.databind.deser.BeanDeserializerBase.handleUnknownProperty(BeanDeserializerBase.java:1589)
at com.fasterxml.jackson.databind.deser.BeanDeserializerBase.handleUnknownVanilla(BeanDeserializerBase.java:1567)
at com.fasterxml.jackson.databind.deser.BeanDeserializer.vanillaDeserialize(BeanDeserializer.java:294)
at com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:151)
at com.fasterxml.jackson.databind.ObjectMapper._readMapAndClose(ObjectMapper.java:4013)
at com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:3004)
Here is a unit test to reproduce the problem :
package springBatch4Migration;
import static org.junit.Assert.assertNotNull;
import java.io.ByteArrayInputStream;
import org.junit.Test;
import org.springframework.batch.core.repository.dao.XStreamExecutionContextStringSerializer;
import org.springframework.batch.item.ExecutionContext;
import com.fasterxml.jackson.databind.ObjectMapper;
public class SerializeTest {
#Test
public void test() throws Exception {
String oldJson = "{\"map\":[{\"entry\":[{\"string\":[\"batch.stepType\",\"org.springframework.batch.core.step.tasklet.TaskletStep\"]},{\"string\":[\"batch.taskletType\",\"com.sun.proxy.$Proxy56\"]},{\"string\":\"setMoisAffaires\",\"set\":[{\"tools.date.dates.Mois\":[{\"dateLocale\":{\"iLocalMillis\":1264982400000,\"iChronology\":{\"#class\":\"org.joda.time.chrono.ISOChronology\",\"#resolves-to\":\"org.joda.time.chrono.ISOChronology$Stub\",\"#serialization\":\"custom\",\"org.joda.time.chrono.ISOChronology$Stub\":{\"org.joda.time.UTCDateTimeZone\":{\"#resolves-to\":\"org.joda.time.DateTimeZone$Stub\",\"#serialization\":\"custom\",\"org.joda.time.DateTimeZone$Stub\":{\"string\":\"UTC\"}}}}},\"categorie\":\"MOIS\",\"dateAsString\":201002},{\"dateLocale\":{\"iLocalMillis\":1312156800000,\"iChronology\":{\"#class\":\"org.joda.time.chrono.ISOChronology\",\"#reference\":\"..\\/..\\/..\\/tools.date.dates.Mois\\/dateLocale\\/iChronology\"}},\"categorie\":\"MOIS\",\"dateAsString\":201108},{\"dateLocale\":{\"iLocalMillis\":1288569600000,\"iChronology\":{\"#class\":\"org.joda.time.chrono.ISOChronology\",\"#reference\":\"..\\/..\\/..\\/tools.date.dates.Mois\\/dateLocale\\/iChronology\"}},\"categorie\":\"MOIS\",\"dateAsString\":201011},{\"dateLocale\":{\"iLocalMillis\":1285891200000,\"iChronology\":{\"#class\":\"org.joda.time.chrono.ISOChronology\",\"#reference\":\"..\\/..\\/..\\/tools.date.dates.Mois\\/dateLocale\\/iChronology\"}},\"categorie\":\"MOIS\",\"dateAsString\":201010},{\"dateLocale\":{\"iLocalMillis\":1243814400000,\"iChronology\":{\"#class\":\"org.joda.time.chrono.ISOChronology\",\"#reference\":\"..\\/..\\/..\\/tools.date.dates.Mois\\/dateLocale\\/iChronology\"}},\"categorie\":\"MOIS\",\"dateAsString\":200906},{\"dateLocale\":{\"iLocalMillis\":1257033600000,\"iChronology\":{\"#class\":\"org.joda.time.chrono.ISOChronology\",\"#reference\":\"..\\/..\\/..\\/tools.date.dates.Mois\\/dateLocale\\/iChronology\"}},\"categorie\":\"MOIS\",\"dateAsString\":200911},{\"dateLocale\":{\"iLocalMillis\":1277942400000,\"iChronology\":{\"#class\":\"org.joda.time.chrono.ISOChronology\",\"#reference\":\"..\\/..\\/..\\/tools.date.dates.Mois\\/dateLocale\\/iChronology\"}},\"categorie\":\"MOIS\",\"dateAsString\":201007},{\"dateLocale\":{\"iLocalMillis\":1249084800000,\"iChronology\":{\"#class\":\"org.joda.time.chrono.ISOChronology\",\"#reference\":\"..\\/..\\/..\\/tools.date.dates.Mois\\/dateLocale\\/iChronology\"}},\"categorie\":\"MOIS\",\"dateAsString\":200908},{\"dateLocale\":{\"iLocalMillis\":1320105600000,\"iChronology\":{\"#class\":\"org.joda.time.chrono.ISOChronology\",\"#reference\":\"..\\/..\\/..\\/tools.date.dates.Mois\\/dateLocale\\/iChronology\"}},\"categorie\":\"MOIS\",\"dateAsString\":201111},{\"dateLocale\":{\"iLocalMillis\":1283299200000,\"iChronology\":{\"#class\":\"org.joda.time.chrono.ISOChronology\",\"#reference\":\"..\\/..\\/..\\/tools.date.dates.Mois\\/dateLocale\\/iChronology\"}},\"categorie\":\"MOIS\",\"dateAsString\":201009},{\"dateLocale\":{\"iLocalMillis\":1328054400000,\"iChronology\":{\"#class\":\"org.joda.time.chrono.ISOChronology\",\"#reference\":\"..\\/..\\/..\\/tools.date.dates.Mois\\/dateLocale\\/iChronology\"}},\"categorie\":\"MOIS\",\"dateAsString\":201202},{\"dateLocale\":{\"iLocalMillis\":1325376000000,\"iChronology\":{\"#class\":\"org.joda.time.chrono.ISOChronology\",\"#reference\":\"..\\/..\\/..\\/tools.date.dates.Mois\\/dateLocale\\/iChronology\"}},\"categorie\":\"MOIS\",\"dateAsString\":201201},{\"dateLocale\":{\"iLocalMillis\":1296518400000,\"iChronology\":{\"#class\":\"org.joda.time.chrono.ISOChronology\",\"#reference\":\"..\\/..\\/..\\/tools.date.dates.Mois\\/dateLocale\\/iChronology\"}},\"categorie\":\"MOIS\",\"dateAsString\":201102},{\"dateLocale\":{\"iLocalMillis\":1306886400000,\"iChronology\":{\"#class\":\"org.joda.time.chrono.ISOChronology\",\"#reference\":\"..\\/..\\/..\\/tools.date.dates.Mois\\/dateLocale\\/iChronology\"}},\"categorie\":\"MOIS\",\"dateAsString\":201106},{\"dateLocale\":{\"iLocalMillis\":1267401600000,\"iChronology\":{\"#class\":\"org.joda.time.chrono.ISOChronology\",\"#reference\":\"..\\/..\\/..\\/tools.date.dates.Mois\\/dateLocale\\/iChronology\"}},\"categorie\":\"MOIS\",\"dateAsString\":201003},{\"dateLocale\":{\"iLocalMillis\":1251763200000,\"iChronology\":{\"#class\":\"org.joda.time.chrono.ISOChronology\",\"#reference\":\"..\\/..\\/..\\/tools.date.dates.Mois\\/dateLocale\\/iChronology\"}},\"categorie\":\"MOIS\",\"dateAsString\":200909},{\"dateLocale\":{\"iLocalMillis\":1262304000000,\"iChronology\":{\"#class\":\"org.joda.time.chrono.ISOChronology\",\"#reference\":\"..\\/..\\/..\\/tools.date.dates.Mois\\/dateLocale\\/iChronology\"}},\"categorie\":\"MOIS\",\"dateAsString\":201001},{\"dateLocale\":{\"iLocalMillis\":1280620800000,\"iChronology\":{\"#class\":\"org.joda.time.chrono.ISOChronology\",\"#reference\":\"..\\/..\\/..\\/tools.date.dates.Mois\\/dateLocale\\/iChronology\"}},\"categorie\":\"MOIS\",\"dateAsString\":201008},{\"dateLocale\":{\"iLocalMillis\":1270080000000,\"iChronology\":{\"#class\":\"org.joda.time.chrono.ISOChronology\",\"#reference\":\"..\\/..\\/..\\/tools.date.dates.Mois\\/dateLocale\\/iChronology\"}},\"categorie\":\"MOIS\",\"dateAsString\":201004},{\"dateLocale\":{\"iLocalMillis\":1317427200000,\"iChronology\":{\"#class\":\"org.joda.time.chrono.ISOChronology\",\"#reference\":\"..\\/..\\/..\\/tools.date.dates.Mois\\/dateLocale\\/iChronology\"}},\"categorie\":\"MOIS\",\"dateAsString\":201110},{\"dateLocale\":{\"iLocalMillis\":1272672000000,\"iChronology\":{\"#class\":\"org.joda.time.chrono.ISOChronology\",\"#reference\":\"..\\/..\\/..\\/tools.date.dates.Mois\\/dateLocale\\/iChronology\"}},\"categorie\":\"MOIS\",\"dateAsString\":201005},{\"dateLocale\":{\"iLocalMillis\":1304208000000,\"iChronology\":{\"#class\":\"org.joda.time.chrono.ISOChronology\",\"#reference\":\"..\\/..\\/..\\/tools.date.dates.Mois\\/dateLocale\\/iChronology\"}},\"categorie\":\"MOIS\",\"dateAsString\":201105},{\"dateLocale\":{\"iLocalMillis\":1233446400000,\"iChronology\":{\"#class\":\"org.joda.time.chrono.ISOChronology\",\"#reference\":\"..\\/..\\/..\\/tools.date.dates.Mois\\/dateLocale\\/iChronology\"}},\"categorie\":\"MOIS\",\"dateAsString\":200902},{\"dateLocale\":{\"iLocalMillis\":1254355200000,\"iChronology\":{\"#class\":\"org.joda.time.chrono.ISOChronology\",\"#reference\":\"..\\/..\\/..\\/tools.date.dates.Mois\\/dateLocale\\/iChronology\"}},\"categorie\":\"MOIS\",\"dateAsString\":200910},{\"dateLocale\":{\"iLocalMillis\":1246406400000,\"iChronology\":{\"#class\":\"org.joda.time.chrono.ISOChronology\",\"#reference\":\"..\\/..\\/..\\/tools.date.dates.Mois\\/dateLocale\\/iChronology\"}},\"categorie\":\"MOIS\",\"dateAsString\":200907},{\"dateLocale\":{\"iLocalMillis\":1309478400000,\"iChronology\":{\"#class\":\"org.joda.time.chrono.ISOChronology\",\"#reference\":\"..\\/..\\/..\\/tools.date.dates.Mois\\/dateLocale\\/iChronology\"}},\"categorie\":\"MOIS\",\"dateAsString\":201107},{\"dateLocale\":{\"iLocalMillis\":1259625600000,\"iChronology\":{\"#class\":\"org.joda.time.chrono.ISOChronology\",\"#reference\":\"..\\/..\\/..\\/tools.date.dates.Mois\\/dateLocale\\/iChronology\"}},\"categorie\":\"MOIS\",\"dateAsString\":200912},{\"dateLocale\":{\"iLocalMillis\":1238544000000,\"iChronology\":{\"#class\":\"org.joda.time.chrono.ISOChronology\",\"#reference\":\"..\\/..\\/..\\/tools.date.dates.Mois\\/dateLocale\\/iChronology\"}},\"categorie\":\"MOIS\",\"dateAsString\":200904},{\"dateLocale\":{\"iLocalMillis\":1314835200000,\"iChronology\":{\"#class\":\"org.joda.time.chrono.ISOChronology\",\"#reference\":\"..\\/..\\/..\\/tools.date.dates.Mois\\/dateLocale\\/iChronology\"}},\"categorie\":\"MOIS\",\"dateAsString\":201109},{\"dateLocale\":{\"iLocalMillis\":1330560000000,\"iChronology\":{\"#class\":\"org.joda.time.chrono.ISOChronology\",\"#reference\":\"..\\/..\\/..\\/tools.date.dates.Mois\\/dateLocale\\/iChronology\"}},\"categorie\":\"MOIS\",\"dateAsString\":201203},{\"dateLocale\":{\"iLocalMillis\":1241136000000,\"iChronology\":{\"#class\":\"org.joda.time.chrono.ISOChronology\",\"#reference\":\"..\\/..\\/..\\/tools.date.dates.Mois\\/dateLocale\\/iChronology\"}},\"categorie\":\"MOIS\",\"dateAsString\":200905},{\"dateLocale\":{\"iLocalMillis\":1322697600000,\"iChronology\":{\"#class\":\"org.joda.time.chrono.ISOChronology\",\"#reference\":\"..\\/..\\/..\\/tools.date.dates.Mois\\/dateLocale\\/iChronology\"}},\"categorie\":\"MOIS\",\"dateAsString\":201112},{\"dateLocale\":{\"iLocalMillis\":1301616000000,\"iChronology\":{\"#class\":\"org.joda.time.chrono.ISOChronology\",\"#reference\":\"..\\/..\\/..\\/tools.date.dates.Mois\\/dateLocale\\/iChronology\"}},\"categorie\":\"MOIS\",\"dateAsString\":201104},{\"dateLocale\":{\"iLocalMillis\":1275350400000,\"iChronology\":{\"#class\":\"org.joda.time.chrono.ISOChronology\",\"#reference\":\"..\\/..\\/..\\/tools.date.dates.Mois\\/dateLocale\\/iChronology\"}},\"categorie\":\"MOIS\",\"dateAsString\":201006},{\"dateLocale\":{\"iLocalMillis\":1293840000000,\"iChronology\":{\"#class\":\"org.joda.time.chrono.ISOChronology\",\"#reference\":\"..\\/..\\/..\\/tools.date.dates.Mois\\/dateLocale\\/iChronology\"}},\"categorie\":\"MOIS\",\"dateAsString\":201101},{\"dateLocale\":{\"iLocalMillis\":1235865600000,\"iChronology\":{\"#class\":\"org.joda.time.chrono.ISOChronology\",\"#reference\":\"..\\/..\\/..\\/tools.date.dates.Mois\\/dateLocale\\/iChronology\"}},\"categorie\":\"MOIS\",\"dateAsString\":200903},{\"dateLocale\":{\"iLocalMillis\":1291161600000,\"iChronology\":{\"#class\":\"org.joda.time.chrono.ISOChronology\",\"#reference\":\"..\\/..\\/..\\/tools.date.dates.Mois\\/dateLocale\\/iChronology\"}},\"categorie\":\"MOIS\",\"dateAsString\":201012},{\"dateLocale\":{\"iLocalMillis\":1298937600000,\"iChronology\":{\"#class\":\"org.joda.time.chrono.ISOChronology\",\"#reference\":\"..\\/..\\/..\\/tools.date.dates.Mois\\/dateLocale\\/iChronology\"}},\"categorie\":\"MOIS\",\"dateAsString\":201103}]}]}]}]}";
/*
* Unserializing oldJson to a Java object...
*/
XStreamExecutionContextStringSerializer oldSerializer = new XStreamExecutionContextStringSerializer();
oldSerializer.init();
ByteArrayInputStream oldInputStream = new ByteArrayInputStream(oldJson.getBytes());
Object javaObject = oldSerializer.deserialize(oldInputStream);
/*
* Serializing back to a new Json string using Jackson...
*/
ObjectMapper newSerializer = new ObjectMapper();
ExecutionContext a = newSerializer.readValue(newSerializer.writeValueAsString(javaObject),ExecutionContext.class);
assertNotNull(a);
}
}
And here is what I get if I try to run a job using such data :
ERROR - CommandLineJobRunner:367 - Job Terminated in error: Unable to deserialize the execution context
java.lang.IllegalArgumentException: Unable to deserialize the execution context
at org.springframework.batch.core.repository.dao.JdbcExecutionContextDao$ExecutionContextRowMapper.mapRow(JdbcExecutionContextDao.java:328)
at org.springframework.batch.core.repository.dao.JdbcExecutionContextDao$ExecutionContextRowMapper.mapRow(JdbcExecutionContextDao.java:312)
at org.springframework.jdbc.core.RowMapperResultSetExtractor.extractData(RowMapperResultSetExtractor.java:94)
at org.springframework.jdbc.core.RowMapperResultSetExtractor.extractData(RowMapperResultSetExtractor.java:61)
at org.springframework.jdbc.core.JdbcTemplate$1.doInPreparedStatement(JdbcTemplate.java:679)
at org.springframework.jdbc.core.JdbcTemplate.execute(JdbcTemplate.java:617)
at org.springframework.jdbc.core.JdbcTemplate.query(JdbcTemplate.java:669)
at org.springframework.jdbc.core.JdbcTemplate.query(JdbcTemplate.java:700)
at org.springframework.jdbc.core.JdbcTemplate.query(JdbcTemplate.java:712)
at org.springframework.jdbc.core.JdbcTemplate.query(JdbcTemplate.java:768)
at org.springframework.batch.core.repository.dao.JdbcExecutionContextDao.getExecutionContext(JdbcExecutionContextDao.java:129)
at org.springframework.batch.core.explore.support.SimpleJobExplorer.getStepExecutionDependencies(SimpleJobExplorer.java:210)
at org.springframework.batch.core.explore.support.SimpleJobExplorer.getJobExecutions(SimpleJobExplorer.java:87)
at org.springframework.batch.core.JobParametersBuilder.getNextJobParameters(JobParametersBuilder.java:266)
at org.springframework.batch.core.launch.support.CommandLineJobRunner.start(CommandLineJobRunner.java:357)
at org.springframework.batch.core.launch.support.CommandLineJobRunner.main(CommandLineJobRunner.java:564)
at fr.mycompany.myapp.commun.batch.CommandLineJobRunnerTest.runJob(CommandLineJobRunnerTest.java:180)
at fr.mycompany.myapp.commun.batch.CommandLineJobRunnerTest.execute(CommandLineJobRunnerTest.java:496)
at fr.mycompany.myapp.commun.batch.CommandLineJobRunnerTest.main(CommandLineJobRunnerTest.java:529)
Caused by: com.fasterxml.jackson.databind.exc.MismatchedInputException: Unexpected token (START_OBJECT), expected VALUE_STRING: need JSON String that contains type id (for subtype of java.lang.Object)
at [Source: (ByteArrayInputStream); line: 1, column: 21] (through reference chain: java.util.HashMap["setMoisAffaires"])
at com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:59)
at com.fasterxml.jackson.databind.DeserializationContext.wrongTokenException(DeserializationContext.java:1499)
at com.fasterxml.jackson.databind.DeserializationContext.reportWrongTokenException(DeserializationContext.java:1274)
at com.fasterxml.jackson.databind.jsontype.impl.AsArrayTypeDeserializer._locateTypeId(AsArrayTypeDeserializer.java:151)
at com.fasterxml.jackson.databind.jsontype.impl.AsArrayTypeDeserializer._deserialize(AsArrayTypeDeserializer.java:96)
at com.fasterxml.jackson.databind.jsontype.impl.AsArrayTypeDeserializer.deserializeTypedFromAny(AsArrayTypeDeserializer.java:71)
at com.fasterxml.jackson.databind.deser.std.UntypedObjectDeserializer$Vanilla.deserializeWithType(UntypedObjectDeserializer.java:712)
at com.fasterxml.jackson.databind.deser.std.MapDeserializer._readAndBindStringKeyMap(MapDeserializer.java:529)
at com.fasterxml.jackson.databind.deser.std.MapDeserializer.deserialize(MapDeserializer.java:364)
at com.fasterxml.jackson.databind.deser.std.MapDeserializer.deserialize(MapDeserializer.java:29)
at com.fasterxml.jackson.databind.ObjectMapper._readMapAndClose(ObjectMapper.java:4013)
at com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:3077)
at org.springframework.batch.core.repository.dao.Jackson2ExecutionContextStringSerializer.deserialize(Jackson2ExecutionContextStringSerializer.java:70)
at org.springframework.batch.core.repository.dao.Jackson2ExecutionContextStringSerializer.deserialize(Jackson2ExecutionContextStringSerializer.java:50)
at org.springframework.batch.core.repository.dao.JdbcExecutionContextDao$ExecutionContextRowMapper.mapRow(JdbcExecutionContextDao.java:325)
... 18 more
Here is an example of a step using such setMoisAffaires :
<bean id="jobname.jobstep" parent="jobItemParent"
class="fr.mycompany.myapp.collecte.batch.jobname.jobstep" scope="step">
<property name="setMoisAffaires" value="#{stepExecutionContext['setMoisAffaires']}" />
</bean>
What has been serialized with XStream is not necessarily deserializable with Jackson. So I'm not sure how you can solve this issue. See breaking change note here.
EDIT: After adding a minimal complete verifiable example
The XStreamExecutionContextStringSerializer#deserialize returns a Map<String, Object> of the key/value pairs of the old execution context. I would this map directly as input to the Jackson serializer instead of marshalling it to a String and then creating it back from a String. Here is how I updated the test:
import static org.junit.Assert.assertNotNull;
import java.io.ByteArrayInputStream;
import java.io.ByteArrayOutputStream;
import java.util.Map;
import com.fasterxml.jackson.databind.ObjectMapper;
import org.junit.Test;
import org.springframework.batch.core.repository.dao.Jackson2ExecutionContextStringSerializer;
import org.springframework.batch.core.repository.dao.XStreamExecutionContextStringSerializer;
public class SerializeTest {
#Test
public void test() throws Exception {
String oldJson = "{\"map\":[{\"entry\":[{\"string\":[\"batch.stepType\",\"org.springframework.batch.core.step.tasklet.TaskletStep\"]},{\"string\":[\"batch.taskletType\",\"com.sun.proxy.$Proxy56\"]},{\"string\":\"setMoisAffaires\",\"set\":[{\"tools.date.dates.Mois\":[{\"dateLocale\":{\"iLocalMillis\":1264982400000,\"iChronology\":{\"#class\":\"org.joda.time.chrono.ISOChronology\",\"#resolves-to\":\"org.joda.time.chrono.ISOChronology$Stub\",\"#serialization\":\"custom\",\"org.joda.time.chrono.ISOChronology$Stub\":{\"org.joda.time.UTCDateTimeZone\":{\"#resolves-to\":\"org.joda.time.DateTimeZone$Stub\",\"#serialization\":\"custom\",\"org.joda.time.DateTimeZone$Stub\":{\"string\":\"UTC\"}}}}},\"categorie\":\"MOIS\",\"dateAsString\":201002},{\"dateLocale\":{\"iLocalMillis\":1312156800000,\"iChronology\":{\"#class\":\"org.joda.time.chrono.ISOChronology\",\"#reference\":\"..\\/..\\/..\\/tools.date.dates.Mois\\/dateLocale\\/iChronology\"}},\"categorie\":\"MOIS\",\"dateAsString\":201108},{\"dateLocale\":{\"iLocalMillis\":1288569600000,\"iChronology\":{\"#class\":\"org.joda.time.chrono.ISOChronology\",\"#reference\":\"..\\/..\\/..\\/tools.date.dates.Mois\\/dateLocale\\/iChronology\"}},\"categorie\":\"MOIS\",\"dateAsString\":201011},{\"dateLocale\":{\"iLocalMillis\":1285891200000,\"iChronology\":{\"#class\":\"org.joda.time.chrono.ISOChronology\",\"#reference\":\"..\\/..\\/..\\/tools.date.dates.Mois\\/dateLocale\\/iChronology\"}},\"categorie\":\"MOIS\",\"dateAsString\":201010},{\"dateLocale\":{\"iLocalMillis\":1243814400000,\"iChronology\":{\"#class\":\"org.joda.time.chrono.ISOChronology\",\"#reference\":\"..\\/..\\/..\\/tools.date.dates.Mois\\/dateLocale\\/iChronology\"}},\"categorie\":\"MOIS\",\"dateAsString\":200906},{\"dateLocale\":{\"iLocalMillis\":1257033600000,\"iChronology\":{\"#class\":\"org.joda.time.chrono.ISOChronology\",\"#reference\":\"..\\/..\\/..\\/tools.date.dates.Mois\\/dateLocale\\/iChronology\"}},\"categorie\":\"MOIS\",\"dateAsString\":200911},{\"dateLocale\":{\"iLocalMillis\":1277942400000,\"iChronology\":{\"#class\":\"org.joda.time.chrono.ISOChronology\",\"#reference\":\"..\\/..\\/..\\/tools.date.dates.Mois\\/dateLocale\\/iChronology\"}},\"categorie\":\"MOIS\",\"dateAsString\":201007},{\"dateLocale\":{\"iLocalMillis\":1249084800000,\"iChronology\":{\"#class\":\"org.joda.time.chrono.ISOChronology\",\"#reference\":\"..\\/..\\/..\\/tools.date.dates.Mois\\/dateLocale\\/iChronology\"}},\"categorie\":\"MOIS\",\"dateAsString\":200908},{\"dateLocale\":{\"iLocalMillis\":1320105600000,\"iChronology\":{\"#class\":\"org.joda.time.chrono.ISOChronology\",\"#reference\":\"..\\/..\\/..\\/tools.date.dates.Mois\\/dateLocale\\/iChronology\"}},\"categorie\":\"MOIS\",\"dateAsString\":201111},{\"dateLocale\":{\"iLocalMillis\":1283299200000,\"iChronology\":{\"#class\":\"org.joda.time.chrono.ISOChronology\",\"#reference\":\"..\\/..\\/..\\/tools.date.dates.Mois\\/dateLocale\\/iChronology\"}},\"categorie\":\"MOIS\",\"dateAsString\":201009},{\"dateLocale\":{\"iLocalMillis\":1328054400000,\"iChronology\":{\"#class\":\"org.joda.time.chrono.ISOChronology\",\"#reference\":\"..\\/..\\/..\\/tools.date.dates.Mois\\/dateLocale\\/iChronology\"}},\"categorie\":\"MOIS\",\"dateAsString\":201202},{\"dateLocale\":{\"iLocalMillis\":1325376000000,\"iChronology\":{\"#class\":\"org.joda.time.chrono.ISOChronology\",\"#reference\":\"..\\/..\\/..\\/tools.date.dates.Mois\\/dateLocale\\/iChronology\"}},\"categorie\":\"MOIS\",\"dateAsString\":201201},{\"dateLocale\":{\"iLocalMillis\":1296518400000,\"iChronology\":{\"#class\":\"org.joda.time.chrono.ISOChronology\",\"#reference\":\"..\\/..\\/..\\/tools.date.dates.Mois\\/dateLocale\\/iChronology\"}},\"categorie\":\"MOIS\",\"dateAsString\":201102},{\"dateLocale\":{\"iLocalMillis\":1306886400000,\"iChronology\":{\"#class\":\"org.joda.time.chrono.ISOChronology\",\"#reference\":\"..\\/..\\/..\\/tools.date.dates.Mois\\/dateLocale\\/iChronology\"}},\"categorie\":\"MOIS\",\"dateAsString\":201106},{\"dateLocale\":{\"iLocalMillis\":1267401600000,\"iChronology\":{\"#class\":\"org.joda.time.chrono.ISOChronology\",\"#reference\":\"..\\/..\\/..\\/tools.date.dates.Mois\\/dateLocale\\/iChronology\"}},\"categorie\":\"MOIS\",\"dateAsString\":201003},{\"dateLocale\":{\"iLocalMillis\":1251763200000,\"iChronology\":{\"#class\":\"org.joda.time.chrono.ISOChronology\",\"#reference\":\"..\\/..\\/..\\/tools.date.dates.Mois\\/dateLocale\\/iChronology\"}},\"categorie\":\"MOIS\",\"dateAsString\":200909},{\"dateLocale\":{\"iLocalMillis\":1262304000000,\"iChronology\":{\"#class\":\"org.joda.time.chrono.ISOChronology\",\"#reference\":\"..\\/..\\/..\\/tools.date.dates.Mois\\/dateLocale\\/iChronology\"}},\"categorie\":\"MOIS\",\"dateAsString\":201001},{\"dateLocale\":{\"iLocalMillis\":1280620800000,\"iChronology\":{\"#class\":\"org.joda.time.chrono.ISOChronology\",\"#reference\":\"..\\/..\\/..\\/tools.date.dates.Mois\\/dateLocale\\/iChronology\"}},\"categorie\":\"MOIS\",\"dateAsString\":201008},{\"dateLocale\":{\"iLocalMillis\":1270080000000,\"iChronology\":{\"#class\":\"org.joda.time.chrono.ISOChronology\",\"#reference\":\"..\\/..\\/..\\/tools.date.dates.Mois\\/dateLocale\\/iChronology\"}},\"categorie\":\"MOIS\",\"dateAsString\":201004},{\"dateLocale\":{\"iLocalMillis\":1317427200000,\"iChronology\":{\"#class\":\"org.joda.time.chrono.ISOChronology\",\"#reference\":\"..\\/..\\/..\\/tools.date.dates.Mois\\/dateLocale\\/iChronology\"}},\"categorie\":\"MOIS\",\"dateAsString\":201110},{\"dateLocale\":{\"iLocalMillis\":1272672000000,\"iChronology\":{\"#class\":\"org.joda.time.chrono.ISOChronology\",\"#reference\":\"..\\/..\\/..\\/tools.date.dates.Mois\\/dateLocale\\/iChronology\"}},\"categorie\":\"MOIS\",\"dateAsString\":201005},{\"dateLocale\":{\"iLocalMillis\":1304208000000,\"iChronology\":{\"#class\":\"org.joda.time.chrono.ISOChronology\",\"#reference\":\"..\\/..\\/..\\/tools.date.dates.Mois\\/dateLocale\\/iChronology\"}},\"categorie\":\"MOIS\",\"dateAsString\":201105},{\"dateLocale\":{\"iLocalMillis\":1233446400000,\"iChronology\":{\"#class\":\"org.joda.time.chrono.ISOChronology\",\"#reference\":\"..\\/..\\/..\\/tools.date.dates.Mois\\/dateLocale\\/iChronology\"}},\"categorie\":\"MOIS\",\"dateAsString\":200902},{\"dateLocale\":{\"iLocalMillis\":1254355200000,\"iChronology\":{\"#class\":\"org.joda.time.chrono.ISOChronology\",\"#reference\":\"..\\/..\\/..\\/tools.date.dates.Mois\\/dateLocale\\/iChronology\"}},\"categorie\":\"MOIS\",\"dateAsString\":200910},{\"dateLocale\":{\"iLocalMillis\":1246406400000,\"iChronology\":{\"#class\":\"org.joda.time.chrono.ISOChronology\",\"#reference\":\"..\\/..\\/..\\/tools.date.dates.Mois\\/dateLocale\\/iChronology\"}},\"categorie\":\"MOIS\",\"dateAsString\":200907},{\"dateLocale\":{\"iLocalMillis\":1309478400000,\"iChronology\":{\"#class\":\"org.joda.time.chrono.ISOChronology\",\"#reference\":\"..\\/..\\/..\\/tools.date.dates.Mois\\/dateLocale\\/iChronology\"}},\"categorie\":\"MOIS\",\"dateAsString\":201107},{\"dateLocale\":{\"iLocalMillis\":1259625600000,\"iChronology\":{\"#class\":\"org.joda.time.chrono.ISOChronology\",\"#reference\":\"..\\/..\\/..\\/tools.date.dates.Mois\\/dateLocale\\/iChronology\"}},\"categorie\":\"MOIS\",\"dateAsString\":200912},{\"dateLocale\":{\"iLocalMillis\":1238544000000,\"iChronology\":{\"#class\":\"org.joda.time.chrono.ISOChronology\",\"#reference\":\"..\\/..\\/..\\/tools.date.dates.Mois\\/dateLocale\\/iChronology\"}},\"categorie\":\"MOIS\",\"dateAsString\":200904},{\"dateLocale\":{\"iLocalMillis\":1314835200000,\"iChronology\":{\"#class\":\"org.joda.time.chrono.ISOChronology\",\"#reference\":\"..\\/..\\/..\\/tools.date.dates.Mois\\/dateLocale\\/iChronology\"}},\"categorie\":\"MOIS\",\"dateAsString\":201109},{\"dateLocale\":{\"iLocalMillis\":1330560000000,\"iChronology\":{\"#class\":\"org.joda.time.chrono.ISOChronology\",\"#reference\":\"..\\/..\\/..\\/tools.date.dates.Mois\\/dateLocale\\/iChronology\"}},\"categorie\":\"MOIS\",\"dateAsString\":201203},{\"dateLocale\":{\"iLocalMillis\":1241136000000,\"iChronology\":{\"#class\":\"org.joda.time.chrono.ISOChronology\",\"#reference\":\"..\\/..\\/..\\/tools.date.dates.Mois\\/dateLocale\\/iChronology\"}},\"categorie\":\"MOIS\",\"dateAsString\":200905},{\"dateLocale\":{\"iLocalMillis\":1322697600000,\"iChronology\":{\"#class\":\"org.joda.time.chrono.ISOChronology\",\"#reference\":\"..\\/..\\/..\\/tools.date.dates.Mois\\/dateLocale\\/iChronology\"}},\"categorie\":\"MOIS\",\"dateAsString\":201112},{\"dateLocale\":{\"iLocalMillis\":1301616000000,\"iChronology\":{\"#class\":\"org.joda.time.chrono.ISOChronology\",\"#reference\":\"..\\/..\\/..\\/tools.date.dates.Mois\\/dateLocale\\/iChronology\"}},\"categorie\":\"MOIS\",\"dateAsString\":201104},{\"dateLocale\":{\"iLocalMillis\":1275350400000,\"iChronology\":{\"#class\":\"org.joda.time.chrono.ISOChronology\",\"#reference\":\"..\\/..\\/..\\/tools.date.dates.Mois\\/dateLocale\\/iChronology\"}},\"categorie\":\"MOIS\",\"dateAsString\":201006},{\"dateLocale\":{\"iLocalMillis\":1293840000000,\"iChronology\":{\"#class\":\"org.joda.time.chrono.ISOChronology\",\"#reference\":\"..\\/..\\/..\\/tools.date.dates.Mois\\/dateLocale\\/iChronology\"}},\"categorie\":\"MOIS\",\"dateAsString\":201101},{\"dateLocale\":{\"iLocalMillis\":1235865600000,\"iChronology\":{\"#class\":\"org.joda.time.chrono.ISOChronology\",\"#reference\":\"..\\/..\\/..\\/tools.date.dates.Mois\\/dateLocale\\/iChronology\"}},\"categorie\":\"MOIS\",\"dateAsString\":200903},{\"dateLocale\":{\"iLocalMillis\":1291161600000,\"iChronology\":{\"#class\":\"org.joda.time.chrono.ISOChronology\",\"#reference\":\"..\\/..\\/..\\/tools.date.dates.Mois\\/dateLocale\\/iChronology\"}},\"categorie\":\"MOIS\",\"dateAsString\":201012},{\"dateLocale\":{\"iLocalMillis\":1298937600000,\"iChronology\":{\"#class\":\"org.joda.time.chrono.ISOChronology\",\"#reference\":\"..\\/..\\/..\\/tools.date.dates.Mois\\/dateLocale\\/iChronology\"}},\"categorie\":\"MOIS\",\"dateAsString\":201103}]}]}]}]}";
/*
* Unserializing oldJson to a Java object...
*/
XStreamExecutionContextStringSerializer oldSerializer = new XStreamExecutionContextStringSerializer();
oldSerializer.init();
ByteArrayInputStream oldInputStream = new ByteArrayInputStream(oldJson.getBytes());
Map<String, Object> javaObject = oldSerializer.deserialize(oldInputStream);
/*
* Serializing back to a new Json string using Jackson...
*/
ObjectMapper newSerializer = new ObjectMapper();
String newJson = newSerializer.writeValueAsString(javaObject);
assertNotNull(newJson);
System.out.println("newJson = " + newJson);
/*
* Or more correctly
*/
Jackson2ExecutionContextStringSerializer stringSerializer = new Jackson2ExecutionContextStringSerializer();
ByteArrayOutputStream out = new ByteArrayOutputStream();
stringSerializer.serialize(javaObject, out);
newJson = new String(out.toByteArray());
assertNotNull(newJson);
System.out.println("newJson = " + newJson);
}
}
Hope this helps.

Why I receive an error when I use exposed in Kotlin?

I need to receive region_name by region_code from Oracle DB
I use Exposed for my program, but I receive error
in thread "main" java.lang.AbstractMethodError
at org.jetbrains.exposed.sql.Transaction.closeExecutedStatements(Transaction.kt:181)
at org.jetbrains.exposed.sql.transactions.ThreadLocalTransactionManagerKt.inTopLevelTransaction(ThreadLocalTransactionManager.kt:137)
at org.jetbrains.exposed.sql.transactions.ThreadLocalTransactionManagerKt.transaction(ThreadLocalTransactionManager.kt:75)
Code is
object Codes : Table("REGIONS") {
val region_code = varchar("region_code",32)
val region_name = varchar("region_name",32)}
The main fun contains
.......
val conn = Database.connect("jdbc:oracle:thin:#//...", driver = "oracle.jdbc.OracleDriver",
user = "...", password = "...")
transaction(java.sql.Connection.TRANSACTION_READ_COMMITTED, 1, conn) {
addLogger(StdOutSqlLogger)
Codes.select { Codes.region_code eq "a" }.limit(1).forEach {
print(it[Codes.region_name])
}
}
AbstractMethodError usually means that you compiled the code with one version of a library, but are running it against a different (incompatible) version.  (See for example these questions.)
So I'd check your dependencies &c carefully.

Facing issue while using SparkUDF with multiple arguments

I am trying to encript the data using SHA-256 by passing as an argument in Spark UDF but getting below error. Please find the program snippet and error details below.
Code Snippet:
package com.sample
import org.apache.spark.SparkContext
import org.apache.spark.SparkConf
import org.apache.spark.sql.SparkSession
import java.security.MessageDigest
import org.apache.spark.sql.functions._
import org.apache.spark.sql.expressions.UserDefinedFunction
import javax.xml.bind.DatatypeConverter;
import org.apache.spark.sql.Column
object Customer {
def main(args: Array[String]) {
val conf = new SparkConf().setAppName("Customer-data").setMaster("local[2]").set("spark.executor.memory", "1g");
val sc = new SparkContext(conf)
val spark = SparkSession.builder().config(sc.getConf).getOrCreate()
//val hash_algm=sc.getConf.get("halgm")
val hash_algm="SHA-256"
val df = spark.read.format("csv").option("header", "true").load("file:///home/tcs/Documents/KiranDocs/Data_files/sample_data")
spark.udf.register("encriptedVal1", encriptedVal)
//calling encription UDF function
//val resDF1 = df.withColumn(("ssn_number"), encriptedVal(df("customer_id"))).show()
val resDF2 = df.withColumn(("ssn_number"), encriptedVal(array("customer_id", hash_algm))).show()
println("data set"+resDF2)
sc.stop()
}
def encriptedVal = udf((s: String,s1:String) => {
val digest = MessageDigest.getInstance(s1)
val hash = digest.digest(s.getBytes("UTF-8"))
DatatypeConverter.printHexBinary(hash)
})
}
Error details are below:
Exception in thread "main" 2019-01-21 19:42:48 INFO SparkContext:54 -
Invoking stop() from shutdown hook java.lang.ClassCastException:
com.sample.Customer$$anonfun$encriptedVal$1 cannot be cast to
scala.Function1 at
org.apache.spark.sql.catalyst.expressions.ScalaUDF.(ScalaUDF.scala:104)
at
org.apache.spark.sql.expressions.UserDefinedFunction.apply(UserDefinedFunction.scala:85)
at com.sample.Customer$.main(Customer.scala:26) at
com.sample.Customer.main(Customer.scala)
The problem here is how you call the defined UDF. You should use it like the following:
val resDF1 = df.withColumn(("ssn_number"), encriptedVal(df.col("customer_id"), lit(hash_algm)))
because it accepts two Columns object (both Columns must be String type as defined in your UDF).

How to enable SQL on SchemaRDD via the JDBC interface? (is it even possible ?)

UPDATING the problem statement
We are using spark 1.2.0 (Hadoop 2.4). We have defined SchemaRDDs using data files in HDFS and would like to enable querying these as tables via HiveServer2. We are encountering runtime exceptions while trying to saveAsTable and would like guidance on how to proceed.
Source code:
package foo.bar
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf
import org.apache.spark.sql._
import org.apache.spark._
import org.apache.spark.sql.hive._
object HiveDemo {
def main(args: Array[String]) {
val conf = new SparkConf().setAppName("Demo")
val sc = new SparkContext(conf)
// sc is an existing SparkContext.
val hiveContext = new org.apache.spark.sql.hive.HiveContext(sc)
// Create an RDD
val zipRDD = sc.textFile("/model-inputs/all_zip_state.csv")
// The schema is encoded in a string
val schemaString = "ODSMEMBERID,ZIPCODE,STATE,TEST_SUPPLIERID,ratio_death_readm_low,ratio_death_readm_high,regions"
// Generate the schema based on the string of schema
val schema =
StructType(
schemaString.split(",").map(fieldName => StructField(fieldName, StringType, true)))
// Convert records of the RDD (zip) to Rows.
val rowRDD = zipRDD.map(_.split(",")).map(p => Row(p(0), p(1), p(2), p(3), p(4), p(5), ""))
// Apply the schema to the RDD.
val zipSchemaRDD = hiveContext.applySchema(rowRDD, schema)
// HiveContext's save as Table
zipSchemaRDD.saveAsTable("allzipstable")
}
}
spark-submit Command:
./bin/spark-submit --class foo.bar.HiveDemo --master yarn-cluster --jars /usr/lib/hive/lib/hive-metastore.jar,/usr/lib/spark-1.2.0-bin-hadoop2.4/lib/datanucleus-api-jdo-3.2.6.jar,/usr/lib/spark-1.2.0-bin-hadoop2.4/lib/datanucleus-core-3.2.10.jar,/usr/lib/spark-1.2.0-bin-hadoop2.4/lib/datanucleus-rdbms-3.2.9.jar --num-executors 3 --driver-memory 4g --executor-memory 2g --executor-cores 1 lib/datapipe_2.10-1.0.jar 10
Exception at runtime on Node:
15/01/29 22:35:50 INFO yarn.ApplicationMaster: Final app status: FAILED, exitCode: 15, (reason: User class threw exception: Unresolved plan found, tree:
'CreateTableAsSelect None, allzipstable, false, None
LogicalRDD [ODSMEMBERID#0,ZIPCODE#1,STATE#2,TEST_SUPPLIERID#3,ratio_death_readm_low#4,ratio_death_readm_high#5,regions#6], MappedRDD[3] at map at HiveDemo.scala:30
)
Exception in thread "Driver" org.apache.spark.sql.catalyst.errors.package$TreeNodeException: Unresolved plan found, tree:
'CreateTableAsSelect None, allzipstable, false, None
LogicalRDD [ODSMEMBERID#0,ZIPCODE#1,STATE#2,TEST_SUPPLIERID#3,ratio_death_readm_low#4,ratio_death_readm_high#5,regions#6], MappedRDD[3] at map at HiveDemo.scala:30
at org.apache.spark.sql.catalyst.analysis.Analyzer$CheckResolution$$anonfun$1.applyOrElse(Analyzer.scala:83)
at org.apache.spark.sql.catalyst.analysis.Analyzer$CheckResolution$$anonfun$1.applyOrElse(Analyzer.scala:78)
at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:144)
at org.apache.spark.sql.catalyst.trees.TreeNode.transform(TreeNode.scala:135)
at org.apache.spark.sql.catalyst.analysis.Analyzer$CheckResolution$.apply(Analyzer.scala:78)
at org.apache.spark.sql.catalyst.analysis.Analyzer$CheckResolution$.apply(Analyzer.scala:76)
at org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$apply$1$$anonfun$apply$2.apply(RuleExecutor.scala:61)
at org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$apply$1$$anonfun$apply$2.apply(RuleExecutor.scala:59)
at scala.collection.IndexedSeqOptimized$class.foldl(IndexedSeqOptimized.scala:51)
at scala.collection.IndexedSeqOptimized$class.foldLeft(IndexedSeqOptimized.scala:60)
at scala.collection.mutable.WrappedArray.foldLeft(WrappedArray.scala:34)
at org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$apply$1.apply(RuleExecutor.scala:59)
at org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$apply$1.apply(RuleExecutor.scala:51)
at scala.collection.immutable.List.foreach(List.scala:318)
at org.apache.spark.sql.catalyst.rules.RuleExecutor.apply(RuleExecutor.scala:51)
at org.apache.spark.sql.SQLContext$QueryExecution.analyzed$lzycompute(SQLContext.scala:411)
at org.apache.spark.sql.SQLContext$QueryExecution.analyzed(SQLContext.scala:411)
at org.apache.spark.sql.SQLContext$QueryExecution.withCachedData$lzycompute(SQLContext.scala:412)
at org.apache.spark.sql.SQLContext$QueryExecution.withCachedData(SQLContext.scala:412)
at org.apache.spark.sql.SQLContext$QueryExecution.optimizedPlan$lzycompute(SQLContext.scala:413)
at org.apache.spark.sql.SQLContext$QueryExecution.optimizedPlan(SQLContext.scala:413)
at org.apache.spark.sql.SQLContext$QueryExecution.sparkPlan$lzycompute(SQLContext.scala:418)
at org.apache.spark.sql.SQLContext$QueryExecution.sparkPlan(SQLContext.scala:416)
at org.apache.spark.sql.SQLContext$QueryExecution.executedPlan$lzycompute(SQLContext.scala:422)
at org.apache.spark.sql.SQLContext$QueryExecution.executedPlan(SQLContext.scala:422)
at org.apache.spark.sql.SQLContext$QueryExecution.toRdd$lzycompute(SQLContext.scala:425)
at org.apache.spark.sql.SQLContext$QueryExecution.toRdd(SQLContext.scala:425)
at org.apache.spark.sql.SchemaRDDLike$class.saveAsTable(SchemaRDDLike.scala:126)
at org.apache.spark.sql.SchemaRDD.saveAsTable(SchemaRDD.scala:108)
at com.healthagen.datapipe.ahm.HiveDemo$.main(HiveDemo.scala:36)
at com.healthagen.datapipe.ahm.HiveDemo.main(HiveDemo.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:427)
15/01/29 22:35:50 INFO yarn.ApplicationMaster: Invoking sc stop from shutdown hook
Another attempt:
package foo.bar
import org.apache.spark.{ SparkConf, SparkContext }
import org.apache.spark.sql._
case class AllZips(
ODSMEMBERID: String,
ZIPCODE: String,
STATE: String,
TEST_SUPPLIERID: String,
ratio_death_readm_low: String,
ratio_death_readm_high: String,
regions: String)
object HiveDemo {
def main(args: Array[String]) {
val conf = new SparkConf().setAppName("HiveDemo")
val sc = new SparkContext(conf)
val hiveContext = new org.apache.spark.sql.hive.HiveContext(sc)
import hiveContext._
val allZips = sc.textFile("/model-inputs/all_zip_state.csv").map(_.split(",")).map(p => AllZips(p(0), p(1), p(2), p(3), p(4), p(5), ""))
val allZipsSchemaRDD = createSchemaRDD(allZips)
allZipsSchemaRDD.saveAsTable("allzipstable")
}
}
Exception on node:
15/01/30 00:28:19 INFO yarn.ApplicationMaster: Final app status: FAILED, exitCode: 15, (reason: User class threw exception: Unresolved plan found, tree:
'CreateTableAsSelect None, allzipstable, false, None
LogicalRDD [ODSMEMBERID#0,ZIPCODE#1,STATE#2,TEST_SUPPLIERID#3,ratio_death_readm_low#4,ratio_death_readm_high#5,regions#6], MapPartitionsRDD[4] at mapPartitions at ExistingRDD.scala:36
)
Exception in thread "Driver" org.apache.spark.sql.catalyst.errors.package$TreeNodeException: Unresolved plan found, tree:
'CreateTableAsSelect None, allzipstable, false, None
LogicalRDD [ODSMEMBERID#0,ZIPCODE#1,STATE#2,TEST_SUPPLIERID#3,ratio_death_readm_low#4,ratio_death_readm_high#5,regions#6], MapPartitionsRDD[4] at mapPartitions at ExistingRDD.scala:36
at org.apache.spark.sql.catalyst.analysis.Analyzer$CheckResolution$$anonfun$1.applyOrElse(Analyzer.scala:83)
at org.apache.spark.sql.catalyst.analysis.Analyzer$CheckResolution$$anonfun$1.applyOrElse(Analyzer.scala:78)
at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:144)
at org.apache.spark.sql.catalyst.trees.TreeNode.transform(TreeNode.scala:135)
at org.apache.spark.sql.catalyst.analysis.Analyzer$CheckResolution$.apply(Analyzer.scala:78)
at org.apache.spark.sql.catalyst.analysis.Analyzer$CheckResolution$.apply(Analyzer.scala:76)
at org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$apply$1$$anonfun$apply$2.apply(RuleExecutor.scala:61)
at org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$apply$1$$anonfun$apply$2.apply(RuleExecutor.scala:59)
at scala.collection.IndexedSeqOptimized$class.foldl(IndexedSeqOptimized.scala:51)
at scala.collection.IndexedSeqOptimized$class.foldLeft(IndexedSeqOptimized.scala:60)
at scala.collection.mutable.WrappedArray.foldLeft(WrappedArray.scala:34)
at org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$apply$1.apply(RuleExecutor.scala:59)
at org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$apply$1.apply(RuleExecutor.scala:51)
at scala.collection.immutable.List.foreach(List.scala:318)
at org.apache.spark.sql.catalyst.rules.RuleExecutor.apply(RuleExecutor.scala:51)
at org.apache.spark.sql.SQLContext$QueryExecution.analyzed$lzycompute(SQLContext.scala:411)
at org.apache.spark.sql.SQLContext$QueryExecution.analyzed(SQLContext.scala:411)
at org.apache.spark.sql.SQLContext$QueryExecution.withCachedData$lzycompute(SQLContext.scala:412)
at org.apache.spark.sql.SQLContext$QueryExecution.withCachedData(SQLContext.scala:412)
at org.apache.spark.sql.SQLContext$QueryExecution.optimizedPlan$lzycompute(SQLContext.scala:413)
at org.apache.spark.sql.SQLContext$QueryExecution.optimizedPlan(SQLContext.scala:413)
at org.apache.spark.sql.SQLContext$QueryExecution.sparkPlan$lzycompute(SQLContext.scala:418)
at org.apache.spark.sql.SQLContext$QueryExecution.sparkPlan(SQLContext.scala:416)
at org.apache.spark.sql.SQLContext$QueryExecution.executedPlan$lzycompute(SQLContext.scala:422)
at org.apache.spark.sql.SQLContext$QueryExecution.executedPlan(SQLContext.scala:422)
at org.apache.spark.sql.SQLContext$QueryExecution.toRdd$lzycompute(SQLContext.scala:425)
at org.apache.spark.sql.SQLContext$QueryExecution.toRdd(SQLContext.scala:425)
at org.apache.spark.sql.SchemaRDDLike$class.saveAsTable(SchemaRDDLike.scala:126)
at org.apache.spark.sql.SchemaRDD.saveAsTable(SchemaRDD.scala:108)
at com.healthagen.datapipe.ahm.HiveDemo$.main(HiveDemo.scala:22)
at com.healthagen.datapipe.ahm.HiveDemo.main(HiveDemo.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:427)
15/01/30 00:28:19 INFO yarn.ApplicationMaster: Invoking sc stop from shutdown hook
You need to use a HiveContext
Here are the java/scala docs:
* Note that this currently only works with SchemaRDDs that are created from a HiveContext as
* there is no notion of a persisted catalog in a standard SQL context.
#Experimental
def saveAsTable(tableName: String): Unit =
sqlContext.executePlan(CreateTableAsSelect(None, tableName, logicalPlan, false)).toRdd
So in your code change it to:
val sc = new HiveContext(conf)
Actually you should rename it to
val sqlc = new HiveContext(conf)
FYI: more info about registering tables (in SQLContext): note the tables are transient if done this way:
/**
* Temporary tables exist only
* during the lifetime of this instance of SQLContext.
*
* #group userf
*/
def registerRDDAsTable(rdd: SchemaRDD, tableName: String): Unit = {
catalog.registerTable(Seq(tableName), rdd.queryExecution.logical)
}
UPDATE Your new stacktrace includes the following phrase:
Unresolved plan found, tree:
That typically means you have a column that does not match the underlying table. I will look further to see if am able to isolate - but in the meantime you might also consider from that perspective.
createSchemaRDD code snippet from above works fine on spark 1.2.1
There was a CTAS defect in 1.2.0