NPE with #JoinColumn and Play2 and Ebean - nullpointerexception

I have the following property in my entity :
#OneToOne
#JoinColumn(unique = true)
#NotNull(message = ValidatorUtils.ERROR_NOT_NULL)
public User user;
and I get the following stacktrace when I refresh my browser, while trying to generate database migration :
[error] application -
! #6gmdbcgi0 - Internal server error, for (GET) [/] ->
play.api.UnexpectedException: Unexpected exception[NullPointerException: null]
at play.core.ReloadableApplication$$anonfun$get$1$$anonfun$apply$1$$anonfun$1.apply(ApplicationProvider.scala:148) ~[play_2.10.jar:2.2.0]
at play.core.ReloadableApplication$$anonfun$get$1$$anonfun$apply$1$$anonfun$1.apply(ApplicationProvider.scala:112) ~[play_2.10.jar:2.2.0]
at scala.Option.map(Option.scala:145) ~[scala-library.jar:na]
at play.core.ReloadableApplication$$anonfun$get$1$$anonfun$apply$1.apply(ApplicationProvider.scala:112) ~[play_2.10.jar:2.2.0]
at play.core.ReloadableApplication$$anonfun$get$1$$anonfun$apply$1.apply(ApplicationProvider.scala:110) ~[play_2.10.jar:2.2.0]
at scala.util.Success.flatMap(Try.scala:200) ~[scala-library.jar:na]
Caused by: java.lang.NullPointerException: null
at com.avaje.ebeaninternal.server.ddl.CreateTableVisitor.isDbColumnWritten(CreateTableVisitor.java:59) ~[avaje-ebeanorm.jar:na]
at com.avaje.ebeaninternal.server.ddl.CreateTableColumnVisitor.visitOneImported(CreateTableColumnVisitor.java:111) ~[avaje-ebeanorm.jar:na]
at com.avaje.ebeaninternal.server.ddl.VisitorUtil.visit(VisitorUtil.java:109) ~[avaje-ebeanorm.jar:na]
at com.avaje.ebeaninternal.server.ddl.VisitorUtil.visit(VisitorUtil.java:73) ~[avaje-ebeanorm.jar:na]
at com.avaje.ebeaninternal.server.ddl.VisitorUtil.visitBean(VisitorUtil.java:62) ~[avaje-ebeanorm.jar:na]
at com.avaje.ebeaninternal.server.ddl.VisitorUtil.visit(VisitorUtil.java:36) ~[avaje-ebeanorm.jar:na]
If I remove the #JoinColumn annotation, my schema is migrated correctly.
How can I specify the column to be unique without having this error ? Thanks.
EDIT 01/09/2014 : Add generated ddl
Here is the generated ddl :
I'd like the generated ddl to contain something like :
constraint uq_user_id unique (user_id)

Instead of #JoinColumn(unique = true) try: #Column(unique=true)
Also, if you want to make a combination of multiple columns unique, you can use: Table(name="table_name",uniqueConstraints=#UniqueConstraint(columnNames={"column1","column2"}))

Related

Loading CSV data containing string and numeric format to Ignite is failing

I am evaluating Ignite and trying to load CSV data to Apache Ignite. I have created a table in Ignite:
jdbc:ignite:thin://127.0.0.1/> create table if not exists SAMPLE_DATA_PK(SID varchar(30),id_status varchar(50), active varchar, count_opening int,count_updated int,ID_caller varchar(50),opened_time varchar(50),created_at varchar(50),type_contact varchar, location varchar,support_incharge varchar,pk varchar(10) primary key);
I tried to load data to this table with command:
copy from '/home/kkn/data/sample_data_pk.csv' into SAMPLE_DATA_PK(SID,ID_status,active,count_opening,count_updated,ID_caller,opened_time,created_at,type_contact,location,support_incharge,pk) format csv;
But the data load is failing with this error:
Error: Server error: class org.apache.ignite.internal.processors.query.IgniteSQLException: Value conversion failed [column=COUNT_OPENING, from=java.lang.String, to=java.lang.Integer] (state=50000,code=1)
java.sql.SQLException: Server error: class org.apache.ignite.internal.processors.query.IgniteSQLException: Value conversion failed [column=COUNT_OPENING, from=java.lang.String, to=java.lang.Integer]
at org.apache.ignite.internal.jdbc.thin.JdbcThinConnection.sendRequest(JdbcThinConnection.java:1009)
at org.apache.ignite.internal.jdbc.thin.JdbcThinStatement.sendFile(JdbcThinStatement.java:336)
at org.apache.ignite.internal.jdbc.thin.JdbcThinStatement.execute0(JdbcThinStatement.java:243)
at org.apache.ignite.internal.jdbc.thin.JdbcThinStatement.execute(JdbcThinStatement.java:560)
at sqlline.Commands.executeSingleQuery(Commands.java:1054)
at sqlline.Commands.execute(Commands.java:1003)
at sqlline.Commands.sql(Commands.java:967)
at sqlline.SqlLine.dispatch(SqlLine.java:734)
at sqlline.SqlLine.begin(SqlLine.java:541)
at sqlline.SqlLine.start(SqlLine.java:267)
at sqlline.SqlLine.main(SqlLine.java:206)
Below is the sample data I am trying to load:
SID|ID_status|active|count_opening|count_updated|ID_caller|opened_time|created_at|type_contact|location|support_incharge|pk
|---|---------|------|-------------|-------------|---------|-----------|----------|------------|--------|----------------|--|
INC0000045|New|true|1000|0|Caller2403|29-02-2016 01:16|29-02-2016 01:23|Phone|Location143||1
INC0000045|Resolved|true|0|3|Caller2403|29-02-2016 01:16|29-02-2016 01:23|Phone|Location143||2
INC0000045|Closed|false|0|1|Caller2403|29-02-2016 01:16|29-02-2016 01:23|Phone|Location143||3
INC0000047|Active|true|0|1|Caller2403|29-02-2016 04:40|29-02-2016 04:57|Phone|Location165||4
INC0000047|Active|true|0|2|Caller2403|29-02-2016 04:40|29-02-2016 04:57|Phone|Location165||5
INC0000047|Active|true|0|489|Caller2403|29-02-2016 04:40|29-02-2016 04:57|Phone|Location165||6
INC0000047|Active|true|0|5|Caller2403|29-02-2016 04:40|29-02-2016 04:57|Phone|Location165||7
INC0000047|AwaitingUserInfo|true|0|6|Caller2403|29-02-2016 04:40|29-02-2016 04:57|Phone|Location165||8
INC0000047|Closed|false|0|8|Caller2403|29-02-2016 04:40|29-02-2016 04:57|Phone|Location165||9
INC0000057|New|true|0|0|Caller4416|29-02-2016 06:10||Phone|Location204||10
Need help to understand how to figure out what is the issue and resolve it
You have to upload CSV without header line. Which contains the column names. An error is thrown when trying to convert the string value "count_opening" to a Integer.

Unable to create a VIEW in Apache Ignite

I am trying to create a view on Apache Ignite using the following syntax:
emplCache.query(new SqlFieldsQuery(
"CREATE VIEW EmployeeCopy AS (SELECT * FROM Employee);")).getAll();
(Assume that table Employee is available and has data in it).
When this line gets executed, I get the following exception:
javax.cache.CacheException: class org.apache.ignite.IgniteCheckedException: null
at org.apache.ignite.internal.processors.query.GridQueryProcessor.querySqlFields(GridQueryProcessor.java:1823)
at org.apache.ignite.internal.processors.cache.IgniteCacheProxy.query(IgniteCacheProxy.java:795)
at org.apache.ignite.internal.processors.cache.IgniteCacheProxy.query(IgniteCacheProxy.java:765)
at com.demo.ignite.test1.EmployeeQuery2.createCopyTable(EmployeeQuery2.java:71)
at com.demo.ignite.test1.EmployeeQuery2.main(EmployeeQuery2.java:55)
Caused by: class org.apache.ignite.IgniteCheckedException: null
at org.apache.ignite.internal.processors.query.GridQueryProcessor.executeQuery(GridQueryProcessor.java:2316)
at org.apache.ignite.internal.processors.query.GridQueryProcessor.querySqlFields(GridQueryProcessor.java:1820)
... 4 more
Caused by: java.lang.NullPointerException
at org.apache.ignite.internal.processors.query.h2.IgniteH2Indexing.queryDistributedSqlFields(IgniteH2Indexing.java:1343)
at org.apache.ignite.internal.processors.query.GridQueryProcessor$5.applyx(GridQueryProcessor.java:1815)
at org.apache.ignite.internal.processors.query.GridQueryProcessor$5.applyx(GridQueryProcessor.java:1813)
at org.apache.ignite.internal.util.lang.IgniteOutClosureX.apply(IgniteOutClosureX.java:36)
at org.apache.ignite.internal.processors.query.GridQueryProcessor.executeQuery(GridQueryProcessor.java:2293)
... 5 more
I see that the variable "twoStepQry" is null at line 1343 of class IgniteH2Indexing.java. I am not able to understand if I have missed something.
I am using apache-ignite-2.1.0.
Also if I create a VIEW, how does it work internally? Does it lock on those entries in the cache, or does it copy it to some other cache?
Ignite does not support CREATE VIEW for now.
I have created a ticket for this: https://issues.apache.org/jira/browse/IGNITE-5951
List of supported DDL statements can be found at https://apacheignite.readme.io/docs/distributed-ddl

Phoenix use jdbcTemplate to batchUpdate while column contain sequence

i'm using phoenix combine with jdbcTemplate to insert data to Hbase.
Phoenix4.7.0
Hbase 1.1.2
One of the column(id) in the table is generated monotonically with a sequence,in general,i use "jdbcTemplate.execute(sql)" method to insert data.
For example , "upsert into table(row id) values( "row"," NEXT VALUE FOR table.id_sequence" )" The second column(id) is generated Automatically with a sequence. it's fine.But when i use "jdbcTemplate.batchUpdate" method ,the same sql to batchInsert,there's something wrong with it.
Caused by: org.apache.phoenix.exception.BatchUpdateExecution: ERROR 1106 (XCL06): Exception while executing batch.
at org.apache.phoenix.jdbc.PhoenixStatement.executeBatch(PhoenixStatement.java:1226)
at org.apache.commons.dbcp.DelegatingStatement.executeBatch(DelegatingStatement.java:297)
at org.apache.commons.dbcp.DelegatingStatement.executeBatch(DelegatingStatement.java:297)
at org.springframework.jdbc.core.JdbcTemplate$4.doInPreparedStatement(JdbcTemplate.java:905)
at org.springframework.jdbc.core.JdbcTemplate$4.doInPreparedStatement(JdbcTemplate.java:890)
at org.springframework.jdbc.core.JdbcTemplate.execute(JdbcTemplate.java:589)
... 45 more
Caused by: org.apache.phoenix.schema.TypeMismatchException: ERROR 203 (22005): Type mismatch. BIGINT and VARCHAR for NEXT VALUE FOR AQMDATA_ALL.id_sequence
How can i fix it?
Please add hbase-protocol.jar in classpath. So that it is available to program..
Please reply me if in case any issue.

org.springframework.data.mapping.PropertyReferenceException: No property greaterThan found for type

Does MongoRepository has $and operator? I was getting error when i try with and operator like below. I have checked the document and haven't noticed "and" keyword.
public interface AssetRepository extends MongoRepository<Asset, String>{
#Query("select a from Asset a where ownerId = ? and timeUpdated > ?")
public List<Asset> findByOwnerIdAndGreaterThan(final String ownerId, Date timeUpdated);
}
Error
Caused by: org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'assetRepository': Invocation of init method failed; nested exception is org.springframework.data.mapping.PropertyReferenceException: No property greaterThan found for type Asset!
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1574)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:539)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:476)
at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:303)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:230)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:299)
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:194)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.findAutowireCandidates(DefaultListableBeanFactory.java:1120)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.doResolveDependency(DefaultListableBeanFactory.java:1044)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.resolveDependency(DefaultListableBeanFactory.java:942)
at org.springframework.beans.factory.support.ConstructorResolver.resolveAutowiredArgument(ConstructorResolver.java:813)
at org.springframework.beans.factory.support.ConstructorResolver.createArgumentArray(ConstructorResolver.java:741)
... 29 more
Caused by: org.springframework.data.mapping.PropertyReferenceException: No property greaterThan found for type Asset!
at org.springframework.data.mapping.PropertyPath.<init>(PropertyPath.java:75)
at org.springframework.data.mapping.PropertyPath.create(PropertyPath.java:327)
at org.springframework.data.mapping.PropertyPath.create(PropertyPath.java:307)
at org.springframework.data.mapping.PropertyPath.from(PropertyPath.java:270)
at org.springframework.data.mapping.PropertyPath.from(PropertyPath.java:241)
at org.springframework.data.repository.query.parser.Part.<init>(Part.java:76)
at org.springframework.data.repository.query.parser.PartTree$OrPart.<init>(PartTree.java:235)
at org.springframework.data.repository.query.parser.PartTree$Predicate.buildTree(PartTree.java:373)
at org.springframework.data.repository.query.parser.PartTree$Predicate.<init>(PartTree.java:353)
at org.springframework.data.repository.query.parser.PartTree.<init>(PartTree.java:87)
at org.springframework.data.mongodb.repository.query.PartTreeMongoQuery.<init>(PartTreeMongoQuery.java:53)
at org.springframework.data.mongodb.repository.support.MongoRepositoryFactory$MongoQueryLookupStrategy.resolveQuery(MongoRepositoryFactory.java:128)
at org.springframework.data.repository.core.support.RepositoryFactorySupport$QueryExecutorMethodInterceptor.<init>(RepositoryFactorySupport.java:369)
at org.springframework.data.repository.core.support.RepositoryFactorySupport.getRepository(RepositoryFactorySupport.java:192)
at org.springframework.data.repository.core.support.RepositoryFactoryBeanSupport.initAndReturn(RepositoryFactoryBeanSupport.java:239)
at org.springframework.data.repository.core.support.RepositoryFactoryBeanSupport.afterPropertiesSet(RepositoryFactoryBeanSupport.java:225)
at org.springframework.data.mongodb.repository.support.MongoRepositoryFactoryBean.afterPropertiesSet(MongoRepositoryFactoryBean.java:108)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1633)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1570)
Judging from your exception your query lookup strategy is CREATE, or you don't import #Query from the correct namespace (package).
You should change the method name to findByOwnerIdAndTimeUpdatedGreaterThan, assuming the Asset class has a timeUpdated property.
Or, resolving back to default lookup strategy, the following should work (I think):
#Query("{ 'ownerId' : ?0 , 'timeUpdated' : {$gt : ?1}}")
Notice that you currently use a JPA syntax in #Query, and not MongoDB.
The code doesn't work for two reasons:
You're using JPQL with MongoDB which is not supported. If you want to manually define the query, use a MongoDB query (i.e. JSON).
The exception indicates the query is derived from the method name. findByOwnerIdAndGreaterThan doesn't make too much sense, as you need to define a property you want to apply GreaterThan to. So it probably needs to be something along the lines of findByOwnerIdAndTimeUpdatedGreaterThan judging from what you have listed as JPQL document.

Trouble running liquibase with different agent

I need to execute the same db-changelog by ant and then by spring. I hope that ant will run the changelog and when spring run, it will not do anything and just stop normally. Ant run the db-changelog successfully and then spring run but it throws an exception, part of the stack trace :
Reason: liquibase.exception.JDBCException: Error executing SQL CREATE TABLE action (action_id int8 NOT NULL, action_name VARCHAR(255), version_no int8, reason_required BOOLEAN, comment_required BOOLEAN, step_id int8, CONSTRAINT action_pkey PRIMARY KEY (action_id)):
Caused By: Error executing SQL CREATE TABLE action (action_id int8 NOT NULL, action_name VARCHAR(255), version_no int8, reason_required BOOLEAN, comment_required BOOLEAN, step_id int8, CONSTRAINT action_pkey PRIMARY KEY (action_id)):
Caused By: ERROR: relation "action" already exists; nested exception is org.springframework.beans.factory.BeanCreationException....
Any help will much appreciated.
Regards,
It does sound like it is trying to run the changelog again. Each changeSet in the changeLog is identified by a combination of the id, author, and the changelog path/filename. If you run "select * from databasechangelog" you can see the values used.
Your problem may be that you are referencing the changelog file differently from ant and spring therefore generating different filename values. Usually you will want to include them in the classpath so no matter where and how you run them they have the same path (like "com/example/db.changelog.xml")
I ran into this same problem and was able to fix it by altering the filename column of DATABASECHANGELOG to reference the spring resource path. In my case, I was using a ServletContextResource under the WEB-INF directory:
update DATABASECHANGELOG set FILENAME = 'WEB-INF/path/to/changelog.xml' where FILENAME = 'changelog.xml'