spring liquibase / hsqldb integration - hsqldb

I am trying to load an embedded hsqldb with data using SpringLiquibase integration.
java config=
#Configuration
#Profile("testing")
...
#Bean
public DataSource dataSource() {
DataSource ds = new EmbeddedDatabaseBuilder()
.setType(EmbeddedDatabaseType.HSQL)
.setName("junittest")
.build();
return ds;
}
#Bean
SpringLiquibase liquibase(){
SpringLiquibase lb = new SpringLiquibase();
lb.setDataSource(dataSource());
lb.setChangeLog("classpath:liquibase/example.xml");
lb.setContexts("testing");
return lb;
}
...
liquibase schema=example.sql
create table xyz(
id integer,
name varchar(10)
);
But I get the following exception:
Caused by: liquibase.exception.LockException: liquibase.exception.UnexpectedLiquibaseException: liquibase.snapshot.InvalidExampleException: Found multiple catalog/schemas matching null.PUBLIC
at liquibase.lockservice.StandardLockService.acquireLock(StandardLockService.java:211)
at liquibase.lockservice.StandardLockService.waitForLock(StandardLockService.java:151)
at liquibase.Liquibase.update(Liquibase.java:182)
at liquibase.Liquibase.update(Liquibase.java:174)
at liquibase.integration.spring.SpringLiquibase.performUpdate(SpringLiquibase.java:345)
at liquibase.integration.spring.SpringLiquibase.afterPropertiesSet(SpringLiquibase.java:302)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1571)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1509)
... 40 more
Caused by: liquibase.exception.UnexpectedLiquibaseException: liquibase.snapshot.InvalidExampleException: Found multiple catalog/schemas matching null.PUBLIC
at liquibase.snapshot.SnapshotGeneratorFactory.hasDatabaseChangeLogLockTable(SnapshotGeneratorFactory.java:168)
at liquibase.lockservice.StandardLockService.hasDatabaseChangeLogLockTable(StandardLockService.java:138)
at liquibase.lockservice.StandardLockService.init(StandardLockService.java:85)
at liquibase.lockservice.StandardLockService.acquireLock(StandardLockService.java:185)
... 47 more
Caused by: liquibase.snapshot.InvalidExampleException: Found multiple catalog/schemas matching null.PUBLIC
at liquibase.snapshot.jvm.SchemaSnapshotGenerator.snapshotObject(SchemaSnapshotGenerator.java:74)
at liquibase.snapshot.jvm.JdbcSnapshotGenerator.snapshot(JdbcSnapshotGenerator.java:59)
at liquibase.snapshot.SnapshotGeneratorChain.snapshot(SnapshotGeneratorChain.java:50)
at liquibase.snapshot.jvm.JdbcSnapshotGenerator.snapshot(JdbcSnapshotGenerator.java:62)
at liquibase.snapshot.SnapshotGeneratorChain.snapshot(SnapshotGeneratorChain.java:50)
at liquibase.snapshot.jvm.JdbcSnapshotGenerator.snapshot(JdbcSnapshotGenerator.java:62)
at liquibase.snapshot.SnapshotGeneratorChain.snapshot(SnapshotGeneratorChain.java:50)
at liquibase.snapshot.jvm.JdbcSnapshotGenerator.snapshot(JdbcSnapshotGenerator.java:62)
at liquibase.snapshot.SnapshotGeneratorChain.snapshot(SnapshotGeneratorChain.java:50)
at liquibase.snapshot.DatabaseSnapshot.include(DatabaseSnapshot.java:153)
at liquibase.snapshot.DatabaseSnapshot.init(DatabaseSnapshot.java:56)
at liquibase.snapshot.DatabaseSnapshot.<init>(DatabaseSnapshot.java:33)
at liquibase.snapshot.JdbcDatabaseSnapshot.<init>(JdbcDatabaseSnapshot.java:22)
at liquibase.snapshot.SnapshotGeneratorFactory.createSnapshot(SnapshotGeneratorFactory.java:126)
at liquibase.snapshot.SnapshotGeneratorFactory.createSnapshot(SnapshotGeneratorFactory.java:119)
at liquibase.snapshot.SnapshotGeneratorFactory.createSnapshot(SnapshotGeneratorFactory.java:107)
at liquibase.snapshot.SnapshotGeneratorFactory.has(SnapshotGeneratorFactory.java:97)
at liquibase.snapshot.SnapshotGeneratorFactory.hasDatabaseChangeLogLockTable(SnapshotGeneratorFactory.java:166)
... 50 more
version of hsqldb / liquibase : 1.8.0.10 / 3.2.3
Am I missing some config here?

Related

Trying to retrieve all rows from local H2 database with Quarkus

I'm creating a Quarkus project in Kotlin. I'm trying to implement an API where I hit "/users" endpoint, and it returns all the users I have in my local database.
Unfortunately, I'm getting an error. The stacktrace:
The stacktrace below has been reversed to show the root cause first.
org.h2.jdbc.JdbcSQLException: Column "USER0_.CREATEDAT" not found; SQL statement:
select user0_.id as id1_7_, user0_.createdAt as createda2_7_, user0_.email as email3_7_, user0_.fullName as fullname4_7_, user0_.updatedAt as updateda5_7_ from User user0_ [42122-197]
at org.h2.message.DbException.getJdbcSQLException(DbException.java:357)
at org.h2.message.DbException.get(DbException.java:179)
at org.h2.message.DbException.get(DbException.java:155)
at org.h2.expression.ExpressionColumn.optimize(ExpressionColumn.java:150)
at org.h2.expression.Alias.optimize(Alias.java:51)
at org.h2.command.dml.Select.prepare(Select.java:858)
at org.h2.command.Parser.prepareCommand(Parser.java:283)
at org.h2.engine.Session.prepareLocal(Session.java:611)
at org.h2.engine.Session.prepareCommand(Session.java:549)
at org.h2.jdbc.JdbcConnection.prepareCommand(JdbcConnection.java:1247)
at org.h2.jdbc.JdbcPreparedStatement.<init>(JdbcPreparedStatement.java:76)
at org.h2.jdbc.JdbcConnection.prepareStatement(JdbcConnection.java:304)
at io.agroal.pool.wrapper.ConnectionWrapper.prepareStatement(ConnectionWrapper.java:659)
at org.hibernate.engine.jdbc.internal.StatementPreparerImpl$5.doPrepare(StatementPreparerImpl.java:149)
at org.hibernate.engine.jdbc.internal.StatementPreparerImpl$StatementPreparationTemplate.prepareStatement(StatementPreparerImpl.java:176)
at org.hibernate.engine.jdbc.internal.StatementPreparerImpl.prepareQueryStatement(StatementPreparerImpl.java:151)
at org.hibernate.loader.Loader.prepareQueryStatement(Loader.java:2103)
at org.hibernate.loader.Loader.executeQueryStatement(Loader.java:2040)
at org.hibernate.loader.Loader.executeQueryStatement(Loader.java:2018)
at org.hibernate.loader.Loader.doQuery(Loader.java:948)
at org.hibernate.loader.Loader.doQueryAndInitializeNonLazyCollections(Loader.java:349)
at org.hibernate.loader.Loader.doList(Loader.java:2849)
at org.hibernate.loader.Loader.doList(Loader.java:2831)
at org.hibernate.loader.Loader.listIgnoreQueryCache(Loader.java:2663)
at org.hibernate.loader.Loader.list(Loader.java:2658)
at org.hibernate.loader.hql.QueryLoader.list(QueryLoader.java:506)
at org.hibernate.hql.internal.ast.QueryTranslatorImpl.list(QueryTranslatorImpl.java:400)
at org.hibernate.engine.query.spi.HQLQueryPlan.performList(HQLQueryPlan.java:219)
at org.hibernate.internal.SessionImpl.list(SessionImpl.java:1414)
at org.hibernate.query.internal.AbstractProducedQuery.doList(AbstractProducedQuery.java:1625)
at org.hibernate.query.internal.AbstractProducedQuery.list(AbstractProducedQuery.java:1593)
at org.hibernate.query.Query.getResultList(Query.java:165)
at io.quarkus.hibernate.orm.panache.common.runtime.CommonPanacheQueryImpl.list(CommonPanacheQueryImpl.java:239)
at io.quarkus.hibernate.orm.panache.kotlin.runtime.PanacheQueryImpl.list(PanacheQueryImpl.java:154)
at io.quarkus.hibernate.orm.panache.kotlin.runtime.KotlinJpaOperations.list(KotlinJpaOperations.java:24)
at io.quarkus.hibernate.orm.panache.kotlin.runtime.KotlinJpaOperations.list(KotlinJpaOperations.java:10)
at io.quarkus.hibernate.orm.panache.common.runtime.AbstractJpaOperations.listAll(AbstractJpaOperations.java:289)
at com.fortuneapp.backend.application.domain.adapters.databases.panache.UserRepository.listAll(UserRepository.kt)
at com.fortuneapp.backend.application.domain.adapters.databases.panache.UserRepository_Subclass.listAll$$superaccessor28(UserRepository_Subclass.zig:4915)
at com.fortuneapp.backend.application.domain.adapters.databases.panache.UserRepository_Subclass$$function$$28.apply(UserRepository_Subclass$$function$$28.zig:29)
at io.quarkus.arc.impl.AroundInvokeInvocationContext.proceed(AroundInvokeInvocationContext.java:54)
at io.quarkus.arc.runtime.devconsole.InvocationInterceptor.proceed(InvocationInterceptor.java:63)
at io.quarkus.arc.runtime.devconsole.InvocationInterceptor.monitor(InvocationInterceptor.java:49)
at io.quarkus.arc.runtime.devconsole.InvocationInterceptor_Bean.intercept(InvocationInterceptor_Bean.zig:521)
at io.quarkus.arc.impl.InterceptorInvocation.invoke(InterceptorInvocation.java:41)
at io.quarkus.arc.impl.AroundInvokeInvocationContext.perform(AroundInvokeInvocationContext.java:41)
at io.quarkus.arc.impl.InvocationContexts.performAroundInvoke(InvocationContexts.java:32)
at com.fortuneapp.backend.application.domain.adapters.databases.panache.UserRepository_Subclass.listAll(UserRepository_Subclass.zig:4873)
at com.fortuneapp.backend.application.domain.adapters.databases.panache.UserRepository_ClientProxy.listAll(UserRepository_ClientProxy.zig:1353)
at com.fortuneapp.backend.application.rest.UsersResource.getAllUsers(UserResource.kt:24)
at com.fortuneapp.backend.application.rest.UsersResource_Subclass.getAllUsers$$superaccessor2(UsersResource_Subclass.zig:354)
at com.fortuneapp.backend.application.rest.UsersResource_Subclass$$function$$2.apply(UsersResource_Subclass$$function$$2.zig:29)
at io.quarkus.arc.impl.AroundInvokeInvocationContext.proceed(AroundInvokeInvocationContext.java:54)
at io.quarkus.arc.runtime.devconsole.InvocationInterceptor.proceed(InvocationInterceptor.java:63)
at io.quarkus.arc.runtime.devconsole.InvocationInterceptor.monitor(InvocationInterceptor.java:49)
at io.quarkus.arc.runtime.devconsole.InvocationInterceptor_Bean.intercept(InvocationInterceptor_Bean.zig:521)
at io.quarkus.arc.impl.InterceptorInvocation.invoke(InterceptorInvocation.java:41)
at io.quarkus.arc.impl.AroundInvokeInvocationContext.perform(AroundInvokeInvocationContext.java:41)
at io.quarkus.arc.impl.InvocationContexts.performAroundInvoke(InvocationContexts.java:32)
at com.fortuneapp.backend.application.rest.UsersResource_Subclass.getAllUsers(UsersResource_Subclass.zig:312)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at org.jboss.resteasy.core.MethodInjectorImpl.invoke(MethodInjectorImpl.java:170)
at org.jboss.resteasy.core.MethodInjectorImpl.invoke(MethodInjectorImpl.java:130)
at org.jboss.resteasy.core.ResourceMethodInvoker.internalInvokeOnTarget(ResourceMethodInvoker.java:660)
at org.jboss.resteasy.core.ResourceMethodInvoker.invokeOnTargetAfterFilter(ResourceMethodInvoker.java:524)
at org.jboss.resteasy.core.ResourceMethodInvoker.lambda$invokeOnTarget$2(ResourceMethodInvoker.java:474)
at org.jboss.resteasy.core.interception.jaxrs.PreMatchContainerRequestContext.filter(PreMatchContainerRequestContext.java:364)
at org.jboss.resteasy.core.ResourceMethodInvoker.invokeOnTarget(ResourceMethodInvoker.java:476)
at org.jboss.resteasy.core.ResourceMethodInvoker.invoke(ResourceMethodInvoker.java:434)
at org.jboss.resteasy.core.ResourceMethodInvoker.invoke(ResourceMethodInvoker.java:408)
at org.jboss.resteasy.core.ResourceMethodInvoker.invoke(ResourceMethodInvoker.java:69)
at org.jboss.resteasy.core.SynchronousDispatcher.invoke(SynchronousDispatcher.java:492)
at org.jboss.resteasy.core.SynchronousDispatcher.lambda$invoke$4(SynchronousDispatcher.java:261)
at org.jboss.resteasy.core.SynchronousDispatcher.lambda$preprocess$0(SynchronousDispatcher.java:161)
at org.jboss.resteasy.core.interception.jaxrs.PreMatchContainerRequestContext.filter(PreMatchContainerRequestContext.java:364)
at org.jboss.resteasy.core.SynchronousDispatcher.preprocess(SynchronousDispatcher.java:164)
at org.jboss.resteasy.core.SynchronousDispatcher.invoke(SynchronousDispatcher.java:247)
at io.quarkus.resteasy.runtime.standalone.RequestDispatcher.service(RequestDispatcher.java:73)
at io.quarkus.resteasy.runtime.standalone.VertxRequestHandler.dispatch(VertxRequestHandler.java:138)
at io.quarkus.resteasy.runtime.standalone.VertxRequestHandler.access$000(VertxRequestHandler.java:41)
at io.quarkus.resteasy.runtime.standalone.VertxRequestHandler$1.run(VertxRequestHandler.java:93)
at io.quarkus.runtime.CleanableExecutor$CleaningRunnable.run(CleanableExecutor.java:231)
at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
at org.jboss.threads.EnhancedQueueExecutor$Task.run(EnhancedQueueExecutor.java:2415)
at org.jboss.threads.EnhancedQueueExecutor$ThreadBody.run(EnhancedQueueExecutor.java:1452)
at org.jboss.threads.DelegatingRunnable.run(DelegatingRunnable.java:29)
at org.jboss.threads.ThreadLocalResettingRunnable.run(ThreadLocalResettingRunnable.java:29)
at java.base/java.lang.Thread.run(Thread.java:834)
at org.jboss.threads.JBossThread.run(JBossThread.java:501)
Resulted in: org.hibernate.exception.SQLGrammarException: could not prepare statement
at org.hibernate.exception.internal.SQLStateConversionDelegate.convert(SQLStateConversionDelegate.java:103)
at org.hibernate.exception.internal.StandardSQLExceptionConverter.convert(StandardSQLExceptionConverter.java:42)
at org.hibernate.engine.jdbc.spi.SqlExceptionHelper.convert(SqlExceptionHelper.java:113)
at org.hibernate.engine.jdbc.internal.StatementPreparerImpl$StatementPreparationTemplate.prepareStatement(StatementPreparerImpl.java:186)
... 78 more
Resulted in: javax.persistence.PersistenceException: org.hibernate.exception.SQLGrammarException: could not prepare statement
at org.hibernate.internal.ExceptionConverterImpl.convert(ExceptionConverterImpl.java:154)
at org.hibernate.query.internal.AbstractProducedQuery.list(AbstractProducedQuery.java:1602)
... 62 more
Resulted in: org.jboss.resteasy.spi.UnhandledException: javax.persistence.PersistenceException: org.hibernate.exception.SQLGrammarException: could not prepare statement
at org.jboss.resteasy.core.ExceptionHandler.handleApplicationException(ExceptionHandler.java:106)
at org.jboss.resteasy.core.ExceptionHandler.handleException(ExceptionHandler.java:372)
at org.jboss.resteasy.core.SynchronousDispatcher.writeException(SynchronousDispatcher.java:218)
at org.jboss.resteasy.core.SynchronousDispatcher.invoke(SynchronousDispatcher.java:519)
... 18 more
I've set up my local H2 database connection by adding this to my application.yaml file:
quarkus:
datasource:
db-kind: h2
username: sa
jdbc:
url: "jdbc:h2:mem:default"
flyway:
migrate-at-start: true
Furthermore, I'm using https://quarkus.io/guides/hibernate-orm-panache, which is quite easy to use. I've created a User entity, and a User repository in my project. I then use those in my User resource, where I define the api.
#Path("/users")
class UsersResource {
#Inject
lateinit var userRepository: UserRepository
#GET
#Produces(MediaType.APPLICATION_JSON)
fun getAllUsers() : GetAllUsersResponse =
try {
GetAllUsersSuccess(userRepository.listAll())
} catch (e: NotFoundException) {
GetAllUsersFailure(e)
}
}
I've already implemented Flyway, for my database migration. This seems to go well. I'm also adding a row to the table when the migration runs, and I can see that "1 row is affected", so I think Flyway has access to the H2 database.
Do any of you guys know what I'm missing?
Kind regards
I found the problem. I defined my column names in snake case, while my entity properties where defined in camel case.

Apache Ignite .Net 2.8.1: Affinity collocation with QueryEntity

I'm using Apache Ignite v.2.8.1 .Net on a Windows 10 machine.
I am trying to use affinity collocation on a query-enabled cache. The entities I store in cache have a primary key named "Id" and an affinity key named "PartnerId". Both keys are of the type Int32. I'm defining the cache as follows:
new CacheConfiguration("BespokeCharge")
{
KeyConfiguration = new[]
{
new CacheKeyConfiguration()
{
AffinityKeyFieldName = "PartnerId",
TypeName = typeof(BespokeCharge).Name
}
}
};
Next I use the following code to add the data:
var cache = Ignite.GetCache<AffinityKey, BespokeCharge>("BespokeCharge")
cache.Put(new AffinityKey(entity.Id, entity.PartnerId), entity)
So far so good. Since I want to be able to use SQL to search for bespoke charges, I also add a QueryEntity configuration:
new CacheConfiguration("BespokeCharge",
new QueryEntity(typeof(AffinityKey), typeof(BespokeCharge))
{
KeyFieldName = "Id",
TableName = "BespokeCharge"
})
{
KeyConfiguration = new[]
{
new CacheKeyConfiguration()
{
AffinityKeyFieldName = "PartnerId",
TypeName = typeof(BespokeCharge).Name
}
}
};
When I run the code, both Ignite and my app crash and the following error is logged:
JVM will be halted immediately due to the failure: [failureCtx=FailureContext [type=CRITICAL_ERROR, err=class o.a.i.i.processors.cache.persistence.tree.CorruptedTreeException: B+Tree is corrupted [pages(groupId, pageId)=[IgniteBiTuple [val1=1565129718, val2=844420635164729]], cacheId=-1278247946, cacheName=BESPOKECHARGES, indexName=BESPOKECHARGES_ID_ASC_IDX, msg=Runtime failure on row: Row#29db2fbe[ key: AffinityKey [idHash=8137191, hash=783909474, key=3, affKey=2], val: UtilityClick.BillValidation.Shared.InMemory.Model.BespokeCharge [idHash=889383506, hash=-399638125, BespokeChargeTypeId=4, ChargeValue=100.0000, ChargeValueIncCommission=100.0000, Id=3, PartnerId=2, QuoteRecordId=5] ][ 4, 100.0000, 100.0000, , 2, 5 ]]]]
When I tried to define QueryEntity with the key type of int instead of AffinityKey, I got a different error with the same outcome -- a crash.
JVM will be halted immediately due to the failure: [failureCtx=FailureContext [type=CRITICAL_ERROR, err=java.lang.ClassCastException: class o.a.i.cache.affinity.AffinityKey cannot be cast to class java.lang.Integer (o.a.i.cache.affinity.AffinityKey is in unnamed module of loader 'app'; java.lang.Integer is in module java.base of loader 'bootstrap')]]
What am I doing wrong? Thank you for your help!
KeyFieldName = "Id" setting is the problem. It sets Id field to be used as a cache key, which is int, but then we use AffinityKey as a cache key, which causes a type mismatch.
In this case I don't think we need KeyFieldName at all, removing it fixes the problem, and SELECT queries should not be affected by this change.

weblogic update from 11g to 12c

Currently I trying to update the project to weblogic 12c. After I using jdk8 to build it and try to deploy it to weblogic 12c, it throws below exception:
Caused by: java.lang.IllegalArgumentException: Class not found for link: DataSourceStatusEnquiryWebS: available:{AppointmentSlotSearchingWebS=class hk.org.ha.pas.webservice.AppointmentSlotSearchEJBBean, CheckIssuedGopdHandheldFolderWebS=class hk.org.ha.pas.webservice.CheckIssuedGopdHandheldFolderEJBBean, ReferralFeedbackEnquiryWebS=class hk.org.ha.pas.webservice.ReferralFeedbackEnquiryEJBBean, PasmShrRelatedServiceWebS=class hk.org.ha.pas.webservice.PasmShrRelatedServiceEJBBean, AppointmentCancelWebS_1=class hk.org.ha.pas.webservice.appointmentcancel.v1.AppointmentCancelEJBBean, OpasServiceEnquiryWebS=class hk.org.ha.pas.webservice.OpasServiceEnquiryEJBBean, OpasServiceEnquiryWebS_3=class hk.org.ha.pas.webservice.opasserviceenquiry.v3.OpasServiceEnquiryEJBBean, PasCmsSecurityWebS=class hk.org.ha.pas.webservice.PasCmsSecurityEJBBean, OpasConDisSummaryWebS_1=class hk.org.ha.pas.webservice.opascondissummary.v1.OpasConDisSummaryEJBBean, AppointmentEnquiryWebS_4=class hk.org.ha.pas.webservice.appointmentenquiry.v4.AppointmentEnquiryEJBBean, PasNtssRelatedServiceWebS_1=class hk.org.ha.pas.webservice.pasntssrelatedservice.v1.PasNtssRelatedServiceEJBBean, DmsGetOpasInfoWebS_1=class hk.org.ha.pas.webservice.dmsgetopasinfo.v1.DmsGetOpasInfoEJBBean, MarkAppointmentAssessmentConsultationStatusWebS=class hk.org.ha.pas.webservice.MarkAppointmentAssessmentConsultationStatusEJBBean, PdisRelatedServiceWebS_1=class hk.org.ha.pas.webservice.pdisrelatedservice.v1.PdisRelatedServiceEJBBean, AppointmentEnquiryWebS_3=class hk.org.ha.pas.webservice.appointmentenquiry.v3.AppointmentEnquiryEJBBean, PspPatientListWebS=class hk.org.ha.pas.webservice.PspPatientListEJBBean, CheckIssuedGopdA4FolderWebS=class hk.org.ha.pas.webservice.CheckIssuedGopdA4FolderEJBBean, PasAdsRelatedServiceWebS_1=class hk.org.ha.pas.webservice.adswebservice.v1.PasAdsRelatedServiceEJBBean, OpasMoeServiceWebS=class hk.org.ha.pas.webservice.OpasMoeServiceEJBBean, WorkStationServiceWebS_1=class hk.org.ha.pas.webservice.workstationservice.v1.WorkStationServiceEJBBean, PasCcaRelatedServiceWebS=class hk.org.ha.pas.webservice.PasCcaRelatedServiceWebSEJBBean, FcsSopdAttendanceWebS_1=class hk.org.ha.pas.webservice.fcssopdattendance.v1.FcsSopdAttendanceEJBBean, AppointmentEnquiryWebS=class hk.org.ha.pas.webservice.AppointmentEnquiryEJBBean, CheckPatientUnderCareWebS=class hk.org.ha.pas.webservice.CheckPatientUnderCareEJBBean, BookedEdcUpdateWebS=class hk.org.ha.pas.webservice.BookedEdcUpdateEJBBean, IssueGopdHandheldFolderWebS=class hk.org.ha.pas.webservice.IssueGopdHandheldFolderEJBBean, BookedEdcEnquiryWebS=class hk.org.ha.pas.webservice.BookedEdcEnquiryEJBBean, AppointmentBookingWebS=class hk.org.ha.pas.webservice.AppointmentBookingEJBBean, FcsGetOpasInfoWebS_1=class hk.org.ha.pas.webservice.fcsgetopasinfo.v1.FcsGetOpasInfoEJBBean, OpasLetterServiceWebS_3=class hk.org.ha.pas.webservice.opasletterservice.v3.OpasLetterServiceEJBBean, UpdateUnconfirmAttendanceWebS=class hk.org.ha.pas.webservice.UpdateUnconfirmAttendanceEJBBean}
at weblogic.wsee.deploy.WSEEAnnotationProcessor.loadPorts(WSEEAnnotationProcessor.java:371)
at weblogic.wsee.deploy.WSEEAnnotationProcessor.load(WSEEAnnotationProcessor.java:337)
at weblogic.wsee.deploy.WSEEAnnotationProcessor.process(WSEEAnnotationProcessor.java:75)
at weblogic.wsee.tools.jws.jaxws.JAXWSAnnotationProcessor.process(JAXWSAnnotationProcessor.java:73)
... 134 more
Caused by: java.lang.AssertionError: Unable to invoke Annotation processoror
at weblogic.j2ee.wsee.compiler.WSEEModuleHelper.processAnnotations(WSEEModuleHelper.java:287)
at weblogic.j2ee.wsee.compiler.WSEEModuleHelper.processAnnotationsWithServiceLinks(WSEEModuleHelper.java:245)
at weblogic.j2ee.wsee.compiler.WSEEModuleHelper.processAnnotations(WSEEModuleHelper.java:187)
at weblogic.wsee.tools.WSEEEJBToolsModuleExtension.processAnnotations(WSEEEJBToolsModuleExtension.java:121)
at weblogic.wsee.tools.WSEEEJBToolsModuleExtension.merge(WSEEEJBToolsModuleExtension.java:87)
at weblogic.application.compiler.flow.MergeModuleFlow.compile(MergeModuleFlow.java:44)
at weblogic.application.compiler.FlowDriver$FlowStateChange.next(FlowDriver.java:70)
at weblogic.application.utils.StateMachineDriver.nextState(StateMachineDriver.java:45)
at weblogic.application.compiler.FlowDriver.nextState(FlowDriver.java:37)
at weblogic.application.compiler.BaseMerger.merge(BaseMerger.java:20)
at weblogic.application.compiler.flow.AppMergerFlow.mergeInput(AppMergerFlow.java:75)
at weblogic.application.compiler.flow.AppMergerFlow.compile(AppMergerFlow.java:40)
at weblogic.application.compiler.FlowDriver$FlowStateChange.next(FlowDriver.java:70)
at weblogic.application.utils.StateMachineDriver.nextState(StateMachineDriver.java:45)
at weblogic.application.compiler.FlowDriver.nextState(FlowDriver.java:37)
... 115 more
and one of the webservice class is annotate with:
#Stateless(name = "DataSourceStatusEnquiryWebS",
mappedName = "pas_service-DataSourceEnquiryEJBBean")
#SOAPBinding(style = SOAPBinding.Style.RPC)
#WebService(serviceName = "DataSourceStatusEnquiryWebS")
#Policy(uri = "policy:UsernameToken.xml")
public class DataSourceStatusEnquiryEJBBean
There is missing definition in ejb-jar.xml. After I defined the ejb, I can deploy successfully.

"Unknown Pair" exception when using StreamTransformer for BinaryObject

I have a cache which store BinaryObject actually in a cluster(2 nodes). Ignite version is 2.1.0.
If I don't use any StreamReceiver(include StreamTransformer), there is no problems when adding lots of BinaryObject data with following code:
IgniteDataStreamer<Long,BinaryObject> ds = ignite.dataStreamer(CACHE_NAME);
SecureRandom random = new SecureRandom();
long i = 0;
long count = 1000000;
while(i++<count){
builder.setField("id", i);
builder.setField("name", "Test"+i);
builder.setField("age", random.nextInt(30));
builder.setField("score", random.nextDouble()*100d);
builder.setField("birthday", new Date());
ds.addData(i, builder.build());
if(i%10000==0){
System.out.println(i+" added...");
}
}
But now, I want to modify my BinaryObject data value before adding, so I tried StreamTransformer like this:
ds.receiver(new StreamTransformer<Long,BinaryObject>(){
#Override
public Object process(MutableEntry<Long, BinaryObject> entry, Object... arguments)
throws EntryProcessorException {
// TODO Auto-generated method stub
Long key = entry.getKey();
BinaryObject value = entry.getValue();
BinaryObjectBuilder builder = value.toBuilder();
//want to change the value of "name" field
builder.setField("name", "Modify"+builder.getField("name"));
entry.setValue(builder.build());
return null;
}
});
while(...){
//... original code to build BinaryObject data and call ds.add method
}
Unluckily following exceptions occurred:
[09:52:36] Topology snapshot [ver=61, servers=2, clients=0, CPUs=8, heap=2.7GB]
10000 added...
20000 added...
30000 added...
40000 added...
[09:52:39,174][SEVERE][data-streamer-#54%null%][DataStreamerImpl] DataStreamer operation failed.
class org.apache.ignite.IgniteCheckedException: Failed to finish operation (too many remaps): 32
at org.apache.ignite.internal.processors.datastreamer.DataStreamerImpl$5.apply(DataStreamerImpl.java:869)
at org.apache.ignite.internal.processors.datastreamer.DataStreamerImpl$5.apply(DataStreamerImpl.java:834)
at org.apache.ignite.internal.util.future.GridFutureAdapter.notifyListener(GridFutureAdapter.java:382)
at org.apache.ignite.internal.util.future.GridFutureAdapter.unblock(GridFutureAdapter.java:346)
at org.apache.ignite.internal.util.future.GridFutureAdapter.unblockAll(GridFutureAdapter.java:334)
at org.apache.ignite.internal.util.future.GridFutureAdapter.onDone(GridFutureAdapter.java:494)
at org.apache.ignite.internal.util.future.GridFutureAdapter.onDone(GridFutureAdapter.java:473)
at org.apache.ignite.internal.util.future.GridFutureAdapter.onDone(GridFutureAdapter.java:461)
at org.apache.ignite.internal.processors.datastreamer.DataStreamerImpl$Buffer$2.apply(DataStreamerImpl.java:1572)
at org.apache.ignite.internal.processors.datastreamer.DataStreamerImpl$Buffer$2.apply(DataStreamerImpl.java:1562)
at org.apache.ignite.internal.util.future.GridFutureAdapter.notifyListener(GridFutureAdapter.java:382)
at org.apache.ignite.internal.util.future.GridFutureAdapter.unblock(GridFutureAdapter.java:346)
at org.apache.ignite.internal.util.future.GridFutureAdapter.unblockAll(GridFutureAdapter.java:334)
at org.apache.ignite.internal.util.future.GridFutureAdapter.onDone(GridFutureAdapter.java:494)
at org.apache.ignite.internal.util.future.GridFutureAdapter.onDone(GridFutureAdapter.java:473)
at org.apache.ignite.internal.util.future.GridFutureAdapter.onDone(GridFutureAdapter.java:461)
at org.apache.ignite.internal.processors.closure.GridClosureProcessor$2.body(GridClosureProcessor.java:967)
at org.apache.ignite.internal.util.worker.GridWorker.run(GridWorker.java:110)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: class org.apache.ignite.IgniteCheckedException: Unknown pair [platformId=0, typeId=-1496463502]
at org.apache.ignite.internal.util.IgniteUtils.cast(IgniteUtils.java:7229)
... 5 more
Caused by: java.lang.ClassNotFoundException: Unknown pair [platformId=0, typeId=-1496463502]
at org.apache.ignite.internal.MarshallerContextImpl.getClassName(MarshallerContextImpl.java:392)
at org.apache.ignite.internal.MarshallerContextImpl.getClass(MarshallerContextImpl.java:342)
at org.apache.ignite.internal.binary.BinaryContext.descriptorForTypeId(BinaryContext.java:686)
at org.apache.ignite.internal.binary.BinaryReaderExImpl.deserialize0(BinaryReaderExImpl.java:1755)
at org.apache.ignite.internal.binary.BinaryReaderExImpl.deserialize(BinaryReaderExImpl.java:1714)
at org.apache.ignite.internal.binary.BinaryObjectImpl.deserializeValue(BinaryObjectImpl.java:797)
at org.apache.ignite.internal.binary.BinaryObjectImpl.value(BinaryObjectImpl.java:143)
at org.apache.ignite.internal.processors.cache.CacheObjectUtils.unwrapBinary(CacheObjectUtils.java:161)
at org.apache.ignite.internal.processors.cache.CacheObjectUtils.unwrapBinaryIfNeeded(CacheObjectUtils.java:41)
at org.apache.ignite.internal.processors.cache.CacheObjectContext.unwrapBinaryIfNeeded(CacheObjectContext.java:125)
at org.apache.ignite.internal.processors.datastreamer.DataStreamerEntry$1.getValue(DataStreamerEntry.java:96)
at org.apache.ignite.stream.StreamTransformer.receive(StreamTransformer.java:45)
at org.apache.ignite.internal.processors.datastreamer.DataStreamerUpdateJob.call(DataStreamerUpdateJob.java:137)
at org.apache.ignite.internal.util.IgniteUtils.wrapThreadLoader(IgniteUtils.java:6608)
at org.apache.ignite.internal.processors.closure.GridClosureProcessor$2.body(GridClosureProcessor.java:959)
... 4 more
What should I do to fix it?
You get ClassNotFoundException because DataStreamer internally tries to deserialize stored BinaryObject. To make it use BinaryObjects directly, you should invoke ds.keepBinary(true) before using it.
Another problem you have in your code is the way you use result of entry.getValue(). Actually, entry, passed to process method represents a record previously stored in cache, so you'll most likely get null value there. If you want to get a newly-assigned value, you should use arguments[0] value.

Apache Ignite: Failed to load job class [class=org.apache.ignite.internal.processors.hadoop.v2.HadoopV2Job]

I am trying to start a server using the ignite.sh script and am getting the above error (failed to load HadoopV2Job). "config/default-config.xml" is being passed to CommandLineStartup and hasn't been changed.
Has anyone came across this issue or does anyone know how to fix it?
My Ignite version is 1.4.0 and here is the full stack trace:
class org.apache.ignite.IgniteException: Failed to start processor: HadoopProcessor [idCtr=0]
at org.apache.ignite.internal.util.IgniteUtils.convertException(IgniteUtils.java:881)
at org.apache.ignite.Ignition.start(Ignition.java:349)
at org.apache.ignite.startup.cmdline.CommandLineStartup.main(CommandLineStartup.java:302)
Caused by: class org.apache.ignite.IgniteCheckedException: Failed to start processor: HadoopProcessor [idCtr=0]
at org.apache.ignite.internal.IgniteKernal.startProcessor(IgniteKernal.java:1504)
at org.apache.ignite.internal.IgniteKernal.start(IgniteKernal.java:888)
at org.apache.ignite.internal.IgnitionEx$IgniteNamedInstance.start0(IgnitionEx.java:1617)
at org.apache.ignite.internal.IgnitionEx$IgniteNamedInstance.start(IgnitionEx.java:1484)
at org.apache.ignite.internal.IgnitionEx.start0(IgnitionEx.java:965)
at org.apache.ignite.internal.IgnitionEx.startConfigurations(IgnitionEx.java:892)
at org.apache.ignite.internal.IgnitionEx.start(IgnitionEx.java:784)
at org.apache.ignite.internal.IgnitionEx.start(IgnitionEx.java:705)
at org.apache.ignite.internal.IgnitionEx.start(IgnitionEx.java:576)
at org.apache.ignite.internal.IgnitionEx.start(IgnitionEx.java:546)
at org.apache.ignite.Ignition.start(Ignition.java:346)
... 1 more
Caused by: class org.apache.ignite.IgniteCheckedException: Failed to load job class [class=org.apache.ignite.internal.processors.hadoop.v2.HadoopV2Job]
at org.apache.ignite.internal.processors.hadoop.jobtracker.HadoopJobTracker.start(HadoopJobTracker.java:167)
at org.apache.ignite.internal.processors.hadoop.HadoopProcessor.start(HadoopProcessor.java:103)
at org.apache.ignite.internal.IgniteKernal.startProcessor(IgniteKernal.java:1501)
... 11 more
Caused by: java.lang.IllegalArgumentException
at org.objectweb.asm.ClassReader.<init>(Unknown Source)
at org.objectweb.asm.ClassReader.<init>(Unknown Source)
at org.objectweb.asm.ClassReader.<init>(Unknown Source)
at org.apache.ignite.internal.processors.hadoop.HadoopClassLoader.hasExternalDependencies(HadoopClassLoader.java:288)
at org.apache.ignite.internal.processors.hadoop.HadoopClassLoader.loadClass(HadoopClassLoader.java:162)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at org.apache.ignite.internal.processors.hadoop.jobtracker.HadoopJobTracker.start(HadoopJobTracker.java:163)
... 13 more
Shane, did you compile the Ignite code with JDK8 ?
It looks like the org.objectweb.asm library failed to parse class "org.apache.ignite.internal.processors.hadoop.v2.HadoopV2Job" bytecode because the bytecode version is higher than 1.7:
/**
* Constructs a new {#link ClassReader} object.
*
* #param b
* the bytecode of the class to be read.
* #param off
* the start offset of the class data.
* #param len
* the length of the class data.
*/
public ClassReader(final byte[] b, final int off, final int len) {
this.b = b;
// checks the class version
if (readShort(off + 6) > Opcodes.V1_7) {
throw new IllegalArgumentException();
}
Please try to build Ignite witgh JDK 1.7 or specify target level = 1.7 with JDK8. Does that solve the problem?