Currently I trying to update the project to weblogic 12c. After I using jdk8 to build it and try to deploy it to weblogic 12c, it throws below exception:
Caused by: java.lang.IllegalArgumentException: Class not found for link: DataSourceStatusEnquiryWebS: available:{AppointmentSlotSearchingWebS=class hk.org.ha.pas.webservice.AppointmentSlotSearchEJBBean, CheckIssuedGopdHandheldFolderWebS=class hk.org.ha.pas.webservice.CheckIssuedGopdHandheldFolderEJBBean, ReferralFeedbackEnquiryWebS=class hk.org.ha.pas.webservice.ReferralFeedbackEnquiryEJBBean, PasmShrRelatedServiceWebS=class hk.org.ha.pas.webservice.PasmShrRelatedServiceEJBBean, AppointmentCancelWebS_1=class hk.org.ha.pas.webservice.appointmentcancel.v1.AppointmentCancelEJBBean, OpasServiceEnquiryWebS=class hk.org.ha.pas.webservice.OpasServiceEnquiryEJBBean, OpasServiceEnquiryWebS_3=class hk.org.ha.pas.webservice.opasserviceenquiry.v3.OpasServiceEnquiryEJBBean, PasCmsSecurityWebS=class hk.org.ha.pas.webservice.PasCmsSecurityEJBBean, OpasConDisSummaryWebS_1=class hk.org.ha.pas.webservice.opascondissummary.v1.OpasConDisSummaryEJBBean, AppointmentEnquiryWebS_4=class hk.org.ha.pas.webservice.appointmentenquiry.v4.AppointmentEnquiryEJBBean, PasNtssRelatedServiceWebS_1=class hk.org.ha.pas.webservice.pasntssrelatedservice.v1.PasNtssRelatedServiceEJBBean, DmsGetOpasInfoWebS_1=class hk.org.ha.pas.webservice.dmsgetopasinfo.v1.DmsGetOpasInfoEJBBean, MarkAppointmentAssessmentConsultationStatusWebS=class hk.org.ha.pas.webservice.MarkAppointmentAssessmentConsultationStatusEJBBean, PdisRelatedServiceWebS_1=class hk.org.ha.pas.webservice.pdisrelatedservice.v1.PdisRelatedServiceEJBBean, AppointmentEnquiryWebS_3=class hk.org.ha.pas.webservice.appointmentenquiry.v3.AppointmentEnquiryEJBBean, PspPatientListWebS=class hk.org.ha.pas.webservice.PspPatientListEJBBean, CheckIssuedGopdA4FolderWebS=class hk.org.ha.pas.webservice.CheckIssuedGopdA4FolderEJBBean, PasAdsRelatedServiceWebS_1=class hk.org.ha.pas.webservice.adswebservice.v1.PasAdsRelatedServiceEJBBean, OpasMoeServiceWebS=class hk.org.ha.pas.webservice.OpasMoeServiceEJBBean, WorkStationServiceWebS_1=class hk.org.ha.pas.webservice.workstationservice.v1.WorkStationServiceEJBBean, PasCcaRelatedServiceWebS=class hk.org.ha.pas.webservice.PasCcaRelatedServiceWebSEJBBean, FcsSopdAttendanceWebS_1=class hk.org.ha.pas.webservice.fcssopdattendance.v1.FcsSopdAttendanceEJBBean, AppointmentEnquiryWebS=class hk.org.ha.pas.webservice.AppointmentEnquiryEJBBean, CheckPatientUnderCareWebS=class hk.org.ha.pas.webservice.CheckPatientUnderCareEJBBean, BookedEdcUpdateWebS=class hk.org.ha.pas.webservice.BookedEdcUpdateEJBBean, IssueGopdHandheldFolderWebS=class hk.org.ha.pas.webservice.IssueGopdHandheldFolderEJBBean, BookedEdcEnquiryWebS=class hk.org.ha.pas.webservice.BookedEdcEnquiryEJBBean, AppointmentBookingWebS=class hk.org.ha.pas.webservice.AppointmentBookingEJBBean, FcsGetOpasInfoWebS_1=class hk.org.ha.pas.webservice.fcsgetopasinfo.v1.FcsGetOpasInfoEJBBean, OpasLetterServiceWebS_3=class hk.org.ha.pas.webservice.opasletterservice.v3.OpasLetterServiceEJBBean, UpdateUnconfirmAttendanceWebS=class hk.org.ha.pas.webservice.UpdateUnconfirmAttendanceEJBBean}
at weblogic.wsee.deploy.WSEEAnnotationProcessor.loadPorts(WSEEAnnotationProcessor.java:371)
at weblogic.wsee.deploy.WSEEAnnotationProcessor.load(WSEEAnnotationProcessor.java:337)
at weblogic.wsee.deploy.WSEEAnnotationProcessor.process(WSEEAnnotationProcessor.java:75)
at weblogic.wsee.tools.jws.jaxws.JAXWSAnnotationProcessor.process(JAXWSAnnotationProcessor.java:73)
... 134 more
Caused by: java.lang.AssertionError: Unable to invoke Annotation processoror
at weblogic.j2ee.wsee.compiler.WSEEModuleHelper.processAnnotations(WSEEModuleHelper.java:287)
at weblogic.j2ee.wsee.compiler.WSEEModuleHelper.processAnnotationsWithServiceLinks(WSEEModuleHelper.java:245)
at weblogic.j2ee.wsee.compiler.WSEEModuleHelper.processAnnotations(WSEEModuleHelper.java:187)
at weblogic.wsee.tools.WSEEEJBToolsModuleExtension.processAnnotations(WSEEEJBToolsModuleExtension.java:121)
at weblogic.wsee.tools.WSEEEJBToolsModuleExtension.merge(WSEEEJBToolsModuleExtension.java:87)
at weblogic.application.compiler.flow.MergeModuleFlow.compile(MergeModuleFlow.java:44)
at weblogic.application.compiler.FlowDriver$FlowStateChange.next(FlowDriver.java:70)
at weblogic.application.utils.StateMachineDriver.nextState(StateMachineDriver.java:45)
at weblogic.application.compiler.FlowDriver.nextState(FlowDriver.java:37)
at weblogic.application.compiler.BaseMerger.merge(BaseMerger.java:20)
at weblogic.application.compiler.flow.AppMergerFlow.mergeInput(AppMergerFlow.java:75)
at weblogic.application.compiler.flow.AppMergerFlow.compile(AppMergerFlow.java:40)
at weblogic.application.compiler.FlowDriver$FlowStateChange.next(FlowDriver.java:70)
at weblogic.application.utils.StateMachineDriver.nextState(StateMachineDriver.java:45)
at weblogic.application.compiler.FlowDriver.nextState(FlowDriver.java:37)
... 115 more
and one of the webservice class is annotate with:
#Stateless(name = "DataSourceStatusEnquiryWebS",
mappedName = "pas_service-DataSourceEnquiryEJBBean")
#SOAPBinding(style = SOAPBinding.Style.RPC)
#WebService(serviceName = "DataSourceStatusEnquiryWebS")
#Policy(uri = "policy:UsernameToken.xml")
public class DataSourceStatusEnquiryEJBBean
There is missing definition in ejb-jar.xml. After I defined the ejb, I can deploy successfully.
Related
I am setting up the structure for MUnit tests in our API and I am trying to do the following
folder structure :
src/test/munit
data.dwl
testdata/api-common/attributes.json
testdata/api-common/audit.json
testdata/api-common/common.dwl
testdata/getbalances/getbalancesrequest.json
testdata/getbalances/getbalancesresponse.json
testdata/getbalances/getbalancesrequest.json
testdata/getbalances/getbalancesdata.dwl
Sample dwl code as follows:
common.dwl
import getResourceAsString from MunitTools
var attributes = readUrl('classpath://testdata/api-
common.json')
var audit = readUrl('classpath://testdata/audit.json')
data.dwl
import getResourceAsString from MunitTools
var common = readUrl('classpath://testdata/api-common/common.dwl')
Way I use it inside set event:
<munit-tools:then-return>
<munit-tools:attributes value=“#[output application/java — data::common::attributes]” mediaType="application/java” encoding="UTF-8" />
</munit-tools:then-return>
But I am getting an exception like missing mapping expression var a= 1.
WARNING: Using Weave Reader at Runtime May Cause Performance Issues.
It is strongly advice to either use with onlydatastrue or try other
MimeType. This format was design for debugging and design only.
4444444++++++++***+*+***¢+++++++++++++++++++++++++++++++++++++++++++++++
org.mule.munit.runner. model. TestExecutionException: Error [MULE:
EXPRESSION] while running test 'credit-api-getBalances-test-suite'
:"Exception while reading classpath://testdata/cr...' as
'application/dw' cause by: Missing Mapping Expression ie. var a = 1 3|
var attributes = readur1('
classpath://testdata/credit-api-common/attributes.json") Trace: at
root::main (line: 3, column: 83) 4| var attributes = readur]('
classpath://testdata/credit-api -common/commonTestData.dwl')
ЛАЛЛЛЛЛЛЛАЛЛЛЛЛ. АЛЛЛЛЛЛЛЛЛЛЛЛЛЛЛАЛЛЛЛЛЛАЛЛЛЛЛЛЛЛАЛЛЛЛЛЛЛЛЛЛЕ Trace:
at data:: readUrl (line: 4, column: 26) at data::main (line: 4,
column: 18)" evaluating expression: "output application/ java ...
data: :attributes" at org.mule.munit.runner.flow. TestFlow.
run(TestFlow. java: 320) at org.mule.munit.runner.model.Test.run(Test.
java:94) at org.mule.munit.runner .model.Suite.run (Suite. java: 112)
at org.mule.munit.runner. SuiteRunner. doRun(SuiteRunner. java:61) at
org.mule.munit. runner. SuiteRunner. run (SuiteRunner. java: 46) at
org.mule.munit. runner. remote.api. server. RunMessageHandler.
runSuite(RunMessageHandler. java: 99) at org.mule.munit. runner.
remote. api. server. RunMessageHandler. parseAndRun(RunMessageHandler.
java: 82) at org.mule.munit.runner. remote. api.server.
RunMessageHandler. handle (RunMessageHandler. java: 75) at
org.mule.munit.runner. remote. api. server. Runnerserver.
handleclientMes sage (RunnerServer. java:145) at org.mule.munit.runner
.remote.api.server. RunnerServer. run (RunnerServer. java:91) at
java.til.concurrent. Executors$RunnableAdapter. call (Executors. java:
511) at java.til.concurrent. FutureTask. run(FutureTask. java: 266) at
org. mule.service.scheduler.
internal.AbstractrunnableFutureDecorator.doRun(AbstractRunnableFutureDecorator.java:113)
at org.mule. service. scheduler. internal. RunnableFutureDecorator.
run(RunnableFutureDecorator. java: 54) at java.til.concurrent.
ThreadPoolExecutor. runworker (IhreadPoolExecutor. java: 1149) at
java.til. concurrent.
ThreadPoolExecutor$worker.run(IhreadPoolExecutor.java:624) at java.
lang. Thread. run (Thread. java: 748) Caused by:
lava.til.concurrent.ExecutionException:
org.mule.runtime.api.component.execution. ComponentExecu Missing
Mapping Expression ie, var a ionException: org.mule.
runtime.core.api.expression. ExpressionRuntimeException: "Exception
while reading 'classpath:// 31 var attributes =
readur](classpath://testdata/credit-api-common/attributes.json')
Trace: at root: :main (line: 3, column: 83) 4| var attributes =
readUr](' classpath://testdata/credit-api-common/commonTestData.dwl')
лллллЛЛЛЛЛЛЛЛЛЛЛЛЛЛЛЛЛЛЛЛЛЛЛЛЛЛЛЛЛЛЛЛЛЛЛЛЛллЛЛЛллллл Trace: at
data::readUrl (line: 4, column: 26) at data: :main (line: 4, column:
18)" evaluating expression: "output application/ java ---
data::attributes". at
java.util.concurrent.CompletableFuture.reportet(CompletableFuture.java:357)
at
java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.mule.munit.runner. flow.
TestFlow$ExceptionAwareCallable.call(TestFlow. java: 384) at org.
mule.munit.runner. flow.
TestFlow$ExceptionAwareCallable.call(TestFlow.java:373) 6 more Caused
by: org.mule.runtime.api.component.
execution.ComponentExecutionException: org.mule.runtime.core. Missing
Mapping Expression 1e. var a = exception: "Exception while reading
'classpath://testdata/cr. as 'application/dw' cause by: 3| var
attributes =
readUr](classpath://testdata/credit-api-common/attributes.json") race:
at root: :main (line: 3, column: 83) 4| var attributes =
readUr]('classpath://testdata/credit-api-common/commonTestData.dwl')
алАлАЛАЛЛЛАЛАЛлЛЛЛЛАЛАЛлЛАЛЛЛЛЛЛЛЛЛЛЛЛЛЛЛЛЛЛЛЛЛЛЛЛАЛАлЛлАЛЛ Trace: at
data: :readUrl (line: 4, column: 26) at data::main (line: 4, column:
18)" evaluating expression: "output application/java ... data:
:attributes". Caused by:
org.mule.runtime.core.api.expression.ExpressionRuntimeException:
"Exception while reading classpath://testdata/cr... 'application/dw'
cause by:
Could you please help me how to fix this and what I am doing wrong?
Try telling readUrl() that you are actually trying to reading a JSON file by using its second argument 'contentType'.
Example:
var audit = readUrl('classpath://testdata/audit.json', 'application/json')
I'm creating a Quarkus project in Kotlin. I'm trying to implement an API where I hit "/users" endpoint, and it returns all the users I have in my local database.
Unfortunately, I'm getting an error. The stacktrace:
The stacktrace below has been reversed to show the root cause first.
org.h2.jdbc.JdbcSQLException: Column "USER0_.CREATEDAT" not found; SQL statement:
select user0_.id as id1_7_, user0_.createdAt as createda2_7_, user0_.email as email3_7_, user0_.fullName as fullname4_7_, user0_.updatedAt as updateda5_7_ from User user0_ [42122-197]
at org.h2.message.DbException.getJdbcSQLException(DbException.java:357)
at org.h2.message.DbException.get(DbException.java:179)
at org.h2.message.DbException.get(DbException.java:155)
at org.h2.expression.ExpressionColumn.optimize(ExpressionColumn.java:150)
at org.h2.expression.Alias.optimize(Alias.java:51)
at org.h2.command.dml.Select.prepare(Select.java:858)
at org.h2.command.Parser.prepareCommand(Parser.java:283)
at org.h2.engine.Session.prepareLocal(Session.java:611)
at org.h2.engine.Session.prepareCommand(Session.java:549)
at org.h2.jdbc.JdbcConnection.prepareCommand(JdbcConnection.java:1247)
at org.h2.jdbc.JdbcPreparedStatement.<init>(JdbcPreparedStatement.java:76)
at org.h2.jdbc.JdbcConnection.prepareStatement(JdbcConnection.java:304)
at io.agroal.pool.wrapper.ConnectionWrapper.prepareStatement(ConnectionWrapper.java:659)
at org.hibernate.engine.jdbc.internal.StatementPreparerImpl$5.doPrepare(StatementPreparerImpl.java:149)
at org.hibernate.engine.jdbc.internal.StatementPreparerImpl$StatementPreparationTemplate.prepareStatement(StatementPreparerImpl.java:176)
at org.hibernate.engine.jdbc.internal.StatementPreparerImpl.prepareQueryStatement(StatementPreparerImpl.java:151)
at org.hibernate.loader.Loader.prepareQueryStatement(Loader.java:2103)
at org.hibernate.loader.Loader.executeQueryStatement(Loader.java:2040)
at org.hibernate.loader.Loader.executeQueryStatement(Loader.java:2018)
at org.hibernate.loader.Loader.doQuery(Loader.java:948)
at org.hibernate.loader.Loader.doQueryAndInitializeNonLazyCollections(Loader.java:349)
at org.hibernate.loader.Loader.doList(Loader.java:2849)
at org.hibernate.loader.Loader.doList(Loader.java:2831)
at org.hibernate.loader.Loader.listIgnoreQueryCache(Loader.java:2663)
at org.hibernate.loader.Loader.list(Loader.java:2658)
at org.hibernate.loader.hql.QueryLoader.list(QueryLoader.java:506)
at org.hibernate.hql.internal.ast.QueryTranslatorImpl.list(QueryTranslatorImpl.java:400)
at org.hibernate.engine.query.spi.HQLQueryPlan.performList(HQLQueryPlan.java:219)
at org.hibernate.internal.SessionImpl.list(SessionImpl.java:1414)
at org.hibernate.query.internal.AbstractProducedQuery.doList(AbstractProducedQuery.java:1625)
at org.hibernate.query.internal.AbstractProducedQuery.list(AbstractProducedQuery.java:1593)
at org.hibernate.query.Query.getResultList(Query.java:165)
at io.quarkus.hibernate.orm.panache.common.runtime.CommonPanacheQueryImpl.list(CommonPanacheQueryImpl.java:239)
at io.quarkus.hibernate.orm.panache.kotlin.runtime.PanacheQueryImpl.list(PanacheQueryImpl.java:154)
at io.quarkus.hibernate.orm.panache.kotlin.runtime.KotlinJpaOperations.list(KotlinJpaOperations.java:24)
at io.quarkus.hibernate.orm.panache.kotlin.runtime.KotlinJpaOperations.list(KotlinJpaOperations.java:10)
at io.quarkus.hibernate.orm.panache.common.runtime.AbstractJpaOperations.listAll(AbstractJpaOperations.java:289)
at com.fortuneapp.backend.application.domain.adapters.databases.panache.UserRepository.listAll(UserRepository.kt)
at com.fortuneapp.backend.application.domain.adapters.databases.panache.UserRepository_Subclass.listAll$$superaccessor28(UserRepository_Subclass.zig:4915)
at com.fortuneapp.backend.application.domain.adapters.databases.panache.UserRepository_Subclass$$function$$28.apply(UserRepository_Subclass$$function$$28.zig:29)
at io.quarkus.arc.impl.AroundInvokeInvocationContext.proceed(AroundInvokeInvocationContext.java:54)
at io.quarkus.arc.runtime.devconsole.InvocationInterceptor.proceed(InvocationInterceptor.java:63)
at io.quarkus.arc.runtime.devconsole.InvocationInterceptor.monitor(InvocationInterceptor.java:49)
at io.quarkus.arc.runtime.devconsole.InvocationInterceptor_Bean.intercept(InvocationInterceptor_Bean.zig:521)
at io.quarkus.arc.impl.InterceptorInvocation.invoke(InterceptorInvocation.java:41)
at io.quarkus.arc.impl.AroundInvokeInvocationContext.perform(AroundInvokeInvocationContext.java:41)
at io.quarkus.arc.impl.InvocationContexts.performAroundInvoke(InvocationContexts.java:32)
at com.fortuneapp.backend.application.domain.adapters.databases.panache.UserRepository_Subclass.listAll(UserRepository_Subclass.zig:4873)
at com.fortuneapp.backend.application.domain.adapters.databases.panache.UserRepository_ClientProxy.listAll(UserRepository_ClientProxy.zig:1353)
at com.fortuneapp.backend.application.rest.UsersResource.getAllUsers(UserResource.kt:24)
at com.fortuneapp.backend.application.rest.UsersResource_Subclass.getAllUsers$$superaccessor2(UsersResource_Subclass.zig:354)
at com.fortuneapp.backend.application.rest.UsersResource_Subclass$$function$$2.apply(UsersResource_Subclass$$function$$2.zig:29)
at io.quarkus.arc.impl.AroundInvokeInvocationContext.proceed(AroundInvokeInvocationContext.java:54)
at io.quarkus.arc.runtime.devconsole.InvocationInterceptor.proceed(InvocationInterceptor.java:63)
at io.quarkus.arc.runtime.devconsole.InvocationInterceptor.monitor(InvocationInterceptor.java:49)
at io.quarkus.arc.runtime.devconsole.InvocationInterceptor_Bean.intercept(InvocationInterceptor_Bean.zig:521)
at io.quarkus.arc.impl.InterceptorInvocation.invoke(InterceptorInvocation.java:41)
at io.quarkus.arc.impl.AroundInvokeInvocationContext.perform(AroundInvokeInvocationContext.java:41)
at io.quarkus.arc.impl.InvocationContexts.performAroundInvoke(InvocationContexts.java:32)
at com.fortuneapp.backend.application.rest.UsersResource_Subclass.getAllUsers(UsersResource_Subclass.zig:312)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at org.jboss.resteasy.core.MethodInjectorImpl.invoke(MethodInjectorImpl.java:170)
at org.jboss.resteasy.core.MethodInjectorImpl.invoke(MethodInjectorImpl.java:130)
at org.jboss.resteasy.core.ResourceMethodInvoker.internalInvokeOnTarget(ResourceMethodInvoker.java:660)
at org.jboss.resteasy.core.ResourceMethodInvoker.invokeOnTargetAfterFilter(ResourceMethodInvoker.java:524)
at org.jboss.resteasy.core.ResourceMethodInvoker.lambda$invokeOnTarget$2(ResourceMethodInvoker.java:474)
at org.jboss.resteasy.core.interception.jaxrs.PreMatchContainerRequestContext.filter(PreMatchContainerRequestContext.java:364)
at org.jboss.resteasy.core.ResourceMethodInvoker.invokeOnTarget(ResourceMethodInvoker.java:476)
at org.jboss.resteasy.core.ResourceMethodInvoker.invoke(ResourceMethodInvoker.java:434)
at org.jboss.resteasy.core.ResourceMethodInvoker.invoke(ResourceMethodInvoker.java:408)
at org.jboss.resteasy.core.ResourceMethodInvoker.invoke(ResourceMethodInvoker.java:69)
at org.jboss.resteasy.core.SynchronousDispatcher.invoke(SynchronousDispatcher.java:492)
at org.jboss.resteasy.core.SynchronousDispatcher.lambda$invoke$4(SynchronousDispatcher.java:261)
at org.jboss.resteasy.core.SynchronousDispatcher.lambda$preprocess$0(SynchronousDispatcher.java:161)
at org.jboss.resteasy.core.interception.jaxrs.PreMatchContainerRequestContext.filter(PreMatchContainerRequestContext.java:364)
at org.jboss.resteasy.core.SynchronousDispatcher.preprocess(SynchronousDispatcher.java:164)
at org.jboss.resteasy.core.SynchronousDispatcher.invoke(SynchronousDispatcher.java:247)
at io.quarkus.resteasy.runtime.standalone.RequestDispatcher.service(RequestDispatcher.java:73)
at io.quarkus.resteasy.runtime.standalone.VertxRequestHandler.dispatch(VertxRequestHandler.java:138)
at io.quarkus.resteasy.runtime.standalone.VertxRequestHandler.access$000(VertxRequestHandler.java:41)
at io.quarkus.resteasy.runtime.standalone.VertxRequestHandler$1.run(VertxRequestHandler.java:93)
at io.quarkus.runtime.CleanableExecutor$CleaningRunnable.run(CleanableExecutor.java:231)
at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
at org.jboss.threads.EnhancedQueueExecutor$Task.run(EnhancedQueueExecutor.java:2415)
at org.jboss.threads.EnhancedQueueExecutor$ThreadBody.run(EnhancedQueueExecutor.java:1452)
at org.jboss.threads.DelegatingRunnable.run(DelegatingRunnable.java:29)
at org.jboss.threads.ThreadLocalResettingRunnable.run(ThreadLocalResettingRunnable.java:29)
at java.base/java.lang.Thread.run(Thread.java:834)
at org.jboss.threads.JBossThread.run(JBossThread.java:501)
Resulted in: org.hibernate.exception.SQLGrammarException: could not prepare statement
at org.hibernate.exception.internal.SQLStateConversionDelegate.convert(SQLStateConversionDelegate.java:103)
at org.hibernate.exception.internal.StandardSQLExceptionConverter.convert(StandardSQLExceptionConverter.java:42)
at org.hibernate.engine.jdbc.spi.SqlExceptionHelper.convert(SqlExceptionHelper.java:113)
at org.hibernate.engine.jdbc.internal.StatementPreparerImpl$StatementPreparationTemplate.prepareStatement(StatementPreparerImpl.java:186)
... 78 more
Resulted in: javax.persistence.PersistenceException: org.hibernate.exception.SQLGrammarException: could not prepare statement
at org.hibernate.internal.ExceptionConverterImpl.convert(ExceptionConverterImpl.java:154)
at org.hibernate.query.internal.AbstractProducedQuery.list(AbstractProducedQuery.java:1602)
... 62 more
Resulted in: org.jboss.resteasy.spi.UnhandledException: javax.persistence.PersistenceException: org.hibernate.exception.SQLGrammarException: could not prepare statement
at org.jboss.resteasy.core.ExceptionHandler.handleApplicationException(ExceptionHandler.java:106)
at org.jboss.resteasy.core.ExceptionHandler.handleException(ExceptionHandler.java:372)
at org.jboss.resteasy.core.SynchronousDispatcher.writeException(SynchronousDispatcher.java:218)
at org.jboss.resteasy.core.SynchronousDispatcher.invoke(SynchronousDispatcher.java:519)
... 18 more
I've set up my local H2 database connection by adding this to my application.yaml file:
quarkus:
datasource:
db-kind: h2
username: sa
jdbc:
url: "jdbc:h2:mem:default"
flyway:
migrate-at-start: true
Furthermore, I'm using https://quarkus.io/guides/hibernate-orm-panache, which is quite easy to use. I've created a User entity, and a User repository in my project. I then use those in my User resource, where I define the api.
#Path("/users")
class UsersResource {
#Inject
lateinit var userRepository: UserRepository
#GET
#Produces(MediaType.APPLICATION_JSON)
fun getAllUsers() : GetAllUsersResponse =
try {
GetAllUsersSuccess(userRepository.listAll())
} catch (e: NotFoundException) {
GetAllUsersFailure(e)
}
}
I've already implemented Flyway, for my database migration. This seems to go well. I'm also adding a row to the table when the migration runs, and I can see that "1 row is affected", so I think Flyway has access to the H2 database.
Do any of you guys know what I'm missing?
Kind regards
I found the problem. I defined my column names in snake case, while my entity properties where defined in camel case.
I am trying to start a server using the ignite.sh script and am getting the above error (failed to load HadoopV2Job). "config/default-config.xml" is being passed to CommandLineStartup and hasn't been changed.
Has anyone came across this issue or does anyone know how to fix it?
My Ignite version is 1.4.0 and here is the full stack trace:
class org.apache.ignite.IgniteException: Failed to start processor: HadoopProcessor [idCtr=0]
at org.apache.ignite.internal.util.IgniteUtils.convertException(IgniteUtils.java:881)
at org.apache.ignite.Ignition.start(Ignition.java:349)
at org.apache.ignite.startup.cmdline.CommandLineStartup.main(CommandLineStartup.java:302)
Caused by: class org.apache.ignite.IgniteCheckedException: Failed to start processor: HadoopProcessor [idCtr=0]
at org.apache.ignite.internal.IgniteKernal.startProcessor(IgniteKernal.java:1504)
at org.apache.ignite.internal.IgniteKernal.start(IgniteKernal.java:888)
at org.apache.ignite.internal.IgnitionEx$IgniteNamedInstance.start0(IgnitionEx.java:1617)
at org.apache.ignite.internal.IgnitionEx$IgniteNamedInstance.start(IgnitionEx.java:1484)
at org.apache.ignite.internal.IgnitionEx.start0(IgnitionEx.java:965)
at org.apache.ignite.internal.IgnitionEx.startConfigurations(IgnitionEx.java:892)
at org.apache.ignite.internal.IgnitionEx.start(IgnitionEx.java:784)
at org.apache.ignite.internal.IgnitionEx.start(IgnitionEx.java:705)
at org.apache.ignite.internal.IgnitionEx.start(IgnitionEx.java:576)
at org.apache.ignite.internal.IgnitionEx.start(IgnitionEx.java:546)
at org.apache.ignite.Ignition.start(Ignition.java:346)
... 1 more
Caused by: class org.apache.ignite.IgniteCheckedException: Failed to load job class [class=org.apache.ignite.internal.processors.hadoop.v2.HadoopV2Job]
at org.apache.ignite.internal.processors.hadoop.jobtracker.HadoopJobTracker.start(HadoopJobTracker.java:167)
at org.apache.ignite.internal.processors.hadoop.HadoopProcessor.start(HadoopProcessor.java:103)
at org.apache.ignite.internal.IgniteKernal.startProcessor(IgniteKernal.java:1501)
... 11 more
Caused by: java.lang.IllegalArgumentException
at org.objectweb.asm.ClassReader.<init>(Unknown Source)
at org.objectweb.asm.ClassReader.<init>(Unknown Source)
at org.objectweb.asm.ClassReader.<init>(Unknown Source)
at org.apache.ignite.internal.processors.hadoop.HadoopClassLoader.hasExternalDependencies(HadoopClassLoader.java:288)
at org.apache.ignite.internal.processors.hadoop.HadoopClassLoader.loadClass(HadoopClassLoader.java:162)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at org.apache.ignite.internal.processors.hadoop.jobtracker.HadoopJobTracker.start(HadoopJobTracker.java:163)
... 13 more
Shane, did you compile the Ignite code with JDK8 ?
It looks like the org.objectweb.asm library failed to parse class "org.apache.ignite.internal.processors.hadoop.v2.HadoopV2Job" bytecode because the bytecode version is higher than 1.7:
/**
* Constructs a new {#link ClassReader} object.
*
* #param b
* the bytecode of the class to be read.
* #param off
* the start offset of the class data.
* #param len
* the length of the class data.
*/
public ClassReader(final byte[] b, final int off, final int len) {
this.b = b;
// checks the class version
if (readShort(off + 6) > Opcodes.V1_7) {
throw new IllegalArgumentException();
}
Please try to build Ignite witgh JDK 1.7 or specify target level = 1.7 with JDK8. Does that solve the problem?
Running spark code in IDEA Intellij is painful as a new Spark/Intellij user.
I googled many pages, but didn't find a solution to this.
Code is very simple as below.
I'm getting run-time error in this line:
val conf = new SparkConf().setAppName("Spark Pi")
import org.apache.spark.SparkConf
import scala.math.random
object HelloWorld {
def main(args: Array[String]) {
println("Hello World")
val conf = new SparkConf().setAppName("Spark Pi")
// val spark = new SparkContext(conf)
// val slices = if (args.length > 0) args(0).toInt else 3
// val n = 100000 * slices
// val count = spark.parallelize(1 to n, slices).map { i =>
// val x = random * 2 - 1
// val y = random * 2 - 1
// if (x*x + y*y < 1) 1 else 0
// }.reduce(_ + _)
// println("Pi is roughly " + 4.0 * count / n)
// val pi = 4.0 * count / n
// val ppi = spark.parallelize(Seq(pi))
// ppi.saveAsTextFile("/tmp/bryan/spark/output.pi")
// spark.stop()
}
}
The error message is :
"C:\Program Files\Java\jdk8\bin\java" -Didea.launcher.port=7534 "-Didea.launcher.bin.path=C:\Program Files (x86)\JetBrains\IntelliJ IDEA Community Edition 14.1.5\bin" -Dfile.encoding=UTF-8 -classpath "C:\Program Files\Java\jdk8\jre\lib\charsets.jar;C:\Program Files\Java\jdk8\jre\lib\deploy.jar;C:\Program Files\Java\jdk8\jre\lib\javaws.jar;C:\Program Files\Java\jdk8\jre\lib\jce.jar;C:\Program Files\Java\jdk8\jre\lib\jfr.jar;C:\Program Files\Java\jdk8\jre\lib\jfxswt.jar;C:\Program Files\Java\jdk8\jre\lib\jsse.jar;C:\Program Files\Java\jdk8\jre\lib\management-agent.jar;C:\Program Files\Java\jdk8\jre\lib\plugin.jar;C:\Program Files\Java\jdk8\jre\lib\resources.jar;C:\Program Files\Java\jdk8\jre\lib\rt.jar;C:\Program Files\Java\jdk8\jre\lib\ext\access-bridge-64.jar;C:\Program Files\Java\jdk8\jre\lib\ext\cldrdata.jar;C:\Program Files\Java\jdk8\jre\lib\ext\dnsns.jar;C:\Program Files\Java\jdk8\jre\lib\ext\jaccess.jar;C:\Program Files\Java\jdk8\jre\lib\ext\jfxrt.jar;C:\Program Files\Java\jdk8\jre\lib\ext\localedata.jar;C:\Program Files\Java\jdk8\jre\lib\ext\nashorn.jar;C:\Program Files\Java\jdk8\jre\lib\ext\sunec.jar;C:\Program Files\Java\jdk8\jre\lib\ext\sunjce_provider.jar;C:\Program Files\Java\jdk8\jre\lib\ext\sunmscapi.jar;C:\Program Files\Java\jdk8\jre\lib\ext\sunpkcs11.jar;C:\Program Files\Java\jdk8\jre\lib\ext\zipfs.jar;D:\code\spark-cdh\analysis-jobs\example\target\scala-2.10\classes;C:\Users\spark39\.ivy2\cache\org.scala-lang\scala-compiler\jars\scala-compiler-2.10.0.jar;C:\Users\spark39\.sbt\boot\scala-2.10.4\lib\scala-library.jar;C:\Program Files (x86)\JetBrains\IntelliJ IDEA Community Edition 14.1.5\lib\idea_rt.jar" com.intellij.rt.execution.application.AppMain com.spark.example.test.HelloWorld
Hello World
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/SparkConf
at com.samsungaustin.yac.spark.example.test.HelloWorld$.main(Test.scala:15)
at com.samsungaustin.yac.spark.example.test.HelloWorld.main(Test.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:140)
Caused by: java.lang.ClassNotFoundException: org.apache.spark.SparkConf
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 7 more
Someone mentioned about adding classpath through 'Edit Configurations' menu, but can anybody tell exactly how I figure this out?
Thanks.
Starting with new technology can often be a bit painful, but it might lead to valuable new knowledge and experience. Do you use a Maven pom.xml file for building your project? I would advise you to use Maven or something similar to keep track of which libraries your code needs.
You can also try one of the existing examples that are available for Spark on the web site: http://spark.apache.org/examples.html. There is also a GitHub repository with even more examples: https://github.com/apache/spark/tree/master/examples. (This repository already has a Maven pom.xml file: https://github.com/apache/spark/blob/master/examples/pom.xml.)
The following page contains (hopefully) Useful Developer Tools: https://cwiki.apache.org/confluence/display/SPARK/Useful+Developer+Tools
There are also several tutorials that use Spark, Scala, and IntelliJ IDEA:
https://medium.com/large-scale-data-processing/how-to-kick-start-spark-development-on-intellij-idea-in-4-steps-c7c8f5c2fe63#.y0uxk7gay
https://docs.sigmoidanalytics.com/index.php/Step_by_Step_instructions_on_how_to_build_Spark_App_with_IntelliJ_IDEA
I am trying to load an embedded hsqldb with data using SpringLiquibase integration.
java config=
#Configuration
#Profile("testing")
...
#Bean
public DataSource dataSource() {
DataSource ds = new EmbeddedDatabaseBuilder()
.setType(EmbeddedDatabaseType.HSQL)
.setName("junittest")
.build();
return ds;
}
#Bean
SpringLiquibase liquibase(){
SpringLiquibase lb = new SpringLiquibase();
lb.setDataSource(dataSource());
lb.setChangeLog("classpath:liquibase/example.xml");
lb.setContexts("testing");
return lb;
}
...
liquibase schema=example.sql
create table xyz(
id integer,
name varchar(10)
);
But I get the following exception:
Caused by: liquibase.exception.LockException: liquibase.exception.UnexpectedLiquibaseException: liquibase.snapshot.InvalidExampleException: Found multiple catalog/schemas matching null.PUBLIC
at liquibase.lockservice.StandardLockService.acquireLock(StandardLockService.java:211)
at liquibase.lockservice.StandardLockService.waitForLock(StandardLockService.java:151)
at liquibase.Liquibase.update(Liquibase.java:182)
at liquibase.Liquibase.update(Liquibase.java:174)
at liquibase.integration.spring.SpringLiquibase.performUpdate(SpringLiquibase.java:345)
at liquibase.integration.spring.SpringLiquibase.afterPropertiesSet(SpringLiquibase.java:302)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1571)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1509)
... 40 more
Caused by: liquibase.exception.UnexpectedLiquibaseException: liquibase.snapshot.InvalidExampleException: Found multiple catalog/schemas matching null.PUBLIC
at liquibase.snapshot.SnapshotGeneratorFactory.hasDatabaseChangeLogLockTable(SnapshotGeneratorFactory.java:168)
at liquibase.lockservice.StandardLockService.hasDatabaseChangeLogLockTable(StandardLockService.java:138)
at liquibase.lockservice.StandardLockService.init(StandardLockService.java:85)
at liquibase.lockservice.StandardLockService.acquireLock(StandardLockService.java:185)
... 47 more
Caused by: liquibase.snapshot.InvalidExampleException: Found multiple catalog/schemas matching null.PUBLIC
at liquibase.snapshot.jvm.SchemaSnapshotGenerator.snapshotObject(SchemaSnapshotGenerator.java:74)
at liquibase.snapshot.jvm.JdbcSnapshotGenerator.snapshot(JdbcSnapshotGenerator.java:59)
at liquibase.snapshot.SnapshotGeneratorChain.snapshot(SnapshotGeneratorChain.java:50)
at liquibase.snapshot.jvm.JdbcSnapshotGenerator.snapshot(JdbcSnapshotGenerator.java:62)
at liquibase.snapshot.SnapshotGeneratorChain.snapshot(SnapshotGeneratorChain.java:50)
at liquibase.snapshot.jvm.JdbcSnapshotGenerator.snapshot(JdbcSnapshotGenerator.java:62)
at liquibase.snapshot.SnapshotGeneratorChain.snapshot(SnapshotGeneratorChain.java:50)
at liquibase.snapshot.jvm.JdbcSnapshotGenerator.snapshot(JdbcSnapshotGenerator.java:62)
at liquibase.snapshot.SnapshotGeneratorChain.snapshot(SnapshotGeneratorChain.java:50)
at liquibase.snapshot.DatabaseSnapshot.include(DatabaseSnapshot.java:153)
at liquibase.snapshot.DatabaseSnapshot.init(DatabaseSnapshot.java:56)
at liquibase.snapshot.DatabaseSnapshot.<init>(DatabaseSnapshot.java:33)
at liquibase.snapshot.JdbcDatabaseSnapshot.<init>(JdbcDatabaseSnapshot.java:22)
at liquibase.snapshot.SnapshotGeneratorFactory.createSnapshot(SnapshotGeneratorFactory.java:126)
at liquibase.snapshot.SnapshotGeneratorFactory.createSnapshot(SnapshotGeneratorFactory.java:119)
at liquibase.snapshot.SnapshotGeneratorFactory.createSnapshot(SnapshotGeneratorFactory.java:107)
at liquibase.snapshot.SnapshotGeneratorFactory.has(SnapshotGeneratorFactory.java:97)
at liquibase.snapshot.SnapshotGeneratorFactory.hasDatabaseChangeLogLockTable(SnapshotGeneratorFactory.java:166)
... 50 more
version of hsqldb / liquibase : 1.8.0.10 / 3.2.3
Am I missing some config here?