Endeca deployment - Initialize service.bat command failed - endeca

Please find the error when I run the initialize_services command in cmd
SEVERE: Caught an exception while invoking method 'run' on object 'InitialSetup'. Releasing locks.
Caused by java.lang.reflect.InvocationTargetException
sun.reflect.NativeMethodAccessorImpl invoke0 - null
Caused by com.endeca.soleng.eac.toolkit.exception.AppControlException
com.endeca.soleng.eac.toolkit.script.Script runBeanShellScript - Unknown error executing a BeanShell script.
Caused by bsh.EvalError
bsh.BSHMethodInvocation eval - Sourced file: inline evaluation of: `` IFCR.provisionSite(); CAS.importDimensionValueIdMappings("Disco . . . '' : Error in method invocation: Method importDimensionValueIdMappings( java.lang.String, java.lang.String ) not found in class'com.endeca.soleng.eac.toolkit.component.CustomComponent'
Failure to initialize EAC application.

According to this adding casStubs.jar from CAS_ROOT\lib\cas-dt\ to classpath in runcommand script will solve the problem.
set CLASSPATH=%CLASSPATH%;C:\Endeca\CAS\11.1.0\lib\cas-dt\casStubs.jar

Related

spark.read.parquet not working in Colab

Py4JJavaError: An error occurred while calling o188.parquet.
: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class org.apache.hadoop.fs.s3a.S3AFileSystem not found
I tried adding the missing hadoop-aws jar file using spark-submit to the classpath but was unable to add it. This is what I tried:
!spark-submit --jars /content/hadoop-aws-2.7.1.jar
Exception in thread "main" java.lang.IllegalArgumentException: Missing application resource.
os.environ['PYSPARK_SUBMIT_ARGS'] = "--packages=org.apache.hadoop:hadoop-aws:2.7.3 pyspark-shell"

Not able to start Datastax Cassandra due to missing SystemInfo class

I have installed Datastax Cassandra 5.0.4 on my mac but when i try to start, it is giving me error for SystemInfo class.
Here is related log:
Exception (com.google.inject.ProvisionException) encountered during startup: Unable to provision, see the following errors:
1) Error injecting constructor, java.lang.NoClassDefFoundError: com/datastax/hadoop/hive/util/SystemInfo
at com.datastax.bdp.plugin.HiveMetaStorePlugin.(HiveMetaStorePlugin.java:39)
at com.datastax.bdp.DseHiveModule.configure(Unknown Source) (via modules: com.datastax.bdp.DseModule -> com.datastax.bdp.DseAnalyticsModule -> com.datastax.bdp.DseHiveModule)
while locating com.datastax.bdp.plugin.HiveMetaStorePlugin
1 error
com.google.inject.ProvisionException: Unable to provision, see the following errors:
1) Error injecting constructor, java.lang.NoClassDefFoundError: com/datastax/hadoop/hive/util/SystemInfo
at com.datastax.bdp.plugin.HiveMetaStorePlugin.(HiveMetaStorePlugin.java:39)
at com.datastax.bdp.DseHiveModule.configure(Unknown Source) (via modules: com.datastax.bdp.DseModule -> com.datastax.bdp.DseAnalyticsModule -> com.datastax.bdp.DseHiveModule)
while locating com.datastax.bdp.plugin.HiveMetaStorePlugin
1 error
at com.google.inject.internal.InjectorImpl$2.get(InjectorImpl.java:1025)
at com.google.inject.internal.InjectorImpl.getInstance(InjectorImpl.java:1051)
at com.datastax.bdp.plugin.PluginManager.register(PluginManager.java:139)
at com.datastax.bdp.plugin.PluginManager.postSetup(PluginManager.java:68)
at com.datastax.bdp.server.DseDaemon.postSetup(DseDaemon.java:824)
at com.datastax.bdp.server.DseDaemon.setup(DseDaemon.java:447)
at org.apache.cassandra.service.CassandraDaemon.activate(CassandraDaemon.java:568)
at com.datastax.bdp.DseModule.main(DseModule.java:91)
Caused by: java.lang.NoClassDefFoundError: com/datastax/hadoop/hive/util/SystemInfo
at com.datastax.bdp.plugin.HiveMetaStorePlugin.(HiveMetaStorePlugin.java:40)
at com.datastax.bdp.plugin.HiveMetaStorePlugin$$FastClassByGuice$$7e2893c3.newInstance()
at com.google.inject.internal.cglib.reflect.$FastConstructor.newInstance(FastConstructor.java:40)

Appfuse Tutorial troubles - BeanCreationException

I'm working through the Appfuse tutorial for version 3.5 (as previously advised by Matt Raible in response to another question).
I'm on this step: http://appfuse.org/display/APF/Using+Spring+MVC#UsingSpringMVC-listview
But when I build (mvn package).. I get the following error:
Running gov.nysed.archives.Nimbus.dao.PersonDaoTest
Running gov.nysed.archives.Nimbus.webapp.controller.UpdatePasswordControllerTest
WARN [main] GenericApplicationContext.refresh(487) | Exception encountered during context initialization - cancelling refresh attempt org.springframework.beans.factory.BeanCreationException: Error creating bean wit h name 'userManager': Injection of autowired dependencies failed; nested exception is org.springframework.beans.factory.BeanCreationException: Could not autowire method: public void gov.nysed.archives.Nimbus.service.impl.UserManagerImpl.set
MailMessage(org.springframework.mail.SimpleMailMessage); nested exception is java.lang.NoClassDefFoundError: [Lorg/hibernate/engine/FilterDefinition;
at org.springframework.beans.factory.annotation.AutowiredAnnotationBeanPostProcessor.postProcessPropertyValues(AutowiredAnnotationBeanPostProcessor.java:334) ~[spring-beans-4.1.3.RELEASE.jar:4.1.3.RELEASE] ...
Could my adding the "persons" section (Entity and controller etc) cause this problem? It passed the tests prior to adding the new entity... so of course, yes... but how?

Oozie hive action fails

I am creating oozie workflow for hive create table command.
I have added hive-site.xml in hdfs location.
I am getting below error:-
Launcher ERROR, reason: Main class [org.apache.oozie.action.hadoop.HiveMain], main() threw exception, com/facebook/fb303/FacebookService$Iface
java.lang.NoClassDefFoundError: com/facebook/fb303/FacebookService$Iface
at java.lang.ClassLoader.defineClass1(Native Method)
This might be because you are missing Thrift jar or version mismatch.
Refer the following
Error while executing program with Hive JDBC

Grails cant use Spock framework?

I am currently working on a project that had started with no TDD in place and now I have taken the following steps to start using Spock for Unit and Integration testing on this project:
Added the following to the buildconfig:
test ":spock:0.7"
Then created a spec at "test/unit/MYCLASSNAME" called MyfunctionControllerSpec as shown below:
import grails.test.mixin.*
import spock.lang.Specification
class MyfunctionControllerSpec extends Specification {
void "list() should return no results with no records in DB"() {
given:
def model = controller.list()
expect:
model.taskInstanceList.size() == 0
model.taskInstanceTotal == 0
}
}
However I am getting the following errors with the Specification import line:
Groovy:unable to resolve class spock.lang.Specification
I don’t understand what I am doing wrong, have I imported or install Spock wrong?
Thanks in advance
EDIT*
I have tried the suggestion below and then the solution wont run and it still doesnt recognise the Specification class, even when I start typing "inport spo" and press cntrl+space nothing comes up as if it cant even recogise the plugin either:
Loading Grails 2.1.0
| Configuring classpath
| Downloading: spock-grails-support-0.7-groovy-2.0.pom.sha1
| Downloading: spock-core-0.7-groovy-2.0.pom.sha1
| Downloading: spock-grails-support-0.7-groovy-2.0.jar.sha1
| Downloading: spock-core-0.7-groovy-2.0.jar.sha1.
| Environment set to development....
| Error Error loading event script from file [/media/system/workspace/sms_bskyb_New_V2(Dynam Messages)/plugins/tool-ui/scripts/_Events.groovy] startup failed:
Could not instantiate global transform class org.spockframework.compiler.SpockTransform specified at jar:file:/home/system/.grails/ivy-cache/org.spockframework/spock-core/jars/spock-core-0.7-groovy-2.0.jar!/META-INF/services/org.codehaus.groovy.transform.ASTTransformation because of exception java.lang.reflect.InvocationTargetException
1 error
(Use --stacktrace to see the full trace)
| Error Error loading event script from file [/home/system/.grails/2.1.0/projects/sms_bskyb/plugins/database-migration-1.1/scripts/_Events.groovy] startup failed:
Could not instantiate global transform class org.spockframework.compiler.SpockTransform specified at jar:file:/home/system/.grails/ivy-cache/org.spockframework/spock-core/jars/spock-core-0.7-groovy-2.0.jar!/META-INF/services/org.codehaus.groovy.transform.ASTTransformation because of exception java.lang.reflect.InvocationTargetException
1 error
(Use --stacktrace to see the full trace)
| Error Error loading event script from file [/home/system/.grails/2.1.0/projects/sms_bskyb/plugins/tomcat-2.1.0/scripts/_Events.groovy] startup failed:
Could not instantiate global transform class org.spockframework.compiler.SpockTransform specified at jar:file:/home/system/.grails/ivy-cache/org.spockframework/spock-core/jars/spock-core-0.7-groovy-2.0.jar!/META-INF/services/org.codehaus.groovy.transform.ASTTransformation because of exception java.lang.reflect.InvocationTargetException
1 error
(Use --stacktrace to see the full trace)
| Error Error loading event script from file [/home/system/.grails/2.1.0/projects/sms_bskyb/plugins/spock-0.7/scripts/_Events.groovy] startup failed:
Could not instantiate global transform class org.spockframework.compiler.SpockTransform specified at jar:file:/home/system/.grails/ivy-cache/org.spockframework/spock-core/jars/spock-core-0.7-groovy-2.0.jar!/META-INF/services/org.codehaus.groovy.transform.ASTTransformation because of exception java.lang.reflect.InvocationTargetException
1 error
(Use --stacktrace to see the full trace)
| Error Error loading event script from file [/home/system/.grails/2.1.0/projects/sms_bskyb/plugins/webxml-1.4.1/scripts/_Events.groovy] startup failed:
Could not instantiate global transform class org.spockframework.compiler.SpockTransform specified at jar:file:/home/system/.grails/ivy-cache/org.spockframework/spock-core/jars/spock-core-0.7-groovy-2.0.jar!/META-INF/services/org.codehaus.groovy.transform.ASTTransformation because of exception java.lang.reflect.InvocationTargetException
1 error
(Use --stacktrace to see the full trace)
| Environment set to development.....
| Packaging Grails application.
| Error Fatal error during compilation org.codehaus.groovy.control.MultipleCompilationErrorsException: startup failed:
Could not instantiate global transform class org.spockframework.compiler.SpockTransform specified at jar:file:/home/system/.grails/ivy-cache/org.spockframework/spock-core/jars/spock-core-0.7-groovy-2.0.jar!/META-INF/services/org.codehaus.groovy.transform.ASTTransformation because of exception java.lang.reflect.InvocationTargetException
1 error
(Use --stacktrace to see the full trace)
You are using Grails 2.1.0 whereas Groovy 2.0 was introduced in Grails 2.2.0 and above. Hopefully you do not need to explicit dependency org.spockframework:spock-grails-support:0.7-groovy-2.0. Only use as below:
plugins{
test ":spock:0.7"
}
In case you still find an issue then isolate the problem by creating a fresh new bare bone grails app and install the plugin as mentioned in the plugin docs. Try to see if there is any classpath clash. If the problem still exists, then clear ivy-cache and/or .m2 and retry.
I almost answered this here, but I see that you are using Grails 2.1.
For others that are getting this error with Grails 2.2.x see this answer