Ignite configuration with absolute path - ignite

I downloaded Ignite 2.5.0 (I use maven dependences in Eclipse on a Mac for my Java class), and I tried to start Ignite with a configuration file given with an absolute path:
public static void main(String [] args) throws Exception {
try (Ignite ignite = Ignition.start("/Users/ahajnal/Documents/git/ignite/target/classes/default-config.xml")) {}
}
but I got exception:
Exception in thread "main" class org.apache.ignite.IgniteException: Failed to find configuration in: file:/Users/ahajnal/Documents/git/ignite/target/classes/default-config.xml
at org.apache.ignite.internal.util.IgniteUtils.convertException(IgniteUtils.java:990)
at org.apache.ignite.Ignition.start(Ignition.java:355)
at hu.sztaki.lpds.ml.ignite.WekaIgnite.main(WekaIgnite.java:82)
Caused by: class org.apache.ignite.IgniteCheckedException: Failed to find configuration in: file:/Users/ahajnal/Documents/git/ignite/target/classes/default-config.xml
at org.apache.ignite.internal.util.spring.IgniteSpringHelperImpl.loadConfigurations(IgniteSpringHelperImpl.java:116)
at org.apache.ignite.internal.util.spring.IgniteSpringHelperImpl.loadConfigurations(IgniteSpringHelperImpl.java:98)
at org.apache.ignite.internal.IgnitionEx.loadConfigurations(IgnitionEx.java:744)
at org.apache.ignite.internal.IgnitionEx.start(IgnitionEx.java:945)
at org.apache.ignite.internal.IgnitionEx.start(IgnitionEx.java:854)
at org.apache.ignite.internal.IgnitionEx.start(IgnitionEx.java:724)
at org.apache.ignite.internal.IgnitionEx.start(IgnitionEx.java:693)
at org.apache.ignite.Ignition.start(Ignition.java:352)
... 1 more
The config file is there:
$ cat /Users/ahajnal/Documents/git/ignite/target/classes/default-config.xml
<?xml version="1.0" encoding="UTF-8"?>...
and:
new File("/Users/ahajnal/Documents/git/ignite/target/classes/default-config.xml").exists() is true
According to docs this path can be absolute.
What am I doing wrong?
Thank you.

I think, the problem is that default-config.xml file has only abstract IgniteConfiguration. This is the case in the default configuration file in examples.
Check, if the configuration bean's definition has abstract=true parameter, and remove it if it does.
P.S.
Creating Ignite as a resource of a try block is a pretty bad idea, since the node will stop right after execution of the try block is finished.

Related

How to use MicroProfile ConfigProperty injection from a microprofile-config.properties file in Open Liberty test using ShrinkWrap + Arquillian?

Problem
I added a microprofile-config.properties file to the Liberty "Testing microservices with the Arquillian managed container" guide sample, but my microprofile-config.properties isn't picked up by my test.
Symptom
> Exception : io.smallrye.config.inject.ConfigException: SRCFG02000:
> Failed to Inject #ConfigProperty for key serviceName into
> io.openliberty.guides.system.AppConfig.serviceName since the config
> property could not be found in any config source at
> io.smallrye.config.inject.ConfigExtension.validate(ConfigExtension.java:183)
> at
> io.openliberty.microprofile.config.internal.extension.OLSmallRyeConfigExtension.validate(OLSmallRyeConfigExtension.java:65)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> ...
Starting point
git clone https://github.com/openliberty/guide-arquillian-managed.git; cd finish
microprofile-config.properties
Path: src/main/resources/META-INF/microprofile-config.properties
serviceName=myService
Bean to inject into: AppConfig.java
#ApplicationScoped
public class AppConfig {
#Inject #ConfigProperty(name="serviceName")
private String serviceName;
...
}
Liberty server config (server.xml)
<featureManager>
<feature>restfulWS-3.0</feature>
<feature>jsonb-2.0</feature>
<feature>jsonp-2.0</feature>
<feature>cdi-3.0</feature>
<feature>mpConfig-3.0</feature>
<!--Enable the following features to run tests with Arquillian managed container-->
<feature>localConnector-1.0</feature>
<feature>servlet-5.0</feature>
</featureManager>
You need to specifically package the microprofile-config.properties file in the ShrinkWrap package like:
.addAsManifestResource(new File("src/main/resources/META-INF", "microprofile-config.properties"))
More completely in the context of this sample it would look like:
WebArchive archive = ShrinkWrap.create(WebArchive.class, WARNAME)
.addAsManifestResource(new File("src/main/resources/META-INF", "microprofile-config.properties"))
.addPackages(true, "io.openliberty.guides.system");
Explanation:
Since the sample uses ShrinkWrap to package the application test deployment, the microprofile-config.properties must be programmatically added to the ShrinkWrap deployment. It doesn't become part of the package by virtue of being in src/main/resources (like it becomes part of a standard Maven WAR package built by the maven-war-plugin).

Struts2 file upload - No mapping found for dependency

I'm using Struts 2.3.20.1 with Commons File Upload 1.3.1 and Commons IO 2.4 to upload a (CSV) file. When i try to do uploading, there is this error in server log:
ERROR [io.undertow.request] (default task-24) UT005023: Exception
handling request to /private/createDatasetFromCSV:
java.lang.RuntimeException: java.lang.RuntimeException:
java.lang.RuntimeException:
com.opensymphony.xwork2.inject.DependencyException:
com.opensymphony.xwork2.inject.ContainerImpl$MissingDependencyException:
No mapping found for dependency [type=java.lang.String,
name='struts.multipart.bufferSize'] in public void
org.apache.struts2.dispatcher.multipart.JakartaStreamMultiPartRequest.setBufferSize(java.lang.String).
I've followed the official guidelines here, creating an Action class, using the JSP form tags and so on.
In struts.xml, for file upload section, I have:
<constant name="struts.multipart.maxSize" value="209715200" />
<constant name="struts.multipart.parser" value="jakarta-stream" />
The version of Struts should be updated due to WW-4466.
With WW-3025 there was a new config constant introduced: struts.multipart.bufferSize
Currently it is set as required and hence applications must specify it. The default value is always overridden.
-> should be required = false
Fix Version/s: 2.3.24

How can I config to access HDFS(namenode HA) with HiveConf class by using hive-common-1.2.1.jar?

Who knows why class HiveConf has no HADOOPCONF enum type in hive-common jar now?
I write code using hive-common-1.2.1.jar HiveConf class to access HDFS(HA namenode), and I get an error below.
I realized my code didn't config HADOOPCONF so it can't connect to HDFS, but there is no HADOOPCONF in hive-common-1.2.1.jar any more, I found previous version of hive-common has the HADOOPCONF.
http://www.docjar.com/html/api/org/apache/hadoop/hive/conf/HiveConf.java.html
My question is how can I config to access HDFS(namenode HA) with HiveConf class by using hive-common-1.2.1.jar?
Here is the error:
Caused by: java.lang.IllegalArgumentException: java.net.UnknownHostException: cluster
at org.apache.hadoop.security.SecurityUtil.buildTokenService(SecurityUtil.java:374)
at org.apache.hadoop.hdfs.NameNodeProxies.createNonHAProxy(NameNodeProxies.java:312)
at org.apache.hadoop.hdfs.NameNodeProxies.createProxy(NameNodeProxies.java:178)
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:665)
My code is:
hiveConf.setVar(HiveConf.ConfVars.HADOOPBIN, "/opt/modules/hadoop/bin");
hiveConf.setVar(HiveConf.ConfVars.HADOOPFS, "hdfs://cluster");
hiveConf.setVar(HiveConf.ConfVars.LOCALSCRATCHDIR, "/opt/modules/hive/temp");
hiveConf.setVar(HiveConf.ConfVars.DOWNLOADED_RESOURCES_DIR, "/opt/modules/hive/temp");
hiveConf.setBoolVar(HiveConf.ConfVars.HIVE_SUPPORT_CONCURRENCY, false);
hiveConf.setVar(HiveConf.ConfVars.METASTOREWAREHOUSE, "/warehouse");
hiveConf.setVar(HiveConf.ConfVars.METASTOREURIS, "thrift://127.0.0.1:9083");
hiveConf.setVar(HiveConf.ConfVars.METASTORE_CONNECTION_DRIVER, "com.mysql.jdbc.Driver");
hiveConf.setVar(HiveConf.ConfVars.METASTORECONNECTURLKEY, "jdbc:mysql://192.168.5.29:3306/hive?createDatabaseIfNotExist=true");
hiveConf.setVar(HiveConf.ConfVars.METASTORE_CONNECTION_USER_NAME, "hive");
hiveConf.setVar(HiveConf.ConfVars.METASTOREPWD, "123456");
hiveConf.setVar(HiveConf.ConfVars.HIVEHISTORYFILELOC, "/opt/modules/hive/temp");
OK, I resolved this issue.
Because the class HiveConf in hive-common jar load "hdfs-site.xml" from hadoop by default, if only you set the classpath pointed to the folder of "hdfs-site.xml" when you running it.
CLASSPATH=$CLASSPATH:/opt/modules/hadoop/conf
$JAVA -cp $CLASSPATH com.baofeng.data.writer.HiveHcatalogWriter

hector cassandra cluster error

Hello I am trying to connect a Java Maven application to Cassandra with hector. The code is very simple
imports......
public class App {
public static void main( String[] args ){
Cluster cluster = HFactory.getOrCreateCluster("TestCluster",
new CassandraHostConfigurator("localhost:9042"));
}
}
}
When I run it I have these exceptions:
log4j:WARN No appenders could be found for logger (me.prettyprint.cassandra.connection.CassandraHostRetryService).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
I think I need to set up something but I cannot find what it is.
You need a log4j properties file. From the Hector docs:
Run your application with the following parameter:
-Dlog4j.configuration=file:///path/to/log4j.properties
Where log4j.properties contains:
log4j.rootLogger=DEBUG,stdout
log4j.appender.stdout=org.apache.log4j.ConsoleAppender
log4j.appender.stdout.layout=org.apache.log4j.SimpleLayout
Log4J is very powerful, check out the project site for more information.

spring Jaxb2Marshaller+contextPath+tomcat embedded in maven throws

I'm using spring's Jaxb2Marshaller (without web services) to unmarshal some xml. The xml code is code-gen via maven-jaxb-plugin, and I instantiate the Jaxb2Marshaller in spring via:
<bean id="unmarshaller" class="...Jaxb2Marshaller" p:contextPath="my.package.path" />
Then start with:
mvn clean package
mvn tomcat:run
The first unmarshaller is created fine, the second throws with org.springframework.oxm.jaxb.JaxbSystemException because it can't find ObjectFactory (which is generated by the maven-jaxb-plugin, and I've verified is in fact present in the jar, in the correct package).
I actually have two unmarshallers, (although I've tried with one unmarshaller and contextPath with colon separated package paths, with same results).
I don't think this is generally a problem with spring or my configuration, because if I deploy into a full tomcat container it works fine. I did notice that maven puts tomcat in my project/target/tomcat folder and there are some differences, such as there is no lib directory. I actually don't know what all the differences are between embedded tomcat and a regular installation.
Can someone explain:
1) exactly how embedded tomcat differs from a regular installation
2) if there are known limitations
3) if it can be configured to work properly in this situtation
Full stack trace:
SEVERE: Exception sending context initialized event to listener instance of class org.springframework.web.context.ContextLoaderListener
org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'unmarshaller' defined in class path resource [spring.xml]: Invocation of init method failed; nested exception is org.springframework.oxm.jaxb.JaxbSystemException: "my.package.path" doesnt contain ObjectFactory.class or jaxb.index; nested exception is javax.xml.bind.JAXBException: "my.package.path" doesnt contain ObjectFactory.class or jaxb.index
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1338)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:473)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory$1.run(AbstractAutowireCapableBeanFactory.java:409)
at java.security.AccessController.doPrivileged(Native Method)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:380)
at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:264)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:222)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:261)
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:185)
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:164)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:429)
at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:728)
at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:380)
at org.springframework.web.context.ContextLoader.createWebApplicationContext(ContextLoader.java:255)
at org.springframework.web.context.ContextLoader.initWebApplicationContext(ContextLoader.java:199)
at org.springframework.web.context.ContextLoaderListener.contextInitialized(ContextLoaderListener.java:45)
at org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:4135)
at org.apache.catalina.core.StandardContext.start(StandardContext.java:4630)
at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1045)
at org.apache.catalina.core.StandardHost.start(StandardHost.java:785)
at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1045)
at org.apache.catalina.core.StandardEngine.start(StandardEngine.java:445)
at org.apache.catalina.startup.Embedded.start(Embedded.java:825)
at org.codehaus.mojo.tomcat.AbstractRunMojo.startContainer(AbstractRunMojo.java:558)
at org.codehaus.mojo.tomcat.AbstractRunMojo.execute(AbstractRunMojo.java:255)
For anyone else who runs across this, I eventually solved the problem by using classesToBeBound property instead of contextPath. The reason I had initially avoided classesToBeBound was that I thought I had to specify every single class in the model in the classesToBeBound list, which isn't the case. You simply specify the class that has the #XmlRootElement annotation.