I'm tracking down why Analytic data is not being logged. I see the error messages described below, they occur when an attempt is made to log analytic data. Are these messages significant? What do I need to fix?
I have the Operation Analytics EAR deployed to a dedicated Liberty server
product = WebSphere Application Server 8.5.5.0 (wlp-1.0.3.20130510-0831)
wlp.install.dir = /home/virtuser/IBM/WebSphere/Liberty/
java.home = /home/virtuser/IBM/WebSphere/Liberty/java/java_1.7_64/jre
java.version = 1.7.0
java.runtime = Java(TM) SE Runtime Environment (pxa6470sr4fp1ifx-20130423_02 (SR4 FP1+IV38579+IV38399+IV40208))
os = Linux (2.6.32-358.el6.x86_64; amd64) (en_GB)
And when I attempt to log some analytic data see this error in messages.log
Uncaught.init.exception.thrown.by.servlet
data
worklight-analytics-ear
javax.xml.transform.TransformerFactoryConfigurationError: Provider for
javax.xml.transform.TransformerFactory cannot be found
and
[30/01/15 05:24:20:853 EST] 00000022 org.apache.wink.server.internal.log.Resources I The server has registered the JAX-RS resource class com.ibm.elasticsearch.servlet.DataReceiver with #Path(/).
[30/01/15 05:24:20:854 EST] 00000022 org.apache.wink.server.internal.log.Providers I There are no custom JAX-RS providers defined in the application.
[30/01/15 05:24:20:860 EST] 00000022 pache.wink.common.internal.application.ApplicationFileLoader E The runtime environment failed to instantiate the org.apache.wink.common.internal.providers.entity.SourceProvider$StreamSourceProvider class. Ensure that the class is not abstract, has a valid constructor, has the right visibility, and is not an inner member class.
java.lang.NoClassDefFoundError: org.apache.wink.common.internal.providers.entity.SourceProvider$StreamSourceProvider (initialization failure)
at java.lang.J9VMInternals.initialize(J9VMInternals.java:176)
at java.lang.Class.forNameImpl(Native Method)
at java.lang.Class.forName(Class.java:219)
at org.apache.wink.common.internal.utils.ClassUtils$1.run(ClassUtils.java:73)
at java.security.AccessController.doPrivileged(AccessController.java:229)
at org.apache.wink.common.internal.utils.ClassUtils.loadClass(ClassUtils.java:66)
and
[30/01/15 05:25:09:342 EST] 0000001e
com.ibm.ws.logging.internal.impl.IncidentImpl I FFDC1015I: An FFDC Incident has been created: "org.elasticsearch.action.search.SearchPhaseExecutionException:
Failed to execute phase [query], all shards failed; shardFailures {[beBqcnofT52DPVzgChWm_Q][worklight][1]: SearchParseException[[worklight][1]: from[-1],size[-1]: Parse Failure [Failed to parse source [{"facets":{"termsFacet":{"date_histogram":{"field":"gadgets.HybridApp.firstAccess","interval":"1h"},"facet_filter":{"bool":{"must":{"range":{"gadgets.HybridApp.firstAccess":{"from":1422527109262,"to":1422613509262,"include_lower":true,"include_upper":true}}}}}}}}]]]; nested: FacetPhaseExecutionException[Facet [termsFacet]: (key) field [gadgets.HybridApp.firstAccess] not found]; }{[beBqcnofT52DPVzgChWm_Q][worklight][2]: SearchParseException[[worklight][2]: from[-1],size[-1]: Parse Failure [Failed to parse source [{"facets":{"termsFacet":{"date_histogram":{"field":"gadgets.HybridApp.firstAccess","interval":"1h"},"facet_filter":{"bool":{"must":{"range":{"gadgets.HybridApp.firstAccess":{"from":1422527109262,"to":1422613509262,"include_lower":true,"include_upper":true}}}}}}}}]]]; nested: FacetPhaseExecutionException[Facet [termsFacet]: (key) field [gadgets.HybridApp.firstAccess] not found]; }{[beBqcnofT52DPVzgChWm_Q][worklight][3]: SearchParseException[[worklight][3]: from[-1],size[-1]: Parse Failure [Failed to parse source [{"facets":{"termsFacet":{"date_histogram":{"field":"gadgets.HybridApp.firstAccess","interval":"1h"},"facet_filter":{"bool":{"must":{"range":{"gadgets.HybridApp.firstAccess":{"from":1422527109262,"to":1422613509262,"include_lower":true,"include_upper":true}}}}}}}}]]]; nested: FacetPhaseExecutionException[Facet [termsFacet]: (key) field [gadgets.HybridApp.firstAccess] not found]; }{[beBqcnofT52DPVzgChWm_Q][worklight][4]: SearchParseException[[worklight][4]: from[-1],size[-1]: Parse Failure [Failed to parse source [{"facets":{"termsFacet":{"date_histogram":{"field":"gadgets.HybridApp.firstAccess","interval":"1h"},"facet_filter":{"bool":{"must":{"range":{"gadgets.HybridApp.firstAccess":{"from":1422527109262,"to":1422613509262,"include_lower":true,"include_upper":true}}}}}}}}]]]; nested: FacetPhaseExecutionException[Facet [termsFacet]: (key) field [gadgets.HybridApp.firstAccess] not found]; }{[beBqcnofT52DPVzgChWm_Q][worklight][0]: SearchParseException[[worklight][0]: from[-1],size[-1]: Parse Failure [Failed to parse source [{"facets":{"termsFacet":{"date_histogram":{"field":"gadgets.HybridApp.firstAccess","interval":"1h"},"facet_filter":{"bool":{"must":{"range":{"gadgets.HybridApp.firstAccess":{"from":1422527109262,"to":1422613509262,"include_lower":true,"include_upper":true}}}}}}}}]]]; nested: FacetPhaseExecutionException[Facet [termsFacet]: (key) field [gadgets.HybridApp.firstAccess] not found]; } com.ibm.ws.webcontainer.filter.WebAppFilterManager.invokeFilters 1105" at ffdc_15.01.30_05.25.09.0.log
Note the: Parse Failure [Failed to parse source in that last message
My server.xml has this:
<featureManager>
<feature>jsp-2.2</feature>
<feature>jndi-1.0</feature>
<feature>appSecurity-1.0</feature>
</featureManager>
<httpEndpoint id="defaultHttpEndpoint"
host="localhost"
httpPort="9082"
httpsPort="9445" />
<jndiEntry jndiName="analytics/httpport" value="9500" />
<jndiEntry jndiName="analytics/transportport" value="9600" />
<jndiEntry jndiName="analytics/masternodes" value="localhost:9600,anotherhost:9600" />
<jndiEntry jndiName="analytics/shards" value="12" />
<application location="worklight-analytics.ear"
name="worklight-analytics-ear"
type="ear">
... registry and role entries here ...
<classloader delegation="parentLast"/>
</application>
Can you try removing the following from the server.xml entry for analytics?
<classloader delegation="parentLast"/>
After you remove this, try restarting the analytics console to see if it begins to work.
Related
I had a maven JSF/JPA web application that was connected to MySQL 5.x developed using Netbeans 12. The application was running fine until I updated the MySQL version from 5.x to 8.x. Since that update, I can not configure the database to connect to the JSF application. The connection to MySQL 8.x is working within Netbeans, but not working when deploying the application.
The current configuration include EclipseLink 2.7.7, MySQL 8.0.23, and GlassFish 5(5.0.1) / Payara 5(5.2021.1). It is not possible to make a successful connection to MySQL. I also failed to establish connection inside JDBS Connection Pool of GlassFish and Payara admin consoles. Can someone please direct me to a source where MySQL version 8 is linked to Payara or GlassFish?
The error displayed in the Payara admin console is as follows.
An error has occurred Ping Connection Pool failed for pooConnection.
Connection could not be allocated because: Communications link failure
The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server. Please check the server.log for more details.
The log file contains the following.
[javax.enterprise.resource.resourceadapter.com.sun.enterprise.connectors.service] [tid: _ThreadID=161 _ThreadName=admin-thread-pool::admin-listener(3)] [timeMillis: 1613549343463] [levelValue: 900] [[
RAR8054: Exception while creating an unpooled [test] connection for pool [ pooConnection ], Connection could not be allocated because: Communications link failure
The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.]]
[2021-02-17T13:39:03.472+0530] [Payara 5.2021.1] [SEVERE] [] [org.glassfish.admingui] [tid: _ThreadID=139 _ThreadName=admin-thread-pool::admin-listener(1)] [timeMillis: 1613549343472] [levelValue: 1000] [[
RestResponse.getResponse() gives FAILURE. endpoint = 'http://localhost:4848/management/domain/resources/ping-connection-pool.json'; attrs = '{id=pooConnection}']]
In order to connect to Payara Server, the only effective way I found was creating a Connection Pool directly in the Domain Admin Console.
It may seem a long way now, but after completing it will become second nature:
Download MySQL8 Java Connector available at https://dev.mysql.com/downloads/connector/j/
and unzip to any folder:
It will extract to something like:
mysql-connector-java-8.0.23 2/mysql-connector-java-8.0.23.jar
1) Make sure you have Payara Server up and running:
cd PATH_TO_PAYARA/bin
2) Start/Restart it
./asadmin start-domain
Note: This will start domain1 by default
3) Install the MySQL8 Connector
./asadmin add-library PATH_TO_MYSQL_CONNECTOR.jar
4) (required) Restart Payara
./asadmin restart-domain
5) Access Admin Console at http://localhost:4848/common/index.jsf
6) In the sidebar navigate to "JDBC" -> "JDBC Connection Pools" menu
7) From there, click "New" to add new Connection Pool
You can have as many as you want. So if you have tried before, you may keep your pools there.
8) In: New JDBC Connection Pool (Step 1 of 2)
Pool Name: MySQL8Pool (or whatever you want)
Resource Type: javax.sql.DataSource
Database Driver Vendor: MySQL8
Click "Next"
9) Scroll down to "Adicional Properties"
Select all and "Delete Properties"
10) Add 6 Properties with the keys/values as:
DatabaseName YOUR_DB_NAME
User YOUR_DB_USER
Password YOUR_DB_PASSWORD
ServerName localhost
PortNumber 3306
UseSSL false
Click "Save"
11) In the sidebar navigate to "JDBC" -> "JDBC Resources" menu
12) From there, click "New" to add new JDBC Resource
And fill:
JNDI Name: jdbc/MySQL8App
PoolName: MySQL8Pool
Click "Save"
From now own I assume you are using Maven.
13) In you pom.xml make sure you have Eclipse Persistence
In you tag:
<dependency>
<groupId>org.eclipse.persistence</groupId>
<artifactId>org.eclipse.persistence.core</artifactId>
<version>2.7.7</version>
</dependency>
<dependency>
<groupId>org.eclipse.persistence</groupId>
<artifactId>org.eclipse.persistence.asm</artifactId>
<version>2.7.7</version>
</dependency>
<dependency>
<groupId>org.eclipse.persistence</groupId>
<artifactId>org.eclipse.persistence.antlr</artifactId>
<version>2.7.7</version>
</dependency>
<dependency>
<groupId>org.eclipse.persistence</groupId>
<artifactId>org.eclipse.persistence.jpa</artifactId>
<version>2.7.7</version>
</dependency>
<dependency>
<groupId>org.eclipse.persistence</groupId>
<artifactId>org.eclipse.persistence.jpa.jpql</artifactId>
<version>2.7.7</version>
</dependency>
<dependency>
<groupId>org.eclipse.persistence</groupId>
<artifactId>org.eclipse.persistence.moxy</artifactId>
<version>2.7.7</version>
</dependency>
<dependency>
<groupId>org.eclipse.persistence</groupId>
<artifactId>javax.persistence</artifactId>
<version>2.2.1</version>
</dependency>
14) Create a persistence unit in
In persistence.xml file:
<persistence-unit name="mysql8PU" transaction-type="JTA">
<jta-data-source>jdbc/MySQL8App</jta-data-source>
<exclude-unlisted-classes>false</exclude-unlisted-classes>
<shared-cache-mode>NONE</shared-cache-mode>
<!--properties>
<property name="javax.persistence.schema-generation.database.action" value="drop-and-create OR create OR complete remove this line"/-->
</properties>
</persistence-unit>
Note that in jta-data-source it points to jdbc/MySQL8App and from now own it can be used any where in your code so that after built Payara is able to now we injected it.
PersistenceService.java
package com.your.package.services
import javax.enterprise.context.ApplicationScoped;
import javax.persistence.EntityManager;
import javax.persistence.PersistenceContext;
#ApplicationScoped
public class PersistenceService {
#PersistenceContext(unitName = "mysql8PU")
EntityManager entityManager;
}
Now rebuild and rerun you project and everything should be fine!
Configure your connecion just in four steps:
Copy Mysql JDBC driver JAR to $PAYARA_HOME/glassfish/domains/$YOUR_DOMAIN/lib/ e. g.
cp mysql-connector-java-8.0.22.jar /opt/payara5/glassfish/domains/domain1/lib/
Create XML resources descriptor file, name it glassfish-resources.xml. Specify apprpriate params:
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE resources PUBLIC "-//GlassFish.org//DTD GlassFish Application Server 3.1 Resource Definitions//EN" "http://glassfish.org/dtds/glassfish-resources_1_5.dtd">
<resources>
<jdbc-connection-pool connection-creation-retry-interval-in-seconds="30" connection-validation-method="auto-commit" datasource-classname="com.mysql.cj.jdbc.MysqlDataSource" wrap-jdbc-objects="false" res-type="javax.sql.DataSource" name="mysql_mydb_rootPool" is-connection-validation-required="true" connection-creation-retry-attempts="10" validate-atmost-once-period-in-seconds="60">
<property name="User" value="root"/>
<property name="Password" value="secret"/>
<property name="URL" value="jdbc:mysql://localhost:3306/voyager?zeroDateTimeBehavior=convertToNull&serverTimezone=UTC&useSSL=false"/>
<property name="driverClass" value="com.mysql.cj.jdbc.Driver"/>
<property name="characterEncoding" value="utf-8"/>
</jdbc-connection-pool>
<jdbc-resource enabled="true" jndi-name="jdbc/mydb" object-type="user" pool-name="mysql_mydb_rootPool"/>
</resources>
Add resources to Payara
$PAYARA_HOME/bin/asadmin add-resources glassfish-resources.xml
Restart your domain
$PAYARA_HOME/bin/asadmin restart-domain
I'm getting ClassNotFoundException : org/apache/avro/ipc/ByteBufferOutputStream when I run apache Nutch with HSQLDB although I have all the avro related jar files under lib
avro-1.7.6.jar
avro-compiler-1.7.6.jar
avro-ipc-1.7.6.jar
avro-mapred-1.7.6.jar
This is what I did:
Got HSQLDB up and running
root#elephant hsqldb# sudo java -cp /home/hsqldb/hsqldb-2.3.3/hsqldb/lib/hsqldb.jar org.hsqldb.server.Server --props /home/hsqldb/hsqldb-2.3.3/hsqldb/conf/server.properties
[Server#372f7a8d]: [Thread[main,5,main]]: checkRunning(false) entered
[Server#372f7a8d]: [Thread[main,5,main]]: checkRunning(false) exited
[Server#372f7a8d]: Startup sequence initiated from main() method
[Server#372f7a8d]: Loaded properties from [/home/hsqldb/hsqldb-2.3.3/hsqldb/conf/server.properties]
[Server#372f7a8d]: Initiating startup sequence...
[Server#372f7a8d]: Server socket opened successfully in 28 ms.
[Server#372f7a8d]: Database [index=0, id=0, db=file:/home/hsqldb/hsqldb-2.3.3/hsqldb/data/nutch, alias=nutchdb] opened sucessfully in 1406 ms.
[Server#372f7a8d]: Startup sequence completed in 1438 ms.
[Server#372f7a8d]: 2015-12-26 18:30:13.841 HSQLDB server 2.3.3 is online on port 9001
[Server#372f7a8d]: To close normally, connect and execute SHUTDOWN SQL
[Server#372f7a8d]: From command line, use [Ctrl]+[C] to abort abruptly
Configured ivy/ivy.xml
uncommented below lines in ivy.xml
<dependency org="org.apache.gora" name="gora-core" rev="0.5" conf="*->default"/>
and
<dependency org="org.apache.gora" name="gora-sql" rev="0.1.1-incubating"
conf="*->default" />
uncommented the below lines conf/gora.properites
###############################
# Default SqlStore properties #
###############################
gora.sqlstore.jdbc.driver=org.hsqldb.jdbc.JDBCDriver
gora.sqlstore.jdbc.url=jdbc:hsqldb:hsql://localhost/nutchdb
gora.sqlstore.jdbc.user=sa
gora.sqlstore.jdbc.password=
Ran ant build
ant runtime
Added configuration for nutch-site.xml
root#elephant conf# cat nutch-site.xml
<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<!-- Put site-specific property overrides in this file. -->
<configuration>
<property>
<name>storage.data.store.class</name>
<value>org.apache.gora.sql.store.SqlStore</value>
</property>
<property>
<name>http.agent.name</name>
<value>NutchCrawler</value>
</property>
<property>
<name>http.robots.agents</name>
<value>NutchCrawler,*</value>
</property>
</configuration>
Created seed.txt under urls folder
Executed the nutch by injecting the urls
[root#elephant local]# bin/nutch inject urls/
InjectorJob: starting at 2015-12-26 19:11:24
InjectorJob: Injecting urlDir: urls
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/avro/ipc/ByteBufferOutputStream
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:259)
at org.apache.nutch.storage.StorageUtils.getDataStoreClass(StorageUtils.java:93)
at org.apache.nutch.storage.StorageUtils.createWebStore(StorageUtils.java:77)
at org.apache.nutch.crawl.InjectorJob.run(InjectorJob.java:218)
at org.apache.nutch.crawl.InjectorJob.inject(InjectorJob.java:252)
at org.apache.nutch.crawl.InjectorJob.run(InjectorJob.java:275)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
at org.apache.nutch.crawl.InjectorJob.main(InjectorJob.java:284)
Caused by: java.lang.ClassNotFoundException: org.apache.avro.ipc.ByteBufferOutputStream
at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 9 more
Gora-sql is not supported. Due some licenses issues (if I am not wrong), it became disabled around Gora 0.2.
So I suggest you to use other storage like, for example, HBase.
How to get HBase up&running fast: read answer at https://stackoverflow.com/a/39837926/582789
I'm trying to install manually IBM Worklight 6.2 in WebSphere Liberty Profile using db2 database. I have already the appcenter and the worklight server running properly. But when I try to declare my server runtime (I've uploaded the war file and I have declared it in the server.xml file), the logs are displaying the following message:
2015-01-19T12:39:43.32+0100 [App/0] ERR [ERROR ] FWLST0003E: ========= Failed starting project /worklight [project worklight]
2015-01-19T12:39:43.32+0100 [App/0] ERR Error creating bean with name 'txManager' defined in URL [wsjar:file:/home/vcap/app/.liberty/usr/shared/resources/worklight/lib/worklight-jee-library.jar!/conf/core.xml]: Cannot resolve reference to bean 'brokerSessionFactory' while setting bean property 'entityManagerFactory'; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'brokerSessionFactory' defined in URL [wsjar:file:/home/vcap/app/.liberty/usr/shared/resources/worklight/lib/worklight-jee-library.jar!/conf/spring-server-core.xml]: Cannot resolve reference to bean 'rssBrokerDS' while setting bean property 'dataSource'; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'rssBrokerDS' defined in URL [wsjar:file:/home/vcap/app/.liberty/usr/shared/resources/worklight/lib/worklight-jee-library.jar!/conf/spring-server-core.xml]: Cannot resolve reference to bean 'worklight-direct' while setting bean property 'targetDataSource'; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'worklight-direct' defined in URL [wsjar:file:/home/vcap/app/.liberty/usr/shared/resources/worklight/lib/worklight-jee-library.jar!/conf/spring-server-core.xml]: Invocation of init method failed; nested exception is java.lang.RuntimeException: FWLSE0206E: The project /worklight failed to initialize, because the project database schema for data source jdbc:db2://75.126.155.153:50000/SQLDB is from version N/A, which is not supported by the server from version 6.2.0.00.20140613-0730. Use the Worklight ant tasks to upgrade the project database schema. [project worklight]
2015-01-19T12:39:43.40+0100 [App/0] ERR [ERROR ] Error creating bean with name 'txManager' defined in URL [wsjar:file:/home/vcap/app/.liberty/usr/shared/resources/worklight/lib/worklight-jee-library.jar!/conf/core.xml]: Cannot resolve reference to bean 'brokerSessionFactory' while setting bean property 'entityManagerFactory'; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'brokerSessionFactory' defined in URL [wsjar:file:/home/vcap/app/.liberty/usr/shared/resources/worklight/lib/worklight-jee-library.jar!/conf/spring-server-core.xml]: Cannot resolve reference to bean 'rssBrokerDS' while setting bean property 'dataSource'; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'rssBrokerDS' defined in URL [wsjar:file:/home/vcap/app/.liberty/usr/shared/resources/worklight/lib/worklight-jee-library.jar!/conf/spring-server-core.xml]: Cannot resolve reference to bean 'worklight-direct' while setting bean property 'targetDataSource'; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'worklight-direct' defined in URL [wsjar:file:/home/vcap/app/.liberty/usr/shared/resources/worklight/lib/worklight-jee-library.jar!/conf/spring-server-core.xml]: Invocation of init method failed; nested exception is java.lang.RuntimeException: FWLSE0206E: The project /worklight failed to initialize, because the project database schema for data source jdbc:db2://75.126.155.153:50000/SQLDB is from version N/A, which is not supported by the server from version 6.2.0.00.20140613-0730. Use the Worklight ant tasks to upgrade the project database schema. [project worklight]
2015-01-19T12:39:43.40+0100 [App/0] ERR Error creating bean with name 'txManager' defined in URL [wsjar:file:/home/vcap/app/.liberty/usr/shared/resources/worklight/lib/worklight-jee-library.jar!/conf/core.xml]: Cannot resolve reference to bean 'brokerSessionFactory' while setting bean property 'entityManagerFactory'; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'brokerSessionFactory' defined in URL [wsjar:file:/home/vcap/app/.liberty/usr/shared/resources/worklight/lib/worklight-jee-library.jar!/conf/spring-server-core.xml]: Cannot resolve reference to bean 'rssBrokerDS' while setting bean property 'dataSource'; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'rssBrokerDS' defined in URL [wsjar:file:/home/vcap/app/.liberty/usr/shared/resources/worklight/lib/worklight-jee-library.jar!/conf/spring-server-core.xml]: Cannot resolve reference to bean 'worklight-direct' while setting bean property 'targetDataSource'; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'worklight-direct' defined in URL [wsjar:file:/home/vcap/app/.liberty/usr/shared/resources/worklight/lib/worklight-jee-library.jar!/conf/spring-server-core.xml]: Invocation of init method failed; nested exception is java.lang.RuntimeException: FWLSE0206E: The project /worklight failed to initialize, because the project database schema for data source jdbc:db2://75.126.155.153:50000/SQLDB is from version N/A, which is not supported by the server from version 6.2.0.00.20140613-0730. Use the Worklight ant tasks to upgrade the project database schema. [project worklight]
and this is my server.xml file:
<server description="new server">
<!-- Enable features -->
<featureManager>
<feature>jsp-2.2</feature>
<!-- Begin of features added by IBM Worklight installer. -->
<feature>ssl-1.0</feature>
<feature>servlet-3.0</feature>
<feature>jdbc-4.0</feature>
<feature>appSecurity-1.0</feature>
<feature>restConnector-1.0</feature>
<feature>jndi-1.0</feature>
<!-- End of features added by IBM Worklight installer. -->
</featureManager>
<httpEndpoint id="defaultHttpEndpoint" host="*" httpPort="9080" httpsPort="9443" >
<tcpOptions soReuseAddr="true"/>
</httpEndpoint>
<!-- Begin of configuration added by IBM Worklight installer. -->
<!-- Declare the IBM Application Center Console application. -->
<application id="appcenterconsole" name="appcenterconsole" location="appcenterconsole.war" type="war">
<application-bnd>
<security-role name="appcenteradmin">
<group name="appcentergroup"/>
</security-role>
</application-bnd>
</application>
<!-- Declare the IBM Application Center Services application. -->
<application id="applicationcenter" name="applicationcenter" location="applicationcenter.war" type="war">
<application-bnd>
<security-role name="appcenteradmin">
<group name="appcentergroup"/>
</security-role>
</application-bnd>
<classloader delegation="parentLast">
<commonLibrary>
<fileset dir="${wlp.install.dir}/lib" includes="com.ibm.ws.crypto.passwordutil_1.0.8.jar"/>
</commonLibrary>
</classloader>
</application>
<!-- Declare the user registry for the IBM Application Center. -->
<basicRegistry id="applicationcenter-registry" realm="ApplicationCenter">
<!-- Worklight user.
[Added by IBM Worklight <installWorklightAdmin> Ant task for context root '/wladmin']
-->
<user name="WorklightRESTUser" password="EKminBt6fHnE"/>
<!-- The users defined here are members of group "appcentergroup",
thus have role "appcenteradmin", and can therefore perform
administrative tasks through the IBM Application Center Console. -->
<user name="appcenteradmin" password="admin"/>
<user name="demo" password="demo"/>
<group name="appcentergroup">
<member name="appcenteradmin"/>
<member name="demo"/>
</group>
</basicRegistry>
<!-- Declare the JNDI properties for the IBM Application Center. -->
<!-- Define the AppCenter services endpoint in order for the AppCenter console to be able to invoke the REST service.
You need to enable this property if the server is behind a reverse proxy
or if the context root of the Application Center Services application is different from '/applicationcenter'. -->
<!-- <jndiEntry jndiName="ibm.appcenter.services.endpoint" value='"http://proxyhost:proxyport/applicationcenter"'/> -->
<!-- The directory with binaries of the 'aapt' program, from the Android SDK's platform-tools package. -->
<!--<jndiEntry jndiName="android.aapt.dir" value='"*******/android-sdk"'/>-->
<jndiEntry jndiName="android.aapt.dir" value="C******/android-sdk"/>
<!-- The protocol of the application resources URI. This property is optional. It is only needed if the protocol of the external and internal URI are different. -->
<!-- <jndiEntry jndiName="ibm.appcenter.proxy.protocol" value='"http"'/> -->
<!-- The hostname of the application resources URI. -->
<!-- <jndiEntry jndiName="ibm.appcenter.proxy.host" value='"proxyhost"'/> -->
<!-- The port of the application resources URI. This property is optional. -->
<!-- <jndiEntry jndiName="ibm.appcenter.proxy.port" value="proxyport"/> -->
<!-- Declare the jar files for DB2 access through JDBC. -->
<library id="DB2Lib">
<fileset dir="${shared.resource.dir}/db2" includes="*.jar"/>
</library>
<!-- Declare the IBM Application Center database. -->
<dataSource jndiName="jdbc/AppCenterDS" transactional="false">
<jdbcDriver libraryRef="DB2Lib"/>
<properties.db2.jcc databaseName='${Database.connection.db}' serverName='${Database.connection.host}' portNumber='${Database.connection.port}' user='${Database.connection.username}' password='${Database.connection.password}'/>
</dataSource>
<!-- End of configuration added by IBM Worklight installer. -->
<administrator-role>
<!-- Worklight JMX User.
[Added by IBM Worklight <installWorklightAdmin> Ant task for context root '/wladmin']
-->
<user>WorklightRESTUser</user>
</administrator-role>
<!--
IBM Worklight requires SSL and declared the "defaultKeyStore" default keystore.
[Added by IBM Worklight <installWorklightAdmin> Ant task for context root '/wladmin']
This configuration is the minimum one that you need to create an SSL configuration.
With this configuration, the Liberty server creates the keystore and the certificate,
if it does not exist yet, during the SSL initialization.
The created certificate is a self-signed certificate that is valid for 365 days.
Do not use the certificates that the Liberty server created for production use.
For more information see http://pic.dhe.ibm.com/infocenter/wasinfo/v8r5/topic/com.ibm.websphere.wlp.core.doc/ae/twlp_sec_ssl.html
-->
<keyStore id="defaultKeyStore" password="worklight"/>
<!-- Worklight JNDI property for JMX connection.
[Added by IBM Worklight <installWorklightAdmin> Ant task for context root '/wladmin']
-->
<jndiEntry jndiName="ibm.worklight.admin.jmx.host" value="localhost"/>
<!-- Worklight JNDI property for JMX connection.
[Added by IBM Worklight <installWorklightAdmin> Ant task for context root '/wladmin']
-->
<jndiEntry jndiName="ibm.worklight.admin.jmx.port" value="9443"/>
<!-- Worklight JNDI property for JMX connection.
[Added by IBM Worklight <installWorklightAdmin> Ant task for context root '/wladmin']
-->
<jndiEntry jndiName="ibm.worklight.admin.jmx.user" value="WorklightRESTUser"/>
<!-- Worklight JNDI property for JMX connection.
[Added by IBM Worklight <installWorklightAdmin> Ant task for context root '/wladmin']
-->
<jndiEntry jndiName="ibm.worklight.admin.jmx.pwd" value="EKminBt6fHnE"/>
<!-- Worklight JNDI property for JMX connection.
[Added by IBM Worklight <installWorklightAdmin> Ant task for context root '/wladmin']
-->
<jndiEntry jndiName="ibm.worklight.topology.platform" value="Liberty"/>
<!-- Worklight JNDI property for JMX connection.
[Added by IBM Worklight <installWorklightAdmin> Ant task for context root '/wladmin']
-->
<jndiEntry jndiName="ibm.worklight.topology.clustermode" value="Standalone"/>
<!-- Begin of configuration added by IBM Worklight <installWorklightAdmin> ant task for context root '/wladmin'. -->
<!-- Declare the Worklight Administration Service application. -->
<application id="wladmin" name="wladmin" location="worklightadmin.war" type="war">
<application-bnd>
<security-role name="worklightadmin">
<user name="demo"/>
</security-role>
<security-role name="worklightdeployer">
</security-role>
<security-role name="worklightmonitor">
</security-role>
<security-role name="worklightoperator">
</security-role>
</application-bnd>
<classloader delegation="parentLast">
<commonLibrary>
<fileset dir="${wlp.install.dir}/lib" includes="com.ibm.ws.crypto.passwordutil_1.0.8.jar"/>
</commonLibrary>
</classloader>
</application>
<!-- Declare web container custom properties for the Worklight Administration Service application. -->
<webContainer invokeFlushAfterService="false" deferServletLoad="false"/>
<!-- Declare the JNDI properties for the Worklight Administration Service. -->
<jndiEntry jndiName="wladmin/ibm.worklight.admin.environmentid" value="WLBluemix"/>
<!-- Declare the jar files for DB2 access through JDBC. -->
<library id="wladmin/DB2Lib">
<fileset dir="${shared.resource.dir}/wladmin/db2" includes="db2jcc4.jar,db2jcc_license_cu.jar"/>
</library>
<!-- Declare the IBM Worklight Administration database. -->
<dataSource jndiName="wladmin/jdbc/WorklightAdminDS" transactional="false">
<jdbcDriver libraryRef="wladmin/DB2Lib"/>
<properties.db2.jcc databaseName='${Database.connection.db}' serverName='${Database.connection.host}' portNumber='${Database.connection.port}' user='${Database.connection.username}' password='${Database.connection.password}' currentSchema='${Database.connection.username}'/>
</dataSource>
<!-- Declare the Worklight Administration Console application. -->
<application id="worklightconsole" name="worklightconsole" location="worklightconsole.war" type="war">
<application-bnd>
<security-role name="worklightadmin">
<user name="demo"/>
</security-role>
<security-role name="worklightdeployer">
</security-role>
<security-role name="worklightmonitor">
</security-role>
<security-role name="worklightoperator">
</security-role>
</application-bnd>
</application>
<!-- Declare web container custom properties for the Worklight Administration Console application. -->
<webContainer invokeFlushAfterService="false" deferServletLoad="false"/>
<!-- Declare the JNDI properties for the Worklight Administration Console. -->
<jndiEntry jndiName="worklightconsole/ibm.worklight.admin.endpoint" value="*://*:*/wladmin"/>
<!-- End of configuration added by IBM Worklight <installWorklightAdmin> ant task for context root '/wladmin'. -->
<!-- Begin of configuration added by IBM Worklight <configureApplicationServer> ant task for context root '/worklight'. -->
<!-- Declare the JNDI properties for the IBM Worklight project runtime. -->
<jndiEntry jndiName="worklight/publicWorkLightProtocol" value="http"/>
<jndiEntry jndiName="worklight/publicWorkLightPort" value="9080"/>
<jndiEntry jndiName="worklight/ibm.worklight.admin.environmentid" value="WLServer"/>
<!-- Declare the jar files for DB2 access through JDBC. -->
<library id="worklight/DB2Lib">
<fileset dir="${shared.resource.dir}/worklight/db2" includes="db2jcc4.jar,db2jcc_license_cu.jar"/>
</library>
<!-- Declare the IBM Worklight Server database. -->
<dataSource jndiName="worklight/jdbc/WorklightDS" transactional="false">
<jdbcDriver libraryRef="worklight/DB2Lib"/>
<properties.db2.jcc databaseName='${cloud.services.Worklight Database.connection.db}' serverName='${cloud.services.Worklight Database.connection.host}' portNumber='${cloud.services.Worklight Database.connection.port}' user='${cloud.services.Worklight Database.connection.username}' password='${cloud.services.Worklight Database.connection.password}' currentSchema='${cloud.services.Worklight Database.connection.username}'/>
</dataSource>
<!-- Declare the IBM Worklight Server reports database. -->
<dataSource jndiName="worklight/jdbc/WorklightReportsDS" transactional="false">
<jdbcDriver libraryRef="worklight/DB2Lib"/>
<properties.db2.jcc databaseName='${Database.connection.db}' serverName='$Database.connection.host}' portNumber='${Database.connection.port}' user='${Database.connection.username}' password='${Database.connection.password}' currentSchema='${Database.connection.username}'/>
</dataSource>
<!-- End of configuration added by IBM Worklight <configureApplicationServer> ant task for context root '/worklight'. -->
<!-- Declare the IBM Worklight project runtime application. -->
<application id="worklight" name="worklight" location="WLserver.war" type="war">
<classloader delegation="parentLast">
<privateLibrary>
<fileset dir="${shared.resource.dir}/worklight/lib" includes="worklight-jee-library.jar"/>
</privateLibrary>
</classloader>
</application>
</server>
It looks like you don't have created the DB2 tables for the Admin Services, runtime and report databases.
Running my script on a data node running the hive client tools is working. But when i schedule the hive script using Oozie than i get the Error as shown below.
I've set the tez.lib.uris in the tez-site.xml to hdfs:///apps/tez/,hdfs:///apps/tez/lib/
What I'm missing here?
Hive script:
USE av_raw;
LOAD DATA INPATH '${INPUT}' INTO TABLE alarms_stg;
INSERT INTO TABLE alarms PARTITION (year, month)
SELECT * FROM alarms_stg WHERE job_id = '${JOBID}';
Workflow action:
<!-- load processed data and store in hive -->
<action name="load-data">
<hive xmlns="uri:oozie:hive-action:0.3">
<job-tracker>${jobTracker}</job-tracker>
<name-node>${nameNode}</name-node>
<job-xml>hive-site.xml</job-xml>
<script>load_data.hive</script>
<param>INPUT=${complete}</param>
<param>JOBID=${wf:actionData('stage-data')['hadoopJobs']}</param>
</hive>
<ok to="end"/>
<error to="fail"/>
</action>
Error:
Log Type: stderr
Log Length: 3227
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/grid/5/hadoop/yarn/local/filecache/2418/slf4j-log4j12-1.6.6.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
log4j:ERROR Could not find value for key log4j.appender.CLA
log4j:ERROR Could not instantiate appender named "CLA".
log4j:ERROR Could not find value for key log4j.appender.CLA
log4j:ERROR Could not instantiate appender named "CLA".
Logging initialized using configuration in file:/grid/2/hadoop/yarn/local/usercache/hdfs/appcache/application_1417175595182_12259/container_1417175595182_12259_01_000002/hive-log4j.properties
Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.HiveMain], main() threw exception, org.apache.tez.dag.api.TezUncheckedException: Invalid configuration of tez jars, tez.lib.uris is not defined in the configurartion
java.lang.RuntimeException: org.apache.tez.dag.api.TezUncheckedException: Invalid configuration of tez jars, tez.lib.uris is not defined in the configurartion
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:358)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:681)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625)
at org.apache.oozie.action.hadoop.HiveMain.runHive(HiveMain.java:316)
at org.apache.oozie.action.hadoop.HiveMain.run(HiveMain.java:277)
at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:38)
at org.apache.oozie.action.hadoop.HiveMain.main(HiveMain.java:66)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:225)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:430)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1594)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
Caused by: org.apache.tez.dag.api.TezUncheckedException: Invalid configuration of tez jars, tez.lib.uris is not defined in the configurartion
at org.apache.tez.client.TezClientUtils.setupTezJarsLocalResources(TezClientUtils.java:137)
at org.apache.tez.client.TezSession.start(TezSession.java:105)
at org.apache.hadoop.hive.ql.exec.tez.TezSessionState.open(TezSessionState.java:185)
at org.apache.hadoop.hive.ql.exec.tez.TezSessionState.open(TezSessionState.java:123)
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:356)
... 19 more
Please try to add tez.lib.uris=hdfs:///apps/tez/,hdfs:///apps/tez/lib/ in workflow.xml of your Oozie job
e.g) workflow.xml
<!-- load processed data and store in hive -->
<action name="load-data">
<hive xmlns="uri:oozie:hive-action:0.3">
<job-tracker>${jobTracker}</job-tracker>
<name-node>${nameNode}</name-node>
<job-xml>hive-site.xml</job-xml>
<configuration>
<property>
<name>tez.lib.uris</name>
<value>hdfs:///apps/tez/,hdfs:///apps/tez/lib/</value>
</property>
</configuration>
<script>load_data.hive</script>
<param>INPUT=${complete}</param>
<param>JOBID=${wf:actionData('stage-data')['hadoopJobs']}</param>
</hive>
<ok to="end"/>
<error to="fail"/>
</action>
Eventually you can try to add value of "tez.lib.uris" directly in "Workflow Settings" under "Hadoop Properties" .
tez.lib.uris = hdfs:///apps/tez/,hdfs:///apps/tez/lib/
Before you add it verify the correct value in tez-site.xml.
I would like to use Cargo to deploy my maven generated war file to a remote JBoss Server that is already running.
I have configured my pom.xml like this:
...
<plugin>
<groupId>org.codehaus.cargo</groupId>
<artifactId>cargo-maven2-plugin</artifactId>
<dependencies>
<dependency>
<groupId>org.jboss.as</groupId>
<artifactId>jboss-as-controller-client</artifactId>
<version>7.1.0.Final</version>
</dependency>
</dependencies>
<configuration>
<!-- Container configuration -->
<container>
<timeout>300000</timeout> <!-- 5 minutes -->
<containerId>jboss71x</containerId>
<type>remote</type>
</container>
<!-- Configuration to use with the container -->
<configuration>
<type>runtime</type>
<properties>
<cargo.hostname><myIP></cargo.hostname>
<cargo.jboss.management.port>9999</cargo.jboss.management.port>
<cargo.remote.username><myUser></cargo.remote.username>
<cargo.remote.password><myPass></cargo.remote.password>
</properties>
</configuration>
<!-- Deployer configuration -->
<deployer>
<type>remote</type>
</deployer>
<!-- Deployables configuration -->
<deployables>
<deployable>
<groupId>de.<myGroup></groupId>
<artifactId><myArtifact></artifactId>
<type>war</type>
<pingURL>http://<myIP>:8080/<myContextPath></pingURL>
<pingTimeout>60000</pingTimeout>
</deployable>
</deployables>
</configuration>
</plugin>
...
Of course all variables of the form <my...> are filled out with the real values.
If I run this with the maven command:
mvn -X cargo:deploy
The maven console says:
...
Sep 12, 2012 5:06:12 PM org.xnio.nio.NioXnio <clinit>
INFO: XNIO NIO Implementation Version 3.0.3.GA
Sep 12, 2012 5:06:12 PM org.jboss.remoting3.EndpointImpl <clinit>
INFO: JBoss Remoting version 3.2.2.GA
[DEBUG] [swordCallbackHandler] Responded to a RealmCallback
[DEBUG] [swordCallbackHandler] Responded to a NameCallback
[DEBUG] [swordCallbackHandler] Responded to a PasswordCallback
[INFO] [Boss7xRemoteDeployer] The deployment has failed: org.codehaus.cargo.util.CargoException: Cannot deploy deployable org.codehaus.cargo.container.jboss.deployable.JBossWAR[myWar.war]
...
On the running JBoss server I can see a logging message in the console like this:
17:07:24,197 ERROR [org.jboss.remoting.remote.connection] (Remoting "pc09:MANAGEMENT" read-1) JBREM000200: Remote connection failed: java.io.IOException: Eine vorhandene Verbindung wurde vom Remotehost geschlossen.
(sorry for the german inside of the log. It basically says: An existing connection was closed by the remote host.)
Does anybody have an idea what could be wrong or how I could get more debug info from cargo to find out what exactly the problem is?
BTW: I have used the JBoss CLI access to that Jboss Server for Arquillian tests so I am pretty confident that the access to that Jboss server via CLI should be ok.
EDIT:
Seems like I have to undeploy first. To reassure that the access to the CLI works I just connected to it using the jboss-cli.bat. Then I just coincidentally undeployed the existing war and after that the deployment via cargo started to work.
(Can I close this question or mark it as resolved in any way?)