WebLogic--->Cannot load driver: com.mysql.jdbc.Driver - datasource

i have a problem in creating a data source in weblogic application server 10.3, i'm making a JDBC connection for MySql database , i placed the mysql driver in the lib folder of weblogic \wlserver_10.3\server\lib and in the mysql folder wlserver_10.3\server\ext\jdbc\mysql independently and the error message still arises
Note: there was a two jar files in this folder wlserver_10.3\server\ext\jdbc\mysql and i removed them to avoid any conflict, how i can solve these problem??

Are you using or need a different driver than the one that comes with the weblogic install? If you are you will need to add it to the CLASSPATH in setDomainEnv.sh. Otherwise it will default to using:
\wlserver_10.3\server\lib\mysql-connector-java-commercial-5.x.x-bin.jar
We had to add additional Oracle jars to our CLASSPATH recently for secure connections. You can check into it more here: http://docs.oracle.com/cd/E13222_01/wls/docs100/jdbc_admin/third_party_drivers.html

If somebody looks for a solution how to add JDBC driver (in my case PostgreSQL) to a Weblogic 14c installation attached to IntelliJ IDEA:
The location of jar-file doesn't matter. It is not needed to put it into the server's lib directory.

Related

JavaMelody error - Monitoring of sql requests and of jdbc connections in GlassFish v4.1

I want to use JavaMelody to monitor the SQL requested by a Glassfish Application Server. There are step-by-step instructions on https://github.com/javamelody/javamelody/wiki/UserGuideAdvanced#monitoring-of-sql-requests-and-of-jdbc-connections-in-glassfish-v3
I followed the instructions (I didn't download javamelody-objectfactory.jar but used javamelody-core-1.54.0.jar instead) and I get this error when clicking on the refresh button (javamelody web page) :
server.log :
exception while collecting data
java.lang.NoClassDefFoundError: org/jrobin/core/RrdException
at net.bull.javamelody.Collector.getCounterJRobin(Collector.java:836)
at net.bull.javamelody.Collector.collectJRobinValues(Collector.java:489)
...
any idea how to resolve this ?
jrobin-1.5.9.1.jar is installed in the lib folder of glassfish (and in my ear project).
thanks !
javamelody-objectfactory.jar (java source included in the jar) and javamelody-core jar file are absolutely different things. The first is to make the datasource monitorable in Glassfish and the second is the monitoring tool itself.
First fix the exception. You should probably put javamelody-core jar and jrobin jar files in your ear project (and not one in lib folder of Glassfish and one in ear).
Then if the monitoring reports don't include SQL monitoring for the datasource declared in Glassfish, use the javamelody-objectfactory.jar including all steps as said in the doc.

DBVisualizer and HIVE

I am using DBVisualizer 9.2 and Cloudera 5.4.1
I want to setup my db visualizer such that I can query hive database from the dbvisualizer tool.
I downloaded the jdbc driver for HIVE from here
http://www.cloudera.com/downloads/connectors/hive/jdbc/2-5-16.html
I extracted all the jar files in /Users/User1/.dbvis/jdbc
But now, when I start dbvisualizer, I get an error
Ignored as there is no matching Default Driver for "com.cloudera.hive.jdbc41.HS1Driver", "com.cloudera.hive.jdbc41.HS2Driver"
/Users/User1/.dbvis/jdbc
HiveJDBC41.jar
TCLIServiceClient.jar
hive_metastore.jar
hive_service.jar
libfb303-0.9.0.jar
libthrift-0.9.0.jar
log4j-1.2.14.jar
ql.jar
slf4j-api-1.5.11.jar
slf4j-log4j12-1.5.11.jar
zookeeper-3.4.6.jar
So my question is, has anyone successfully configured the DBVisualizer tool to connect to cloudera hive server?
After several hours of troubleshooting. I was able to resolve the error and successfully connect to HIVE from DB Visualizer using the HIVE JDBC Driver from cloudera.
These are the steps I took
First go to Tools -> Tool Properties -> Driver finder paths.
Here register a new empty directory. this will be the place where you will download all your jars.
First in this directory extract all the JAR files which come along with the cloudera JDBC Hive Driver.
http://www.cloudera.com/downloads/connectors/hive/jdbc/2-5-4.html
Now go to Tools -> driver manager and select Hive. In the "user specified" tab. click on the "folder icon" on the right hand side and select all the jar files which you just unzipped. (not just the folder... select all jars).
Make sure you select com.cloudera.hive.jdbc41.HS2Driver
Now define connection to Hive using these parameters
url: jdbc:hive2://foo:10000/default
user: admin
password: admin
Now when I tried to connect, I still got errors.
"Type: java.lang.reflect.UndeclaredThrowableException"
In order to resolve the above, I you need to see the error log. (this was the most important step).
Tools -> Debug Window -> Error log
Here I saw that the mysterious "UndeclaredThrowableException" is occuring because a bunch of class files like http utils, http core, hadoop core, hive core and hive cli jar files were missing. I downloaded these jars from maven central
hadoop-core-0.20.2.jar
hive-exec-2.0.0.jar
hive-service-1.1.1.jar
httpclient-4.5.2.jar
httpcore-4.4.4.jar
and again I went inside Tools->DriverManager -> Hive -> user defined and clicked on folder on right hand side and selected each of these jars as well.
Now when I restarted DBVisualizer, I connected to hive just fine and I can query it using DBVisualizer.

Pentaho (5.0.5) Configuring for Mysql

I installed Pentaho BA suite 5.0.5 on linux platform. Everything works well in postgreSQL repository.
I REFERRED THIS LINK To Configure Mysql as reposity
But if try to configure mysql for pentaho, i'm facing Errors.
This are the changes i did :
1.Edited /home/pentaho/server/biserver-ee/pentaho-solutions/system/quartz/quartz.properties
line:300
org.quartz.jobStore.driverDelegateClass = org.quartz.impl.jdbcjobstore.StdJDBCDelegate
2.Edited /home/pentaho/server/biserver-ee/pentaho-solutions/system/hibernate/hibernate-setting.xml line:15
system/hibernate/mysql5.hibernate.cfg.xml
3.Edited /home/pentaho/server/biserver-ee/pentaho-solutions/system/applicationContext-spring-security-hibernate.properties
jdbc.driver=com.mysql.jdbc.Driver
jdbc.url=jdbc:mysql://localhost:3306/hibernate
jdbc.username=hibuser
jdbc.password=password
hibernate.dialect=org.hibernate.dialect.MySQLDialect
4.I copied audit_sql.xml file from /home/pentaho/server/biserver-ee/pentaho-solutions/system/dialects/mysql5 to /home/pentaho/server/biserver-ee/pentaho-solutions/system
Edited /home/pentaho/server/biserver-ee/pentaho-solutions/system/jackrabbit/repository.xml file and uncommented SQL-configuration
I copied mysql-connector-java-5.1.25-bin.jar file to tomcat/lib folder
i made changes in /home/pentaho/server/biserver-ee/tomcat/webapps/pentaho/META-INF/conetxt.xml file
driverClassName="com.mysql.jdbc.Driver" url="jdbc:mysql://localhost:3306/hibernate"
validationQuery="select 1" /> in jdbc /hibernate section
driverClassName="com.mysql.jdbc.Driver" url="jdbc:mysql://localhost:3306/quartz"
validationQuery="select 1"/>
in jdbc/Quartz section
I'm facing these errors :
1.pentaho.log file :EmbeddedQuartzSystemListener.ERROR_0001 - Scheduler was not properly initialized at startup
2.In pentaho user console ,Loading symbol remain forever without displaying files.
3.i'm not able to save reports.
That might be very simple - but pentaho platform is not shipped with mysql jdbc driver by default due to some kind of licensing issues.
So it is required to manually obtain mysql jdbc driver and put it into web server lib folder.
For tomcat (default installation) that will be /servers/.../tomcat/lib folder.
For this particular case that may be
/home/pentaho/server/biserver-ee/tomcat/lib
One more advice is to check full log under
/home/pentaho/server/biserver-ee/logs
That is the main place where pentaho platform keeps logs info.
Hope it will help.
By the way there is a pretty good pentaho info portal about configuring pentaho platform:
http://infocenter.pentaho.com/help/nav/2_3
Make sure you are pointing to Java 7 in your JAVA_HOME. When you have your JAVA_HOME pointing to Java 8, BI Server will not start correctly
This is a pentaho bug. Create a table in quartz database of mysql like this postgresql table:
CREATE TABLE "QRTZ"
(
name character varying(200) NOT NULL,
CONSTRAINT "QRTZ_pkey" PRIMARY KEY (name)
)

How to define MySQL data source in TomEE?

Platform: TomEE Web profile 1.5.0.
I am trying to do a very basic thing, setup a data source for MySQL. I have read the official guide (http://openejb.apache.org/configuring-datasources.html). It asks us to enter a Resource element in openejb.xml. I can not find that file anywhere in tomee-webprofile-1.5.0. I read in other places that I could use tomee.xml for the same purpose. So, I added this to my conf/tomee.xml.
<Resource id="TestDS" type="DataSource">
JdbcDriver com.mysql.jdbc.Driver
JdbcUrl jdbc:mysql://localhost/test
UserName root
Password some_pass
</Resource>
I copied MySQL driver JAR to tomee/lib folder.
I wrote this code. Showing snippets here:
#Resource(name="TestDS")
DataSource ds;
Connection con = ds.getConnection();
PreparedStatement ps = con.prepareStatement("select * from UserProfile");
The prepareStatement() call is throwing this exception:
java.sql.SQLSyntaxErrorException: user lacks privilege or object not found: USERPROFILE
at org.hsqldb.jdbc.Util.sqlException(Unknown Source)
at org.hsqldb.jdbc.Util.sqlException(Unknown Source)
Why is the system using hsqldb driver? In fact, no matter what is use as name for #Resource, I get the same exception.
What am I doing wrong? I am starting TomEE from Eclipse, if that makes any difference.
I have tracked down the root cause. The problem happens only when I start TomEE from Eclipse. If I start it from command line, my data source definition works just fine.
It appears that when I run TomEE from command line, it uses configuration files from /.metadata/.plugins/org.eclipse.wst.server.core/tmp0/conf. To change this, I had to take these steps in Eclipse:
Remove all deployed projects from the server.
Open server settings and from "Server Locations" choose "Use Tomcat installation". This section is greyed out if you have at least one project still deployed to the server. So, make sure you have done step #1.
Restart the server and redeploy the application. Now, my application is finding the data source.
normally the installation is explained here http://tomee.apache.org/tomee-and-eclipse.html
[I would make this a comment to the answer of RajV, but do not have enough reputation to do so.]
Platform: Tomee 1.6.0 Webprofile, eclipse-jee-kepler-SR2-linux-gtk-x86_64 and OpenJDK 1.7.0_51
After doing the steps in http://tomee.apache.org/tomee-and-eclipse.html (including "Workspace Metadata Installation") I got the same error "user lacks privilege or object not found".
My reaction was to:
$ ln -s [workspace_path]/Servers/tomee.xml \
[workspace_path]/.metadata/.plugins/org.eclipse.wst.server.core/tmp0/conf/
As an advantage of this solution TomEE in eclipse is always using the current version of Workspace/Servers/tomee.xml without any further manual operation.
For me, better solution is to put tomee.xml file in your wpt server directory (/.metadata/.plugins/org.eclipse.wst.server.core/tmp0/conf) and define your datasource there.

Maven deploy fails for Apache Archiva

I have a Maven project which generates a 413.06 KB jar file. I have to deploy it on Apache Archiva based managed repository. I have tried to deploy different versions, and it created required layout and structure, uploaded some files, even it uploaded that jar with 200~ KB. every time the jar file size changes but always it fails to upload 413.06 KB jar file.
Information:-
I am running standalone Archiva
I have given guest account to Global Repository Manager & "Repository Manager - MYREPO"
I have also tried a separate account in Archiva with "Repository Manager - MYREPO" rights and configured it in maven's settings.xml file to set custom timeout.
I am getting following error
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-deploy-plugin:2.7:deploy
(default-deploy) on project SharedshelfRepository: Error deploying artifact: Transfer error:
The server did not respond within the configured timeout. -> [Help 1]
that might be maven-deploy-plugin issue, resources plugin itself needs several dependencies,try manually jar nad p
What version of Maven are you using? You might try 3.0.4 as it has a different HTTP library. I'm also not sure if there's more context for what was happening when it timed out (it seems more request oriented rather than deploy oriented, and deploy does request some metadata).
I can't see that you'd need to alter the timeout, as none of the defaults should apply to such a small file. How long does it take to fail?