How to connect to Odoo Data with JDBC api via Talend? - api

I wanted to extract data from an odoo database on a cloud environment, I tried this jdbc url : jdbc:odoo:User=MyUser;Password=MyPassword;URL=http://MyOdooSite/;Database=MyDatabase;
But I didn't got any results. Do you have any idea, please?

Related

Connecting Pentaho to Apache Druid

Can anybody let me know the steps to connect to Apache Druid using Tableau, please?
I tried the following steps:
Copy the Avatica jar file to tomcat/lib
Create a new Generic Database (JDBC connection) using the following parameters options-
MSTR_JDBC_JAR_FOLDER=/opt/mstr/MicroStrategy/install/JDBC;
DRIVER=org.apache.calcite.avatica.remote.Driver;
URL={jdbc:avatica:remote:url=http://localhost:8082/druid/v2/sql/avatica/};
Note-
1. userid/password is not enabled i.e. Druid can be accessed without login credentials
Able to access the Druid console "http://localhost:8888/unified-console.html"
Able to connect to Druid using Apache Superset and Tableau applications.
jdbc:avatica:remote:url=http://localhost:8082/druid/v2/sql/avatica/
instead of HTTP replace it with druid. like I did below , may be it works.
jdbc:avatica:remote:url=druid://localhost:8082/druid/v2/sql/avatica/
Also let me know the update that were you able to connect druid to pentaho

simple-jdin/jdbc.properties is ignored in pentaho-server 8.2

If we need to perform queries in pentaho data integration (IDE), we need to add manually the datasource in simple-jdin/jdbc.properties
MyDatabase/type=javax.sql.DataSource
MyDatabase/driver=org.hsqldb.jdbcDriver
MyDatabase/url=jdbc:hsqldb:hsql://localhost/sampledata
MyDatabase/user=pentaho_admin
MyDatabase/password=password
This works as expected in the ide known as pentaho data integration, spoon or kettle.
But, the same in pentaho server 8.2 does not works.
Steps to reproduce the error
deploy or upload the transformation(.ktr) in the pentaho-server 8.2
add manually the datasource in the server /../pentaho-server/pentaho-solutions/system/simple-jndi/jdbc.properties
executed the transformation using pentaho server web console : run in background or schedule options
error: datasource not found
Alternative
Create the datasource manually using web console of pentaho server instead of manually modification of file : /../pentaho-server/pentaho-solutions/system/simple-jndi/jdbc.properties
Question
simple-jndi/jdbc.properties works in server or is just for development purposes (pdi)?
are the settings available in jdbc.properties file on server can you verify the same by referring to the pentaho data integrator > simple-jndi folder.
also are you able to connect to the database from the server may be using a database client to confirm.
NOTE:whenever you work with databases make sure you relevant library files to connect to the respective database.
From my personal experience, I was not able to make the server pick JNDI connection definitions from the simple-jndi/jdbc.properties file when the Pentaho Server was running with Tomcat.
I was only able to use JNDI on the Pentaho Server by defining JNDI Datasources in the Tomcat configuration files.

Google App Script - MySql 8 - JDBC Connection Failed

I'm trying to connect to a self-hosted MySql 8 instance from a Google AdWords Script (effectively an App Script) using the Jdbc utility over SSL. My code follows the specifications in these answers here and here.
No matter what I do, I still get the following error:
ERROR: Failed to establish a database connection. Check connection string, username and password.
It turns out that the Jdbc utility only works with versions of MySql up to 5.7.
I thought I would post this here posthumously in case I can save someone else time.
The issue has been raised here and marked as fixed recently.
Other related issues can be searched here
it seems like Google has updated the jdbc connector. It is working for me now,

Schema loads in Pentaho BI Server 5 but cube is not displayed

im running schema-workbench 3.6.1, Pentaho BI server 5.0.1, Saiku analytics installed from the marketplace and I successfully publish from schema-workbench. (I also have the datasource present and working in both schema-workbench and BI Server)
When I go to BI server the analytics file created by the workbench is present but the cube is not in saiku analytics, I try to refresh and only are present the two test cubes. What im doing wrong?
I appreciate your help.
Check your pentaho.log and catalina.out files, there's most likely an error on the schema (bad column or table name, wrong data type on a level, or something similar).

How to access database of JasperReports Server via API?

Can we access Database of JasperReports Server server through its API?
And do that allow us to make run time report?
If so how can we do that?
You can access JasperServer's resources using it's SOAP API... java-client can be found in jasperserver-common-ws-[VERSION].jar...
examples of usage are in doc folder of jasperserver as well... Document you need is "Jasper Server Web Services Guide"
Regards..
Here's a link to a page which has a full example of how to write a c program to call an existing Jasper report from a remote client.
http://community.ingres.com/wiki/Jaspersoft_WebServices_C_API
Have a look at the following wiki.
It shows you how to set up a Microsoft Sql Server as a JDBC DataSource on a Jasper Server.