When I use tableau "order jdbc" connect to hdp3.1 hive I get this error, but "extract" is working.
An error occurred while communicating with the data source.
An error occurred while communicating with the data source.
Bad Connection: Tableau could not connect to the data source.
com.tableausoftware.jdbc.TableauJDBCException: Error reading metadata for prepared query: SELECT *
FROM (
select * from dim_boxinfo
) Custom_SQL_Query
LIMIT 1
Method not supported
There was a Java error.
As a matter of connecting to a datasource, Tableau will attempt many types of queries in cascading fashion to determine the functionality of the connection. It looks like this is an example of a time where one of those query types is failing, yet is not necessary for the creation of an extract.
This link discusses the customization of JDBC connections, I do not know the settings well enough to specify which might suppress that warning. (ODBC connection customization has been around for a long time and might offer some clues as to what is possible.)
Related
When creating a BigQuery data connector for Google Data Studio, my query works until I attempt to parameterize some fields. As soon as I add parameters, I get the unhelpful and unspecific error:
The query returned an error.
Error ID: xyz
How can I figure out what the underlying issue is that is causing this problem?
1. Check BigQuery Logs in Cloud Logging
If there is an error executing a query in BigQuery, the underlying cause will likely be visible in Cloud Logging. In Cloud Logging, execute this query to show these errors, and hopefully get insight into the underlying problem:
resource.type="bigquery_resource"
severity=ERROR
Its possible these logs will show that the query is failing because the format of certain data is invalid -- in that case its likely because having no default values for parameters is preventing the BigQuery query from succeeding. In that case:
2. Give Parameters Default Values
The connector passes the query to BigQuery, which executes it. In order for this to work correctly, the parameters need to have some values. Provide them in the form of parameter default values that will result in a valid query.
Somebody tell me I'm not crazy. I have SAS on a server, and I'm running the following code:
data wtf;
a=".123456 1 1";
b=input(a,anydtdtm.);
run;
If I run this on my local computer, no problem. If I run this on the server, I get:
ERROR: An exception has been encountered.
Please contact technical support and provide them with the following traceback information:
The SAS task name is [DATASTEP]
ERROR: Read Access Violation DATASTEP
Exception occurred at (04E0AB8C)
Task Traceback
Address Frame (DBGHELP API Version 4.0 rev 5)
0000000004E0AB8C 0000000009C4EC20 sasxdtu:tkvercn1+0x9B4C
0000000004E030D9 0000000009C4F100 sasxdtu:tkvercn1+0x2099
0000000005FF14BE 0000000009C4F108 uwianydt:tkvercn1+0x47E
0000000002438026 0000000009C4F178 tkmk:tkBoot+0x162E6
Does anyone else get this error???
This is an internal bug that cannot be resolved by the user. You'll need to send this information, your environment description, and the exact steps to recreate the bug over to SAS Technical Support to open up an investigation and determine a workaround.
If your server is a database not made up of .sas7bdat files, it might be due to the SAS/ACCESS engine attempting to translate the function into a way that the server's language can understand, but is unable to do so properly; that is, it might think it's doing it correctly, but it's not. There are special cases where this can occur, and you may have discovered one.
If you are in fact querying some other database, try adding this before running the data step:
options sastrace=',,,d' sastraceloc=saslog;
This will show all of the steps as SAS sends data & functions to and from the server, and may help give some insight.
I am getting the same error on Linux system running SAS 9.4
AUTOMATIC SYSSCP LIN X64
AUTOMATIC SYSSCPL Linux
AUTOMATIC SYSVER 9.4
AUTOMATIC SYSVLONG 9.04.01M3P062415
AUTOMATIC SYSVLONG4 9.04.01M3P06242015
Until SAS can fix the informat you probably need to add additional testing in your code to exclude strange values like that.
While refreshing Webi report I am getting an error:
A database error occured. The database error text is: (CS) "Unexpected behavior" . (WIS 10901)
All the objects are parsing in the universe and Server is also responding. What can be the possible reason?
We are also able to run query in the database using database client tool.
If the error message appears after the a long time it might just be a timeout issue.
Else, you could try to import a version of the report that works in CMS to your local drive, rename it and run again.
It can be caused by some special character in the data combined with the fact that the server language settings do not foresee such character and therefore Business Objects cannot parse it for presentation.
If that is the case you might need to configure an environment variable of the server (like NLS_LANG) setting it to a value such that those special characters in your data can be handled by Business Objects.
In my situation, the error appera when some objet from the data base has changed or does not exists anymore. So we need to delete this object in the Universe or be sure that the field exists in the data base with the same name and type.
I had same problem with my reports. After couple hour of "investigation", I found.
I create Object in my universe, and set inappropriate type of object data Number, when value in database have type Character.
It throw me oracle Error (ORA-01722), and Bussiness Object error (WIS 10901), though SQL copied from report creator interface, executed directly on database return proper data.
I'm looking for a way to programmatically execute arbitrary SQL commands against my DB.
(Hibernate, JPA, HSQL)
Query.createNativeQuery() doesn't work for things like CREATE TABLE.
Doing LOTS of searching, I thought I could use the Hibernate Session.doWork().
By using the deprecated Configuration.buildSesionFactory() seems to show that doWork won't work.
I get "use lacks privilege or object not found" for all the CREATE TABLE statements.
So, what other technique is there for executing arbitratry SQL statements?
There were some notes on using the underlying JDBC Statement, but I haven't figure out how to get a JDBC Connection object from Hibernate to try that.
Note that the hibernate.hbm2ddl.auto=create setting will NOT work for me, as I have ARRAY[] columns which it chokes on.
I don't think there is any problem executing a create table statement with a Hibernate native query. Just make sure to use Query.executeUpdate(), and not Query.list() or Query.uniqueResult().
If it doesn't work, please tell us what happens when you execute it, and join the full stack trace of the exception and the SQL query you're executing.
"use lacks privilege or object not found" in HSQL may mean anything, for example existence of a table with the same name. Error messages in HSQL are completely misleading. Try listing your tables using DatabaseMetadata - you have probably already created the table.
I am having this big database on one MSSQL server that contains data indexed by a web crawler.
Every day I want to update SOLR SearchEngine Index using DataImportHandler which is situated in another server and another network.
Solr DataImportHandler uses query to get data from SQL. For example this query
SELECT * FROM DB.Table WHERE DateModified > Config.LastUpdateDate
The ImportHandler does 8 selects of this types. Each select will get arround 1000 rows from database.
To connect to SQL SERVER i am using com.microsoft.sqlserver.jdbc.SQLServerDriver
The parameters I can add for connection are:
responseBuffering="adaptive/all"
batchSize="integer"
So my question is:
What can go wrong while doing this queries every day ? ( except network errors )
I want to know how is SQL Server working in this context ?
Further more I have to take a decicion regarding the way I will implement this importing and how to handle errors, but first I need to know what errors can arise.
Thanks!
Later edit
My problem is that I don't know how can this SQL Queries fail. When i am calling this importer every day it does 10 queries to the database. If 5th query fails I have to options:
rollback the entire transaction and do it again, or commit the data I got from the first 4 queries and redo somehow the queries 5 to 10. But if this queries always fails, because of some other problems, I need to think another way to import this data.
Can this sql queries over internet fail because of timeout operations or something like this?
The only problem i identified after working with this type of import is:
Network problem - If the network connection fails: in this case SOLR is rolling back any changes and the commit doesn't take place. In my program I identify this as an error and don't log the changes in the database.
Thanks #GuidEmpty for providing his comment and clarifying out this for me.
There could be issues with permissions (not sure if you control these).
Might be a good idea to catch exceptions you can think of and include a catch all (Exception exp).
Then take the overall one as a worst case and roll-back (where you can) and log the exception to include later on.
You don't say what types you are selecting either, keep in mind text/blob can take a lot more space and could cause issues internally if you buffer any data etc.
Though just a quick re-read and you don't need to roll-back if you are only selecting.
I think you would be better having a think about what you are hoping to achieve and whether knowing all possible problems will help?
HTH