Tableau data source with Athena custom query - datasource

I'm getting the below error on Tableau Desktop when using the custom query, I'm successfully able to connect and see contents from the table when directly drag the table to query builder section on Tableau Desktop.
Datasource used: AWS Athena
Driver version: AthenaJDBC42_2.0.2
Tableau Desktop version: 10.4
com.tableausoftware.jdbc.TableauJDBCException: Error reading metadata for executed query: SELECT * FROM ( select * from tablename ) "TableauSQL" LIMIT 0 [Simba][AthenaJDBC](100071) An error has been thrown from the AWS Athena client. Only one sql statement is allowed. Got: SELECT * FROM ( select * from tablename ;) "TableauSQL" LIMIT 0
There was a Java error.
Unable to connect to the server "Athena.us-east-1.amazonaws.com". Check that the server is running and that you have access privileges to the requested database.

Try removing the ; from the subquery - That always throws errors for me

I was having the same issue.
To provide some clarity to the above - the outer query is being added by Tableau. Here is my simple query and the error being returned by Tableau.
I was able to resolve by changing my query to refer to my table in the form "database"."table" and remove the closing ;
This custom query then worked for me:

Related

Hive SQL Select is not working with multiple AND criteria, showing error: The operator 'AND' accepts at least 2 argument

I am trying to run a very simple query that just select all the rows based upon some multiple criteria and all are inclusive i.e. I am using AND in select HiveQL statement. The table is an external table in Hive and the storage handler is phoenix, so I checked in phoenix also about that query and it is working fine, but in Hive, it is showing some java IO exception error which I am not able to get where I am wrong. The query I am using is:
SELECT * FROM msme_api.msme_jk WHERE gender='Male' AND socialcategory='General';
The complete error message is:
Error: java.io.IOException: org.apache.hadoop.hive.ql.exec.UDFArgumentLengthException: The operator 'AND' accepts at least 2 argument. (state=,code=0)
I am trying for external and for internal Hive tables, In both, the issues are still the same but when I give order by, OR statement then it's surprising that it works.
SELECT * FROM msme_api.msme_jk WHERE gender='Male' AND socialcategory='General' order by aid;
SELECT * FROM msme_api.msme_jk WHERE gender='Male' OR socialcategory='General';
Both works fine but with AND, I am getting the error.
I still confused about how hive is taking and processing the above queries and why I am not able to execute simple select statement. Any help will be appreciated.

Unable to access the temp tables in azure sql database

Using following code I have created a temp table in azure sql database.
CREATE TABLE ##UpsertTempTable (
eno varchar(25),
ename varchar(25)
);
and I am want to check the data using the below query
select * from ##UpsertTempTable
Ideally it should run without any issue as in all of the azure documentation it works without any issues but unfortunately it is not working and giving below error.
I tried looking solution in all places in the internet but could not find any relevant documentation for this issue.
Error : Failed to execute query. Error: Invalid object name '##UpsertTempTable'.
I tried in Query Editor(Preview) in Portal, and create temporary table code doesn't work. I both used ##UpsertTempTable and #UpsertTempTable.
For example, when we run the code, no error happens .
When you run select * from ##UpsertTempTable, Query editor will gives the error:
I also try with SSMS V17.9 and SSMS V18.1, everything is ok.
What I think is the query editor doesn't support create temporary table well.
I asked Azure Support and wait their replay, please wait my update.
Update:
Azure Support replied me:
"This is by design, the temp tables exists as long as the connection is open.
The current way portal query editor is designed, the connection is killed resulting in temp table being deleted.
"
Hope this helps.

query is running but failed to create view in biqquery

I would like to create view by using multiple tables . While creating view getting "FAiled to save View , column not found do u mean that this column" but if you run the query . I am getting data. Can you please help to resolve this issue
Most likely you are missing project(s) to be specified in your query
You should fully qualify tables as
[project:dataset.table] if you are using Legacy SQL or
`project.dataset.table` if you are using Standard SQL

Why spark sql is adding where 1=0 during load?

I am pretty new with spark. I have a task to fetch 3M record from a sql server through denodo data platform and write into s3. In sql server side it is a view on join of two tables. The view is time consuming.
Now I am trying to run a spark command as:
val resultDf = sqlContext.read.format("jdbc").option("driver","com.denodo.vdp.jdbc.Driver").option("url", url).option("dbtable", "myview").option("user", user).option("password", password)
I can see that spark is sending query like:
SELECT * FROM myview WHERE 1=0
And this portion is taking more than an hour.
Can anyone please tell me why the where clause is appending here?
Thanks.
If I'm understanding your issue correctly, Spark is sending SELECT * FROM myview WHERE 1=0 to the Denodo Server.
If that is the case, that query should be detected by Denodo as a query with no results due to incompatible conditions in the WHERE clause and the execution should be instantaneous. You can try to execute the same query in Denodo's VQL Shell (available in version 6), Denodo's Administration Tool or any other ODBC/JDBC client to validate that the query is not even sent to the data source. Maybe Spark is executing that query in order to obtain the output schema first?
What version of Denodo are you using?
I see this is an old thread -- however we are experiencing the same issue -- however it does not occur all of the time nor on all connections/queries -
SQOOP command is sent -- the AND (1=0) context ('i18n' = 'us_est') is added somewhere --we are using Denodo 7 -- jdbc driver com.denodo.vdp.jdbc.Driver
select
BaseCurrencyCode,BaseCurrencyName,TermCurrencyCode,TermCurrencyName,
ExchangeAmount,AskRate,BidRate,MidMarketRate,ExchangeRateStartDate,
ExchangeRateEndDate,RecCreateDate ,LastChangeDate
from
CurrencyExchange
WHERE
LastChangeDate > '2020-01-21 23:20:15'
And LastChangeDate <= '2020-01-22 03:06:19'
And (1 = 0) context ('i18n' = 'us_est' )

Specifying database other than default with Impala JDBC driver

I'm using the Impala JDBC driver (or I guess it's actually the Hive Server 2 JDBC driver). I have a view created in another database -- let's call it "store55".
Let's say my view is defined as follows:
CREATE VIEW good_customers AS
SELECT * from customers WHERE good = true;
When I try to query this view using JDBC as follow:
SELECT * FROM store55.good_customers LIMIT 10
I get an error such as:
java.sql.SQLException: AnalysisException: Table does not exist: default.customers
Ideally, I'd like to specify the database name somewhere in the JDBC URL or as a parameter but when I try to use this JDBC url, I still get the same error:
jdbc:hive2://<host>:<port>/store55;auth=noSasl
Doe the Hive2 JDBC driver just ignore the database part of the URL and assume all queries are executed against the default database?
The only way I was able to have the queries return is to change the view definition itself to include the database name:
CREATE VIEW good_customers AS
SELECT * from store55.customers WHERE good = true;
However, I'd like to keep the view definition free of database names.
Thanks!
You might want to specify in JDBC the "use database xxxxx;" statement.
Also, if you are already using the database try "invalidate metadata" statement.
The URL is jdbc:hive2://:/store55;auth=noSasl correct
Can you run few diagnostics such as:
SHOW TABLES - to ensure that the view is created in store55
Are you using the USE DATABASE command in the DDL's