I'm using GROUP_CONCAT succesfully in one of my sheets in tableau. It performs as needed and everything works.
When I publish the workbook to tableau however, I get this error:
An unexpected error occurred. If you continue to receive this error
please contact your Tableau Server Administrator.
The Google BigQuery service was unable to compile the query. Function
not found: GROUP_CONCAT at [1:408] 2017-02-06 11:50:35.854,
(WJhi5wrG0e4AACIU#woAAAHo,0,0)
According to this SO post, I should use STRING_AGG instead.
However if I try to use this in tableau, it doesn't recognize it as a valid function.
How could I fix this?
You need to use STRING_AGG in your Tableau -> BigQuery data source, not as a calculated field
BigQuery Standard SQL will then understand the query and the result can be consumed in Tableau
Related
I have developed some SQL that reads from a redshift table, does some manipulation (esp listagg some fields), and then writes to another redshift table.
When I run the SQL using SQLWorkbench it executes successfully. When I embed it in a Tableau Prep flow (as "Complex SQL") I get several of these errors: "System error: AqlProcessor evaluation failed: [Amazon][Support] (40550) Invalid character value for cast specification." Presumably these relate to my treatment of data types. What I don't is what is so difference in the environment that would cause different results like this? Is it because SQLWorkbench and Tableau Prep use different SQL interpreters? Or is my question too broad to even speculate without going through the actual code?
Best guess is that Tableau, which has knowledge of DDL, is add some CAST() operations to the SQL. SQLWorkbench is simpler and is pushing the SQL to Redshift as written. This is based on there being no explicit CASTs in your SQL but an error message that identifies a CAST().
Look at stl_querytext for these two queries and see if they are being given to Redshift differently by the two benches. I suspect this will give you some clues to go on.
If there are no differences in the SQL then the issue may be with user / connection differences and more info will likely be needed about the issue.
I'm using the SQL Connector in Azure Logic Apps to connect to our Azure SQL Database to perform queries, update tables, and execute stored procedures. After several months of developing logic apps, I'm seeing the below error message when using the SQL Connector "Execute Stored Procedure (V2)". I'm only encountering this error with this specific connector. All the other connectors work just fine. Has anyone encountered this error before and had success troubleshooting?
Error:
Could not retrieve values. Error code: 'BadRequest', Message: 'The value's length for key 'application name' exceeds it's limit of '128'.
clientRequestId: 'XXXXX'. More diagnostic information: x-ms-client-request-id is 'XXXXX'.
Just had a call with Microsoft Support and got confirmation: this is the global issue in the SQL connector of LogicApp. They are working on this to fix, no ETA for now.
Update: should be fixed from around 29.03.2022 03:00 CET
Azure Data Explorer supposedly supports T-SQL queries:
The Kusto.Explorer tool supports T-SQL queries to Kusto. To instruct Kusto.Explorer to execute a query, begin the query with an empty T-SQL comment line (--).
However, I can't get this to work in a Log Analytics Workspace.
For instance, this Kusto query works fine and returns results:
ContainerInstanceLog_CL
| where Message has "Hamlet"
| limit 500
But any attempt to use T-SQL (with a leading empty comment line) ...
--
SELECT * FROM ContainerInstanceLog_CL
...fails with
Query could not be parsed at '-' on line [1,1]
Token: -
Line: 1
Position: 1
Are T-SQL queries not supported in Log Analytics Workspaces?
Unfortunately, you cannot run T-SQL queries in Azure Log Analytics Workspaces.
I would suggest you to provide feedback on the same:
https://feedback.azure.com/forums/267889-azure-monitor-log-analytics
All of the feedback you share in these forums will be monitored and reviewed by the Microsoft engineering teams responsible for building Azure.
T-SQL queries run on the Azure Data Explorer:
Writing my comment as an answer as suggested.
Log Analytics Workspaces supports only Kusto as of now. You can further integrate it with power BI for better analytics options.
I am trying to get data from a LegancySQL View in BigQuery using the Excel ODBC Simba Connection, but it is saying "Cannot reference a legacy SQL view in a standard SQL Query". However, I can't seem to be able to write it as LegacySQL. Adding in #legacySQL at the start of the query gives another error saying that legacySQL is not allowed, however, in their documentation it says it is used by default..?
What can I do?
Thanks,
Benji
I need to get a copy of a SQL Server 2008 table into an Oracle RDBMS. I have database link for SQL Server, database has a table which contains LONG BINARY type column.
When I issue
create table test_ora as select * from mssqltable#dblink
I get the error
Can't convert LONG
I tried to use to_lob, to_char, hextoraw and a ream of Oracle conversion function but still hasn't defeated the issue. Do you have any ideas?
p.s. I'm out of work now so can't tell exact ORA- error number.
There is a way to do that with undocumented Oracle's package:
http://tonguc.wordpress.com/2008/08/28/how-to-transfer-long-datatype-over-dblink/
I would recommend tool called Pentaho Data Integration. This is free, small and superb ETL tool.
Download page: community(.)pentaho(.)com
It will recreated all tables and types for you. How to do it:
pldwh(.)blogspot(.)co(.)uk/2013/03/pentaho-data-integration-create-tables_1(.)html