Datastudio BigQuery connector: The query returned an error - google-bigquery

When creating a BigQuery data connector for Google Data Studio, my query works until I attempt to parameterize some fields. As soon as I add parameters, I get the unhelpful and unspecific error:
The query returned an error.
Error ID: xyz
How can I figure out what the underlying issue is that is causing this problem?

1. Check BigQuery Logs in Cloud Logging
If there is an error executing a query in BigQuery, the underlying cause will likely be visible in Cloud Logging. In Cloud Logging, execute this query to show these errors, and hopefully get insight into the underlying problem:
resource.type="bigquery_resource"
severity=ERROR
Its possible these logs will show that the query is failing because the format of certain data is invalid -- in that case its likely because having no default values for parameters is preventing the BigQuery query from succeeding. In that case:
2. Give Parameters Default Values
The connector passes the query to BigQuery, which executes it. In order for this to work correctly, the parameters need to have some values. Provide them in the form of parameter default values that will result in a valid query.

Related

com.tableausoftware.jdbc.TableauJDBCException: Error reading metadata for prepared query

When I use tableau "order jdbc" connect to hdp3.1 hive I get this error, but "extract" is working.
An error occurred while communicating with the data source.
An error occurred while communicating with the data source.
Bad Connection: Tableau could not connect to the data source.
com.tableausoftware.jdbc.TableauJDBCException: Error reading metadata for prepared query: SELECT *
FROM (
select * from dim_boxinfo
) Custom_SQL_Query
LIMIT 1
Method not supported
There was a Java error.
As a matter of connecting to a datasource, Tableau will attempt many types of queries in cascading fashion to determine the functionality of the connection. It looks like this is an example of a time where one of those query types is failing, yet is not necessary for the creation of an extract.
This link discusses the customization of JDBC connections, I do not know the settings well enough to specify which might suppress that warning. (ODBC connection customization has been around for a long time and might offer some clues as to what is possible.)

Deleting rows in BigQuery fails with "Invalid schema update"

I'm trying to delete some rows from a BigQuery table (using standard SQL dialect):
DELETE FROM ocds.releases
WHERE
ocid LIKE 'ocds-b5fd17-%'
However, I get the following error:
Query Failed
Error: Invalid schema update. Field packageInfo has changed mode from REQUIRED to NULLABLE
Job ID: ocds-172716:bquijob_2f60927_15d13c97149
It seems as though BigQuery doesn't like deleting rows with a REQUIRED column. Is there any way around this?
It has been a known limitation that BigQuery DML doesn't work with tables with required fields (see https://cloud.google.com/bigquery/docs/reference/standard-sql/data-manipulation-language#known_issues).
We are in the process of removing this limitation. We whitelisted your project today. Please try running your query again in the same project. Let us know if the problem is still there, or if you want to have more projects whitelisted.

Syntax Error While Calling DB2 Stored Procedure Through JMeter

I am trying to call a DB2 Stored Procedure which has two Input parameters (Timestamp, both) and one Output parameter (Integer). I am trying to do so from JMeter JDBC Sampler and getting sql syntax exception.
Response code: 42884 -440
Response message: com.ibm.db2.jcc.am.SqlSyntaxErrorException: DB2 SQL Error: SQLCODE=-440, SQLSTATE=42884, SQLERRMC=PROCEDURE;DEVSCHEMA.GET_ROW_COUNT, DRIVER=4.19.26
Response headers:
1272084586, URL=jdbc:db2://<db2IP>:<port>/DB2T, UserName=<someUserName>, IBM Data Server Driver for JDBC and SQLJ
From IBM Documentation I got to know that this error happens when either Stored procedure is not present - which is not the case, Schema name incorrect - which is also not the case, mismatching number of parameters - I verified this but I am doubtful at this point because JMeter provides separate fields to be filled up and in one of the fields I might be passing incorrect value.
I have not much knowledge of JMeter but with the help from apache JMeter documentation I have set below values to the Sample Fields.
Query Type: Callable Statement
Query: CALL DEVSCHEMA.GET_ROW_COUNT(?,?,?)
Parameter Values: ${__time(yyyy-MM-dd HH:mm:ss,)},${__time(yyyy-MM-dd HH:mm:ss,)},0
Parameter Types: IN TIMESTAMP,IN TIMESTAMP,OUT INTEGER
Variable Names:VARCOUNT
Handle ResultSet: Store as a String
Can anyone please figure out where am I making a mistake? Many thanks.
The issue has been resolved. It was literally something else about which I never gave a thought. When I got the same error when I tried to access the same SP using Java code, I contacted DB2 team who wrote the Stored procedure. And issue was with the SP. According to them execute access was given so after drilling here and there they preferred to create a new SP which worked without any issues. Else everything was perfect from JMeter side.

"Invalid snapshot time" error without table decorator usage

We got an error that {"message":"Invalid snapshot time 1472342794519, unable to read before 1472342794785","reason":"invalid"}. Other QA describes that such an error happens when table decorators' parameters are invalid, however, our query does not have table decorators.
The query uses TABLE_DATE_RANGE, but its arguments are date timestamp, so the lower digits must be 0s, not like that in the above error.
Retrying the same query succeeded.
I can provide the job ID, but because it includes internal information of our company. I apologize that I cannot directly write it here.
The tables that the TABLE_DATE_RANGE wildcard evaluates to are resolved as of the time of the start of the query. Looking at the timestamps, it looks like the table was deleted right after the job started execution. This causes the table resolution to throw that error.

Pentaho Execute SQL Statements variable conversion to null

I am using PDI to delete and insert some data from a DB. I have the following issue. I create two variables called START_DATE and END_DATE that are used to select the data that will be deleted from my DB. I am able to get them and run my transformation with no erors in the log file, but when I checked if data was deleted, I find it didn't. I send checked my "DeleteProcedure" step, and it says "Conversion error: null". I have tried different approached to take the variables and pass them as Strings, but I haven't been able to solve this issue. It cannot be a SQL mistake as I tested it with a constant and it works.
Any ideas? I attach some pics. Thanks!
As a documentation of the Execute SQL script says:
Note: When you have an issue, that the SQL is started at the initialization phase of the transformation and not for each row, make sure to check the option "Execute for each row" (see description below).
In your case it executes during the initialization phase of the transformation that's why it gets null values instead of ones from previous step.