BigQuery Data Transfer Service Error - google-bigquery

I had the below errors with the BigQuery Data Transfer Service. Has anyone encountered these errors before?
Error 1
Error 2

Did you request a custom schema for your data transfer files? The field should be "Campaign ID" per the documentation:
https://developers.google.com/doubleclick-advertisers/dtv2/reference/file-format
If so, it can be resolved by asking customer support that helped with the original setup to use standard values.

Related

Azure Data Factory error on Sink "UserErrorSchemaMappingCannotInferSinkColumnType"

I am using Azure Data Factory to read data from Application Insights via REST API by passing a KUSTO query and I am trying to write the results to an Azure SQL database.
Unfortunately when I execute my pipeline I get the following error:
UserErrorSchemaMappingCannotInferSinkColumnType,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Data type of column '$['Month']' can't be inferred from 1st row of data, please specify its data type in mappings of copy activity or structure of DataSet.,Source=Microsoft.DataTransfer.Common
It seems like an error in the mapping, but from the mapping tab I am unable to specify the data type of the columns:
Can you provide me a hint?
Update, I use the copy data activity with the following rest Source:
As I understand, the copy active works well with no error, but the data not be inserted.
And for now, we're glad to hear that you have resolved the issue. I help you post these as answer to end this question:
In the end you managed to solve my issue following this
blog:https://www.ben-morris.com/using-azure-data-factory-with-the-application-insights-rest-api/
This can be beneficial to other community members.

Data Fusion not allow Struct type from Bigquery

I'm try create a pipeline on Datafusion to read a table from bigquery with STRUCT type but received this error:
2021-06-01 19:13:53,818 - WARN [service-http-executor-1321:i.c.w.s.c.AbstractWranglerHandler#210] - Error processing GET /v3/namespaces/system/apps/dataprep/services/service/methods/contexts/interoper_prd/connections/interoper_bq_prd/bigquery/raw_deep/tables/deep_water_report/read?scope=plugin-browser-source, resulting in a 500 response.
java.lang.RuntimeException: BigQuery type STRUCT is not supported
I was able to reproduce your issue. For this reason, I have opened an Issue within GCP Issue Tracker platform, here.
The product team is already aware of the complaint, as you can see in the case comments. Please follow the thread for any updates.

BigQuery Error 6034920 when using UPDATE FROM

We are trying to perform an ELT in BigQuery. When using UPDATE FROM, it fails on some tables with the following error:
"An internal error occurred and the request could not be completed.
Error: 6034920"
Moreover, both (Source and Destination) tables consists of data from a single partition.
We are unable to find the details for error code 6034920. Any insight/solutions would be really appreciated?
It is transient, internal error on Bigquery. This behavior is related to a BigQuery shuffling component (in the BQ service backend) and engineers are working to solve it. At the moment there is not an ETA to have this resolved.
In the meantime, as a workaround you should retry the query to detect this behavior again. You can continue tracking the logs in Stackdriver related to this issue by using the following filter:
resource.type="bigquery_resource"
protoPayload.serviceData.jobCompletedEvent.job.jobStatus.additionalErrors.message="An internal error occurred and the request could not be completed. Error: 6034920"
What you can try, is to stop putting values into the partitioning column, it could hopefully fixed the job failures. I hope you find the above pieces of information useful.

Big query data transfer

I am trying to use GoogleDataTransfer to import the DFP data to BQ. When I create the transfer method I get an error without any details:
Error in creating a new transfer
Does anyone know why I get the error message? I use the correct Network ID.

Getting error from bq tool when uploading and importing data on BigQuery - 'Backend Error'

I'm getting the error: BigQuery error in load operation: Backend Error when I try to upload and import data on BQ. I already reduced size, increased time between imports, but nothing helps. The strange thing is that if I wait for a time and retry it just works.
In the BigQuery Browser tool it appears like an error in some line/field, but I checked and there is none. And obviously this is a fake message, because if I wait and retry to upload/import the same file, it works.
Tnks
I looked up our failing jobs in the bigquery backend, and I couldn't find any jobs that terminated with 'backend error'. I found several that failed because there were ascii nulls found in the data. (it can be helpful to look at the error stream errors, not just the error result). It is possible that the data got garbled on the way to bigquery... are you certain the data did not change between the failing import and the successful one on the same data?
I've found exporting from a big query table to csv in cloud storage hits the same error when certain characters are present in one of the columns (in this case a column storing the raw results from a prediction analysis). By removing that column from the export it resolved the issue.