Big query data transfer - google-bigquery

I am trying to use GoogleDataTransfer to import the DFP data to BQ. When I create the transfer method I get an error without any details:
Error in creating a new transfer
Does anyone know why I get the error message? I use the correct Network ID.

Related

Azure Data Factory error on Sink "UserErrorSchemaMappingCannotInferSinkColumnType"

I am using Azure Data Factory to read data from Application Insights via REST API by passing a KUSTO query and I am trying to write the results to an Azure SQL database.
Unfortunately when I execute my pipeline I get the following error:
UserErrorSchemaMappingCannotInferSinkColumnType,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Data type of column '$['Month']' can't be inferred from 1st row of data, please specify its data type in mappings of copy activity or structure of DataSet.,Source=Microsoft.DataTransfer.Common
It seems like an error in the mapping, but from the mapping tab I am unable to specify the data type of the columns:
Can you provide me a hint?
Update, I use the copy data activity with the following rest Source:
As I understand, the copy active works well with no error, but the data not be inserted.
And for now, we're glad to hear that you have resolved the issue. I help you post these as answer to end this question:
In the end you managed to solve my issue following this
blog:https://www.ben-morris.com/using-azure-data-factory-with-the-application-insights-rest-api/
This can be beneficial to other community members.

Bigquery - Error creating scheduled query: Cannot create a transfer in JURISDICTION_US when destination dataset is located in JURISDICTION_EU

BigQuery has started throwing up this error
"Error creating scheduled query: Cannot create a transfer in
JURISDICTION_US when destination dataset is located in
JURISDICTION_EU".
My datasets are all in the EU but I don't understand why it is trying to create a transfer in the US.
Has anyone had a similar issue and been able to resolve it?
I ran into this issue and did the following:
"In the UI, under More -> Query Settings, then look for Processing Location down the bottom before Advanced."
Then I had to reload the page for it to work.

BigQuery Error 6034920 when using UPDATE FROM

We are trying to perform an ELT in BigQuery. When using UPDATE FROM, it fails on some tables with the following error:
"An internal error occurred and the request could not be completed.
Error: 6034920"
Moreover, both (Source and Destination) tables consists of data from a single partition.
We are unable to find the details for error code 6034920. Any insight/solutions would be really appreciated?
It is transient, internal error on Bigquery. This behavior is related to a BigQuery shuffling component (in the BQ service backend) and engineers are working to solve it. At the moment there is not an ETA to have this resolved.
In the meantime, as a workaround you should retry the query to detect this behavior again. You can continue tracking the logs in Stackdriver related to this issue by using the following filter:
resource.type="bigquery_resource"
protoPayload.serviceData.jobCompletedEvent.job.jobStatus.additionalErrors.message="An internal error occurred and the request could not be completed. Error: 6034920"
What you can try, is to stop putting values into the partitioning column, it could hopefully fixed the job failures. I hope you find the above pieces of information useful.

BigQuery Data Transfer Service Error

I had the below errors with the BigQuery Data Transfer Service. Has anyone encountered these errors before?
Error 1
Error 2
Did you request a custom schema for your data transfer files? The field should be "Campaign ID" per the documentation:
https://developers.google.com/doubleclick-advertisers/dtv2/reference/file-format
If so, it can be resolved by asking customer support that helped with the original setup to use standard values.

Getting error from bq tool when uploading and importing data on BigQuery - 'Backend Error'

I'm getting the error: BigQuery error in load operation: Backend Error when I try to upload and import data on BQ. I already reduced size, increased time between imports, but nothing helps. The strange thing is that if I wait for a time and retry it just works.
In the BigQuery Browser tool it appears like an error in some line/field, but I checked and there is none. And obviously this is a fake message, because if I wait and retry to upload/import the same file, it works.
Tnks
I looked up our failing jobs in the bigquery backend, and I couldn't find any jobs that terminated with 'backend error'. I found several that failed because there were ascii nulls found in the data. (it can be helpful to look at the error stream errors, not just the error result). It is possible that the data got garbled on the way to bigquery... are you certain the data did not change between the failing import and the successful one on the same data?
I've found exporting from a big query table to csv in cloud storage hits the same error when certain characters are present in one of the columns (in this case a column storing the raw results from a prediction analysis). By removing that column from the export it resolved the issue.