Problem with recurring data fetch in BigQuery - google-bigquery

We try to save data on a daily basis in BigQuery. When we do this process manually via the save function there is no problem with this process.
In detail, we try to load some data from a Google Cloud SQL instance to BigQuery.
But if we use the same settings with the planning function we receive an error without any details about the problem.
Does anybody know about this problem or know where we can get more details about the error message?

Related

Problems with Transfer Service from Google Ads to BigQuery

I am having a problem with a data transfer from Google Ads. When I schedule the backfill I get the following error for some dates:
Invalid value: Load configuration must specify at least one source URI
When I check the log inside of the details of execution I get the following message:
Failed to start job for table p_ClickStats_5419416216$20201117 with error INVALID_ARGUMENT: Invalid value: Load configuration must specify at least one source URI
The weird part is that this happens for random dates which I had transfered before in a previous transfer. Did anyone have a problem similar to that?
I had the very same problem with the same error. I was using a free trial account. I shared my project under another account that was an upgraded account with billing set-up (but still with the promotional credits). So far, I have not had the same issue. Try to upgrade your account to one where you set up billing and try the data transfer again. Don't use the purely free trial account. You can share the project to the other account and set it up there, the data transfer. Backfilling for me has also worked and looks like no more duplicate runs either. Maybe free account trial is a limited version.

Error when creating scheduled query on Bigquery "Error creating scheduled query: er"

I just started a new project on Google Cloud, set up some bigquery datasets and tables. I now want to set up some scheduled queries. I have already enabled BigQuery Data Transfer API. My query is valid (it's just SELECT * FROM table). I can't find anything about this error online.
See screenshot
UPDATE: I've experimented a bit and it seems to be an organization wide issue. All projects, new and old within my organization get this same error when trying to schedule a query. I tried for a project in a different organization and did not have the issue. What could be causing this error for ALL projects in an organization?
UPDATE 2:
By querying a table that is not empty the error change to "Error creating scheduled query: Yn" instead of "Error creating scheduled query: er" (when the scheduled query would have queried an empty table).
I faced the same issue than you, and basically I just needed to run the query first before creating the the scheduled query... And that did the trick.
from the BQ FAQs :
"Scheduled queries use features of BigQuery Data Transfer Service. Verify that you have completed all actions required in Enabling BigQuery Data Transfer Service."
basically, what this means is that you need to enable the data transfer api in your project, AND give the user who creates the scheduled query a BQ admin role in order to have the right permissions to access that transfer service.
If done right, you should get a popup when creating the scheduled query to confirm that the data transfer service has access to your uses account (if you block popups you might not see this message and get stuck)
If this error only occurs in your organisation, I believe it might be caused by a organisation policy on Google Cloud. I would encourage you to double check if there is any org policy causing this error. If that's not the case, open a support ticket with GCP.
What worked for me was signing in through Incognito Mode with just my account and attempting to save the scheduled query. I have multiple Google Accounts signed it at one time and for whatever reason, BigQuery throws this generic error after authorization is successful and BigQuery is granted the access it requested.
You need to make sure that you are creating the query under the project targeted not in any other projects because it won't appear
Also you need to enable the API as one of the above answers
This eventually worked for me when i ran this in an cognito window

How can I move data from BigQuery or DataPrep to Firestore?

I just cleaned up my firestore collection data using DataPrep and verified the data via BigQuery. I now want to move the data back to Firestore. Is there a way to do this?
I have used manual method of exporting to JSON and then uploading using a code provided by AngularFirebase. But It is not automated as there is a need to periodically cleanup this data.
I am looking for a process within Google Cloud console. Any help will be appreciated
This is not an answer, more like a partial answer. I could not add a comment as I don't have 50 reputation yet. I am in a similar boat but not entirely the same situation. My situation being that I want to use a subset of BigQuery data and add it to Firestore. My thinking is to do the following:
Use the BigQuery API to query the data periodically using BigQuery Jobs' Load (in your case) or Query (in my case)
Convert it to JSON in code
Use batch commit in Firestore's API to update the firestore database
This is my idea and I am not sure whether this will work, but I will you know more once I am done with this. Unless someone else has better insights to help me and the person asking this question

Cannot Export a Table from BigQuery to Google Cloud Storage

I am trying to export a table from big query to google cloud storage from console/command line. The console job runs for a few minutes and errors out without any error code and the command line job also after running for sometime gives the below error:
BigQuery error in extract operation: Error processing job 'data-flow-experiment:bqjob_r308ff0f73d1820a6_00000157f77e8ab9_1': Backend error. Job aborted.
Job id of the command line is given above.
The billing is enabled for the project and the big query service is also enabled.
Also I get the below error when I try to create a bucket in the Google Cloud Storage:
AccessDeniedException: 403 The account for the specified project is read only.
Though the IAM user I am using has owner access and I have created buckets using this account previously and have also extracted tables in the past.
Please guide.
For the bigquery issue:
Do you happen to have a timestamp column which have out-of-range values (say, far far far into the future)?
If so, you can just wait for two more days, as the fix is

Cannot process data in separate locations

I am trying to load csv file to BigQuery from Google Cloud Storage by WebUI.
But sometimes occurs error.
Error message is "Cannot process data in separate locations".
What does it mean?
And how can I fix it?
This was an unintended consequence of an update to the BigQuery service. We'll provide additional followup on this bug:
https://code.google.com/p/google-bigquery/issues/detail?id=270