I have a dataset located in Europe-west3, and i'm trying to setup scheduled queries on that dataset. However, when setting up the scheduled query, the "processing location" option doesn't contain Europe-west3 as an option. Leaving it as "default" makes the processing location be US, and then the query is unable to run. There are only like 7 procesing locations available, i tried both EU and Europe-west2, but neither work.
I don't really know what to do to get my queries to run on schedule. I can run the queries just fine normally, but trying to schedule them the processing location simply wont let me pick the correct location.
Any ideas?
Currently Schedule Queries does not support region europe-west3. Follow (star) this public issue tracker to stay updated.
Right now if you need to implement scheduled queries you should create a replica of that dataset in another region that is supported and run them there. I would suggest creating a copy of that dataset in another region. However, this feature is also not available for region europe-west3 right now.
I hope you can achieve what you desire without many headaches.
Related
A few days ago, I started receiving an error in my Scheduled Queries dashboard Error loading location europe-west8: BigQuery Data Transfer Service does not yet support location: europe-west8.
I'm in the US, so I have set all 4 of my storage buckets are set to US or REGION, and have confirmed their locations.
Datasets are all US:
Scheduled queries are all Region "us"
Since this error started, my BigQuery Scheduled Queries that append data to tables have stopped running.
Where can I change the setting that seems to be calling europe-west8?
You need to check the region of the dataset you are using. The destination table for your scheduled query must be in the same region as the data being queried.
You can see the scheduled queries are supported in these locations here.
You specify a location for storing your BigQuery data when you create a dataset. After you create the dataset, the location cannot be changed, but you can copy the dataset to a different location, or manually move (recreate) the dataset in a different location.
You can see more information about how locations work in BigQuery here.
EDIT
This is a known issue from BigQuery UI, and the engineering team is aware of and is working towards a solution, although so far there isn't a specific ETA. Feel free to start the issue to raise further awareness towards it.
There are two possible workarounds you can try to circumvent this.
More specifically,
Workaround#1
Using the old UI, you can do it by clicking on "Disable editor
tabs".
Workaround#2
In Scheduled Query Editor > click the SCHEDULE dropdown > choose "Enable scheduled queries".
The overlay shows up with the message box ("Enable scheduled queries").
Click anywhere on the screen to close the overlay
Click the SCHEDULE dropdown again, and the create/update options are there.
If you are running schedule queries check that the processing location is set to the location of your data source and the destination table is also correct.
Checking the docs about setting a query location.
https://cloud.google.com/bigquery/docs/scheduling-queries
I'm trying to create a scheduled job based out of saved query in big query. I'm following steps as per documentation here
After entering all the filelds, I'm getting response "Error creating scheduled query: Aq" with no additional details.
In on BigQuery documentation also, I didn't find anything.
I could run the query which I'm trying to create a schedule for, is running fine and giving appropriate results.
It turned out that, when schedule query is created for first time, the console is taking permission by take user to new tab in browser. Browser settings was suppressing it.
After acknowledging the required permissions, I could create scheduled query successfully.
I just started a new project on Google Cloud, set up some bigquery datasets and tables. I now want to set up some scheduled queries. I have already enabled BigQuery Data Transfer API. My query is valid (it's just SELECT * FROM table). I can't find anything about this error online.
See screenshot
UPDATE: I've experimented a bit and it seems to be an organization wide issue. All projects, new and old within my organization get this same error when trying to schedule a query. I tried for a project in a different organization and did not have the issue. What could be causing this error for ALL projects in an organization?
UPDATE 2:
By querying a table that is not empty the error change to "Error creating scheduled query: Yn" instead of "Error creating scheduled query: er" (when the scheduled query would have queried an empty table).
I faced the same issue than you, and basically I just needed to run the query first before creating the the scheduled query... And that did the trick.
from the BQ FAQs :
"Scheduled queries use features of BigQuery Data Transfer Service. Verify that you have completed all actions required in Enabling BigQuery Data Transfer Service."
basically, what this means is that you need to enable the data transfer api in your project, AND give the user who creates the scheduled query a BQ admin role in order to have the right permissions to access that transfer service.
If done right, you should get a popup when creating the scheduled query to confirm that the data transfer service has access to your uses account (if you block popups you might not see this message and get stuck)
If this error only occurs in your organisation, I believe it might be caused by a organisation policy on Google Cloud. I would encourage you to double check if there is any org policy causing this error. If that's not the case, open a support ticket with GCP.
What worked for me was signing in through Incognito Mode with just my account and attempting to save the scheduled query. I have multiple Google Accounts signed it at one time and for whatever reason, BigQuery throws this generic error after authorization is successful and BigQuery is granted the access it requested.
You need to make sure that you are creating the query under the project targeted not in any other projects because it won't appear
Also you need to enable the API as one of the above answers
This eventually worked for me when i ran this in an cognito window
All my datasheets, tables, and ALL items inside BQ are un EU. When I try to do a View->to->Table 15 min scheduled query I get an error regarding my location, which is incorrect, because all, source and destiny are both on EU...
Anyone knows why?
There is a transient known issue matching your situation, GCP support team needs more time for troubleshooting. There may be a potential issue in the UI. I would ask you to try the following steps:
Firstly, try to make the same operation in Chrome's incognito mode.
Another possible workaround is trying to follow this official guide using a different approach than the UI (CLI for instance).
I hope it helps.
So, we have this one project which uses Cloud Storage and BigQuery as services. All has been well.
Then, I wanted to add Cloud SQL to this project to try it out. It asked for a unique Project ID so I gave it one. (The Project ID is different than the Project Number.)
Ever since then, I've been having a difficult time accessing my BigQuery tables. When I go to the BigQuery web interface, the URL contains the Project ID instead of the original Project Number. It shows the list of datasets, but now shows the Project Number before each dataset name and the datasets are greyed out and inaccessible. If I manually change the URL to contain the Project Number instead of the Project ID, it appears to work although it shows the list of datasets in the left nav twice, one set greyed out and inaccessible and the other set seemingly accessible.
At the same time, some code that I've been successfully using in Apps Script that accesses BigQuery is now regularly failing with a generic "We're sorry, a server error occurred. Please wait a bit and try again." I'm not sure if this is related to the Project ID/Project Number confusion, or if it's just a Red Herring.
Since we actively use the Cloud Storage service of this project, I am trying to be cautious with further experimentation with this project. I'm not sure if I should delete the Cloud SQL service in this project to get it back to the way it was, or if this is a known issue with some back-end solution. Please advise.
After setting the project id, there can be a delay where BigQuery picks up the change. It should happen within 15 minutes or so, but sometimes it takes longer.
If you send the project ID I can make sure it has been updated.