BigQuery web interface saying unable to find destination table - google-bigquery

Occasionally, when using the BigQuery web interface I receive an error banner at the top of the page saying unable to find destination table: <table_name> where table_name is something like anon_somerandomgibberish. I also had this happen when specifying a destination table for a query. Is this due to a temporary issue in the BigQuery service? If so, is there anything that I can do to retrieve the results of the query without having to re-execute it?

This error appears when attempting to show the results of a query and a BigQuery API call to get the table details fails or times out. This could happen for a number of reasons, such as the table having been deleted or flakiness in the BigQuery service.
If this happens again, you can use the network tab in the Chrome (or another browser's) dev tools to see which API call is failing and with what error. The URL of the failure should contain the name of the table that the error message complains about. If you can share those details with us, it can help us determine if there's a bug in the UI or whether something is going wrong with the API service.
If you would like to retrieve the results of a previously run query, you can click on the Query History in the left nav, open the relevant query, and click the "Show Previous Results" button. Note that this button will only be present if the destination table still exists. Another option, if you know the full job ID of the query, is to navigate to
https://bigquery.cloud.google.com/results/<project_id>:<job_id>

Related

Can't save query in Schedule Query. "Scheduled Query Error"

I have created a View in BigQuery and want to set it to update. When trying to save a query in a Schedule Query, an error occurs: Scheduled Query Error
This error is about permissions. You need these permissions to create a query scheduler with BigQuery:
bigquery.transfers.update or (bigquery.jobs.create and
bigquery.transfers.get )
bigquery.jobs.create
bigquery.datasets.update
Another option is adding this role roles/bigquery.admin , including all the permissions you need to schedule or modify a query.
You can read more information about permissions.
When I run a query, unless I manually specify the data location to run the query, BQ defaults to the US multi-region and uses resources in the US.
But then, The BQ Data Transfer Service throws an error since I are now trying to export the data from a query that was executed in US to a table that sits in EU.
So, to fix that, before you schedule the query, I have to go under the query detailed settings, and change the Data Location to EU
Then save these settings, and finish scheduling the query.
Before that, make sure you have enable the Billing
enter image description here
And then maybe because of BigQuery Data Transfer Service (trusted through the pop-up windows……)
bother me a lot……
If the pop-up window does not appear, you should check the address bar to make sure the window is not blocked, if it's not the cause,you can try to change the location of data(related to where you store your data),and submit.
enter image description here
I had the same problem and it turned out to be the pop-up blocker on my Firefox browser. The pop-up blocker on Chrome also stopped the scheduled query working but Chrome was a bit more visible that pop-ups were blocked.
On Firefox, click on the circled icon in the address bar to see blocked pop-ups
Change the pop-up permission to 'Allow'.
I had exactly this generic error message 'Scheduled query error', and it was driving me mad. I checked all permissions and locations etc.
I turns out that the 'Time Travel Window' was set to less than 48 hours for the dataset I was trying to write into. Which apparently BigQuery doesn't like.
To fix this I executed this query
ALTER SCHEMA `PROJECTNAME.DATASETNAME`
SET OPTIONS(
max_time_travel_hours = 72);

Bigquery schedule query creating resulting in error

I'm trying to create a scheduled job based out of saved query in big query. I'm following steps as per documentation here
After entering all the filelds, I'm getting response "Error creating scheduled query: Aq" with no additional details.
In on BigQuery documentation also, I didn't find anything.
I could run the query which I'm trying to create a schedule for, is running fine and giving appropriate results.
It turned out that, when schedule query is created for first time, the console is taking permission by take user to new tab in browser. Browser settings was suppressing it.
After acknowledging the required permissions, I could create scheduled query successfully.

Error when creating scheduled query on Bigquery "Error creating scheduled query: er"

I just started a new project on Google Cloud, set up some bigquery datasets and tables. I now want to set up some scheduled queries. I have already enabled BigQuery Data Transfer API. My query is valid (it's just SELECT * FROM table). I can't find anything about this error online.
See screenshot
UPDATE: I've experimented a bit and it seems to be an organization wide issue. All projects, new and old within my organization get this same error when trying to schedule a query. I tried for a project in a different organization and did not have the issue. What could be causing this error for ALL projects in an organization?
UPDATE 2:
By querying a table that is not empty the error change to "Error creating scheduled query: Yn" instead of "Error creating scheduled query: er" (when the scheduled query would have queried an empty table).
I faced the same issue than you, and basically I just needed to run the query first before creating the the scheduled query... And that did the trick.
from the BQ FAQs :
"Scheduled queries use features of BigQuery Data Transfer Service. Verify that you have completed all actions required in Enabling BigQuery Data Transfer Service."
basically, what this means is that you need to enable the data transfer api in your project, AND give the user who creates the scheduled query a BQ admin role in order to have the right permissions to access that transfer service.
If done right, you should get a popup when creating the scheduled query to confirm that the data transfer service has access to your uses account (if you block popups you might not see this message and get stuck)
If this error only occurs in your organisation, I believe it might be caused by a organisation policy on Google Cloud. I would encourage you to double check if there is any org policy causing this error. If that's not the case, open a support ticket with GCP.
What worked for me was signing in through Incognito Mode with just my account and attempting to save the scheduled query. I have multiple Google Accounts signed it at one time and for whatever reason, BigQuery throws this generic error after authorization is successful and BigQuery is granted the access it requested.
You need to make sure that you are creating the query under the project targeted not in any other projects because it won't appear
Also you need to enable the API as one of the above answers
This eventually worked for me when i ran this in an cognito window

Liferay 6.2 - No UserNotificationEvent exists with the primary key?

I'm facing an issue regarding Notifications on my portal (Liferay 6.2).
When I had the idea to clean old (& useless) notifications from the DB table USERNOTIFICATIONEVENT my notification portlet crashes.
Every time I open the notifications I get the following error:
Caused by: com.liferay.portal.NoSuchUserNotificationEventException: No UserNotificationEvent exists with the primary key 115765
Although my table is empty, and I login in with a user the notifications show to be 20 (for example) and when I click on them I get the error. Creating a new notification with java code, the table updates and inserts the new notifications, so after that the notifications show to be 21.
How is that possible to see 21 notifications when in USERNOTIFICATIONEVENT exists only 1 record?
How is it possible? It's because you manipulated the database without fully understanding it, a common recipe for disaster. Check where liferay site will store in which table details will fetch? for an argument to not bother. If you do anything on the database, do it through the API, never through database manipulation. Also check the link contained in that answer.
There are typically additional data structures, metadata for example for permission checks or the full text index that you'd need to update as well. And that's not a complete list
Restore your backup is the safest way to recover, because even if you get it to work now otherwise, the upgrade routines to the next version might find unexpected data. And then it's too late

Problems with BigQuery and Cloud SQL in same project

So, we have this one project which uses Cloud Storage and BigQuery as services. All has been well.
Then, I wanted to add Cloud SQL to this project to try it out. It asked for a unique Project ID so I gave it one. (The Project ID is different than the Project Number.)
Ever since then, I've been having a difficult time accessing my BigQuery tables. When I go to the BigQuery web interface, the URL contains the Project ID instead of the original Project Number. It shows the list of datasets, but now shows the Project Number before each dataset name and the datasets are greyed out and inaccessible. If I manually change the URL to contain the Project Number instead of the Project ID, it appears to work although it shows the list of datasets in the left nav twice, one set greyed out and inaccessible and the other set seemingly accessible.
At the same time, some code that I've been successfully using in Apps Script that accesses BigQuery is now regularly failing with a generic "We're sorry, a server error occurred. Please wait a bit and try again." I'm not sure if this is related to the Project ID/Project Number confusion, or if it's just a Red Herring.
Since we actively use the Cloud Storage service of this project, I am trying to be cautious with further experimentation with this project. I'm not sure if I should delete the Cloud SQL service in this project to get it back to the way it was, or if this is a known issue with some back-end solution. Please advise.
After setting the project id, there can be a delay where BigQuery picks up the change. It should happen within 15 minutes or so, but sometimes it takes longer.
If you send the project ID I can make sure it has been updated.