getting error while creating table in bigquery - google-bigquery

I am getting error while creating a table which is
Unexpected error
Tracking number: c122671418813749
enter image description here

Related

Getting error while using google data transfer to transfer multiple csv table refrences from GCS to Bigquery

I'm getting error while I'm trying to transfer file from Google cloud storage to google big query. This is the error :
Error while reading data, error message: CSV table references column position 101, but line starting at postion:2611 contains only 101 columns
There was a new field that was recently added, so we believe this may be the issue b/c out of many loads, only 3 per day are working.
When I read this error, I understand it was the line starting in the incorrect column - but correct me if I am wrong.
Can this be corrected?

BigQuery return Unknown error after create table name with '_ads` suffix

I try both API and GUI to create this empty table and they both failed.
I create many tables via API just fine but only this name organizes_ads has a problem.
Same create process and schema can create organizes_ads_0 but not organizes_ads.
If I try to get this table via API it will return.
{"error":{"code":-1,"message":"A network error occurred, and the request could not be completed."}}
I tend to use this name because it's a replicated table name from other source, so it will be weird if I have to hard code to use other name for workaround.
[UPDATE] I also found that any table name with suffix _ads will be broken (so nothing wrong with schema).
This error could be caused by an AdBlocker.
I created a table with _ads suffix and when enabled the AdBlocker I got the same error: Unknown error response from the server.

Bigquery Error: 8822097

On trying to load a json file to bigquery. I get the following error: "An internal error occurred and the request could not be completed. Error: 8822097". Is this an error related to hitting the bigquery daily load limit? It will be amazing if someone can point me to a glossary of errors.
{Location: ""; Message: "An internal error occurred and the request could not be completed. Error: 8822097"; Reason: "internalError"
Thanks!
Are you trying to load different types of file in a single command?
It may happen when you try to load from a Google Storage path with both compressed and uncompressed files:
$ gsutil ls gs://bucket/path/
gs://bucket/path/a.txt
gs://bucket/path/b.txt.gz
$ bq load --autodetect --noreplace --source_format=NEWLINE_DELIMITED_JSON "project-id:dataset_name.table_name" gs://bucket/path/*
Waiting on bqjob_id_1 ... (0s) Current status: DONE
BigQuery error in load operation: Error processing job 'project-id:bqjob_id_1': An internal error occurred and the request could not be completed. Error: 8822097
This error can occur due to the maximum columns per table — 10,000 BigQuery limit.
To verify this, you can check the number of distinct columns in the used table:
bq --format=json show project:dataset.table | jq . | grep "type" | grep -v "RECORD" | wc -l
Reducing the number of columns would probably be the best and quickest way to work-around this issue.
We got the same error "An internal error occurred and the request could not be completed. Error: 8822097" when running a standard sql query. Running the corresponding legacy sql query gave us an error message that was actually actionable:
Error while reading table: ABC, error message: The reference schema
differs from the existing data: The required field 'XYZ' is
missing.
Fixing the underlying error, exposed by the legacy sql query, also fixed the error for the standard sql query.
In our case we have avro files. The table was created from the avro files. Newer avro files didn't contain a certain field but the table still contained that field. Rebuilding the table from the new avro files solved the issue. We also have views on top of the table which may or may not change the resulting error message.

Google Bigquery error 404 Not Found for dataset which exists

I have dataset having name "data". I am getting an error 404 Not Found when trying to delete this dataset using Java API.
I am able to create and delete table under this dataset. I wanted to delete this dataset but it returning an error.
I tried this on Google Bigquery web console there I am getting an error "Unable to find dataset:data"

Query Failed Error: Property had unexpected type. Google BigQuery

I have an external table linked to our GCS account in Google BigQuery. I am attempting to run a simple SELECT query (select * from [.] LIMIT 10) with Google BigQuery Web UI. The table has around 13 GB of data. The scheme looks as if it is able to see the data correctly.
The error I received is as such:
Query Failed
Error: Property had unexpected type.
Job ID: csgapi:bquijob_3e7a58f7_155e4baea88