I keep getting an "Failed to create table: Unexpected error" every time I try to upload my CSV file(30,395KB) to Bigquery.
I've tried to see if zipping the file helped but it doesn't. I've never had this problem before with all of the other CSV files
Related
I keep getting an error when I try and upload data from my own documents in csv format under the table destination. How do I fix this so I can write queries for my data?
I tried changing the names of the file I was uploading and following the instructions from my course exactly without any luck. I was expecting the data to be uploaded into my project file so I could write queries to analyze the data.
I want to create a Table for my dataset in BigQuery. I want to upload CSV file. When I upload it and clicked on "create table" it is saying:
unexpected error. Tracking number c986854671035387
What is this error and How can I solve this? (I also upgraded my BigQuery to 90 days free trial).
You need to check the data inside CSV. If it has column names and no faulty records.
you can download any sample CSV file from here and try
http://www.mytrapture.com/sampledata/
I clicked a table on bigquery dashboard, got this error:
However, I can get data when I do a select on this table. (That means the table does exist)
I already have the highest admin privilege so it shouldn't be a permission issue.
I created this table with python script, which collects data, writes into a csv file, and upload the csv file to bigquery everyday. After I created the table I once changed the schema both in the script and on the dashboard. Not sure if that's the cause, but the table loading error occurred several days after I changed the schema.
If you have Addblock extensions, this might be the root cause of this issue. Thus, try disabling it, then try running your query again.
Hope it helps.
I have successfully loaded large number of AVRO files (of same schema type into same table), stored on Google Storage, using bq CLI utility.
However, for some of the AVRO files I am getting very cryptic error while loading into bigquery, the error says:
The Apache Avro library failed to read data with the follwing error: EOF
reached (error code: invalid)
With avro-tools validated that the AVRO file is not corrupted, report output:
java -jar avro-tools-1.8.1.jar repair -o report 2017-05-15-07-15-01_48a99.avro
Recovering file: 2017-05-15-07-15-01_48a99.avro
File Summary:
Number of blocks: 51 Number of corrupt blocks: 0
Number of records: 58598 Number of corrupt records: 0
I tried creating a brand new table with one of the failing files in case it was due to schema mismatch but that didnt help as the error was exactly the same.
need help to figure out what could be causing the error here?
No way to pinpoint the issue without more information, but I ran into this error message and filed a ticket here.
I a number of files in a single load job were missing columns which was causing the error.
Explanation from the ticket.
BigQuery uses the alphabetically last file from the directory as the avro schema to read the other Avro files. I suspect the issue is with schema incompatibility between the last file and the "problematic" file. Do you know if all the files have the exact same schema or differ? One thing you could try to help verify this is to copy the alphabetically last file of the directory and the "problematic" file to a different folder and try to load those two files in one BigQuery load job and see if the error reproduces.
Im getting this error:
Error SQL query… #2006 - MySQL server has gone away
I'm suspecting the CSV file might be the source of the problem as other CSV work, I've tried changing the following values on the SQL server
the key_buffer_size = 900M
max_allowed_packet = 900M
But it doesnt seem to fix the problem, I've tried converting the file to SQL, XML but it just doesn't want to import.
any advice?
Here are the files:
CSV im traying to upload
CSV that works
With phpMyAdmin 4.4.13.1 and the CSV import, I was able to import your file, after I changed the first line to:
Username,Client Name,Date,Time,Published App,x
The only way I managed to import the file was to make smaller file and upload the one by one... which is strange as the csv wasn't that big 4MB thousand records...