BigQuery unexpected error when creating a table - google-bigquery

I am trying to create table for a project. I have to csv filed save on my desktop. I am following the course instructions for creating the table. Every time I hit 'create table' I get an unexpected error has occured. Any help??

I encountered with exactly the same issue, I moved desired file to the google drive and when you are creating a table use option where you link files from drive, but since you are not loading the file but linking from drive you wont be able to preview the data but it's there run a query and you ll see it returns the data. Ciao.

Related

Azure Data Factory Failing with Bulk Load

I am trying to extract data from a Azure SQL Database, however I'm getting the
Operation on target Copy Table to EnrDB failed: Failure happened on 'Source' side. ErrorCode=SqlOperationFailed,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=A database operation failed with the following error: 'Cannot bulk load because the file "https://xxxxxxx.dfs.core.windows.net/dataverse-xxxxx-org5a2bcccf/appointment/2022-03.csv" could not be opened. Operating system error code 12(The access code is invalid.).
You might be thinking this is permission issue, but if you take a look at the error code 12 you will see the issue is related to Bulk Load.. a related answer can be found here..
https://learn.microsoft.com/en-us/answers/questions/988935/cannot-bulk-load-file-error-code-12-azure-synapse.html
I thought I might be able to fix the issue by selecting Bulk lock see image.
But I still get the error.
Any thoughts on how to resolve this issue?
As I see that the error is refering to a source side (2022-03.csv) , so I am not sure as to why are you making changes on the sink side . As explained in the threads which you referd , it appears the the CSV file is getting updated once the you pipeline starts execute by some other process . Refering back to the same thread .https://learn.microsoft.com/en-us/answers/questions/988935/cannot-bulk-load-file-error-code-12-azure-synapse.html
The changes suggested below should be made on the pipeline/process which is writing to 2022-03.csv .
[![enter image description here][1]][1]
HTH
[1]: https://i.stack.imgur.com/SSzwt.png

BigQuery - Data transfer "Detected that no changes will be made to the destination table"

I use a script to generate files from an API and store them on Google Cloud Storage. Following this documentation, https://cloud.google.com/bigquery/docs/cloud-storage-transfer?hl=en_US#limitations, I've created a BigQuery table with the corresponding schema in advance and t then created a Data Transfer with the following configuration:
When I run the Data Transfer the following error shows up in the logs:
Detected that no changes will be made to the destination table
I've updated some of the files, added files, deleted files, etc and everytime I get the same message. I also have other Data Transfers that work just fine with the same BigQuery instance and Cloud Storage bucket.
Only issue I found on SO, Not able to update Big query table with Transfer from a Storage file, says you need to wait 1 hour, but even after a day I get the same error.
Any idea as to what triggers BiQuery to determine changes have been made (or not)?

SQL Server - insufficient memory (mscorlib) / 'the operation could not be completed'

I have been working on building a new database. I began by building the structure within the database it is replacing and populating this as I created each set of tables. Once I had made additions I would drop what had been created and execute the code to build the structure again and a separate file to insert the data. I repeated this until the structure and content was complete to ensure each stage was as I intended.
The insert file is approximately 30mb with 500,000 lines of code (I appreciate this is not the best way to do this but for various reasons I cannot use alternative options). The final insert completed and took approximately 30 minutes.
A new database was created for me, the structure executed successfully but the data would not insert. I received the first error message shown below. I have looked into this and it appears I need to use the sqlcmd utility to get around this, although I find it odd as it worked in the other database which is on the same sever and has the same autogrow settings.
However, when I attempted to save the file after this error I received the second error message seen below. When I selected OK it took me to my file directory as it would if I selected Save As, I tried saving in a variety of places but received the same error.
I attempted to copy the code into notepad to save my changes but the code would not copy to the clipboard. I accepted I would lose my changes and rebooted my system. If I reopen this file and attempt to save it I receive the second error message again.
Does anyone have an explanation for this behaviour?
Hm. This looks more like an issue with SSMS and not the SQL Server DB/engine.
If you've been doing few times, possibly Management Studio ran out of RAM?
Have you tried breaking INSERT into batches/smaller files?

Open Refine Error Uploading Data?

I'm trying google refine out to address name disambiguation in my data.
Whenever I upload a CSV, however, I keep getting this error.
I've been following the tutorial at this link Tutorial
Error uploading data
.import-temp/1405348781604/raw-data/spreadsheet/ccc (No such file or directory)
I also came across this in my google search, naming a similar problem I'm facing.
https://github.com/OpenRefine/OpenRefine/issues/670
But I don't know how to fix the issue. Am I supposed to go into the source code and edit the lines mentioned? If so, can someone please give me some directions about how to do that?
The reason for this error message is that you don't have write access for the folder that contains the OpenRefine files.
I first had the files in C:\Program Files (x86)\OpenRefine\. Then I moved the folder to D:\Documents\OpenRefine and the error disappeared.

Error message: Impossible to create ACCDE, MDE or ADE

Upon trying to create an execution-only front-end database file in *.accde format, i receive the error message "Impossible to create accde, mde, or ade. After doing a bit of research i am informed that this message happens when the dimensions of the database are too big. Looking at my database which is only 10KB with about 40+ linked tables linking to a back end database of 45KB, I am confused as to why I am not allowed to perform this action.
This error is common when there are errors compiling the database. If you open the database and go into the VBA window are you able to compile the database without a problem (debug menu -> compile)? If you run into errors compiling the database this way, fix these errors and then once you can successfully compile the file this way then try creating the ACCDE file again.