bigquery backend error - no more details available - google-bigquery

Job No: swift-sphinx-624:job_Ja1iYkl8OdF83J9xU5CIQJFlomM
Is failing, tried making the dataset public to no avail. Any info more descriptive than 'backend error' would be greatly appreciated.
Really sorry but I just dont have anything more to give, error message is so undescriptive.

You should not upload a file that big to BigQuery. The proper way is to upload that file to google storage and then import from there. This process is much more reliable.
google's services fail. we have to account for that.
Load from Cloud Storage
https://developers.google.com/bigquery/loading-data-into-bigquery#loaddatagcs
Resumable uploads to Cloud Storage
https://developers.google.com/storage/docs/gsutil/commands/cp#resumable-transfers
I hope this helps!

Related

Whatsapp Cloud API Save Session Attributes

I am creating a component in AWS Lambda that is responsible for receiving a WhatsApp message, retrieving the text and sending it to another system.
This other system is capable of connecting to multiple cognitive engines, recovering the user's intention and generating a correct response.
Before getting into the trouble of saving information in DynamoDB, I wanted to find out if it was possible to save a field in the Whatsapp session.
I've read the documentation and I've done a lot of research on the Postman provided by Meta and I don't see how to do it or if it's possible to do it.
Basically I need to save a session id from the other system to be able to keep up with the conversation.
I've read a lot of the documentation and I don't see anything that can help me.
API Whats App Cloud
Thank you very much for the help.

Azure data Factory save CSV from URL

I need to download a CSV file from a URL using Azure Data Factory v2.
The URL is: https://api.worldtradingdata.com/api/v1/history?symbol=SNAP&output=csv&sort=newest&api_token=demo
Do you know how to do this. I was thinking about downloading it to Blob storage but am unsure what connection to use?
Thanks,
Bob
This is easy thanks to the HTTP connector, here is a tutorial: https://learn.microsoft.com/en-us/azure/data-factory/connector-http
The tutorial guides you in creating a linked service, a dataset for that linked service and finally do a copy activity using that dataset!!
Should be fairly easy to follow, but if you have any questions be sure to reply me and ask away!
Hope this helped!

BigQuery API [HELP]

My spreadsheet was working normally, but this error started to appear. My account is business, so I did not activate the charge.
I've done some spreadsheets and none of them needed it.
Can someone help me?
This error
The error message states that you do not have a billing account attached to this project and therefore you will not be able to perform querying in BigQuery of your own data until you attach a billing account. If you are saying that everything works from the UI of the Google Cloud Platform Console, you are probably using a different project there.

Upload of Large Files to Google Drive with the Google Drive API for Android (GDAA)

I realize that similar questions have been asked before. However, none of them was answered.
My problem is the following:
To upload a file to Google Drive, you need to create DriveContents.
You either do this by creating them out of thin air:
DriveApi.DriveContentsResult driveContentsResult = Drive.DriveApi.newDriveContents(getGoogleApiClient()).await();
DriveContents contents = driveContentsResult.getDriveContents();
Or you do this by opening an already existing file:
DriveApi.DriveContentsResult driveContentsResult = driveFileResult.getDriveFile().open(getGoogleApiClient(), DriveFile.MODE_WRITE_ONLY, null).await();
DriveContents contents = driveContentsResult.getDriveContents();
You are now ready to fill the DriveContents with data. You do this by obtaining an OutputStream and by writing to this OutputStream:
FileOutputStream fileOutputStream = new FileOutputStream(driveContentsResult.getDriveContents().getParcelFileDescriptor().getFileDescriptor());
Now this is where the problem starts: by filling this OutputStream, Google Play services just copy the file I want to upload and create a local copy. If you have 0.5 GB of free space on your phone and you want to upload a 1.3 GB file, this is not going to work! There is not enough storage space.
So how is it done? Is there a way to directly upload to Google Drive via the GDAA that does not involve creating a local copy first, and THEN uploading it?
Does the Google REST API handle these uploads any different? Can it be done via the Google REST API?
EDIT:
It seems this cannot be done via the GDAA. For people looking for a way to do resumable uploads with the Google REST API, have a look at my example here on StackOverflow.
I'm not sure if it can but you can surely try to use Google REST APIs to upload your file.
You could use Multipart upload or Resumable upload:
Multipart upload
If you have metadata that you want to send along with the data to upload, you can make a single multipart/related request. This is a good choice if the data you are sending is small enough to upload again in its entirety if the connection fails.
Resumable upload
To upload data files more reliably, you can use the resumable upload protocol. This protocol allows you to resume an upload operation after a communication failure has interrupted the flow of data. It is especially useful if you are transferring large files and the likelihood of a network interruption or some other transmission failure is high, for example, when uploading from a mobile client app. It can also reduce your bandwidth usage in the event of network failures because you don't have to restart large file uploads from the beginning.
You must remember as discussed in this SO question:
The GDAA's main identifier, the DriveId lives in GDAA (GooPlaySvcs) only and does not exist in the REST Api.
ResourceId can be obtained from the DriveId only after GDAA committed (uploaded) the file/folder)
You will run into a lot of timing issues caused by the fact that GDAA 'buffers' network requests on it's own schedule (system optimized), whereas the REST Api let your app control the waiting for the response..
Lastly you can check this related SO question regarding tokens and authentication in HTTP request in android. There are also some examples by seanpj for both GDAA and the REST api that might help you.
Hope this helps.

Is there a way to resume a Google BigQuery export after it reached the 500 URI quota cap?

I'm exporting a ~500GB table to google cloud storage as AVRO files, and I hit the 500 URI quota cap. It looks like a hard limit, even though I've given them payment information, and would gladly toss them a few bucks to finish getting my data :-).
Is there any way to resume an export from where it left off? Maybe an undocumented offset parameter in an API? I didn't find one, but figured it was worth asking the community.
BigQuery Quota Policy for reference.