BigQuery result export to Google Drive is blocked for projects with VPC-SC enabled - google-bigquery

Conditions
BigQuery Project in VPC-SC
User have Project Owner Role via IAM
Step
Run BQ sql via developers console.
SAVE RESULTS
Select CSV (Google Drive)
failed to export: Export to Google Drive is blocked for projects with VPC-SC enabled
I tried..
Add VPC-SC ingress, egress settings like https://cloud.google.com/bigquery/docs/connected-sheets
Google has not officially announced the project_number of google drive, so I set egress from all project.

Related

How to upload a 9GB file with extension sql.gz to SQL BigQuery Sandbox

I want to upload a 9.6 GB file with the extension .sql.gz to my SQL BigQuery Sandbox (free) account. I received a message that the file is too big and that I need to upload it from the cloud. When trying to upload it from the cloud, I am asked to create a bucket, and if I want to create a bucket, I get the message: "billing must be enabled". Is there any alternative, specifically for an sql.gz file?
As of now, there is no alternative but to upload .gz files files to a bucket in Cloud Storage and use the bq command-line tool to create a new table.
You may enable billing for your existing project to use Cloud Storage.

Where is the root Azure Storage instance?

I am trying to access logs from my Databricks notebook which is run as a job. I would like to see these logs in an azure storage account.
From the documentation: https://learn.microsoft.com/en-us/azure/databricks/administration-guide/workspace/storage#notebook-results
According to this, my results are stored in the workspace's root Azure Storage instance. However, I can't find any reference to this elsewhere online. How would I access this?
The documentation says:
Notebook results are stored in workspace system data storage, which is not accessible by users.
But you can retrieve these results via UI, or via get-output command of Jobs REST API, or via runs get-output command of databricks-cli.

How to create symbolic link for Google Cloud SDK gcloud directory on a NAS drive?

The problem: we have several servers that need to reference the same Google Cloud SDK credentials and we want to reference those credentials from a central location. What is the easiest way to share these credentials between several servers?
What we tried: we tried to create soft and hard symbolic links to the gcloud directory but we did not have success, in both cases we received the following error message:
C:\Users\Redacted\AppData\Local\Google\Cloud SDK> bq ls
WARNING: Could not open the configuration file: [C:\Users\Redacted\AppData\Roaming\gcloud\configurations\config_default].
ERROR: (bq) You do not currently have an active account selected.
Found the answer... use the below commands for the two files below to make a soft symbolic link to centralize credential files, not the gcloud directory itself:
mklink access_tokens.db
mklink credentials.db

Copy files between cloud storage providers

I need to upload a large number of files to one cloud storage provider and then copy those files to another cloud storage provider using software that I will write. I have looked at several cloud storage providers and I don't see an easy way to do what I need to do unless I first download the files and then upload them to the second storage provider. I want to copy directly using cloud storage provider API's. Any suggestions or links to storage providers that have API's that will allow copying from one provider to another would be most welcome.
There is several option you could choose. First using cloud transfer services such as Multi Cloud. I've using it to transfer from AWS S3 or Egnyte to Google Drive.
Multicloud https://www.multcloud.com which is free to for 30GB data traffic per month.
Mountain duck https://mountainduck.io/ if connector are available you could mount each cloud services as your hard drive, and move each file easily.
I hope this could help.
If you want to write code for it use Google's gsutil :
The gsutil cp command allows you to copy data between your local file
system and the cloud, copy data within the cloud, and copy data
between cloud storage providers.
You will find detailed info in this link :
https://cloud.google.com/storage/docs/gsutil/commands/cp
If you want a software, use Multicloud. https://www.multcloud.com/
It can download directly from the web and it can also transfer the file from one cloud storage like dropbox to another like google drive.
Cloud HQ also as a chrome extension is one of the best solutions to sync your data between clouds. You can check it out.

Migrate s3 data to google cloud storage

I have a python web application deployed on Google App Engine.
I need to grab a log file stored on Amazon S3 and load it into Google Cloud Storage. Once it is in Google Cloud Storage I may need to perform some transformations and eventually import the data into BigQuery for analysis.
I tried using gsutil as a some sort of proof of concept, since boto is under the hood of gsutil and I'd like to use boto in my project. This did not work.
I'd like to know if anyone has managed to transfer file directly between the 2 clouds. If possible I'd like to see a simple example. In the end this task has to be accomplished through code executing on GAE.
Per this thread, you can stream data from S3 to Google Cloud Storage using gsutil but every byte still has to take two hops: S3 to your local computer and then your computer to GCS. Since you're using App Engine, however, you should be able to pull from S3 and deposit into GCS. It's the same progression as above except App Engine is the intermediary, i.e. every byte travels from S3 to your app and then to GCS. You could use boto for the pull side and the Google Cloud Storage API for the push side.
Google allows you to import entire buckets from S3 to the storage service:
https://cloud.google.com/storage/transfer/getting-started
You can set file filters on the source bucket to only import the file you want, or a "directory" (i.e. anything with a certain prefix).
I'm not aware of any cloud provider that provides an API for transferring data to a competing cloud provider. Cloud providers have no incentive to help you move your data to the competition. You will almost certainly have to read the data to an intermediate machine then write it to Google.
GCP supports not only transfer from S3, also it supports all the storage which have S3-compatible API's.
https://cloud.google.com/storage-transfer/docs/create-transfers
https://cloud.google.com/storage-transfer/docs/s3-compatible