Is it possible to change the region of a Google Cloud Platform project? - google-bigquery

If I go to the Google Developer Console then I can see all my Cloud Platform projects, but not their regions.
How do I see the region of each project? And is it possible to change the region once it has been set?
Thanks for any help.

There is no such thing as a region of a GCP project.
In other words, region/location is specific to resources, and a GCP project is not permanently tied to a single region/location.
For example, you can have a project with multiple BigQuery datasets in different regions.
That same project can have many Compute Engine instances running, each one in different location/region.
There is a default region that is set per GCP project, but that can always be overwritten when creating resources in GCP, and is mainly used to guess default location when location is not specified in API calls.

Regarding the BigQuery aspect of this question:
Data Locations on a table are immutable once set.
In order to change the location, the easiest solution would be to export the data to Google Cloud Storage, delete the table, re-create the table in the correct region, then import the data.

https://cloud.google.com/appengine/docs/python/console/#server-location
Setting the server location
When you create your project, you can specify the location from which it will be served. In the new project dialog, click on the link to Show Advanced Options, and select a location from the pulldown menu:
us-central
us-east1
europe-west
If you select us-east1 your project will be served from a single region in South Carolina. The us-central and europe-west locations contain multiple regions in the United States and western Europe, respectively. Projects deployed to either us-central or europe-west may be served from any one of the regions they contain. If you want to colocate your App Engine instances with other single-region services, such as Google Compute Engine, you should select us-east1.

Related

How to query a BigQuery table in one GCP project and one location and write results to a table in another project and another location with Airflow?

I need to query a BigQuery table in one GCP project (say #1) and one location (EU) and write results to a table in another project (say #2) and another location (US) with Airflow.
Composer/Airflow instance itself runs in project #2 and location US.
Airflow is using GCP connection configured with a service account from project #2 which also has most of the rights in project #1.
I realise that this might involve multiple extra steps such as storing data temporarily in GCS, so this is fine as long as the end result is achieved.
How should I approach this problem? I saw quite a few articles but none does suggest a strategy for dealing with this situation which I suppose is fairly common.

Is it possible to extract job from big query to GCS across project ids?

Hey guys trying to export a bigquery table to cloud storage a la this example . Not working for me at the moment, am worried that the reason is that the cloud storage project is different to the bigquery table, is this actually doable? I can't see how using that template above.
Confirming:
You CAN have your table in ProjectA to be exported/extracted to GCS bucket in ProjectB. You just need make sure you have proper permissions on both sides. At least:
READ for respective dataset in Project A and
and
WRITE for respective bucket in Project B
Please note: Data in respective dataset of Project A and bucket in Project B - MUST be in the same location - US or EU , etc.
Simply to say: sourse and destination must be in the same location

Google BigQuery, unable to load data into shared datasets

I created a project on Google BigQuery and enabled billing.
Went on to create few datasets that were shared with my team members (Can EDIT premissions).
However, my team mates are unable to load data into the respective datasets shared with them. Whenever they try it says billing not enabled for this project.
I am able to load data into the datasets but not my team.
It's been more than 24 hours
Thanks in advance
Note that in order to load data, they need to run a load job, and that load job needs to be run in a project. Perhaps billing is not enabled on the project they are using?
You can give your team members read access to the project (or greater) to allow them to run jobs in your own billing-enabled project.
You can share a BigQuery project at the project level and at the dataset level.
See https://developers.google.com/bigquery/access-control.
I assume you are sharing at the dataset level. Can you try sharing the project instead with your team members? (here: https://cloud.google.com/console/project)
Please report back!

BigQuery Data Location Setting

Is there a way to determine "BigQuery Data Location Setting", similar to "Cloud Storage Data Location Setting" or “Datastore Data Location Setting”?
Apparently there are some legal & tax issues for companies operating outside of the US when using services hosted in the US. Our legal guys have asked me to configure the BigQuery location to be in EU. But i couldn't find where to configure this.
Thanks
There isn't currently a way to locate your BigQuery data in the EU. Right now, all of it is located in the United States.
That said, one of the reasons why this hasn't been done yet is due to lack of customer interest in EU datacenters. If you have a relationship with google cloud support and want this feature, please let them know. Alternately, vote up the question and we'll take that into account when we prioritize new features.
This appears to have changed now, so you can actually select the EU datacenter:
http://techcrunch.com/2015/04/16/google-opens-cloud-dataflow-to-all-developers-launches-european-zone-for-bigquery/
Another issue arises when you want to copy datasets from one region to the other which is not currently possibly (at least directly). Here is how you can check the location of your dataset. Open up a Google Cloud Shell and enter this command:
bq show --format=prettyjson {PROJECT_ID}:{DATASET_NAME} | grep location
However, note that you cannot edit the location. You will need to backup/export all your tables, delete the dataset, and recreate the dataset with the desired location.

Problems with BigQuery and Cloud SQL in same project

So, we have this one project which uses Cloud Storage and BigQuery as services. All has been well.
Then, I wanted to add Cloud SQL to this project to try it out. It asked for a unique Project ID so I gave it one. (The Project ID is different than the Project Number.)
Ever since then, I've been having a difficult time accessing my BigQuery tables. When I go to the BigQuery web interface, the URL contains the Project ID instead of the original Project Number. It shows the list of datasets, but now shows the Project Number before each dataset name and the datasets are greyed out and inaccessible. If I manually change the URL to contain the Project Number instead of the Project ID, it appears to work although it shows the list of datasets in the left nav twice, one set greyed out and inaccessible and the other set seemingly accessible.
At the same time, some code that I've been successfully using in Apps Script that accesses BigQuery is now regularly failing with a generic "We're sorry, a server error occurred. Please wait a bit and try again." I'm not sure if this is related to the Project ID/Project Number confusion, or if it's just a Red Herring.
Since we actively use the Cloud Storage service of this project, I am trying to be cautious with further experimentation with this project. I'm not sure if I should delete the Cloud SQL service in this project to get it back to the way it was, or if this is a known issue with some back-end solution. Please advise.
After setting the project id, there can be a delay where BigQuery picks up the change. It should happen within 15 minutes or so, but sometimes it takes longer.
If you send the project ID I can make sure it has been updated.