Copy Datasets - Between different Organizations in GCP - google-bigquery

we are having a couple of GCP Organizations, Company X in the EU and the other company in the US. I know we can move datasets across different regions.
But is it also possible to move resources across other Organizations?

Related

Cross-region movement - Transferring Big Query and GCS data from US to EU locations

For compliance requirements, we would like to move all our Bigquery data and GCS data from US region to EU region.
From my understanding, multi-region is either within US or within EU. There is no cross-region option as such.
Question 1: In order to move the data from US to EU or vice versa, our understanding is that we need explicitly move the data using a storage transfer service. And assuming a cost associated with this movement even though it is within Google cloud?
Question 2: We also think if we can maintain copies at both locations. In this case, Is there a provision for cross-region replication? If so, what would be the associated cost for the same?
Question1:
You are moving data from one part of the world to another one. So, yes you will pay the egress cost of the source location.
Sadly, today (November 28th 2023), I can't 100% commit on that cost. Indeed, I reached Google Cloud about a very similar question and my Google Cloud contact told me that the cost page was out of date. The Cloud Storage egress cost should apply (instead of the Compute Engine Networking egress cost as today in the documentation).
Question2:
You copy the data, so, you have, at the end, the volume of data duplicated in 2 dataset and you have your storage cost duplicated.
Every time that you want to sync the data, you perform a copy. It's only a copy, and not a smart delta update. So, be careful if you update directly the data in the target dataset: a new copy will override the data!
Instead, use the target dataset as a base to query, and duplicate (again) the data in an independent dataset, where you can add your region specific data
According to the docs, once the dataset is created, the location cannot be changed, but you can copy the dataset to a different location, or manually move (recreate) the dataset in a different location.
The easier approach is copy, you can learn more about the requirements, quotas and limitations here: https://cloud.google.com/bigquery/docs/copying-datasets
So:
There is no need for the transfer service, you can copy datasets to a different location.
There is no mechanism for automatic replication across regions. Even a disaster recovery policy will require cross-region datasets copies.
BigQuery does not automatically provide a backup or replica of your data in another geographic region. You can create cross-region dataset copies to enhance your disaster recovery strategy.
https://cloud.google.com/bigquery/docs/availability#:%7E:text=cross%2Dregion%20dataset%20copies
So in both cases you need to work with dataset copies and deal with data freshness in the second scenario.

What are the charges incurrent for keeping Azure App Insights in one region and the Log Analytics Workspace in another?

I have my App Insights(Classic) in West US2. I want to move it to workspace based mode. My Log Analytics Workspace is in another region (West US). Is it ok to migrate? What are the pricing impacts pertaining to the resources being in different regions?
If you already have a workspace in a different region, it is fine but if you want to migrate it to a different region when you already have instances in it. Unfortunately, it is not possible, Microsoft can manually move the region if you want and there won't be any extra pricing if you change the region.
We have Azure Mover which came into the picture in September 2020 but unfortunately, Application insights don't support it.

Can I specify default labels to apply to bigquery jobs?

We have end users issuing queries against our bigquery tables from multiple places. From BI tools, from the GCP console, from bash scripts that use bq, from python scripts that call the API etc … we would like to be able to track the cost of queries so we can compare the cost of those different querying methods.
Within GCP the way to differentiate costs is to put labels onto stuff. Is there a way to mandate the labels that queries from any (all??) of those querying mechanisms must have? I think this is impossible for ad hoc bash/python scripts but perhaps we can mandate that a particular BI tool must pass through labels on all the queries that it issues.
Any advice on this subject would be appreciated.
I dont think you can Mandate lables . But you can create seprate service accounts
One for CLI
One for GCP console
One for BI
and track your costs based on that.

Is it possible to specify the AWS region for the storage location?

Not sure if this is even the correct place to ask, but I couldn't find any relevant information on this topic (apart from an old forum post that was last answered a year ago).
Like the question says, does anyone know if it's possible to specify a AWS or GCP region that FaunaDB will use?
I saw that on the Pricing page, the Business Plan offers Data locality* and Premium regions* but they are marked as future feature, with no further information like a roadmap or planned release quarter.
Many of my clients are Canadian or from Europe and they are already asking me about hosting their data in their own country. I know that AWS and Google offer data center locations in Canada (for example), so I'm just looking for any further information on this and if/when it will be possible.
I really, really don't want to have to host my own database on a private server.
Thanks in advance!
It is not possible to specify an AWS region. Fauna database transactional replication involves all deployed regions.
We are working towards the data locality feature, but is not available yet nor does it have a finalized definition.
When the data locality feature is closer to completion, we'll be able to say more.
Hi ouairz As eskwayrd mentions, it's not possible to select individual regions in which to locate your data. We do plan to provide a a set of individual replication zones across the globe which you select to control your data residency needs. For example, there would be a EU zone which would you would use for data that must stay resident to EU member states. Other zones may include, for instance, Asia, Australia, North America, etc. We are considering a Canadian zone. Please feel free to reach out to product#fauna.com for more details

Can I Have Two Regions Under One Amazon S3 Account?

I have two wordpress blogs and I am planning to use amazon S3 with one blog and (amazon S3+cloudfront) for another blog.
I read that we need to choose a location when we start our AWS account.
However, for one site (One using cloudfront and amazon S3), my target market is US and UK and another site (Using amazon S3 alone), my target market is India.
In this case, should I use two separate accounts? Or can I have one single account with two locations? (US and Asia).
The one I am using cloudfront for will have video streaming and the one which I use S3 alone will be heavy on images.
Thank you in advance
you can have multiple locations within the same account. When creating a bucket you will be given a choice in which region to create it. E.g. you can have different buckets within the same account located in the different regions.
Thanks,
Andy