Need to try out bigquery and google cloud sql free package, am asked to enable billing before creating an instance - google-bigquery

Members, I have been trying to learn how to use google bigquery and the cloud sql though I have had a challenge with enabling billing issues, this is all because I needed is to have a free access package.
Question:
Is there a free package to enable me practice google cloud sql and bigquey, if yes please get me the link.
Besides any one experiencing the same problem?

This topic is not a programming question, so I will close it.
FYI:
BigQuery offers a free query tier for all users (the first 100 GB of data processed per month is at no charge). If you plan on using your own data, and not just test BigQuery with our sample public datasets, then you must enable billing, as there is no free storage tier. See: https://developers.google.com/bigquery/pricing
The D0 tier of Cloud SQL is less than a dollar a day, see:
https://developers.google.com/cloud-sql/docs/billing

Related

Can we set a schedule for availability of an Azure SQL database?

Brand new to Azure, so please bear with me if this is obvious.
I've set up an SQL database for testing purposes. As the service is charged per hour, and it's currently only going to be used by me during my working hours, I would like to know if it's possible to have it running only during those hours.
I realise that the cost difference this will make isn't large, but I might as well not spend the money when I know it's not needed, and I'll want to know how it's done for when we start adding more services.
As of now, the simple and direct answer is No. Azure don't allow to start/stop/pause Azure SQL Database. It starts billing once you create it. As a workaround, you can export the database and then delete it and it will stop billing.
But, as #DavidBrown mentioned in the comment, the serverless compute tier is a option which you can go with.
The serverless compute tier for single databases in Azure SQL Database
is parameterized by a compute autoscaling range and an auto-pause
delay. The configuration of these parameters shapes the database
performance experience and compute cost.
But even in serverless compute tier, you need to pay for storage even when the database is paused.
Add on, Serverless compute tier allows you to have auto-pausing and auto-resuming based on certain conditions.
Please go through Auto-pausing and auto-resuming to get some insights on the same.

Loading data from RDBMS to Bigquery

I have an App Engine scheduled job which runs everyday and look for rows in a PostgreSQL table (hosted in gcp not a cloudsql) which meets a criteria to archive. If the criteria is met, it connects to BigQuery and streams the data to big query. Everyday there are few records qualify for archiving and we write to BigQuery. Is this the cost effective way or we can try loading data using Cloud Functions? https://cloud.google.com/solutions/performing-etl-from-relational-database-into-bigquery
App Engine and Cloud Functions have different purposes. You should use App Engine if you want to deploy a full application in a serverless environment. If you need to integrate services in the cloud, use Cloud Function. In your case it seems that Cloud Functions fits better.
It's important to remember that Cloud Function has a time limitation: the maximum time which your code has to run is 9 minutes.
You can find this and other limitations here
Furthermore, you can find here a pricing calculator for GCP products.
If you have any further questions, please let me know.

Working out which BigQuery query I am paying for?

I am new to BigQuery and I have a question regarding billing - I have a recurring (almost daily) charge on my account and I think it is related to a query I have embedded into a published Tableau report - people are viewing the report and I am being charged - however the charge is more that I am expecting. How can I track the charge back to the specific query to confirm which one is raising the charge?
Thank you for your help,
Ben
I would start by enabling audit logs and inspecting the logs.
Audit logs are available via Google Cloud Logging, where they can be immediately filtered to provide insights on specific jobs or queries, or exported to Google Cloud Pub/Sub, Google Cloud Storage, or BigQuery.
To analyze your aggregated BigQuery usage using SQL, set up export of audit logs back to BigQuery. For more information about setting up exports from Cloud Logging, see Overview of Logs Export in the Cloud Logging documentation.
Analyzing Audit Logs Using BigQuery: https://cloud.google.com/bigquery/audit-logs

Azure SQL service in Germany

Although Azure seems to have datacenters in Germany, I cannot select that region when creating a new SQL server on Azure Portal.
The Azure pricing page show prices for this region, so why it is not listed in the available options? Any restrictions?
There are restrictions on some Azure regions, based on your Azure account. A US based Azure account will typically not be able to use these regions for tax and legal reasons.
Full details on the German data-center GA is at this blog https://azure.microsoft.com/en-us/blog/microsoft-azure-germany-now-available-via-first-of-its-kind-cloud-for-europe/
Customers in the EU and EFTA can continue to use Microsoft cloud options as they do today, or, for those who want the option, they’re able to use the services from German datacenters
Since 2016 unfortunately nothing has changed until now (2023). Region Germany West Central (Frankfurt/Main) is available for Azure SQL but you can't create one. I worked last year a few month to install our infrastructure at Azure in that region until trying to create Azure SQL. You will get an error message when selecting the server location: "This location is not available". This has nothing to do with free or pay-as-you-go subscription. The answer from the dev support after opening a ticket (october 2022):
"Unfortunately, due to high demand for Azure SQL in this region, we
are not able to approve your quota request at this time. To ensure
that all customers can access the services they need, we are working
through approving quota requests as we bring additional capacity
online. We are continually investing in additional infrastructure to
expand our available resources. Apologies for the delay in being able
to increase the quota on your Azure subscription. No additional
details are needed from you at this time, your request will stay
pending. Thank you for your patience until we report back."
The problem is that microsoft is not working transparent und you loose so much time and money. They simply have no resources anymore.
People are asking this but they don't get answers and their questions get locked:
https://social.technet.microsoft.com/forums/en-US/94278a11-c5ac-4239-b092-a256bb5c4488/why-germany-west-central-location-is-not-available-for-subscription?forum=ssdsgetstarted
https://social.msdn.microsoft.com/Forums/en-US/ac0376cb-2a0e-4dc2-a52c-d986989e6801/unable-to-create-sql-database-server?forum=ssdsgetstarted#00a598f2-5fd4-4c7e-ab91-913fae5ba7cc
https://github.com/MicrosoftDocs/azure-docs/issues/52606
I am wondering whether Azure SQL resources in this region will be available after some time or not?

What happens to Cloud SQL if you reach max queries with pay per use?

I'm currently looking into using Google App Engine for a project.
I understand that the main instance will scale by creating a clone of itself.
I understand that Cloud Storage is basically a big bucket for holding static files.
I understand that Cloud SQL is where the data goes.
Now, lets say I use the smallest SQL instance which allows for 25 concurrent connections, with a pay per use plan. If I exceed 25 connections will Google App Engine create an additional database and split requests?
No, App Engine and Cloud SQL are totally separate things. One's an application server and the others a relational database. App Engine will never create additional databases (presumably you mean servers?). If you hit the quotas you will get an exception relating to that quota.
On the other hand, Cloud SQL doesn't have to be 'where the data goes'. There is also Cloud Datastore - with an API set much more integrated with App Engine. It also scales without you having to worry about things like concurrent connections. If you are starting a new project from scratch, I'd highly recommend checking out the datastore.