What are 10 database transaction units in the Azure free trial? - sql

I am looking for a cloud service provider to host a SQL DB in and access through API calls. After looking through multiple providers I have seen that Azure has a 12-month free trial but only 250 GB S0 instance with 10 database transaction units.
Could anyone explain to be what they mean by 10 DB transaction units? Any help is greatly appreciated.
For reference our database would not be large in scale just holding candidate and judges applications which we only get maximum 600 candidates per year.
I tried looking transactional units online and saw it make be a single REST API call which seems absurd to me.

Please examine the output of the following query:
SELECT * FROM sys.dm_user_db_resource_governance
That will tell you the following information about the current service tier:
min_cores (cores available on the service)
max_dop (the MAX_DOP value for the user workload)
max_sessions (the maximum number of sessions allowed)
max_db_max_size_in_mb (the maximum max_size value for a data file, in MB)
log_size_in_mb
instance_max_worker_threads (worker thread limit for the SQL Server instance)
The above information will give the details of what 10 DTU means in terms of resources available. You can run this query every time you change service tier of the database.

Related

Kusto Query limitation gets timeout error

I am running a Kusto Query in my Azure Diagnostics where I am querying logs of last 1 week and the query times out after 10 mins. Is there a way I can increase the timeout limits? if yes can someone please guide me the steps. I downloaded Kusto explorer but couldnt see any easy way of connecting my Azure cluster. Need help as how can i increase this timeout duration from inside Azure portal for query I am running?
It seems like 10 minutes are the max value for timeout.
https://learn.microsoft.com/en-us/azure/azure-monitor/service-limits
Query API
Category
Limit
Comments
Maximum records returned in a single query
500,000
Maximum size of data returned
~104 MB (~100 MiB)
The API returns up to 64 MB of compressed data, which translates to up to 100 MB of raw data.
Maximum query running time
10 minutes
See Timeouts for details.
Maximum request rate
200 requests per 30 seconds per Azure AD user or client IP address
See Log queries and language.
https://learn.microsoft.com/en-us/azure/azure-monitor/logs/api/timeouts
Timeouts
Query execution times can vary widely based on:
The complexity of the query
The amount of data being analyzed
The load on the system at the time of the query
The load on the workspace at the time of the query
You may want to customize the timeout for the query.
The default timeout is 3 minutes, and the maximum timeout is 10 minutes.

How to interpret query process GB in Bigquery?

I am using a free trial of Google bigquery. This is the query that I am using.
select * from `test`.events where subject_id = 124 and id = 256064 and time >= '2166-01-15T14:00:00' and time <='2166-01-15T14:15:00' and id_1 in (3655,223762,223761,678,211,220045,8368,8441,225310,8555,8440)
This query is expected to return at most 300 records and not more than that.
However I see a message like this as below
But the table on which this query operates is really huge. Does this indicate the table size? However, I ran this query multiple times a day
Due to this, it resulted in error below
Quota exceeded: Your project exceeded quota for free query bytes scanned. For more information, see https://cloud.google.com/bigquery/troubleshooting-errors
How long do I have to wait for this error to go-away? Is the daily limit 1TB? If yes, then I didn't not use close to 400 GB.
How to view my daily usage?
If I can edit quota, can you let me know which option should I be editing?
Can you help me with the above questions?
According to the official documentation
"BigQuery charges for queries by using one metric: the number of bytes processed (also referred to as bytes read)", regardless of how large the output size is. What this means is that if you do a count(*) on a 1TB table, you will supposedly be charged $5, even though the final output is very minimal.
Note that due to storage optimizations that BigQuery is doing internally, the bytes processed might not equal to the actual raw table size when you created it.
For the error you're seeing, browse the Google Console to "IAM & admin" then "Quotas", where you can then search for quotas specific to the BigQuery service.
Hope this helps!
Flavien

BigQuery GUI - CPU Resource Limit

Is there a way to set the CPU Resource Limit on the BigQuery with Python and GUI?
I'm getting an error of:
Query exceeded resource limits. 2147706.163729571 CPU seconds were used, and this query must use less than 46300.0 CPU seconds.
Looking at the BigQuery's Python reference page: http://google-cloud-python.readthedocs.io/en/latest/bigquery/reference.html
It looks like there's:
1. maximum_billing_tier
2. maximum_bytes_billed
That can be set, but there is no CPU second options.
You cannot set anymore maximum_billing_tier - it is obsolete and as soon as you are lower than tier 100 you are billed as if it were 1. if you exceed 100 - query just failes.
As of CPU - check concept of slots
Maximum concurrent slots per project for on-demand pricing — 2,000
The default number of slots for on-demand queries is shared among all queries in a single project. As a rule, if you're processing less than 100 GB of queries at once, you're unlikely to be using all 2,000 slots.
To check how many slots you're using, see Monitoring BigQuery Using Stackdriver. If you need more than 2,000 slots, contact your sales representative to discuss whether flat-rate pricing meets your needs.
See more at https://cloud.google.com/bigquery/quotas#query_jobs

Bigquery: Error running query: Query exceeded resource limits for tier 1. Tier 29 or higher required. in Redash

I would like to know how to increase my billing tier for only one Bigquery query in Redash (not the whole project).
I am getting this error, while trying to refresh the query in Redash:
"Error running query: Query exceeded resource limits for tier 1. Tier 29 or higher required".
According to Bigquery's documentation there are three ways to increase this limit (https://cloud.google.com/bigquery/pricing#high-compute). However, I am not sure which one is applicable to the queries that are written directly in the Redash's query editor. It would be great if you could provide an example.
Thanks for your help.

Azure SQL high wait time on "VDI_CLIENT_OTHER"

We're benchmarking our app with different scales of an Azure SQL database, and we're having a hard time saturating the db. Among other things, we've executed this query:
SELECT *
FROM sys.dm_os_wait_stats
ORDER BY wait_time_ms DESC
The top row of the result was something like
wait_type waiting_tasks_count wait_time_ms max_wait_time_ms signal_wait_time_ms
VDI_CLIENT_OTHER 19560 409007428 60016 37281
What is this wait time? What exactly have we been waiting for during those 409000 seconds (almost 5 days)? Google doesn't seem to know what VDI_CLIENT_OTHER is.
VDI_CLIENT_OTHER is used in case of new replica seeding or any other user initiated workflow that triggers copies like update service tier and setting up geo relationship link. High wait time It likely just means we did seeding and the task remained running waiting for additional work items which aren’t arriving.