SQL (MariaDB) SUM times limitation - sql

I have query where I SUM times by group
SELECT category,
SEC_TO_TIME(SUM(TIME_TO_SEC(TIMEDIFF(ter_ura_zakljucek, ter_ura_zacetek)))) AS total_time
GROUP_BY 'category'
This seems to work fine on smaller amount of data, but when I execute it on larger amount of rows it seems it always stops at total_time = 838:59:59
Is there some limitation in SQL at amount of times you can sum up?

The range for TIME-datatype is '-838:59:59.999999' to '838:59:59.999999'. See the documentation.
To handle larger time values you have to covert it to something else depending on your need.

Related

Sum over partition but only as a running total in SQL Server

I have the following problem.
The correct values should be like this:
The 9/13 data should have the orig_target_qty - sum(of 9/13/ exec_qty)
I want the difference between the 700,000 - sum of the total in the left box. I am partitioning the values by the ticker but the addition of 700k + 100k - sum (of exec qty) should give me the correct result but for some reason I am getting the value short 100k because of the previous set of records that have 100k.
Any thoughts on why I'm doing this wrong? I have syntax wrong somewhere.

DATEIME_DIFF throwing error when using safe divide

I've created a query that I'm hoping to use to fill a table with daily budgets at the end of every day. To do this, I just need some simple maths:
monthly budget / number of days left in the month.
Then, at the end of the month, I can SUM all of the rows to calculate an accurate budget.
The query is as follows:
SELECT *,
ROUND(SAFE_DIVIDE(budget, DATETIME_DIFF(CURRENT_DATE(), LAST_DAY(CURRENT_DATE()), DAY)),2) AS daily_budget
FROM `mm-demo-project.marketing_hub.budget_manager`
When executing the query, my results present as negative numbers, which according to the documentation for this function, is likely caused by the result overflowing the result type.
View the results of the query.
I've made a fools guess at rounding the calculation. Needless to say that it did not work at all.
How can I stop my query from returning negative number?
use below
SELECT *,
ROUND(SAFE_DIVIDE(budget, DATETIME_DIFF(LAST_DAY(CURRENT_DATE()), CURRENT_DATE(), DAY)),2) AS daily_budget
FROM `mm-demo-project.marketing_hub.budget_manager`

SQL - Finding Percent Of a Total and subtracting to get two new totals

In my SQL course I'm trying to answer a problem in which i'm being asked to find X% of a Total then subtracting that result from the original total to produce another result. Then putting these results (the X% of the total and the total-X% of total) in two new columns.
Example: We need to know how much money we owe Tom and Ted. We have total up sales to $1,000,000.00. We owe Tom 75% of that total. The remainder goes to Ted.
I can't seem to find anything in my readings/videos about this nor a google search that isn't an answers that produces ratios or comparing to other records in the table. Also, not sure about how to get the results into their own columns. Thanks for any advice!
Example of what I got so far:
SELECT SUM(Sale_Amount) From Order_Table
Now I have to find the % of that SUM then subtract it from the SUM and push both results to two new columns, one for the percent of the SUM(Sale_Amount) and one for the remainder.
Given it's an SQL course (and it's not 100% clear what's being asked), I'm not going to give you the total answer, but I'll give you components but you'll need to understand them to put them together.
In SQL, you can
Get totals using SUM and GROUP BY
Do normal maths e.g., SELECT 10000 * 60/100 to get percentages of totals
'Save' results by a) having columns/fields to save them in, and b) UPDATE those fields with relevant data
Note if you're not saving data, and simply reporting them, you can just add those to a SELECT statement e.g., SELECT 100000 AS Total, 100000 * 0.75 AS Toms_Share, 100000 * 0.25 AS Teds_Share.

How to Calculate Time Remaining - Based on estimated time x rows avaiable

For security reasons, we're using a front end application, where we're uploading a txt file, deleting 2500 records from a SQL database.
I'm using SQL Server 2008 Mgt Studio to query the progress of that delete. But this just shows me how many rows I have left, not how much time is left.
How can I add into my query?
A calculation of the estimate 'in minutes', remaining on a record deletes?
While there is no fixed amount of time it takes to complete, its been averaging 1.5min to delete each record.
I figured simple math (2500 x 1.5min = est time remaining), I just don't know how to write into the query as a new column. Here is where my query is at now:
SELECT COUNT (UNITS) AS Orders_Remaining
FROM ORDERS
WHERE UNITS BETWEEN '0001' and '2500'
Do you mean this?
SELECT COUNT (UNITS) AS Orders_Remaining, COUNT(UNITS) * 1.5 AS Minutes_Remaining
FROM ORDERS
WHERE UNITS BETWEEN '0001' and '2500'

BigQuery gives Response Too Large error for whole dataset but not for equivalent subqueries

I have a table in BigQuery with the following fields:
time,a,b,c,d
time is a string in ISO8601 format but with a space, a is an integer from 1 to 16000, and the other columns are strings. The table contains one month's worth of data, and there are a few million records per day.
The following query fails with "response too large":
select UTC_USEC_TO_DAY(PARSE_UTC_USEC(time)) as day,b,c,d,count(a),count(distinct a, 1000000)
from [myproject.mytable]
group by day,b,c,d
order by day,b,c,d asc
However, this query works (the data starts at 2012-01-01)
select UTC_USEC_TO_DAY(PARSE_UTC_USEC(time)) as day,
b,c,d,count(a),count(distinct a)
from [myproject.mytable]
where UTC_USEC_TO_DAY(PARSE_UTC_USEC(time)) = UTC_USEC_TO_DAY(PARSE_UTC_USEC('2012-01-01 00:00:00'))
group by day,b,c,d
order by day,b,c,d asc
This looks like it might be related to this issue. However, because of the group by clause, the top query is equivalent to repeatedly calling the second query. Is the query planner not able to handle this?
Edit: To clarify my test data:
I am using fake test data I generated. I originally used several fields and tried to get hourly summaries for a month (group by hour, where hour is defined using as in the select part of the query). When that failed I tried switching to daily. When that failed I reduced the columns involved. That also failed when using a count (distinct xxx, 1000000), but it worked when I just did one day's worth. (It also works if I remove the 1000000 parameter, but since that does work with the one-day query it seems the query planner is not separating things as I would expect.)
The one checked for count (distinct) has cardinality 16,000, and the group by columns have cardinality 2 and 20 for a total of just 1200 expected rows. Column values are quite short, around ten characters.
How many results do you expect? There is currently a limitation of about 64MB in the total size of results that are allowed. If you're expecting millions of rows as a result, than this may be an expected error.
If the number of results isn't extremely large, it may be that the size problem is not the final response, but the internal calculation. Specifically, if there are too many results from the GROUP BY, the query can run out of memory. One possible solution is to change "GROUP BY" to "GOUP EACH BY" which alters the way the query is executed. This is a feature that is currently experimental, and as such, is not yet documented.
For your query, since you reference fields named in the select in the group by, you might need to do this:
select day, b,c,d,day,count(a),count(distinct a, 1000000)
FROM (
select UTC_USEC_TO_DAY(PARSE_UTC_USEC(time)) as day, b, c, d
from [myproject.mytable]
)
group EACH by day,b,c,d
order by day,b,c,d asc