SQL multiple sum by PARTITION - sql

I have the following postgreSql table stock, there the structure is following
| column | pk |
+--------+-----+
| date | yes |
| id | yes |
| type | yes |
| qty | |
| fee | |
table looks like this
| date | id | type | qty | fee |
+------------+-----+------+------+------+
| 2015-01-01 | 001 | CB04 | 500 | 2 |
| 2015-01-01 | 002 | CB04 | 1500 | 3 |
| 2015-01-01 | 003 | CB04 | 500 | 1 |
| 2015-01-01 | 004 | CB04 | 100 | 5 |
| 2015-01-01 | 001 | CB02 | 800 | 6 |
| 2015-01-02 | 002 | CB03 | 3100 | 1 |
| | | | | |
I want to create a view or query, so that the result looks like this.
| date | type | t_qty | total_weighted_fee |
+------------+------+-------+--------------------+
| 2015-01-01 | CB04 | 2600 | 2.5 |
| 2015-01-01 | CB03 | 3100 | 1 |
| | | | |
what I did is this
http://sqlfiddle.com/#!17/39fb8a/18
But this is not the output what I want.
The Sub Query table looks like this:
% of total Qty = qty / t_qty
weighted fee = fee * % of total Qty
| date | id | type | qty | fee | t_qty | % of total Qty | weighted fee |
+------------+-----+------+------+-----+-------+----------------+--------------+
| 2015-01-01 | 001 | CB04 | 500 | 2 | 2600 | 0.19 | 0.38 |
| 2015-01-01 | 002 | CB04 | 1500 | 3 | 2600 | 0.58 | 1.73 |
| 2015-01-01 | 003 | CB04 | 500 | 1 | 2600 | 0.19 | 0.192 |
| 2015-01-01 | 004 | CB04 | 100 | 5 | 2600 | 0.04 | 0.192 |
| 2015-01-01 | 002 | CB03 | 3100 | 1 | 3100 | 1 | 1 |
| | | | | | | | |

You can use aggregation . . . I don't think you are far off:
select date, type, sum(qty),
sum(fee * qty * 1.0) / nullif(sum(qty), 0)
from t
group by date, type;

Related

Properly using PERCENTILE_CONT - Oracle SQL

I am trying to calculate the following:
The average of the dataset
The median of the dataset
The top 20% of the dataset
The bottom 20% of the dataset
My dataset looks like this:
| Part | Step | Step_Start | Part_Finish | TheTime |
|:----:|:----:|:----------:|:-----------:|:-----------:|
| 1 | 200 | 15-Aug-18 | 19-Jun-19 | 307.4926273 |
| 2 | 200 | 7-Jun-19 | 19-Jun-19 | 11.4434375 |
| 3 | 200 | 17-Sep-18 | 4-Feb-19 | 139.4360417 |
| 4 | 200 | 30-Jan-19 | 4-Feb-19 | 4.356666667 |
| 5 | 200 | 1-Oct-18 | 18-Feb-19 | 139.4528009 |
| 6 | 200 | 13-Feb-19 | 18-Feb-19 | 4.50375 |
| 7 | 200 | 17-Oct-18 | 28-Mar-19 | 161.7007176 |
| 8 | 200 | 12-Nov-18 | 28-Mar-19 | 135.630625 |
| 9 | 200 | 25-Oct-18 | 26-Feb-19 | 123.6026968 |
| 10 | 200 | 22-Feb-19 | 26-Feb-19 | 3.628090278 |
| 11 | 200 | 30-Oct-18 | 3-Jan-19 | 64.51466435 |
| 12 | 200 | 12-Dec-18 | 3-Jan-19 | 21.48703704 |
| 13 | 200 | 15-Nov-18 | 14-Jan-19 | 59.41373843 |
| 14 | 200 | 7-Jan-19 | 14-Jan-19 | 6.621828704 |
| 15 | 200 | 15-Nov-18 | 12-Jan-19 | 57.62283565 |
| 16 | 200 | 8-Jan-19 | 12-Jan-19 | 3.264398148 |
| 17 | 200 | 15-Nov-18 | 7-Mar-19 | 111.5082523 |
| 18 | 200 | 4-Mar-19 | 7-Mar-19 | 2.153587963 |
| 19 | 200 | 16-Nov-18 | 23-May-19 | 187.6931481 |
| 20 | 200 | 16-Nov-18 | 3-Jan-19 | 47.47916667 |
| 21 | 200 | 17-Dec-18 | 3-Jan-19 | 16.62722222 |
| 22 | 200 | 20-Nov-18 | 14-Feb-19 | 85.6115625 |
| 23 | 200 | 9-Feb-19 | 14-Feb-19 | 4.520787037 |
| 24 | 200 | 19-Nov-18 | 14-Jan-19 | 55.53342593 |
| 25 | 200 | 9-Jan-19 | 14-Jan-19 | 4.721400463 |
| 26 | 200 | 26-Nov-18 | 9-Jan-19 | 43.50748843 |
| 27 | 200 | 4-Jan-19 | 9-Jan-19 | 4.417164352 |
| 28 | 200 | 26-Nov-18 | 21-Jan-19 | 55.59988426 |
| 29 | 200 | 13-Jan-19 | 21-Jan-19 | 7.535 |
| 30 | 200 | 16-Jan-19 | 21-Jan-19 | 4.618796296 |
| 31 | 200 | 26-Nov-18 | 11-Jan-19 | 45.42148148 |
| 32 | 200 | 4-Jan-19 | 11-Jan-19 | 6.316921296 |
| 33 | 200 | 4-Dec-18 | 24-Jan-19 | 50.3669213 |
| 34 | 200 | 18-Jan-19 | 24-Jan-19 | 5.589467593 |
| 35 | 200 | 4-Dec-18 | 31-Jan-19 | 57.26877315 |
| 36 | 200 | 22-Jan-19 | 31-Jan-19 | 8.240034722 |
| 37 | 200 | 5-Dec-18 | 28-Jun-19 | 204.5283912 |
| 38 | 200 | 26-Jun-19 | 28-Jun-19 | 1.508252315 |
| 39 | 200 | 9-Feb-19 | 19-Feb-19 | 9.532893519 |
| 40 | 200 | 7-Dec-18 | 14-Feb-19 | 68.51900463 |
| 41 | 200 | 5-Feb-19 | 14-Feb-19 | 8.641076389 |
| 42 | 200 | 11-Dec-18 | 25-Jan-19 | 44.50501157 |
| 43 | 200 | 22-Jan-19 | 25-Jan-19 | 2.511435185 |
| 44 | 200 | 13-Dec-18 | 17-Jan-19 | 34.43806713 |
| 45 | 200 | 14-Jan-19 | 17-Jan-19 | 2.210972222 |
| 46 | 200 | 13-Dec-18 | 24-Jan-19 | 41.38921296 |
| 47 | 200 | 17-Jan-19 | 24-Jan-19 | 6.444664352 |
| 48 | 200 | 10-Jan-19 | 7-Feb-19 | 27.43130787 |
| 49 | 200 | 1-Feb-19 | 7-Feb-19 | 5.349189815 |
| 50 | 200 | 18-Dec-18 | 4-Feb-19 | 47.50416667 |
| 51 | 200 | 29-Jan-19 | 4-Feb-19 | 5.481979167 |
| 52 | 200 | 3-Jan-19 | 30-Jan-19 | 26.46112269 |
| 53 | 200 | 23-Jan-19 | 30-Jan-19 | 6.712175926 |
| 54 | 200 | 4-Jan-19 | 5-Feb-19 | 31.49590278 |
| 55 | 200 | 30-Jan-19 | 5-Feb-19 | 5.385798611 |
| 56 | 200 | 23-Jan-19 | 20-Mar-19 | 55.296875 |
| 57 | 200 | 21-Feb-19 | 20-Mar-19 | 26.06854167 |
| 58 | 200 | 22-Jan-19 | 14-Mar-19 | 50.57989583 |
| 59 | 200 | 8-Mar-19 | 14-Mar-19 | 5.147303241 |
| 60 | 200 | 22-Jan-19 | 21-Feb-19 | 29.46405093 |
| 61 | 200 | 14-Feb-19 | 21-Feb-19 | 6.701724537 |
| 62 | 200 | 24-Jan-19 | 23-Apr-19 | 88.50689815 |
| 63 | 200 | 17-Apr-19 | 23-Apr-19 | 5.725405093 |
| 64 | 200 | 28-Jan-19 | 21-Feb-19 | 23.50082176 |
| 65 | 200 | 13-Feb-19 | 21-Feb-19 | 7.115717593 |
| 66 | 200 | 31-Jan-19 | 28-Feb-19 | 27.55881944 |
| 67 | 200 | 25-Feb-19 | 28-Feb-19 | 2.633738426 |
| 68 | 200 | 31-Jan-19 | 27-Feb-19 | 26.46105324 |
| 69 | 200 | 23-Feb-19 | 27-Feb-19 | 3.531423611 |
| 70 | 200 | 1-Feb-19 | 28-Feb-19 | 26.45835648 |
| 71 | 200 | 27-Feb-19 | 28-Feb-19 | 0.471296296 |
| 72 | 200 | 6-Feb-19 | 27-Feb-19 | 20.54436343 |
| 73 | 200 | 23-Feb-19 | 27-Feb-19 | 3.598854167 |
| 74 | 200 | 6-Feb-19 | 5-Mar-19 | 26.54347222 |
| 75 | 200 | 28-Feb-19 | 5-Mar-19 | 4.303773148 |
| 76 | 200 | 12-Feb-19 | 6-Mar-19 | 21.56993056 |
| 77 | 200 | 1-Mar-19 | 6-Mar-19 | 4.597615741 |
| 78 | 200 | 12-Feb-19 | 14-Mar-19 | 29.50417824 |
| 79 | 200 | 7-Mar-19 | 14-Mar-19 | 6.083541667 |
| 80 | 200 | 28-Feb-19 | 28-Mar-19 | 27.5291088 |
| 81 | 200 | 25-Mar-19 | 28-Mar-19 | 2.637824074 |
| 82 | 200 | 29-Jan-19 | 28-Feb-19 | 29.34280093 |
| 83 | 200 | 21-Feb-19 | 28-Feb-19 | 6.233831019 |
| 84 | 200 | 19-Feb-19 | 30-Apr-19 | 69.51832176 |
| 85 | 200 | 7-Feb-19 | 5-Mar-19 | 25.74865741 |
| 86 | 200 | 27-Feb-19 | 5-Mar-19 | 5.380034722 |
| 87 | 200 | 21-Feb-19 | 21-Mar-19 | 27.56310185 |
| 88 | 200 | 19-Mar-19 | 21-Mar-19 | 1.161828704 |
| 89 | 200 | 26-Feb-19 | 28-Mar-19 | 29.41315972 |
| 90 | 200 | 22-Mar-19 | 28-Mar-19 | 5.673703704 |
| 91 | 200 | 26-Feb-19 | 28-Mar-19 | 29.5131713 |
| 92 | 200 | 20-Mar-19 | 28-Mar-19 | 7.073414352 |
| 93 | 200 | 28-Feb-19 | 15-Apr-19 | 45.63513889 |
| 94 | 200 | 5-Apr-19 | 15-Apr-19 | 9.479456019 |
| 95 | 200 | 1-Mar-19 | 29-Mar-19 | 27.54568287 |
| 96 | 200 | 25-Mar-19 | 29-Mar-19 | 3.044340278 |
| 97 | 200 | 4-Mar-19 | 27-Mar-19 | 22.52392361 |
| 98 | 200 | 21-Mar-19 | 27-Mar-19 | 5.074421296 |
| 99 | 200 | 14-Feb-19 | 19-Mar-19 | 32.54349537 |
| 100 | 200 | 13-Mar-19 | 19-Mar-19 | 5.265266204 |
My current SQL query looks like this:
SELECT
Step,
ROUND(MEDIAN(Part_Finish - Step_Start), 2) AS "The_Median",
ROUND(AVG(Part_Finish - Step_Start), 2) AS "The_Average",
PERCENTILE_CONT(0.20) WITHIN GROUP (ORDER BY (Part_Finish - Step_Start) ASC) AS "Best_Time",
PERCENTILE_CONT(0.80) WITHIN GROUP (ORDER BY (Part_Finish - Step_Start) ASC) AS "Worst_Time"
FROM
myTbl
GROUP BY
Step
However, I am not sure if my results are correct, because I don't think I am using PERCENTILE_CONT() correctly. How can I use PERCENTILE_CONT() (or another method) to find the average or median (whichever is easier) "time to complete" based on the best 20% of the data, and the worst 20% of the data?
I would expect some results to look like this:
| Step | The_Average | The_Median | Best_Time | Worst_Time |
|:----:|:-----------:|:----------:|:---------:|:----------:|
| 200 | < value > | < value > | < value > | < value > |
where the < value > fields are the properly calculated average, median, and best and worst of the dataset. Best and worst being calculated by finding the average or median of the top 20% of the data (i.e., the smallest times) or the worst 20% of the data (i.e., the largest times)
PERCENTILE_CONT is a window function, so if you just want a result set consisting of a single record with scalar values, you may try selecting distinct:
SELECT DISTINCT
PERCENTILE_CONT(0.5) WITHIN GROUP (ORDER BY Part_Finish - Step_Start) AS "The_median",
ROUND(AVG(Part_Finish - Step_Start) OVER (ORDER BY Part_Finish - Step_Start), 2) AS "The_Average",
PERCENTILE_CONT(0.20) WITHIN GROUP (ORDER BY Part_Finish - Step_Start) AS "Best_Time",
PERCENTILE_CONT(0.80) WITHIN GROUP (ORDER BY Part_Finish - Step_Start) AS "Worst_Time"
FROM myTbl;
The reason for the above approach is that selecting PERCENTILE_CONT, a window function, over your entire table would just return the entire table as the result set. But, as you are using it, the values would always be the same for each record. Therefore, we can just take the distinct value to get a single result.
If you instead expect a different report for each Step value, then you should be using PARTITION BY in the calls to PERCENTILE_CONT, e.g.
PERCENTILE_CONT(0.5) WITHIN GROUP (PARTITION BY Step
ORDER BY Part_Finish - Step_Start) AS "The_median"

SQL subcategory total is not properly placed at the end of every parent category

I am having trouble in SQl query,The query result should be like this
+------------+------------+-----+------+-------+--+--+--+
| District | Tehsil | yes | no | Total | | | |
+------------+------------+-----+------+-------+--+--+--+
| ABBOTTABAD | ABBOTTABAD | 377 | 5927 | 6304 | | | |
| ABBOTTABAD | HAVELIAN | 112 | 2276 | 2388 | | | |
| ABBOTTABAD | Overall | 489 | 8203 | 8692 | | | |
| CHARSADDA | CHARSADDA | 289 | 3762 | 4051 | | | |
| CHARSADDA | SHABQADAR | 121 | 1376 | 1497 | | | |
| CHARSADDA | TANGI | 94 | 1703 | 1797 | | | |
| CHARSADDA | Overall | 504 | 6841 | 7345 | | | |
+------------+------------+-----+------+-------+--+--+--+
The overall total should be should be shown at the end of every parent category but now it is showing like this
+------------+------------+-----+------+-------+--+--+--+
| District | Tehsil | yes | no | Total | | | |
+------------+------------+-----+------+-------+--+--+--+
| ABBOTTABAD | ABBOTTABAD | 377 | 5927 | 6304 | | | |
| ABBOTTABAD | HAVELIAN | 112 | 2276 | 2388 | | | |
| ABBOTTABAD | Overall | 489 | 8203 | 8692 | | | |
| CHARSADDA | CHARSADDA | 289 | 3762 | 4051 | | | |
| CHARSADDA | Overall | 504 | 6841 | 7345 | | | |
| CHARSADDA | SHABQADAR | 121 | 1376 | 1497 | | | |
| CHARSADDA | TANGI | 94 | 1703 | 1797 | | | |
+------------+------------+-----+------+-------+--+--+--+
My query is sorting second column with respect to first column although order by query is applied on my first column. This is my query
select District as 'District', tName as 'tehsil',[1] as 'yes',[0] as 'no',ISNULL([1]+[0], 0) as "Total" from
(
select d.Name as 'District',
case when grouping (t.Name)=1 then 'Overall' else t.Name end as tName,
BoundaryWallAvailable,
count(*) as total from School s
INNER JOIN SchoolIndicator i ON (i.refSchoolID=s.SchoolID)
INNER JOIN Tehsil t ON (t.TehsilID=s.refTehsilID)
INNER JOIN district d ON (d.DistrictID=t.refDistrictID)
group by
GROUPING sets((d.Name, BoundaryWallAvailable), (d.Name,t.Name, BoundaryWallAvailable))
) B
PIVOT
(
max(total) for BoundaryWallAvailable in ([1],[0])
) as Pvt
order by District
P.S: BoundaryWall is one column through pivoting i am breaking it into Yes and No Column

SQL percent of total and weighted average

I have the following postgreSql table stock, there the structure is the following:
| column | pk |
+--------+-----+
| date | yes |
| id | yes |
| type | yes |
| qty | |
| fee | |
The table looks like this:
| date | id | type | qty | cost |
+------------+-----+------+------+------+
| 2015-01-01 | 001 | CB04 | 500 | 2 |
| 2015-01-01 | 002 | CB04 | 1500 | 3 |
| 2015-01-01 | 003 | CB04 | 500 | 1 |
| 2015-01-01 | 004 | CB04 | 100 | 5 |
| 2015-01-01 | 001 | CB02 | 800 | 6 |
| 2015-01-02 | 002 | CB03 | 3100 | 1 |
I want to create a view or query, so that the result looks like this.
The table will show the t_qty, % of total Qty, and weighted fee for each day and each type:
% of total Qty = qty / t_qty
weighted fee = fee * % of total Qty
| date | id | type | qty | cost | t_qty | % of total Qty | weighted fee |
+------------+-----+------+------+------+-------+----------------+--------------+
| 2015-01-01 | 001 | CB04 | 500 | 2 | 2600 | 0.19 | 0.38 |
| 2015-01-01 | 002 | CB04 | 1500 | 3 | 2600 | 0.58 | 1.73 |
| 2015-01-01 | 003 | CB04 | 500 | 1 | 2600 | 0.19 | 0.192 |
| 2015-01-01 | 004 | CB04 | 100 | 5 | 2600 | 0.04 | 0.192 |
| | | | | | | | |
I could do this in Excel, but the dataset is too big to process.
You can use SUM with windows function and some Calculation to make it.
SELECT *,
SUM(qty) OVER (PARTITION BY date ORDER BY date) t_qty,
qty::numeric/SUM(qty) OVER (PARTITION BY date ORDER BY date) ,
fee * (qty::numeric/SUM(qty) OVER (PARTITION BY date ORDER BY date))
FROM T
If you want to Rounding you can use ROUND function.
SELECT *,
SUM(qty) OVER (PARTITION BY date ORDER BY date) t_qty,
ROUND(qty::numeric/SUM(qty) OVER (PARTITION BY date ORDER BY date),3) "% of total Qty",
ROUND(fee * (qty::numeric/SUM(qty) OVER (PARTITION BY date ORDER BY date)),3) "weighted fee"
FROM T
sqlfiddle
[Results]:
| date | id | type | qty | fee | t_qty | % of total Qty | weighted fee |
|------------|-----|------|------|-----|-------|----------------|--------------|
| 2015-01-01 | 001 | CB04 | 500 | 2 | 2600 | 0.192 | 0.385 |
| 2015-01-01 | 002 | CB04 | 1500 | 3 | 2600 | 0.577 | 1.731 |
| 2015-01-01 | 003 | CB04 | 500 | 1 | 2600 | 0.192 | 0.192 |
| 2015-01-01 | 004 | CB04 | 100 | 5 | 2600 | 0.038 | 0.192 |
| 2015-01-02 | 002 | CB03 | 3100 | 1 | 3100 | 1 | 1 |

How to check dates condition from one table to another in SQL

Which way we can use to check and compare the dates from one table to another.
Table : inc
+--------+---------+-----------+-----------+-------------+
| inc_id | cust_id | item_id | serv_time | inc_date |
+--------+---------+-----------+-----------+-------------+
| 1 | john | HP | 40 | 17-Apr-2015 |
| 2 | John | HP | 60 | 10-Jan-2016 |
| 3 | Nick | Cisco | 120 | 11-Jan-2016 |
| 4 | samanta | EMC | 180 | 12-Jan-2016 |
| 5 | Kerlee | Oracle | 40 | 13-Jan-2016 |
| 6 | Amir | Microsoft | 300 | 14-Jan-2016 |
| 7 | John | HP | 120 | 15-Jan-2016 |
| 8 | samanta | EMC | 20 | 16-Jan-2016 |
| 9 | Kerlee | Oracle | 10 | 2-Feb-2017 |
+--------+---------+-----------+-----------+-------------+
Table: Contract:
+-----------+---------+----------+------------+
| item_id | con_id | Start | End |
+-----------+---------+----------+------------+
| Dell | DE2015 | 1/1/2015 | 12/31/2015 |
| HP | HP2015 | 1/1/2015 | 12/31/2015 |
| Cisco | CIS2016 | 1/1/2016 | 12/31/2016 |
| EMC | EMC2016 | 1/1/2016 | 12/31/2016 |
| HP | HP2016 | 1/1/2016 | 12/31/2016 |
| Oracle | OR2016 | 1/1/2016 | 12/31/2016 |
| Microsoft | MS2016 | 1/1/2016 | 12/31/2016 |
| Microsoft | MS2017 | 1/1/2017 | 12/31/2017 |
+-----------+---------+----------+------------+
Result:
+-------+---------+---------+--------------+
| Calls | Cust_id | Con_id | Tot_Ser_Time |
+-------+---------+---------+--------------+
| 2 | John | HP2016 | 180 |
| 2 | samanta | EMC2016 | 200 |
| 1 | Nick | CIS2016 | 120 |
| 1 | Amir | MS2016 | 300 |
| 1 | Oracle | OR2016 | 40 |
+-------+---------+---------+--------------+
MY Query:
select count(inc_id) as Calls, inc.cust_id, contract.con_id,
sum(inc.serv_time) as tot_serv_time
from inc inner join contract on inc.item_id = contract.item_id
where inc.inc_date between '2016-01-01' and '2016-12-31'
group by inc.cust_id, contract.con_id
The result from inc table with filter between 1-jan-2016 to 31-Dec-2016 with
count of inc_id based on the items and its contract start and end dates .
If I understand correctly your problem, this query will return the desidered result:
select
count(*) as Calls,
inc.cust_id,
contract.con_id,
sum(inc.serv_time) as tot_serv_time
from
inc inner join contract
on inc.item_id = contract.item_id
and inc.inc_date between contract.start and contract.end
where
inc.inc_date between '2016-01-01' and '2016-12-31'
group by
inc.cust_id,
contract.con_id
the question is a little vague so you might need some adjustments to this query.
select
Calls = count(*)
, Cust = i.Cust_id
, Contract = c.con_id
, Serv_Time = sum(Serv_Time)
from inc as i
inner join contract as c
on i.item_id = c.item_id
and i.inc_date >= c.[start]
and i.inc_date <= c.[end]
where c.[start]>='20160101'
group by i.Cust_id, c.con_id
order by i.Cust_Id, c.con_id
returns:
+-------+---------+----------+-----------+
| Calls | Cust | Contract | Serv_Time |
+-------+---------+----------+-----------+
| 1 | Amir | MS2016 | 300 |
| 2 | John | HP2016 | 180 |
| 1 | Kerlee | OR2016 | 40 |
| 1 | Nick | CIS2016 | 120 |
| 2 | samanta | EMC2016 | 200 |
+-------+---------+----------+-----------+
test setup: http://rextester.com/WSYDL43321
create table inc(
inc_id int
, cust_id varchar(16)
, item_id varchar(16)
, serv_time int
, inc_date date
);
insert into inc values
(1,'john','HP', 40 ,'17-Apr-2015')
,(2,'John','HP', 60 ,'10-Jan-2016')
,(3,'Nick','Cisco', 120 ,'11-Jan-2016')
,(4,'samanta','EMC', 180 ,'12-Jan-2016')
,(5,'Kerlee','Oracle', 40 ,'13-Jan-2016')
,(6,'Amir','Microsoft', 300 ,'14-Jan-2016')
,(7,'John','HP', 120 ,'15-Jan-2016')
,(8,'samanta','EMC', 20 ,'16-Jan-2016')
,(9,'Kerlee','Oracle', 10 ,'02-Feb-2017');
create table contract (
item_id varchar(16)
, con_id varchar(16)
, [Start] date
, [End] date
);
insert into contract values
('Dell','DE2015','20150101','20151231')
,('HP','HP2015','20150101','20151231')
,('Cisco','CIS2016','20160101','20161231')
,('EMC','EMC2016','20160101','20161231')
,('HP','HP2016','20160101','20161231')
,('Oracle','OR2016','20160101','20161231')
,('Microsoft','MS2016','20160101','20161231')
,('Microsoft','MS2017','20170101','20171231');

sum qty with different date SQL Server

I have 2 tables,
table 1 is transaction table
+----------+-----------+---------+------------+-----+
| IDOutlet | IDProduct | TrxType | TrxDate | Qty |
+----------+-----------+---------+------------+-----+
| 101 | ASD11 | 2 | 11/11/2015 | 15 |
| 101 | ASD11 | 3 | 11/14/2015 | -3 |
| 101 | ASD11 | 3 | 11/17/2015 | -6 |
| 101 | ASD11 | 2 | 11/22/2015 | 7 |
| 101 | ASD11 | 3 | 11/26/2015 | -2 |
| 101 | ASD11 | 2 | 12/3/2015 | 1 |
| 101 | ASD11 | 3 | 12/9/2015 | -3 |
| 101 | ASD11 | 3 | 12/11/2015 | -2 |
| 101 | ASD11 | 2 | 12/12/2015 | 5 |
| 101 | FFD34 | 2 | 11/11/2015 | 9 |
| 101 | FFD34 | 3 | 11/14/2015 | -3 |
| 101 | FFD34 | 2 | 11/16/2015 | 3 |
| 101 | FFD34 | 3 | 11/19/2015 | -4 |
| 101 | FFD34 | 3 | 11/23/2015 | -3 |
| 102 | FFD34 | 2 | 11/26/2015 | 2 |
| 102 | FFD34 | 2 | 11/28/2015 | 4 |
| 102 | FFD34 | 3 | 11/29/2015 | -5 |
| 102 | FFD34 | 3 | 12/1/2015 | -1 |
+----------+-----------+---------+------------+-----+
Table 2 is opnametable
+----------+-----------+------------+-----------+
| IDOutlet | IDProduct | OpnameDate | QtyOpname |
+----------+-----------+------------+-----------+
| 101 | ASD11 | 11/20/2015 | 5 |
| 101 | FFD34 | 11/30/2015 | 5 |
| 102 | FFD34 | 11/30/2015 | 1 |
+----------+-----------+------------+-----------+
And I want the result like this
+----------+-----------+------------+---------+
| IDOutlet | IDProduct | OpnameDate | Sum Qty |
+----------+-----------+------------+---------+
| 101 | ASD11 | 11/20/2015 | 6 |
| 101 | FFD34 | 11/20/2015 | 5 |
| 102 | FFD34 | 11/30/2015 | 1 |
+----------+-----------+------------+---------+
You can use a date comparison in your JOIN criteria:
SELECT T2.IDOutlet,T2.IDProduct,T2.OpnameDate,SUM(T1.Qty) AS Sum_Qty
FROM opnametable T2
LEFT JOIN transaction T1
ON T2.IDOUtlet = T1.IDOutlet
AND T2.IDProduct = T1.IDProduct
AND T1.TrxDate <= T2.OpnameDate
GROUP BY T2.IDOutlet,T2.IDProduct,T2.OpnameDate
I'm assuming the dates are stored in an appropriate date datatype, and that you want to include OpnameDate.
Try this
select o.IDOutlet,o.IDProduct,sum(t.Qty) as [Sum Qty]
from opnametable o left outer join transaction1 t on o.IDOutlet=t.IDOutlet
and o.IDProduct=t.IDProduct and t.trxdate<o.OpnameDate
group by o.IDOutlet,o.IDProduct