I'm trying to sum all sales of one period of time of a selling vehicle. The problem is that every product sold is one row whit amount and price and a total of the bill and the bill number.
So I have 2 options: multiply ever sold product amount whit the price and sum that. Or take the bill remove double rows and sum that. I chosen for the second option.
So now I have a [Location Code] (selling vehicle), [Bill No] and a [Total Price].
So I get:
0001 0001/00277343 10,26000000000000000000
0001 0001/00277343 10,26000000000000000000
0001 0001/00277343 10,26000000000000000000
0001 0001/00277343 10,26000000000000000000
0001 0001/00277345 10,33000000000000000000
0001 0001/00277345 10,33000000000000000000
0001 0001/00277345 10,33000000000000000000
0001 0001/00277347 24,35000000000000000000
0001 0001/00277348 30,31000000000000000000
0001 0001/00277348 30,31000000000000000000
0001 0001/00277349 2,69000000000000000000
As you see double entries, because on one bill there are more than one item. So now I just want to sum the unique price so that I get
0001 1822,50
At this moment I'm only as far as this:
select [Location Code], [Bill No_] , Price from [Item Ledger Entry]
where [Location Code] = '0001' and [Document Date] = '01.04.2015'
I tried several but none is working. Best result gives this, but not summed
select distinct[Bill No_], [Location Code] , Price from [Item Ledger Entry]
where [Location Code] = '0001' and [Document Date] = '01.04.2015'
I think you are looking for this:
SELECT [Location Code], [Bill No_], SUM(Price) AS Price
FROM (SELECT DISTINCT [Location Code], [Bill No_] , Price from [Item Ledger Entry]
WHERE [Location Code] = '0001' and [Document Date] = '01.04.2015') t
GROUP BY [Location Code], [Bill No_]
select [Location Code], [Bill No_] , SUM(Price) from [Item Ledger Entry]
where [Location Code] = '0001' and [Document Date] = '01.04.2015'
group by [Location Code], [Bill No_]
Related
I am trying to create an openAR file and I am stuck trying to group this data by Customer and Invoice. The file will get created daily.
[FILE DATE]
[CUSTOMER ID]
[INVOICE NUMBER]
[INVOICE TYPE]
[INVOICE DATE]
[OPEN INVOICE AMOUNT]
01/22/2021
00100000
INV1000
INV
06/08/2020
1000
01/22/2021
00100000
INV1001
INV
06/15/2020
50
01/22/2021
00100000
INV1002
INV
08/20/2020
50
01/22/2021
00100000
INV1005
CM
10/18/2020
-100
01/22/2021
00100000
PAY1000
PAY
06/15/2020
-750
01/22/2021
00100000
PAY1000
PAY
06/15/2020
820
I am trying to group this data as I need to Sum lines of the open invoice amounts per each invoice. The file will get exported automatically to another company to process the AR info. The column headers need to be exact as they are below. I usually use Aliases to group but with 2 word fixed Column headers, I am a bit stuck to figure out how to group this code. Also, how would you group GETDATE() and that CASE statement?
SELECT
CONVERT (nvarchar(30), GETDATE(), 101) as [FILE DATE],
GACC.BPR_0 as [CUSTOMER ID],
GACC.NUM_0 as [INVOICE NUMBER],
GACC.TYP_0 as [INVOICE TYPE],
Case
When GACC.TYP_0 in ('INV', 'CM') Then CONVERT (nvarchar(30), SI.BPRDAT_0 , 101)
Else CONVERT (nvarchar(30), PAY.ACCDAT_0 , 101)
End as [INVOICE DATE],
(GACC.AMOUNT_0 * GACC.SNS_0) as [OPEN INVOICE AMOUNT] --- want to SUM and group this column for each INV#
FROM dbo.GACCDUDATE as GACC
left join dbo.SINVOICE as SI --- Invoice Table
on GACC.NUM_0 = SIV.NUM_0
left join dbo.PAYMENT as PAY -- Payment Table
on PAY.NUM_0 = GACC.NUM_0
Thank you so much for helping me to group this for each customer, invoice, sum of open amount.
Edit - Desired output
[FILE DATE]
[CUSTOMER ID]
[INVOICE NUMBER]
[INVOICE TYPE]
[INVOICE DATE]
[OPEN INVOICE AMOUNT]
01/22/2021
00100000
INV1000
INV
06/08/2020
1000
01/22/2021
00100000
INV1001
INV
06/15/2020
50
01/22/2021
00100000
INV1002
INV
08/20/2020
50
01/22/2021
00100000
INV1005
CM
10/18/2020
-100
01/22/2021
00100000
PAY1000
PAY
06/15/2020
70
Using CTE to reference aliases. Alternatively, as in my comment just group on "CONVERT (nvarchar(30), GETDATE(), 101)" or "(GACC.AMOUNT_0 * GACC.SNS_0)"
With MyInvoices as
(
SELECT
CONVERT (nvarchar(30), GETDATE(), 101) as [FILE DATE],
GACC.BPR_0 as [CUSTOMER ID],
GACC.NUM_0 as [INVOICE NUMBER],
GACC.TYP_0 as [INVOICE TYPE],
Case
When GACC.TYP_0 in ('INV', 'CM') Then CONVERT (nvarchar(30), SI.BPRDAT_0 , 101)
Else CONVERT (nvarchar(30), PAY.ACCDAT_0 , 101)
End as [INVOICE DATE],
(GACC.AMOUNT_0 * GACC.SNS_0) as [OPEN INVOICE AMOUNT]
FROM dbo.GACCDUDATE as GACC
left join dbo.SINVOICE as SI
on GACC.NUM_0 = SIV.NUM_0
left join dbo.PAYMENT as PAY
on PAY.NUM_0 = GACC.NUM_0
)
select [FILE DATE], [CUSTOMER ID], [INVOICE NUMBER], [INVOICE TYPE],[INVOICE DATE],[OPEN INVOICE AMOUNT] from MyInvoices
group by [FILE DATE], [CUSTOMER ID], [INVOICE NUMBER], [INVOICE TYPE],[INVOICE DATE]
Is there any way to do this in SQL as right now i have to run same query 30 times by using union all 30 times for monthly total which takes lot of execution time.
If possible then it will be huge help.
Example:sum of item quantity on each day of given month
1 July- I want total of item qty from day transaction started , 2 nd July-I want total of item qty from day transaction started + 1st July qty , 3 rd July -total of item qty from day transaction started + 1st July qty+ 2 nd July qty
Thanks in advance
Declare #DateTo DateTime
Set #DateTo='2018-07-19'
Select #DateTo DateTo, [Item No_], sum(Quantity) from
[Snowman Logistics Limited$Item Ledger Entry]
where [Posting Date]<=#DateTo and [Item No_]='H1023038'
Group by [Item No_]
union all
Select #DateTo+1 DateTo, [Item No_], sum(Quantity) from
[Snowman Logistics Limited$Item Ledger Entry]
where [Posting Date]<=#DateTo+1 and [Item No_]='H1023038'
Group by [Item No_]
union all
Select #DateTo+2 DateTo, [Item No_], sum(Quantity) from
[Snowman Logistics Limited$Item Ledger Entry]
where [Posting Date]<=#DateTo+2 and [Item No_]='H1023038'
Group by [Item No_]
Result below
DateTo Item No_ (Quantity)
2018-07-19 00:00:00.000 H1023038 0.00000000000000000000
2018-07-20 00:00:00.000 H1023038 20100.00000000000000000000
2018-07-21 00:00:00.000 H1023038 12500.00000000000000000000
I have used DATEDIFF to distinguish between when the first unit rate was created and the posting date is when the first transaction of that item was posted.
I have the result that I need , however the DateDiff function gives me NULL values starting date for some rows.
SELECT DISTINCT b.[Entry No_] ,
a.[Starting Date],
b.[Posting Date],
b.[Item No_],
b.[Invoiced Quantity],
a.[Litre Conversion Factor],
a.[Unit Rate] ,
b.[Location Code],
a.[Excise Location],
a.[Excise Type Code],
a.[Unit Of Measure Code]
FROM [Spier Live$Value Entry] b
LEFT JOIN [Transfer Excise Tbl] a
ON a.[No_] = b.[Item No_]
AND b.[Location Code] = a.[Location Code]
AND DateDiff(y,b.[Posting Date],a.[Starting Date]) > -365 --DateDiff Year -365 for starting date
AND DateDiff(y,b.[Posting Date],a.[Starting Date]) < 0 --DateDiff Yer < 0 for posting date
WHERE b.[Posting Date] > '2013-02-26' --This is when the unit rate was entered
AND b.[Gen_ Bus_ Posting Group] IN ('LOCA','EXSA')
AND b.[Invoiced Quantity] <>0 --Removing all zero values
AND b.[Item No_] = 'F00335'
ORDER BY b.[Posting Date]
My Result
Transfer Excise Tbl This is the table I am joining on
As alex points out in the comments, you're looking for things with datediffs based on day of year, not year. Did you mean <= 0, btw? These criteria don't make sense and so probably aren't what you really want. If so, they'd cause joins to fail where you don't really want them to, leading to nulls showing for table a columns.
Good day
I have two tables I need to Join , Transfer Excise Tbl and Value Entry.
Transfer Excise Tbl: No must match the Item no in the Value Entry table. I did do a comparison for Items not in Transfer Excise that is in Value entry and found a few.
Transfer Excise Tbl:
Starting Date No_ Excise Location Location Code Unit Rate Excise Type Code Unit Of Measure Code Litre Conversion Factor
----------------------- -------------------- --------------- ------------- --------------------------------------- ---------------- -------------------- ---------------------------------------
2013-02-28 00:00:00.000 600011263 NONBOND ~DUTY PAID 2.70000000000000000000 UWNEPACK LITRES 1.33333000000000000000
2014-02-27 00:00:00.000 600011263 NONBOND ~DUTY PAID 2.87000000000000000000 UWNEPACK LITRES 1.33333000000000000000
2015-02-26 00:00:00.000 600011263 NONBOND ~DUTY PAID 3.07000000000000000000 UWNEPACK LITRES 1.33333000000000000000
2016-02-25 00:00:00.000 600011263 NONBOND ~DUTY PAID 3.31000000000000000000 UWNEPACK LITRES 1.33333000000000000000
Value Entry Table:
Item No_ Location Code Gen_ Bus_ Posting Group Invoiced Quantity
-------------------- ------------- ----------------------- ---------------------------------------
F00330 VINI EXSA -10.00000000000000000000
F00331 VINI EXSA -30.00000000000000000000
F00332 VINI EXSA -40.00000000000000000000
I want to write the query to exclude duplicates as the script below still creates duplicates. The PK is Item No and the FK is Location Code. you will see on the Transfer excise table that for each year I new unit rate was supplied for a specific Item and Location
SELECT DISTINCT a.[Starting Date],
b.[Posting Date],
b.[Item No_],
b.[Invoiced Quantity],
a.[Litre Conversion Factor],
a.[Unit Rate] ,
a.[Location Code],
a.[Excise Location],
a.[Excise Type Code],
a.[Unit Of Measure Code]
FROM [Transfer Excise Tbl] a JOIN [Spier Live$Value Entry] b
ON a.[No_] = b.[Item No_]
WHERE b.[Posting Date] > '2013-02-26 '
AND b.[Location Code] = a.[Location Code]
AND b.[Gen_ Bus_ Posting Group] IN ('LOCA','EXSA')
AND b.[Posting Date] >= a.[Starting Date]
AND b.[Invoiced Quantity] <>0
First of all, there is something wrong with your [Value Entry] table.
1) In your query you refer to [Posting Date] column, but there is no such column in your example data.
Now, if I have well understood the scenario, I think your problem is related to how you join lines from the two tables.
I get more lines than you expect because you JOIN each line in [Value Entry] with ALL lines in [Transfer Excise Tbl] with a [Starting Date] older, not only the LAST (valid) line.
To solve the problem you should pre-calc the period of validity of your [Transfer Excise Tbl] line finding the [End Date] of each line, and then you will
JOIN b.[Posting Date] BETWEEN a.[Starting Date] AND a.[End Date]
final query will be something like:
;WITH
EndDates as (-- add [End Date] to [Transfer Excise Tbl]
select t1.*, ISNULL([End Date], CONVERT(date, '9999-12-31', 121)) [End Date]
from [Transfer Excise Tbl] t1
outer apply (
select MIN([Starting Date]) [End Date]
from [Transfer Excise Tbl]
where [Starting Date] > t1.[Starting Date]
) T2
)
SELECT DISTINCT a.[Starting Date],
b.[Posting Date],
b.[Item No_],
b.[Invoiced Quantity],
a.[Litre Conversion Factor],
a.[Unit Rate] ,
a.[Location Code],
a.[Excise Location],
a.[Excise Type Code],
a.[Unit Of Measure Code]
FROM [EndDates] a JOIN [Spier Live$Value Entry] b ON a.[No_] = b.[Item No_] AND b.[Posting Date] BETWEEN a.[Starting Date] AND a.[End Date]
WHERE b.[Posting Date] > '2013-02-26 '
AND b.[Location Code] = a.[Location Code]
AND b.[Gen_ Bus_ Posting Group] IN ('LOCA','EXSA')
AND b.[Invoiced Quantity] <> 0
It should return only the number of rows you expect
I hope this helps
I have this table called myTable
Posting Date|Item No_|Entry Type|
2015-01-13|1234|1
2015-01-13|1234|1
2015-01-12|1234|1
2015-01-12|5678|1
2015-02-12|4567|1
What I want, is only return result where a [Item No_] is ind the table 1 time.
So in this example of my table, i only want to return [Item No_] 5678 and 4567, because there only are one record in it. And then ignore [Item No_] 1234
This is my SQL i have tried, but something is wrong. Can anyone help me?
SELECT [Item No_], [Posting Date], COUNT([Item No_]) AS Antal
FROM myTable
GROUP BY [Entry Type], [Posting Date], [Item No_]
HAVING ([Entry Type] = 1) AND (COUNT([Item No_]) = 1)
ORDER BY [Posting Date] DESC
select [Item No_]
from myTable
group by [Item No_]
having count(*)=1
Remove Posting Date from group by
SELECT [Item No_],Entry Type, COUNT([Item No_]) AS Antal
FROM myTable
GROUP BY [Entry Type], [Item No_]
HAVING COUNT([Item No_]) = 1
or if you want other details use a subquery
SELECT [Item No_],
Entry Type,
Posting Date
FROM myTable a
WHERE EXISTS (SELECT 1
FROM myTable b
where a.[Item No_]=b.[Item No_]
GROUP BY [Entry Type],
[Item No_]
HAVING Count(1) = 1)
ORDER BY [Posting Date] DESC
or window function
;WITH cte
AS (SELECT [Item No_],
[Posting Date],
[Entry Type],
Row_number()OVER (Partition BY [Entry Type], [Item No_] ORDER BY [Item No_]) RN
FROM myTable)
SELECT *
FROM cte a
WHERE NOT EXISTS (SELECT 1
FROM cte b
WHERE a.[Item No_] = b.[Item No_]
AND rn > 1)
ORDER BY [Posting Date] DESC
You can use ROW_Number() in Sql Server
select * from (
SELECT [Item No_], [Posting Date],
Row_Number() over (Parition by [Item No_]
order by [Item No]) RN
FROM myTable
)D
where D.RN=1