Error converting data type nvarchar to numeric but I have no idea why - sql

I ran a query I created but I am getting an "Error converting data type nvarchar to numeric" error along with "Warning: Null value is eliminated by an aggregate or other SET operation." but I have no idea why as I am not converting anything.
Here is my query:
SELECT DISTINCT TOP 1000
O.Date_Entered
,O.Company_Code
,O.Division_Code
,O.Customer_Purchase_Order_Number
,O.Control_Number
,O.Customer_Number
,P.PickTicket_Number
,sh.PACKSLIP
,Accellos_Download
,Accellos_Allocated
,Accellos_Waved
,Accellos_Label
,Accellos_Last_Pick
,Accellos_Rating
,Accellos_Shipped
,Accellos_Upload
FROM [JMNYC-AMTDB].[AMTPLUS].[dbo].Orders o (nolock)
LEFT JOIN [JMNYC-AMTDB].[AMTPLUS].[dbo].PickTickets P (nolock)
on O.Company_Code = P.Company_Code
and O.Division_Code = P.Division_Code
and O.Control_Number = P.Control_Number
LEFT JOIN [JMDNJ-ACCELSQL].[A1WAREHOUSE].[dbo].SHIPHIST sh (nolock) ON o.Customer_Purchase_Order_Number = sh.cust_po
LEFT JOIN (
SELECT
Packslip
,max( case when Action like 'DNLOAD' then Date_Time end) as Accellos_Download
,max( case when Action like 'ALLOC' then Date_Time end) as Accellos_Allocated
,max( case when Action like 'WAVEORDER' then Date_Time end) as Accellos_Waved
,max( case when Action like 'NEWLABEL' then Date_Time end) as Accellos_Label
,max( case when Action like 'EOL_LSTP' then Date_Time end) as Accellos_Last_Pick
,max( case when Action like 'RATED' then Date_Time end) as Accellos_Rating
,max( case when Action like 'SHIPPED' then Date_Time end) as Accellos_Shipped
,max( case when Action like 'UPLOAD' then Date_Time end) as Accellos_Upload
FROM(
SELECT DISTINCT
Packslip
,Date_Time
,Action
from [JMDNJ-ACCELSQL].[A1Warehouse].[dbo].[RF_LOG2] RL (nolock)
)RLTS
group by Packslip
)RLTSS on Coalesce(sh.PACKSLIP, P.pickticket_number) = RLTSS.PACKSLIP
Here is a sample of the RF_LOG2 Table
+--------------------------------------+----------+----------+---------------------------------------------------------------------------------------------------------+--------+----------+----------+----------+----------+-----------+---------------------+----------------------+----------------------+--------------------+------------+----------+--------+--------+----------+------------------+------------+----------+----------+
| ROWID | PACKSLIP | BINLABEL | EXTENDED | TERMID | USERID | ACTION | QUANTITY | Q_SCALER | TOTLABEL | REFERENCE2 | REFERENCE3 | DATE_TIME | DATE_CREAT | CLIENTNAME | TENANTID | PO_NUM | SERIAL | LOCATION | LICENSE_PLATE | PURGE_FLAG | PACKSIZE | UPLOADED |
+--------------------------------------+----------+----------+---------------------------------------------------------------------------------------------------------+--------+----------+----------+----------+----------+-----------+---------------------+----------------------+----------------------+--------------------+------------+----------+--------+--------+----------+------------------+------------+----------+----------+
| BC5A92B0-F347-4E27-80C5-49798E1B6B75 | 90214801 | PICK | | 0 | | DNLOAD | 0.000000 | 0 | | E. Keith DuBose | l:1 u:1 | 20190726 13:15:29.87 | 0x00000000207E9F1E | 09 | | | | | | 1 | 1.000000 | 0 |
| 3564B24F-1AA9-42A4-83A4-D14151395CED | 90214801 | | | 0 | jsac | ALLOCORD | 0.000000 | 0 | | Allocated | READY TO WAVE | 20190726 13:25:54.51 | 0x00000000207E4672 | 09 | | | | | | 1 | 1.000000 | 0 |
| 0E5B3952-2BD4-4035-A645-1C024B8D3F10 | 90214801 | | | 0 | jsac | ALLOC | 0.000000 | 0 | | Release SWOG | | 20190726 13:25:54.54 | 0x00000000207F14C6 | 09 | | | | | | 1 | 1.000000 | 0 |
| 09575559-EB27-4CDB-8B35-56F741F779E1 | 90214801 | | | 0 | jsac | WAVEORDR | 0.000000 | 0 | | Wave:2392 | RF Picking | 20190726 15:05:31.71 | 0x00000000207EFE60 | 09 | | | | | | 1 | 1.000000 | 0 |
| 61B21B11-D638-4AA2-A94A-25B54650EBAD | 90214801 | | | 0 | | EOL_PRNT | 0.000000 | 0 | | New Carton | 00008139850296299650 | 20190726 15:06:03.79 | 0x00000000207E5A7D | 09 | | | | | | 1 | 1.000000 | 0 |
| 7B46FD91-A30D-4D92-A9E9-6024630D2710 | 90214801 | | | 0 | RFBASE | NEWLABEL | 0.000000 | 0 | 109629965 | 029629965 | | 20190726 15:06:03.80 | 0x00000000207E480E | 09 | | | | | | 1 | 1.000000 | 0 |
| 042D7D42-1D08-4926-AF5B-005868924302 | 90214801 | 3F88082A | 910B2307NSZ99000 /09 | 0 | LSAB | PICK_LP | 1.000000 | 1 | 109629965 | LP picking | | 20190726 15:55:58.92 | 0x00000000207F04F4 | 09 | | | | | 910B2307NSZ99000 | 1 | 1.000000 | 0 |
| 21711DE4-6119-47C0-B3F0-1A0AB816A679 | 90214801 | 3F88082A | 910B2307NSZ99000 /09 | 0 | LSAB | MOVE-OUT | 1.000000 | -1 | | 1 Packs of 1.000000 | via PICKING | 20190726 15:55:58.94 | 0x00000000207E32CC | 09 | | | | | | 1 | 1.000000 | 0 |
| E0D5C819-DC3C-4E21-9857-25476432A057 | 90214801 | 3F88082A | 910B2307NSZ99000 /09 | 0 | LSAB | PICKDETL | 1.000000 | -1 | 109629965 | | | 20190726 15:55:58.95 | 0x00000000207E239A | 09 | | | | | | 1 | 1.000000 | 0 |
| 20D981C1-CE83-459F-9D7A-1784CC215856 | 90214801 | | | 0 | LSAB | EOL_LSCP | 0.000000 | 0 | | Last Pick In Carton | 00008139850296299650 | 20190726 15:55:58.97 | 0x00000000207E07FE | 09 | | | | | | 1 | 1.000000 | 0 |
| CDBCBD5B-9DC7-4FE5-91C9-7C409EA4C2D9 | 90214801 | | | 0 | LSAB | PICKORDR | 0.000000 | 0 | | | | 20190726 15:55:58.97 | 0x00000000207F1CEE | 09 | | | | | | 1 | 1.000000 | 0 |
| DD637317-640E-4A8D-A8DB-9C2C587BA217 | 90214801 | 3F88082A | 910B2307NSZ99000 /09 | 0 | LSAB | PICKLINE | 1.000000 | -1 | | 1 | | 20190726 15:55:58.97 | 0x00000000207E8F55 | 09 | | | | | | 1 | 1.000000 | 0 |
| EE4D734C-8CCE-4C73-B133-C024D79A6054 | 90214801 | | | 0 | LSAB | EOL_LSTP | 0.000000 | 0 | | LAST PICK COMPLETED | 2 | 20190726 15:55:58.97 | 0x00000000207F516E | 09 | | | | | | 1 | 1.000000 | 0 |
| 06204BC1-87B1-4340-9712-C8996388B550 | 90214801 | | | 0 | BACKGRND | RATED | 0.000000 | 0 | 109629965 | 109629965 ACT99 | SHP1563345 | 20190729 08:30:39.86 | 0x000000002089F080 | 09 | | | | | | 1 | 1.000000 | 0 |
| 48759371-8B78-4901-8BE4-749FA55E1D40 | 90214801 | | | 0 | BACKGRND | EOL_SSYS | 0.000000 | 0 | | ShipSys Confirm | | 20190729 08:30:39.89 | 0x0000000020896EF1 | 09 | | | | | | 1 | 1.000000 | 0 |
| 904BF8C6-794D-4288-A594-22BA93A31095 | 90214801 | | | 0 | BACKGRND | SHIPPED | 0.000000 | 0 | | USPS PM | SHP1563345 | 20190729 08:30:39.90 | 0x000000002087F9F3 | 09 | | | | | | 1 | 1.000000 | 0 |
| ECA102C8-B7C4-46D3-A844-FBD0CFE79413 | 90214801 | | | 0 | sdob | SUSPEND | 0.000000 | 0 | | | | 20190729 09:45:40.12 | 0x00000000208922D8 | 09 | | | | | | 1 | 1.000000 | 0 |
| 867A7B87-5AB2-4EE7-8FDC-7175D406C0F0 | 90214801 | | | 0 | sdob | UNSUSPND | 0.000000 | 0 | | | | 20190729 10:07:56.88 | 0x00000000208A0AF5 | 09 | | | | | | 1 | 1.000000 | 0 |
| E5FB157B-9837-4DA8-B5D0-9A605603FD60 | 90214801 | | | 0 | sdob | SHIPCOMP | 0.000000 | 0 | | | ship_order() | 20190729 11:42:20.30 | 0x000000002089D1FD | 09 | | | | | | 1 | 1.000000 | 0 |
| 37D4B782-1184-4F91-913B-F1BA251740DF | 90214801 | | | 0 | sdob | SHIPPED | 0.000000 | 0 | | USPS PM | SHP1563345 | 20190729 11:42:20.32 | 0x000000002088482F | 09 | | | | | | 1 | 1.000000 | 0 |
| 4FDE75F7-D98B-451E-A106-0C9F29BADEE1 | 90214801 | | | 0 | sdob | EOL_EXTN | 0.000000 | 0 | | External Process | | 20190729 11:42:20.33 | 0x0000000020897E4A | 09 | | | | | | 1 | 1.000000 | 0 |
| C41D73C8-385A-4547-A684-7CEA1B7CE9DB | 90214801 | PICK | | 0 | C# | UPLOAD | 0.000000 | 0 | | E. Keith DuBose | | 20190729 11:43:45.66 | 0x00000000208A1D2F | | | | | | | 1 | 1.000000 | 0 |
+--------------------------------------+----------+----------+---------------------------------------------------------------------------------------------------------+--------+----------+----------+----------+----------+-----------+---------------------+----------------------+----------------------+--------------------+------------+----------+--------+--------+----------+------------------+------------+----------+----------+
What I am trying to do is to get the timestamp for each part of the order. So when it was created, when it was picked, and so on. I want this information to be displayed horizontally, per order. Also, it only crashed after running for like 5 minutes.

The warning Warning: Null value is eliminated by an aggregate or other SET operation happen because of the value of Date_Time that you do in max() containing NULL value.
For the error, I'm afraid that that causes from Coalesce(sh.PACKSLIP, P.pickticket_number). You should check the type and convert one of them to have the same type as another. From the hint from the table, you attached they both should be the numeric value.

To avoid the warning you can use below option.
SET ANSI_WARNINGS OFF

Related

How I can I add a count to rank null values in SQL Hive?

This is what I have right now:
| time | car_id | order | in_order |
|-------|--------|-------|----------|
| 12:31 | 32 | null | 0 |
| 12:33 | 32 | null | 0 |
| 12:35 | 32 | null | 0 |
| 12:37 | 32 | 123 | 1 |
| 12:38 | 32 | 123 | 1 |
| 12:39 | 32 | 123 | 1 |
| 12:41 | 32 | 123 | 1 |
| 12:43 | 32 | 123 | 1 |
| 12:45 | 32 | null | 0 |
| 12:47 | 32 | null | 0 |
| 12:49 | 32 | 321 | 1 |
| 12:51 | 32 | 321 | 1 |
I'm trying to rank orders, including those who have null values, in this case by car_id.
This is the result I'm looking for:
| time | car_id | order | in_order | row |
|-------|--------|-------|----------|-----|
| 12:31 | 32 | null | 0 | 1 |
| 12:33 | 32 | null | 0 | 1 |
| 12:35 | 32 | null | 0 | 1 |
| 12:37 | 32 | 123 | 1 | 2 |
| 12:38 | 32 | 123 | 1 | 2 |
| 12:39 | 32 | 123 | 1 | 2 |
| 12:41 | 32 | 123 | 1 | 2 |
| 12:43 | 32 | 123 | 1 | 2 |
| 12:45 | 32 | null | 0 | 3 |
| 12:47 | 32 | null | 0 | 3 |
| 12:49 | 32 | 321 | 1 | 4 |
| 12:51 | 32 | 321 | 1 | 4 |
I just don't know how to manage a count for the null values.
Thanks!
You can count the number of non-NULL values before each row and then use dense_rank():
select t.*,
dense_rank() over (partition by car_id order by grp) as row
from (select t.*,
count(order) over (partition by car_id order by time) as grp
from t
) t;

Reformat an existing data table into single line entries?

I would like to get all entries from an existing SQL Server table for the same job_number field on a single row. For example, I want to take the following table:
+--------------+------------+-------------------+-------------+------------+--------------+
| Company_Code | Job_Number | User_Def_Sequence | Alpha_Field | Date_Field | Amount_Field |
+--------------+------------+-------------------+-------------+------------+--------------+
| ABC | 02-0294-00 | 000001 | | NULL | 0.000000 |
| ABC | 02-0294-00 | 000003 | | NULL | 0.000000 |
| ABC | 02-0294-00 | 000006 | | NULL | 0.000000 |
| ABC | 02-0418-00 | 000001 | | NULL | 0.000000 |
| ABC | 02-0418-00 | 000002 | | NULL | 0.000000 |
| ABC | 02-0418-00 | 000003 | 15-02-0065 | NULL | 0.000000 |
| ABC | 02-0424-00 | 000003 | 15-02-0095 | NULL | 0.000000 |
| ABC | 02-0431-00 | 000003 | 15-02-0095 | NULL | 0.000000 |
| ABC | 02-0435-00 | 000003 | 15-02-0102 | NULL | 0.000000 |
+--------------+------------+-------------------+-------------+------------+--------------+
and convert it into something like this:
+--------------+------------+--------+------------+--------+----------+--------+---------+--------+----------+--------+------------+--------+----------+
| Company_Code | Job_Number | UDS_1 | Alpha_1 | Date_1 | Amount_1 | UDS_2 | Alpha_2 | Date_2 | Amount_2 | UDS_3 | Alpha_3 | Date_3 | Amount_3 |
+--------------+------------+--------+------------+--------+----------+--------+---------+--------+----------+--------+------------+--------+----------+
| ABC | 02-0294-00 | 000001 | | NULL | 0.000000 | 000003 | | NULL | 0.000000 | 000006 | | NULL | 0.000000 |
| ABC | 02-0418-00 | 000001 | | NULL | 0.000000 | 000002 | | NULL | 0.000000 | 000003 | 15-02-0065 | NULL | 0.000000 |
| ABC | 02-0424-00 | 000003 | 15-02-0065 | NULL | 0.000000 | | | | | | | | |
| ABC | 02-0431-00 | 000003 | 15-02-0095 | NULL | 0.000000 | | | | | | | | |
| ABC | 02-0435-00 | 000003 | 15-02-0102 | NULL | 0.000000 | | | | | | | | |
+--------------+------------+--------+------------+--------+----------+--------+---------+--------+----------+--------+------------+--------+----------+
Edit: Each job number may have multiple User_Def_Sequence variations which need to be appended to the same row along with their value.
What is the best method of accomplishing this?
I think you want aggregation:
select Company_Code, Job_Number,
min(User_Def_Sequence),
min(Alpha_Field),
min(Alpha_Field),
min(Amount_Field),
. . .
from t
group by Company_Code, Job_Number;

How to apply status on a account that has its running total reaching zero

Below is a sample data that i am trying to manipulate.
+----------------+------------------+---------+------+--------------+-------------+--+--+
| ACCOUNT_NUMBER | TRANSACTION_DATE | bal | Row# | RunningTotal | status | | |
+----------------+------------------+---------+------+--------------+-------------+--+--+
| 155 | 31/03/2015 | 82.61 | 4 | 82.61 | CLEARED | | |
+----------------+------------------+---------+------+--------------+-------------+--+--+
| 155 | 31/03/2015 | 85.25 | 5 | 167.86 | CLEARED | | |
+----------------+------------------+---------+------+--------------+-------------+--+--+
| 155 | 31/03/2015 | 93.61 | 6 | 261.47 | CLEARED | | |
+----------------+------------------+---------+------+--------------+-------------+--+--+
| 155 | 30/04/2015 | 78.95 | 7 | 340.42 | CLEARED | | |
+----------------+------------------+---------+------+--------------+-------------+--+--+
| 155 | 22/05/2015 | -62.04 | 8 | 278.38 | CLEARED | | |
+----------------+------------------+---------+------+--------------+-------------+--+--+
| 155 | 31/05/2015 | 98.95 | 9 | 377.33 | CLEARED | | |
+----------------+------------------+---------+------+--------------+-------------+--+--+
| 155 | 30/06/2015 | 79.5 | 10 | 456.83 | CLEARED | | |
+----------------+------------------+---------+------+--------------+-------------+--+--+
| 155 | 15/07/2015 | -345.76 | 11 | 111.07 | CLEARED | | |
+----------------+------------------+---------+------+--------------+-------------+--+--+
| 155 | 12/05/2016 | -111.07 | 12 | 0 | CLEARED | | |
+----------------+------------------+---------+------+--------------+-------------+--+--+
| 953 | 31/03/2015 | 2.5 | 13 | 2.5 | NOT_CLEARED | | |
+----------------+------------------+---------+------+--------------+-------------+--+--+
| 953 | 31/03/2015 | 2.5 | 14 | 5 | NOT_CLEARED | | |
+----------------+------------------+---------+------+--------------+-------------+--+--+
| 953 | 31/03/2015 | 2.5 | 15 | 7.5 | NOT_CLEARED | | |
+----------------+------------------+---------+------+--------------+-------------+--+--+
| 953 | 30/04/2015 | 2.5 | 16 | 10 | NOT_CLEARED | | |
+----------------+------------------+---------+------+--------------+-------------+--+--+
| 953 | 31/05/2015 | 2.5 | 17 | 12.5 | NOT_CLEARED | | |
+----------------+------------------+---------+------+--------------+-------------+--+--+
| 953 | 30/06/2015 | 0.67 | 18 | 13.17 | NOT_CLEARED | | |
+----------------+------------------+---------+------+--------------+-------------+--+--+
| 953 | 30/07/2015 | -0.81 | 19 | 12.36 | NOT_CLEARED | | |
+----------------+------------------+---------+------+--------------+-------------+--+--+
| 953 | 31/05/2018 | 5.08 | 20 | 17.44 | NOT_CLEARED | | |
+----------------+------------------+---------+------+--------------+-------------+--+--+
| 953 | 30/11/2018 | 1.02 | 21 | 18.46 | NOT_CLEARED | | |
+----------------+------------------+---------+------+--------------+-------------+--+--+
| 953 | 31/05/2019 | 1.48 | 22 | 19.94 | NOT_CLEARED | | |
+----------------+------------------+---------+------+--------------+-------------+--+--+
| 961 | 31/03/2015 | 8.38 | 23 | 8.38 | NOT_CLEARED | | |
+----------------+------------------+---------+------+--------------+-------------+--+--+
| 961 | 31/03/2015 | 10.65 | 24 | 19.03 | NOT_CLEARED | | |
+----------------+------------------+---------+------+--------------+-------------+--+--+
| 961 | 31/03/2015 | 25.07 | 25 | 44.1 | NOT_CLEARED | | |
+----------------+------------------+---------+------+--------------+-------------+--+--+
| 961 | 30/04/2015 | 12.21 | 26 | 56.31 | NOT_CLEARED | | |
+----------------+------------------+---------+------+--------------+-------------+--+--+
| 961 | 30/04/2015 | -20 | 27 | 36.31 | NOT_CLEARED | | |
+----------------+------------------+---------+------+--------------+-------------+--+--+
| 961 | 20/05/2015 | -36.31 | 28 | 0 | CLEARED | | |
+----------------+------------------+---------+------+--------------+-------------+--+--+
| 961 | 31/05/2015 | -3.69 | 29 | -3.69 | NOT_CLEARED | | |
+----------------+------------------+---------+------+--------------+-------------+--+--+
| 961 | 31/05/2015 | 13.17 | 30 | 9.48 | NOT_CLEARED | | |
+----------------+------------------+---------+------+--------------+-------------+--+--+
| 961 | 30/06/2015 | 9 | 31 | 18.48 | NOT_CLEARED | | |
+----------------+------------------+---------+------+--------------+-------------+--+--+
| 961 | 25/07/2015 | -18.48 | 32 | 0 | CLEARED
Below is the script used to apply the status of the invoices. Essentially i want to be able to determine if a certain account has knocked off all its invoices. I have two conditions :
IF SUM of the balance equals zero then apply CLEARED
Second option which is what i am trying to figure out is to say. If somehow the final sum of the Running total isnt zero, but at the point where it clears let all the above invoices have cleared marker.
select *,
(CASE WHEN sum(bal) OVER (PARTITION BY ACCOUNT_NUMBER ) = 0 THEN 'CLEARED'
WHEN sum(bal) OVER (PARTITION BY ACCOUNT_NUMBER order by Row#,TRANSACTION_DATE ) = 0 THEN 'CLEARED'
else 'NOT_CLEARED'
end) as status
from #running_totals
order by Row#, TRANSACTION_DATE
Can some assist me on how to apply this
Expected Results
+----------------+------------------+---------+------+--------------+-------------+--+--+
| ACCOUNT_NUMBER | TRANSACTION_DATE | bal | Row# | RunningTotal | status | | |
+----------------+------------------+---------+------+--------------+-------------+--+--+
| 155 | 31/03/2015 | 82.61 | 4 | 82.61 | CLEARED | | |
+----------------+------------------+---------+------+--------------+-------------+--+--+
| 155 | 31/03/2015 | 85.25 | 5 | 167.86 | CLEARED | | |
+----------------+------------------+---------+------+--------------+-------------+--+--+
| 155 | 31/03/2015 | 93.61 | 6 | 261.47 | CLEARED | | |
+----------------+------------------+---------+------+--------------+-------------+--+--+
| 155 | 30/04/2015 | 78.95 | 7 | 340.42 | CLEARED | | |
+----------------+------------------+---------+------+--------------+-------------+--+--+
| 155 | 22/05/2015 | -62.04 | 8 | 278.38 | CLEARED | | |
+----------------+------------------+---------+------+--------------+-------------+--+--+
| 155 | 31/05/2015 | 98.95 | 9 | 377.33 | CLEARED | | |
+----------------+------------------+---------+------+--------------+-------------+--+--+
| 155 | 30/06/2015 | 79.5 | 10 | 456.83 | CLEARED | | |
+----------------+------------------+---------+------+--------------+-------------+--+--+
| 155 | 15/07/2015 | -345.76 | 11 | 111.07 | CLEARED | | |
+----------------+------------------+---------+------+--------------+-------------+--+--+
| 155 | 12/05/2016 | -111.07 | 12 | 0 | CLEARED | | |
+----------------+------------------+---------+------+--------------+-------------+--+--+
| 953 | 31/03/2015 | 2.5 | 13 | 2.5 | NOT_CLEARED | | |
+----------------+------------------+---------+------+--------------+-------------+--+--+
| 953 | 31/03/2015 | 2.5 | 14 | 5 | NOT_CLEARED | | |
+----------------+------------------+---------+------+--------------+-------------+--+--+
| 953 | 31/03/2015 | 2.5 | 15 | 7.5 | NOT_CLEARED | | |
+----------------+------------------+---------+------+--------------+-------------+--+--+
| 953 | 30/04/2015 | 2.5 | 16 | 10 | NOT_CLEARED | | |
+----------------+------------------+---------+------+--------------+-------------+--+--+
| 953 | 31/05/2015 | 2.5 | 17 | 12.5 | NOT_CLEARED | | |
+----------------+------------------+---------+------+--------------+-------------+--+--+
| 953 | 30/06/2015 | 0.67 | 18 | 13.17 | NOT_CLEARED | | |
+----------------+------------------+---------+------+--------------+-------------+--+--+
| 953 | 30/07/2015 | -0.81 | 19 | 12.36 | NOT_CLEARED | | |
+----------------+------------------+---------+------+--------------+-------------+--+--+
| 953 | 31/05/2018 | 5.08 | 20 | 17.44 | NOT_CLEARED | | |
+----------------+------------------+---------+------+--------------+-------------+--+--+
| 953 | 30/11/2018 | 1.02 | 21 | 18.46 | NOT_CLEARED | | |
+----------------+------------------+---------+------+--------------+-------------+--+--+
| 953 | 31/05/2019 | 1.48 | 22 | 19.94 | NOT_CLEARED | | |
+----------------+------------------+---------+------+--------------+-------------+--+--+
| 961 | 31/03/2015 | 8.38 | 23 | 8.38 | CLEARED | | |
+----------------+------------------+---------+------+--------------+-------------+--+--+
| 961 | 31/03/2015 | 10.65 | 24 | 19.03 | CLEARED | | |
+----------------+------------------+---------+------+--------------+-------------+--+--+
| 961 | 31/03/2015 | 25.07 | 25 | 44.1 | CLEARED | | |
+----------------+------------------+---------+------+--------------+-------------+--+--+
| 961 | 30/04/2015 | 12.21 | 26 | 56.31 | CLEARED | | |
+----------------+------------------+---------+------+--------------+-------------+--+--+
| 961 | 30/04/2015 | -20 | 27 | 36.31 | CLEARED | | |
+----------------+------------------+---------+------+--------------+-------------+--+--+
| 961 | 20/05/2015 | -36.31 | 28 | 0 | CLEARED | | |
+----------------+------------------+---------+------+--------------+-------------+--+--+
| 961 | 31/05/2015 | -3.69 | 29 | -3.69 | CLEARED | | |
+----------------+------------------+---------+------+--------------+-------------+--+--+
| 961 | 31/05/2015 | 13.17 | 30 | 9.48 | CLEARED | | |
+----------------+------------------+---------+------+--------------+-------------+--+--+
| 961 | 30/06/2015 | 9 | 31 | 18.48 | CLEARED | | |
+----------------+------------------+---------+------+--------------+-------------+--+--+
| 961 | 25/07/2015 | -18.48 | 32 | 0 | CLEARED | | |
+----------------+------------------+---------+------+--------------+-------------+--+--+
| | | | | | | | |
+----------------+------------------+---------+------+--------------+-------------+--+--+
| | | | | | | | |
+----------------+------------------+---------+------+--------------+-------------+--+--+
| | | | | | | | |
+----------------+------------------+---------+------+--------------+-------------+--+--+
Give this a try. Given the table of data you provided I would assume you will already have the RowNum row. You can probably improve on this, but it is a start.
;WITH CTE AS(
select t1.ACCOUNT_NUMBER AN3, t1.RowNum RN3
from #temp t1
CROSS APPLY(select ACCOUNT_NUMBER AN2,RowNum RN2
from #temp where RunningTotal=0) t2 where t1.ACCOUNT_NUMBER = t2.AN2 and t1.RowNum <= t2.RN2
)
select ACCOUNT_NUMBER,TRANSACTION_DATE,bal,RowNum,RunningTotal,Status,
CASE WHEN t2.AN3 IS NOT NULL THEN 'CLEARED'
ELSE 'NOT CLEARED' END Status
from #temp t1
LEFT JOIN CTE t2 on t1.ACCOUNT_NUMBER = t2.AN3 and t1.RowNum = t2.RN3
Try with this following logic-
SELECT *,
CASE
WHEN (SELECT MIN(ACCOUNT_NUMBER) FROM your_table) = ACCOUNT_NUMBER THEN 'CLEARED'
-- I have considered the MIN ACC_NUMBER as per your data
-- But you can also use a fix ACC_NUMBER if required like
-- WHEN 199= ACCOUNT_NUMBER THEN 'CLEARED'
WHEN
SUM(bal) OVER(
PARTITION BY ACCOUNT_NUMBER ORDER BY ACCOUNT_NUMBER
ROWS BETWEEN UNBOUNDED PRECEDING AND CURRENT ROW
) = 0 THEN 'CLEARED'
ELSE 'NOT_CLEARED'
END Status
FROM your_table
I think you can do this just by comparing the row number of the last row with status cleared and balance 0 to the last row with balance 0:
select rt.*,
(case when sum(bal) over (partition by account_number) = 0
then 'CLEARED'
when max(case when status = 'CLEARED' and bal = 0 then Row# end) =
max(case when bal = 0 then Row# end)
then 'CLEARED'
else 'NOT_CLEARED'
end) as status
from #running_totals rt
order by Row#, TRANSACTION_DATE

SQL Performance multiple exclusion from the same table

I have a table where I have a list of people, lets say i have 100 people listed in that table
I need to filter out the people using different criteria's and put them in groups, problem is when i start excluding on the 4th-5th level, performance issues come up and it becomes slow
with lst_tous_movements as (
select
t1.refid_eClinibase
t1.[dthrfinmouvement]
t1.[unite_service_id]
t1.[unite_service_suiv_id]
from sometable t1
)
,lst_patients_hospitalisés as (
select distinct
t1.refid_eClinibase
from lst_tous_movements t1
where
t1.[dthrfinmouvement] = '4000-01-01'
)
,lst_patients_admisUIB_transferes as (
select distinct
t1.refid_eClinibase
from lst_tous_movements t1
left join lst_patients_hospitalisés t2 on t1.refid_eClinibase = t2.refid_eClinibase
where
t1.[unite_service_id] = 4
and t1.[unite_service_suiv_id] <> 0
and t2.refid_eClinibase is null
)
,lst_patients_admisUIB_nonTransferes as (
select distinct
t1.refid_eClinibase
from lst_tous_movements t1
left join lst_patients_admisUIB_transferes t2 on t1.refid_eClinibase = t2.refid_eClinibase
left join lst_patients_hospitalisés t3 on t1.refid_eClinibase = t3.refid_eClinibase
where
t1.[unite_service_id] = 4
and t1.[unite_service_suiv_id] = 0
and t2.refid_eClinibase is null
and t3.refid_eClinibase is null
)
,lst_patients_autres as (
select distinct
t1.refid_eClinibase
from lst_patients t1
left join lst_patients_admisUIB_transferes t2 on t1.refid_eClinibase = t2.refid_eClinibase
left join lst_patients_hospitalisés t3 on t1.refid_eClinibase = t3.refid_eClinibase
left join lst_patients_admisUIB_nonTransferes t4 on t1.refid_eClinibase = t4.refid_eClinibase
where
t2.refid_eClinibase is null
and t3.refid_eClinibase is null
and t4.refid_eClinibase is null
)
as you can see i have a multi level filtering out going on here...
1st i get the people where t1.[dthrfinmouvement] = '4000-01-01'
2nd i get the people with another criteria EXCLUDING the 1st group
3rd i get the people with yet another criteria EXCLUDING the 1st and
the 2nd group
etc..
when i get to the 4th level, my query takes 6 - 10 seconds to complete
is there any way to speed this up ?
this is my dataset i'm working with:
+------------------+-------------------------------+------------------+------------------+-----------------------+
| refid_eClinibase | nodossierpermanent_eClinibase | dthrfinmouvement | unite_service_id | unite_service_suiv_id |
+------------------+-------------------------------+------------------+------------------+-----------------------+
| 25611 | P0017379 | 2013-04-27 | 58 | 0 |
| 25611 | P0017379 | 2013-05-02 | 4 | 2 |
| 25611 | P0017379 | 2013-05-18 | 2 | 0 |
| 85886 | P0077918 | 2013-04-10 | 58 | 0 |
| 85886 | P0077918 | 2013-05-06 | 6 | 12 |
| 85886 | P0077918 | 4000-01-01 | 12 | 0 |
| 91312 | P0083352 | 2013-07-24 | 3 | 14 |
| 91312 | P0083352 | 2013-07-24 | 14 | 3 |
| 91312 | P0083352 | 2013-07-30 | 3 | 8 |
| 91312 | P0083352 | 4000-01-01 | 8 | 0 |
| 93835 | P0085879 | 2013-04-30 | 58 | 0 |
| 93835 | P0085879 | 2013-05-07 | 4 | 2 |
| 93835 | P0085879 | 2013-05-16 | 2 | 0 |
| 93835 | P0085879 | 2013-05-22 | 58 | 0 |
| 93835 | P0085879 | 2013-05-24 | 4 | 0 |
| 93835 | P0085879 | 2013-05-31 | 58 | 0 |
| 93836 | P0085880 | 2013-05-20 | 58 | 0 |
| 93836 | P0085880 | 2013-05-22 | 4 | 2 |
| 93836 | P0085880 | 2013-05-31 | 2 | 0 |
| 97509 | P0089576 | 2013-04-09 | 58 | 0 |
| 97509 | P0089576 | 2013-04-11 | 4 | 0 |
| 102787 | P0094886 | 2013-04-08 | 58 | 0 |
| 102787 | P0094886 | 2013-04-11 | 4 | 2 |
| 102787 | P0094886 | 2013-05-21 | 2 | 0 |
| 103029 | P0095128 | 2013-04-04 | 58 | 0 |
| 103029 | P0095128 | 2013-04-10 | 4 | 1 |
| 103029 | P0095128 | 2013-05-03 | 1 | 0 |
| 103813 | P0095922 | 2013-07-02 | 58 | 0 |
| 103813 | P0095922 | 2013-07-03 | 4 | 6 |
| 103813 | P0095922 | 2013-08-14 | 6 | 0 |
| 105106 | P0097215 | 2013-08-09 | 58 | 0 |
| 105106 | P0097215 | 2013-08-13 | 4 | 0 |
| 105106 | P0097215 | 2013-08-14 | 58 | 0 |
| 105106 | P0097215 | 4000-01-01 | 4 | 0 |
| 106223 | P0098332 | 2013-06-11 | 1 | 0 |
| 106223 | P0098332 | 2013-08-01 | 58 | 0 |
| 106223 | P0098332 | 4000-01-01 | 1 | 0 |
| 106245 | P0098354 | 2013-04-02 | 58 | 0 |
| 106245 | P0098354 | 2013-05-24 | 58 | 0 |
| 106245 | P0098354 | 2013-05-29 | 4 | 1 |
| 106245 | P0098354 | 2013-07-12 | 1 | 0 |
| 106280 | P0098389 | 2013-04-07 | 58 | 0 |
| 106280 | P0098389 | 2013-04-09 | 4 | 0 |
| 106416 | P0098525 | 2013-04-19 | 58 | 0 |
| 106416 | P0098525 | 2013-04-23 | 4 | 0 |
| 106444 | P0098553 | 2013-04-22 | 58 | 0 |
| 106444 | P0098553 | 2013-04-25 | 4 | 0 |
| 106609 | P0098718 | 2013-05-08 | 58 | 0 |
| 106609 | P0098718 | 2013-05-10 | 4 | 11 |
| 106609 | P0098718 | 2013-07-24 | 11 | 12 |
| 106609 | P0098718 | 4000-01-01 | 12 | 0 |
| 106616 | P0098725 | 2013-05-09 | 58 | 0 |
| 106616 | P0098725 | 2013-05-09 | 4 | 1 |
| 106616 | P0098725 | 2013-07-27 | 1 | 0 |
| 106698 | P0098807 | 2013-05-16 | 58 | 0 |
| 106698 | P0098807 | 2013-05-22 | 4 | 6 |
| 106698 | P0098807 | 2013-06-14 | 6 | 1 |
| 106698 | P0098807 | 2013-06-28 | 1 | 0 |
| 106714 | P0098823 | 2013-05-20 | 58 | 0 |
| 106714 | P0098823 | 2013-05-21 | 58 | 0 |
| 106714 | P0098823 | 2013-05-24 | 58 | 0 |
| 106729 | P0098838 | 2013-05-21 | 58 | 0 |
| 106729 | P0098838 | 2013-05-23 | 4 | 1 |
| 106729 | P0098838 | 2013-06-03 | 1 | 0 |
| 107038 | P0099147 | 2013-06-25 | 58 | 0 |
| 107038 | P0099147 | 2013-06-28 | 4 | 1 |
| 107038 | P0099147 | 2013-07-04 | 1 | 0 |
| 107038 | P0099147 | 2013-08-13 | 58 | 0 |
| 107038 | P0099147 | 2013-08-15 | 4 | 6 |
| 107038 | P0099147 | 4000-01-01 | 6 | 0 |
| 107082 | P0099191 | 2013-06-29 | 58 | 0 |
| 107082 | P0099191 | 2013-07-04 | 4 | 6 |
| 107082 | P0099191 | 2013-07-19 | 6 | 0 |
| 107157 | P0099267 | 4000-01-01 | 13 | 0 |
| 107336 | P0099446 | 4000-01-01 | 6 | 0 |
+------------------+-------------------------------+------------------+------------------+-----------------------+
thanks.
It is hard to understand exactly what all your rules are from the question, but the general approach should be to add a "Grouping" column to a singl query that uses a CASE statement to categorize the people.
The conditions in a CASE are evaluated in order, so that if the first criteria is met, then the subsequent criteria are not even evaluated for that row.
Here is some code to get you started....
select t1.refid_eClinibase
,t1.[dthrfinmouvement]
,t1.[unite_service_id]
,t1.[unite_service_suiv_id]
CASE WHEN [dthrfinmouvement] = '4000-01-01' THEN 'Group1 Label'
WHEN condition2 = something THEN 'Group2 Label'
....
WHEN conditionN = something THEN 'GroupN Label'
ELSE 'Catch All Label'
END as person_category
from sometable t1

How can I do this in SQL in a Single Statement?

I have the following MySQL table:
+---------+------------+------+--------+------+---------+------------+-------+---------+----------+------------+------------+
| Version | Yr_Varient | FY | Period | CoA | Company | Item | Mvt | Ptnr_Co | Investee | GC | LC |
+---------+------------+------+--------+------+---------+------------+-------+---------+----------+------------+------------+
| 201 | 1 | 2010 | 1 | 11 | 23 | 1110105000 | 60200 | | | 450000 | 450000 |
| 201 | 1 | 2010 | 1 | 11 | 23 | 2110300000 | 60200 | | | -520000 | -520000 |
| 201 | 1 | 2010 | 1 | 11 | 23 | 1220221600 | | | | 78080 | 78080 |
| 201 | 1 | 2010 | 1 | 11 | 23 | 2130323000 | | | | 50000 | 50000 |
| 201 | 1 | 2010 | 1 | 11 | 23 | 2130322000 | | | | -58080 | -58080 |
| 201 | 1 | 2010 | 1 | 11 | 23 | 3100505000 | | | | -275000 | -275000 |
| 201 | 1 | 2010 | 1 | 11 | 23 | 3200652500 | | | | 216920 | 216920 |
| 201 | 1 | 2010 | 1 | 11 | 23 | 3900000000 | | | | 58080 | 58080 |
| 201 | 1 | 2010 | 1 | 11 | 26 | 1110105000 | 60200 | | | 376000 | 376000 |
| 201 | 1 | 2010 | 1 | 11 | 26 | 2110300000 | 60200 | | | -545000 | -545000 |
| 201 | 1 | 2010 | 1 | 11 | 26 | 1220221600 | | | | 452250 | 452250 |
| 201 | 1 | 2010 | 1 | 11 | 26 | 2130323000 | | | | -165000 | -165000 |
| 201 | 1 | 2010 | 1 | 11 | 26 | 2130322000 | | | | -118250 | -118250 |
| 201 | 1 | 2010 | 1 | 11 | 26 | 3100505000 | | | | -937750 | -937750 |
| 201 | 1 | 2010 | 1 | 11 | 26 | 3200652500 | | | | 819500 | 819500 |
| 201 | 1 | 2010 | 1 | 11 | 26 | 3900000000 | | | | 118250 | 118250 |
| 201 | 1 | 2010 | 1 | 11 | 37 | 1110105000 | 60200 | | | 777000 | 777000 |
| 201 | 1 | 2010 | 1 | 11 | 37 | 2110308000 | 60200 | 43 | | -255000 | -255000 |
| 201 | 1 | 2010 | 1 | 11 | 37 | 2130321500 | | | | 180000 | 180000 |
| 201 | 1 | 2010 | 1 | 11 | 37 | 2130322000 | | | | -77000 | -77000 |
| 201 | 1 | 2010 | 1 | 11 | 37 | 2310407001 | | 1 | | -625000 | -625000 |
| 201 | 1 | 2010 | 1 | 11 | 37 | 3100505000 | | | | -2502500 | -2502500 |
| 201 | 1 | 2010 | 1 | 11 | 37 | 3200652500 | | | | 2425500 | 2425500 |
| 201 | 1 | 2010 | 1 | 11 | 37 | 3900000000 | | | | 77000 | 77000 |
| 201 | 1 | 2010 | 1 | 11 | 43 | 1110105000 | 60200 | | | 2600000 | 2600000 |
| 201 | 1 | 2010 | 1 | 11 | 43 | 1140161000 | 60200 | | 23 | 430000 | 430000 |
| 201 | 1 | 2010 | 1 | 11 | 43 | 1140161000 | 60200 | | 26 | 505556 | 505556 |
| 201 | 1 | 2010 | 1 | 11 | 43 | 1140160000 | 60200 | 37 | | 255000 | 255000 |
| 201 | 1 | 2010 | 1 | 11 | 43 | 1160163000 | 60200 | 99999 | 48 | 49428895 | 49428895 |
| 201 | 1 | 2010 | 1 | 11 | 43 | 1160163000 | 60200 | 99999 | 49 | 188260175 | 188260175 |
| 201 | 1 | 2010 | 1 | 11 | 43 | 2310405500 | | | | -237689070 | -237689070 |
| 201 | 1 | 2010 | 1 | 11 | 43 | 2110300000 | 60200 | | | -1000 | -1000 |
| 201 | 1 | 2010 | 1 | 11 | 43 | 2110300500 | 60200 | | | -3999000 | -3999000 |
| 201 | 1 | 2010 | 1 | 11 | 43 | 1220221600 | | | | 1571112 | 1571112 |
| 201 | 1 | 2010 | 1 | 11 | 43 | 2130321500 | | | | -805556 | -805556 |
| 201 | 1 | 2010 | 1 | 11 | 43 | 2130322000 | | | | -556112 | -556112 |
| 201 | 1 | 2010 | 1 | 11 | 43 | 3100505000 | | | | -836000 | -836000 |
| 201 | 1 | 2010 | 1 | 11 | 43 | 3200652500 | | | | 781000 | 781000 |
| 201 | 1 | 2010 | 1 | 11 | 43 | 3300715700 | | 99999 | 32 | -440000 | -440000 |
| 201 | 1 | 2010 | 1 | 11 | 43 | 3300715700 | | 99999 | 26 | -61112 | -61112 |
| 201 | 1 | 2010 | 1 | 11 | 43 | 3900000000 | | | | 556112 | 556112 |
+---------+------------+------+--------+------+---------+------------+-------+---------+----------+------------+------------+
I need to take all rows with Mvt = 60200 and multiply every GC and LC record in that row by 1.1 and add a new row containing the changes back into the same table with FY set to 2011.
How can I do all this in 1 statement?
Is it even possible to do all this in 1 statement (I know very little about SQL)?
Can this be done in standard SQL as the database will be ported to another Database Server?
I don't know which server it will be.
In standard SQL (there may be better ways in vendor-specific implementations but I tend to prefer standard stuff where possible):
insert into mytable (
Version, Yr_Varient, Period, CoA, Company, Item, Mvt, Ptnr_Co, Investee,
FY, GC, LC
) select
Version, Yr_Varient, Period, CoA, Company, Item, Mvt, Ptnr_Co, Investee,
2011, GC*1.1, LC*1.1
from mytable
where Mvt = 60200
-- and FY = 2010
You may also want to limit your select statement a little more depending on the results of your testing, such as uncommenting the and FY = 2010 line above to stop copying all your 2009 and 2008 data as well, if any. I asume you only wanted to carry forward the previous year's stuff with a 10% increase on GC and LC.
The way this works is to run the select which gives modified data for FY, GC and LC as per your request, and pump all those rows back into the insert.
insert into mytable (
Version,Yr_Varient,FY,Period,CoA,Company,Item,Mvt,Ptnr_Co,Investee,GC,LC)
SELECT Version ,Yr_Varient,"2011" as FY, Period, CoA, Company , Item , Mvt ,Ptnr_Co , Investee , GC*1.1 as GC, LC*1.1 as LC FROM <table Name>
WHERE Mvt = 60200
INSERT INTO _table_
(Version,
Yr_Varient,
FY,
Period,
CoA,
Company,
Item,
Mvt,
Ptnr_Co,
Investee,
GC,
LC)
SELECT
Version,
Yr_Varient,
2011,
Period,
CoA,
Company,
Item,
Mvt,
Ptnr_Co,
Investee,
GC * 1.1,
LC * 1.1
FROM
_table_
WHERE
Mvt = 60200
AND FY <> 2011
This statement should work in any SQL-Database.
Edit: Too slow