I have a table with values pasted in, but they initially start as varchar. I need to convert them to numeric, so I did
convert(decimal(10,3), cw.col7)
But this is returning Error 8114: Error converting data type varchar to numeric. The reason I made this question is because it does not give this error for a similar data set. Are there sometimes strange anomalies when using convert() or decimal()? Or should I maybe convert to float first?
The data:
col7
490.440
2
934
28,108.000
33,226.000
17,347.000
1,561.000
57
0
421.350
64
1,100.000
0
0
3,584
202.432
0
3,280
672.109
1,150
0
104
411.032
18,016
40
510,648
443,934.000
18,705
322,254
301
9,217
18,075
16,100
395
706,269
418,313
7,170
40,450
2,423
1,300
2,311
94,000.000
17,463
0
228
884
557
153
13
0
0
212.878
45,000.000
152
24,400
3,675
11,750
987
23,725
268,071
4,520.835
286,000
112,912.480
9,000
1,316
1,020
215,244
123,967
6,911
1,088.750
138,644
16,924
7,848
33,017
464,463
618
72,391
9,367
507,635.950
588,087
92,890
17,266
0
1,414,547
89,080
664
101,635
1,552,992
175
356
7,000
0
0
445
507,381
24,016
469,983
0
0
147,737
3,521
88,210
18,433.000
21,775
3,607
34,774
7,642
42,680
1,255
10,880
350,409.800
19,394.520
2,476,257.400
778.480
1,670.440
9,710
24,931.600
3,381.800
2,900
18,000
4,121
3,750
62,200
952
29.935
17.795
11.940
902
36,303
1,240
1,020
617
817
620
92,648
70,925
82,924
19,162.200
1,213.720
2,871
3,180
91,600
645
607
155,100
6
840
1,395
112
6,721
3,850
40
4,032
5,912
1,040
872
56
1,856
179
Try_Convert(money,...) will handle the comma, while decimal(10, 3) will return null
Example
Select col7
,AsMoney = Try_Convert(money,col7)
,AsDecimal = Try_Convert(decimal(10, 3),col7)
from YourTable
Returns
Try using cast and remove comma
SELECT CAST(repalce(cw.col7,',','') AS DECIMAL(10,3))
from your_table
and as suggested by Jhon Cappelleti you need more that 3 decimal so you should use
SELECT CAST(repalce(cw.col7,',','') AS DECIMAL(12,4))
from your_table
Run this query:
select cw.col7
from cw
where try_convert(decimal(10, 3), cw.col7) is null;
This will show you the values that do not convert successfully. (If cw.col7 could be NULL then add and cw.col7 is not null to make the output more meaningful.)
Related
my table looks like
id total avg test_no
1 445 89
2 434 85
3 378 75
4 421 84
I'm working on matillion-snowflake
I need my result to look like
id total avg test_no
1 445 89 1
2 434 85 1
3 378 75 1
4 421 84 1
Just use a Calculator component and set the value of the calculated column to 1
In Snowflake, you would modify the table using:
update t
set test_no = 1;
I assume that Matillion supports this as well.
I have a situation where I have a table that has a single varchar(max) column called dbo.JsonData. It has x number of rows with x number of properties.
How can I create a query that will allow me to turn the result set from a select * query into a row/column result set?
Here is what I have tried:
SELECT *
FROM JSONDATA
FOR JSON Path
But it returns a single row of the json data all in a single column:
JSON_F52E2B61-18A1-11d1-B105-00805F49916B
[{"Json_Data":"{\"Serial_Number\":\"12345\",\"Gateway_Uptime\":17,\"Defrost_Cycles\":0,\"Freeze_Cycles\":2304,\"Float_Switch_Raw_ADC\":1328,\"Bin_status\":2304,\"Line_Voltage\":0,\"ADC_Evaporator_Temperature\":0,\"Mem_Sw\":1280,\"Freeze_Timer\":2560,\"Defrost_Timer\":593,\"Water_Flow_Switch\":3328,\"ADC_Mid_Temperature\":2560,\"ADC_Water_Temperature\":0,\"Ambient_Temperature\":71,\"Mid_Temperature\":1259,\"Water_Temperature\":1259,\"Evaporator_Temperature\":1259,\"Ambient_Temperature_Off_Board\":0,\"Ambient_Temperature_On_Board\":0,\"Gateway_Info\":\"{\\\"temp_sensor\\\":0.00,\\\"temp_pcb\\\":82.00,\\\"gw_uptime\\\":17.00,\\\"winc_fw\\\":\\\"19.5.4\\\",\\\"gw_fw_version\\\":\\\"0.0.0\\\",\\\"gw_fw_version_git\\\":\\\"2a75f20-dirty\\\",\\\"gw_sn\\\":\\\"328\\\",\\\"heap_free\\\":11264.00,\\\"gw_sig_csq\\\":0.00,\\\"gw_sig_quality\\\":0.00,\\\"wifi_sig_strength\\\":-63.00,\\\"wifi_resets\\\":0.00}\",\"ADC_Ambient_Temperature\":1120,\"Control_State\":\"Bin Full\",\"Compressor_Runtime\":134215680}"},{"Json_Data":"{\"Serial_Number\":\"12345\",\"Gateway_Uptime\":200,\"Defrost_Cycles\":559,\"Freeze_Cycles\":510,\"Float_Switch_Raw_ADC\":106,\"Bin_status\":0,\"Line_Voltage\":119,\"ADC_Evaporator_Temperature\":123,\"Mem_Sw\":113,\"Freeze_Timer\":0,\"Defrost_Timer\":66,\"Water_Flow_Switch\":3328,\"ADC_Mid_Temperature\":2560,\"ADC_Water_Temperature\":0,\"Ambient_Temperature\":71,\"Mid_Temperature\":1259,\"Water_Temperature\":1259,\"Evaporator_Temperature\":54,\"Ambient_Temperature_Off_Board\":0,\"Ambient_Temperature_On_Board\":0,\"Gateway_Info\":\"{\\\"temp_sensor\\\":0.00,\\\"temp_pcb\\\":82.00,\\\"gw_uptime\\\":199.00,\\\"winc_fw\\\":\\\"19.5.4\\\",\\\"gw_fw_version\\\":\\\"0.0.0\\\",\\\"gw_fw_version_git\\\":\\\"2a75f20-dirty\\\",\\\"gw_sn\\\":\\\"328\\\",\\\"heap_free\\\":10984.00,\\\"gw_sig_csq\\\":0.00,\\\"gw_sig_quality\\\":0.00,\\\"wifi_sig_strength\\\":-60.00,\\\"wifi_resets\\\":0.00}\",\"ADC_Ambient_Temperature\":1120,\"Control_State\":\"Defrost\",\"Compressor_Runtime\":11304}"},{"Json_Data":"{\"Seri...
What am I missing?
I can't specify the columns explicitly because the json strings aren't always the same.
This what I expect:
Serial_Number Gateway_Uptime Defrost_Cycles Freeze_Cycles Float_Switch_Raw_ADC Bin_status Line_Voltage ADC_Evaporator_Temperature Mem_Sw Freeze_Timer Defrost_Timer Water_Flow_Switch ADC_Mid_Temperature ADC_Water_Temperature Ambient_Temperature Mid_Temperature Water_Temperature Evaporator_Temperature Ambient_Temperature_Off_Board Ambient_Temperature_On_Board ADC_Ambient_Temperature Control_State Compressor_Runtime temp_sensor temp_pcb gw_uptime winc_fw gw_fw_version gw_fw_version_git gw_sn heap_free gw_sig_csq gw_sig_quality wifi_sig_strength wifi_resets LastModifiedDateUTC Defrost_Cycle_time Freeze_Cycle_time
12345 251402 540 494 106 0 98 158 113 221 184 0 0 0 1259 1259 1259 33 0 0 0 Freeze 10833 0 78 251402 19.5.4 0.0.0 2a75f20-dirty 328.00000000 10976 0 0 -61 0 2018-03-20 11:15:28.000 0 0
12345 251702 540 494 106 0 98 178 113 517 184 0 0 0 1259 1259 1259 22 0 0 0 Freeze 10838 0 78 251702 19.5.4 0.0.0 2a75f20-dirty 328.00000000 10976 0 0 -62 0 2018-03-20 11:15:42.000 0 0
...
Thank you,
Ron
I have a table called airports in a SQL Server database, with a column declared as nvarchar(255). I had to declare it as nvarchar otherwise SSIS failed to import the data from a .csv file generated by an API.
I have approx 25k records in this table, where by from what I can tell 763 have Unicode characters in them, by running this query:
select cast(name as varchar), name
from airports
where cast(name as varchar) <> name
The first row shows the following two values returned in column 1 and 2
Harrisburg Capital City Airpor
Harrisburg Capital City Airport
The first value from column 1 has had the last t stripped off it, which I assume means there is one unicode character in the string. Please let me know if I am wrong, as I am a bit useless with unicode characters.
My question is: how can I find the unicode characters in the column, and is there a safe / recommended way to remove them?
I did try this to see if I could find it, but it didn't do what I thought it would do.
set nocount on
DECLARE #nstring NVARCHAR(100)
SET #nstring =(select name from airports where fs = 'HAR')
DECLARE #position INT
SET #position = 1
DECLARE #CharList TABLE (Position INT,UnicodeChar NVARCHAR(1),UnicodeValue INT)
WHILE #position <= DATALENGTH(#nstring)
BEGIN
INSERT #CharList
SELECT
#position as Position,
CONVERT(nchar(1),SUBSTRING(#nstring, #position, 1)) as UnicodeChar,
UNICODE(SUBSTRING(#nstring, #position, 1)) as UnicodeValue
SET #position = #position + 1
END
SELECT *
FROM #CharList[/sql]
ORDER BY unicodevalue
The output is as follows
32 NULL
33 NULL
34 NULL
35 NULL
36 NULL
37 NULL
38 NULL
39 NULL
40 NULL
41 NULL
42 NULL
43 NULL
44 NULL
45 NULL
46 NULL
47 NULL
48 NULL
49 NULL
50 NULL
51 NULL
52 NULL
53 NULL
54 NULL
55 NULL
56 NULL
57 NULL
58 NULL
59 NULL
60 NULL
61 NULL
62 NULL
11 32
19 32
24 32
25 A 65
20 C 67
12 C 67
1 H 72
2 a 97
13 a 97
17 a 97
7 b 98
10 g 103
15 i 105
5 i 105
21 i 105
26 i 105
18 l 108
29 o 111
28 p 112
14 p 112
9 r 114
3 r 114
4 r 114
30 r 114
27 r 114
6 s 115
16 t 116
22 t 116
31 t 116
8 u 117
23 y 121
However, if you want to first find the records which have some unicode chars then follow below approach with help of case expression
;WITH CTE
AS (
SELECT DATA,
CASE
WHEN(CAST(DATA AS VARCHAR(MAX)) COLLATE SQL_Latin1_General_Cp1251_CS_AS) = DATA
THEN 0
ELSE 1
END HasUnicodeChars,
ROW_NUMBER() OVER (ORDER BY (SELECT 1)) RN
FROM <table_name>)
SELECT * FROM CTE where HasUnicodeChars = 1
I've got a table like that,
id PID VID Type PriceA PriceB
41 297 2 128 70.000 80.000
42 297 3 256 90.000 100.000
43 297 4 300 110.000 120.000
44 297 5 400 130.000 140.000
45 294 2 128 10.000 50.000
46 294 3 256 20.000 60.000
47 294 4 300 30.000 70.000
48 294 5 400 40.000 80.000
49 294 6 450 50.000 85.000
50 294 7 470 45.000 75.000
What I want to do is a query with PID parameter and get a result like that
PID | 128 | 256 | 300 | 400
297 | 70.000 / 80.0000 | 90.000 / 100.000 | 110.000 / 120.000 | 130.000 / 140.000
I have tried several different options pivot table subquery etc., but I could not make it.
This is full working exmaple:
CREATE TABLE DataSource
(
[ID] TINYINT
,[PID] SMALLINT
,[VID] TINYINT
,[Type] SMALLINT
,[PriceA] VARCHAR(32)
,[PriceB] VARCHAR(32)
)
INSERT INTO DataSource ([ID],[PID],[VID],[Type],[PriceA],[PriceB])
VALUES (41,297,2,128,70.000,80.000)
,(42,297,3,256,90.000,100.000)
,(43,297,4,300,110.000,120.000)
,(44,297,5,400,130.000,140.000)
,(45,294,2,128,10.000,50.000)
,(46,294,3,256,20.000,60.000)
,(47,294,4,300,30.000,70.000)
,(48,294,5,400,40.000,80.000)
,(49,294,6,450,50.000,85.000)
,(50,294,7,470,45.000,75.000)
SELECT *
FROM
(
SELECT [PID]
,[Type]
,[PriceA] + ' / ' + [PriceB] AS [Price]
FROM DataSource
) AS DataSource
PIVOT
(
MAX([Price]) FOR [Type] IN ([128],[256],[300],[400],[450], [470])
) PVT
The output is like this:
The idea is to build the column [PriceA] + ' / ' + [PriceB] and then to make the pivot.
Note, that I have hardcoded the posible [Type] values. If you need to make this dynamic you can build a dynamic PIVOT building the SQL string and then executing it with sp_executesql procedure like is done here.
Hi Here is my SQL code:
SELECT a."Date", a."Missed", b."Total Client Schedules", cast(100-((a."Missed"*100) / b."Total Client Schedules")AS decimal) as "Pct Completed" -
FROM -
( -
SELECT DATE(scheduled_start) as "Date",count(*) as "Missed" FROM -
events WHERE node_name IS NOT NULL AND status IN ('Missed') GROUP BY DATE(scheduled_start) -
) as a, -
( -
SELECT DATE(scheduled_start) as "Date", count(*) as -
"Total Client Schedules" FROM events WHERE node_name IS NOT NULL GROUP BY DATE(scheduled_start) -
) as b -
WHERE a."Date" = b."Date" ORDER BY "Date" desc
and Here is the output
Date Missed Total Client Schedules Pct Completed
----------- ------------ ----------------------- --------------
2013-02-20 2 805 100
2013-02-19 14 805 99
2013-02-18 29 805 97
2013-02-17 59 805 93
2013-02-16 29 806 97
2013-02-15 49 805 94
2013-02-14 33 805 96
2013-02-13 57 805 93
2013-02-12 21 805 98
2013-02-11 35 805 96
2013-02-10 34 805 96
it always seems to round to the highest number when i want it to be like 99.99% or 97.2% etc..
You don't specify what database you are using. However, some databases do integer arithmetic, so 1/2 is 0 not 0.5.
To fix this, just make the constants you are using numeric rather than integer:
cast(100.0-((a."Missed"*100.0) / b."Total Client Schedules")AS decimal)
It will then convert to a non-integer type for the arithmetic.