Parsing Json Data from select query in SQL Server - sql

I have a situation where I have a table that has a single varchar(max) column called dbo.JsonData. It has x number of rows with x number of properties.
How can I create a query that will allow me to turn the result set from a select * query into a row/column result set?
Here is what I have tried:
SELECT *
FROM JSONDATA
FOR JSON Path
But it returns a single row of the json data all in a single column:
JSON_F52E2B61-18A1-11d1-B105-00805F49916B
[{"Json_Data":"{\"Serial_Number\":\"12345\",\"Gateway_Uptime\":17,\"Defrost_Cycles\":0,\"Freeze_Cycles\":2304,\"Float_Switch_Raw_ADC\":1328,\"Bin_status\":2304,\"Line_Voltage\":0,\"ADC_Evaporator_Temperature\":0,\"Mem_Sw\":1280,\"Freeze_Timer\":2560,\"Defrost_Timer\":593,\"Water_Flow_Switch\":3328,\"ADC_Mid_Temperature\":2560,\"ADC_Water_Temperature\":0,\"Ambient_Temperature\":71,\"Mid_Temperature\":1259,\"Water_Temperature\":1259,\"Evaporator_Temperature\":1259,\"Ambient_Temperature_Off_Board\":0,\"Ambient_Temperature_On_Board\":0,\"Gateway_Info\":\"{\\\"temp_sensor\\\":0.00,\\\"temp_pcb\\\":82.00,\\\"gw_uptime\\\":17.00,\\\"winc_fw\\\":\\\"19.5.4\\\",\\\"gw_fw_version\\\":\\\"0.0.0\\\",\\\"gw_fw_version_git\\\":\\\"2a75f20-dirty\\\",\\\"gw_sn\\\":\\\"328\\\",\\\"heap_free\\\":11264.00,\\\"gw_sig_csq\\\":0.00,\\\"gw_sig_quality\\\":0.00,\\\"wifi_sig_strength\\\":-63.00,\\\"wifi_resets\\\":0.00}\",\"ADC_Ambient_Temperature\":1120,\"Control_State\":\"Bin Full\",\"Compressor_Runtime\":134215680}"},{"Json_Data":"{\"Serial_Number\":\"12345\",\"Gateway_Uptime\":200,\"Defrost_Cycles\":559,\"Freeze_Cycles\":510,\"Float_Switch_Raw_ADC\":106,\"Bin_status\":0,\"Line_Voltage\":119,\"ADC_Evaporator_Temperature\":123,\"Mem_Sw\":113,\"Freeze_Timer\":0,\"Defrost_Timer\":66,\"Water_Flow_Switch\":3328,\"ADC_Mid_Temperature\":2560,\"ADC_Water_Temperature\":0,\"Ambient_Temperature\":71,\"Mid_Temperature\":1259,\"Water_Temperature\":1259,\"Evaporator_Temperature\":54,\"Ambient_Temperature_Off_Board\":0,\"Ambient_Temperature_On_Board\":0,\"Gateway_Info\":\"{\\\"temp_sensor\\\":0.00,\\\"temp_pcb\\\":82.00,\\\"gw_uptime\\\":199.00,\\\"winc_fw\\\":\\\"19.5.4\\\",\\\"gw_fw_version\\\":\\\"0.0.0\\\",\\\"gw_fw_version_git\\\":\\\"2a75f20-dirty\\\",\\\"gw_sn\\\":\\\"328\\\",\\\"heap_free\\\":10984.00,\\\"gw_sig_csq\\\":0.00,\\\"gw_sig_quality\\\":0.00,\\\"wifi_sig_strength\\\":-60.00,\\\"wifi_resets\\\":0.00}\",\"ADC_Ambient_Temperature\":1120,\"Control_State\":\"Defrost\",\"Compressor_Runtime\":11304}"},{"Json_Data":"{\"Seri...
What am I missing?
I can't specify the columns explicitly because the json strings aren't always the same.
This what I expect:
Serial_Number Gateway_Uptime Defrost_Cycles Freeze_Cycles Float_Switch_Raw_ADC Bin_status Line_Voltage ADC_Evaporator_Temperature Mem_Sw Freeze_Timer Defrost_Timer Water_Flow_Switch ADC_Mid_Temperature ADC_Water_Temperature Ambient_Temperature Mid_Temperature Water_Temperature Evaporator_Temperature Ambient_Temperature_Off_Board Ambient_Temperature_On_Board ADC_Ambient_Temperature Control_State Compressor_Runtime temp_sensor temp_pcb gw_uptime winc_fw gw_fw_version gw_fw_version_git gw_sn heap_free gw_sig_csq gw_sig_quality wifi_sig_strength wifi_resets LastModifiedDateUTC Defrost_Cycle_time Freeze_Cycle_time
12345 251402 540 494 106 0 98 158 113 221 184 0 0 0 1259 1259 1259 33 0 0 0 Freeze 10833 0 78 251402 19.5.4 0.0.0 2a75f20-dirty 328.00000000 10976 0 0 -61 0 2018-03-20 11:15:28.000 0 0
12345 251702 540 494 106 0 98 178 113 517 184 0 0 0 1259 1259 1259 22 0 0 0 Freeze 10838 0 78 251702 19.5.4 0.0.0 2a75f20-dirty 328.00000000 10976 0 0 -62 0 2018-03-20 11:15:42.000 0 0
...
Thank you,
Ron

Related

SQL: Increment a row when value in another row changes

I have the following table:
Sequence Change
100 0
101 0
103 0
106 0
107 1
110 0
112 1
114 0
115 0
121 0
126 1
127 0
134 0
I need an additional column, Group, whose values increment based on the occurrence of 1 in Change. How is that done? I'm using Microsoft Server 2012.
Sequence Change Group
100 0 0
101 0 0
103 0 0
106 0 0
107 1 1
110 0 1
112 1 2
114 0 2
115 0 2
121 0 2
126 1 3
127 0 3
134 0 3
You want a cumulative sum:
select t.*, sum(change) over (order by sequence) as grp
from t;

Why can't I convert this varchar to numeric?

I have a table with values pasted in, but they initially start as varchar. I need to convert them to numeric, so I did
convert(decimal(10,3), cw.col7)
But this is returning Error 8114: Error converting data type varchar to numeric. The reason I made this question is because it does not give this error for a similar data set. Are there sometimes strange anomalies when using convert() or decimal()? Or should I maybe convert to float first?
The data:
col7
490.440
2
934
28,108.000
33,226.000
17,347.000
1,561.000
57
0
421.350
64
1,100.000
0
0
3,584
202.432
0
3,280
672.109
1,150
0
104
411.032
18,016
40
510,648
443,934.000
18,705
322,254
301
9,217
18,075
16,100
395
706,269
418,313
7,170
40,450
2,423
1,300
2,311
94,000.000
17,463
0
228
884
557
153
13
0
0
212.878
45,000.000
152
24,400
3,675
11,750
987
23,725
268,071
4,520.835
286,000
112,912.480
9,000
1,316
1,020
215,244
123,967
6,911
1,088.750
138,644
16,924
7,848
33,017
464,463
618
72,391
9,367
507,635.950
588,087
92,890
17,266
0
1,414,547
89,080
664
101,635
1,552,992
175
356
7,000
0
0
445
507,381
24,016
469,983
0
0
147,737
3,521
88,210
18,433.000
21,775
3,607
34,774
7,642
42,680
1,255
10,880
350,409.800
19,394.520
2,476,257.400
778.480
1,670.440
9,710
24,931.600
3,381.800
2,900
18,000
4,121
3,750
62,200
952
29.935
17.795
11.940
902
36,303
1,240
1,020
617
817
620
92,648
70,925
82,924
19,162.200
1,213.720
2,871
3,180
91,600
645
607
155,100
6
840
1,395
112
6,721
3,850
40
4,032
5,912
1,040
872
56
1,856
179
Try_Convert(money,...) will handle the comma, while decimal(10, 3) will return null
Example
Select col7
,AsMoney = Try_Convert(money,col7)
,AsDecimal = Try_Convert(decimal(10, 3),col7)
from YourTable
Returns
Try using cast and remove comma
SELECT CAST(repalce(cw.col7,',','') AS DECIMAL(10,3))
from your_table
and as suggested by Jhon Cappelleti you need more that 3 decimal so you should use
SELECT CAST(repalce(cw.col7,',','') AS DECIMAL(12,4))
from your_table
Run this query:
select cw.col7
from cw
where try_convert(decimal(10, 3), cw.col7) is null;
This will show you the values that do not convert successfully. (If cw.col7 could be NULL then add and cw.col7 is not null to make the output more meaningful.)

Teradata how to select first occurrent

I have a table similar to the picture below. In this table, I have some duplicates in SESS_KEY. I only want rows that does not have duplicates or if rows that have duplicates, I only want the ones with CALL_TRNSF_FLG set to 1. I have manual create INCLUDE field to show column that I want (1). How can I achieve this?
Thank you for your help!
Here is the sample data:
INCLUDE SESS_KEY SESS_CALL_ST_DT_TS CONN_ID TLK_DUR HLD_DUR AFT_CALL_WRK_DUR TRNSF_TLK_TM TRNSF_HLD_TM TRNSF_ACW_TM CALL_TRNSF_FLG
0 24067A16-A24A-45BE-E3AA-7E0BFE7ECDA5 7/25/2016 9:07 0141028541267da5 918 57 26 ? ? ? 0
1 24067A16-A24A-45BE-E3AA-7E0BFE7ECDA5 7/25/2016 9:07 0521028304ed75f8 236 0 3 918 57 26 1
0 49FFAB03-C19C-4291-6BAB-267CC95E27CF 7/6/2016 17:25 014102854125f060 278 0 130 ? ? ? 0
1 49FFAB03-C19C-4291-6BAB-267CC95E27CF 7/6/2016 17:25 0521028304e98111 391 0 8 278 0 130 1
0 7CCBBF2F-6FBC-4812-BAB1-4E258B88C20A 7/12/2016 11:34 05200282b0814531 269 0 190 406 0 124 1
1 7CCBBF2F-6FBC-4812-BAB1-4E258B88C20A 7/12/2016 11:34 013b028225ed6484 406 0 124 ? ? ? 0
0 CA32F05E-5C8A-4849-63A4-15B2342081B8 7/6/2016 11:38 02420282b06776f9 256 0 114 297 0 67 1
1 CA32F05E-5C8A-4849-63A4-15B2342081B8 7/6/2016 11:38 014102854125ea06 297 0 67 ? ? ? 0
0 E75EF405-1C0D-45E4-EC97-88D3CD7B5E55 7/5/2016 15:03 1.41E+214 2,691 0 255 ? ? ? 0
1 E75EF405-1C0D-45E4-EC97-88D3CD7B5E55 7/5/2016 15:03 0243028304ee14a5 314 0 9 2,691 0 255 1
1 04F8CC43-710B-4E4D-D8A1-DAC45FB3FF24 7/19/2016 16:49 1.41E+14 123 100 43 ? ? ? 0
1 0AFB6070-9D95-47B0-B0AF-D34ED70FCE8E 7/22/2016 14:20 0243028304f1ffca 335 239 79 ? ? ? 0
1 13581E6A-A568-4993-098C-05233CF293AE 7/15/2016 11:22 014102854126375a 196 150 258 ? ? ? 0
1 1A6AE4BE-1858-4CB3-83B1-CFF7A9E88EF9 7/8/2016 19:09 02420282b068325e 120 0 0 ? ? ? 0
1 24CE6C11-AF85-4770-53B4-FE20200339DF 7/28/2016 12:47 0243028304f3401b 181 0 0 107 0 48 1
1 293F85F4-34BC-44B1-43B5-A6B3B8886FC8 7/1/2016 8:33 0521028304e8778e 70 0 21 149 0 1 1
1 2BD0216A-B3F3-4597-1CBD-095F8D291736 7/7/2016 8:41 0243028304ee83b2 1,037 0 187 ? ? ? 0
1 2C774BE2-5B26-47C0-B69F-69B04A63F879 7/25/2016 18:26 013b028225edd637 1,481 0 110 ? ? ? 0
1 3F43720B-B6AE-4335-4FB5-9275A952989F 7/11/2016 11:08 013b028225ed5830 155 0 0 ? ? ? 0
1 41B056DC-8D3F-425D-BD9E-10A3EB0E944D 7/27/2016 11:13 05200282b084c5d5 34 0 0 ? ? ? 0
1 420483AD-8586-45C7-68AB-675E50EF2B92 7/5/2016 11:03 013b028225ed2765 1,320 0 283 ? ? ? 0
1 43A14051-6EAA-4251-3FA1-F2FBAE6DB643 7/23/2016 12:16 05200282b083f410 359 0 143 ? ? ? 0
1 494F3EA9-EA47-4F7B-C795-61B8B23DA0FA 7/21/2016 9:27 02420282b06ac6c3 0 0 0 ? ? ? 0
1 4D743557-DE09-4007-D58C-EFB09EF6713C 7/29/2016 17:19 05200282b085844a 951 361 240 ? ? ? 0
1 546C0FD0-5445-44F8-0789-1FA62BB57CDB 7/15/2016 18:14 1.41E+59 686 0 60 ? ? ? 0
1 5487C587-D37C-4E5C-9A88-87A3978996CD 7/28/2016 18:51 014102854126a96d 833 0 534 ? ? ? 0
1 5AB8D65A-28C7-4CAD-5796-3A7B720A47F7 7/20/2016 8:56 0141028541265a9f 274 111 381 ? ? ? 0
1 6866B3F8-F953-43BF-9089-B1FE699DEE07 7/19/2016 16:25 05200282b0830349 35 0 180 ? ? ? 0
1 6A4566B3-71B9-47BB-75BC-37B6E644D704 7/19/2016 10:14 02420282b06a3d7b 0 0 0 ? ? ? 0
1 72D17A78-FA5D-42DA-E39A-F7B950C15E22 7/5/2016 18:05 02420282b0675679 606 0 167 ? ? ? 0
1 73657A2A-34B7-4921-E691-49827E46128D 7/20/2016 11:02 02420282b06a8ae8 31 0 264 ? ? ? 0
1 7520F825-DA7B-4D5F-7AA9-3ADD9AAC5BE7 7/5/2016 18:53 05200282b07fd5df 354 0 20 ? ? ? 0
1 76DA5FB6-3EDD-45E1-B8BB-C70EA1CB4E53 7/1/2016 10:07 0243028304ed7c74 132 0 20 105 0 66 1
1 810B9E66-AA32-4BB0-128D-8E3FFC86EB0E 7/22/2016 13:37 013b028225edc13d 1,621 109 34 ? ? ? 0
1 81402352-DE71-45E4-4EAD-C1FFE20F8288 7/11/2016 9:28 0521028304ea456a 38 0 0 71 0 0 1
1 81EA3AD7-B721-4718-9AB0-6FB005252F64 7/12/2016 17:15 013b028225ed6ad5 812 129 60 ? ? ? 0
1 870632C0-4D80-41DC-AD84-12972DBC5AF2 7/23/2016 14:20 0243028304f229ee 1,084 0 5 ? ? ? 0
1 886919E7-80DB-4E2C-D5B2-8B83420F4D27 7/26/2016 19:22 0243028304f2da42 533 465 155 ? ? ? 0
1 8A18B8A2-1405-446B-71BA-A3FBAC816C12 7/8/2016 16:13 013b028225ed4f72 318 237 0 ? ? ? 0
1 8A54DAD7-2745-4BFB-22BE-BF479C1A8710 7/7/2016 15:25 02420282b067da6c 42 0 94 104 0 38 1
1 8D5EB433-2D50-4A67-00AC-E768A549B56E 7/26/2016 14:35 0521028304edf692 55 0 0 ? ? ? 0
1 8F222904-EC4E-4395-D496-A25FB408AD95 7/29/2016 17:09 0243028304f3a5f0 88 0 137 ? ? ? 0
1 9310922F-D545-4E78-42B2-E1B508F5A436 7/7/2016 12:23 02420282b067c625 155 2 15 ? ? ? 1
1 A605BF7A-50E6-4114-1981-7B3988079B7E 7/6/2016 16:56 02420282b0679dfa 89 0 293 ? ? ? 0
1 AA23384F-C4DA-4357-3DAF-7CD8337831DD 7/9/2016 11:20 014102854126082a 138 0 210 ? ? ? 0
1 AF5AD7E2-7584-4ACD-B28E-1AB2DB87BDAA 7/21/2016 17:36 0243028304f1cda6 0 0 0 ? ? ? 0
1 B66D3851-83BE-4E0E-7D9B-1719E378905D 7/19/2016 12:41 0243028304f122cd 81 0 0 ? ? ? 0
1 BB2FA3CD-AB6D-42BD-3CB7-EEC27E3403BF 7/15/2016 14:38 0243028304f0753e 65 0 195 92 3 29 1
1 BBA4031A-7876-4614-F9BC-718A6D8A16A7 7/13/2016 17:42 0521028304eb1a47 163 0 85 ? ? ? 0
1 BCF2B7D8-CBD0-497F-EEA7-FDEC46EFEEBE 7/7/2016 12:09 0521028304e9acaa 44 0 8 ? ? ? 0
1 BE9386B6-424E-40F9-67A1-A56EF6C18B77 7/27/2016 20:03 013b028225edecc5 1 0 0 ? ? ? 0
1 C0F0EF71-F52B-4D10-E9B4-DA1AF4343CC7 7/11/2016 15:21 05200282b08111ee 49 0 61 368 0 597 1
1 C84FCA28-2372-4F8B-52B1-4BC5E9AD128B 7/19/2016 13:06 013b028225eda06c 59 0 32 162 0 0 1
1 C8B3CC50-DEC3-4F24-D0A2-E32A03AFA786 7/13/2016 13:22 05200282b0819c4d 126 0 0 ? ? ? 0
1 CC119F61-A70F-4DB3-7C8C-DCE9A1C3BCB5 7/27/2016 9:48 02420282b06c1330 0 0 0 ? ? ? 0
1 D57D43C7-F9F0-42B9-C6B6-23B1414D9F12 7/14/2016 15:04 05200282b081ee2a 36 0 17 ? ? ? 0
1 E438B480-8F98-469C-3899-E6F10DD1F755 7/5/2016 20:12 02420282b0675cea 3,874 163 7 ? ? ? 0
1 F223F1F4-F50D-41F6-DA9F-46EA2972F394 7/27/2016 20:13 05200282b084f966 417 0 6 ? ? ? 0
1 FB3B0CB1-89D8-4B47-E4BA-465E57D52B0D 7/14/2016 14:21 02420282b0695b07 138 0 2 ? ? ? 0
SELECT *
FROM tab
QUALIFY
-- rows that does not have duplicates
COUNT(*)
OVER (PARTITION BY SESS_KEY) = 1
-- the ones with CALL_TRNSF_FLG set to 1
OR CALL_TRNSF_FLG = 1
If there might be multiple rows with CALL_TRNSF_FLG = 1 and you only want one row per session:
QUALIFY
ROW_NUMBER(*)
OVER (PARTITION BY SESS_KEY
ORDER BY CALL_TRNSF_FLG DESC) = 1
You can filter rows You want like this:
WHERE
CONN_ID NOT IN (
SELECT
MIN(CONN_ID)
FROM
<<your_table>>
GROUP BY
SESS_KEY
HAVING
count(*) > 1
)
Assuming, that pair: sess_key,conn_id is unique. Otherwise, You should find unique set of columns and filter by it.
Hi You can have query as,
Using JOINS
select t1.* from table_name t1
join
(select count(*),SESS_KEYfrom table_name
group by SESS_KEYhaving count(*) > 1 ) t2
on t1.SESS_KEY=t2.SESS_KEY;
Using WHERE clause
select t1.* from table_name t1,
(select count(*),SESS_KEYfrom table_name
group by SESS_KEYhaving count(*) > 1 ) t2
where t1.SESS_KEY=t2.SESS_KEY;
This provides you all the duplicates having in SESS_KEY column
To Update
merge into table_name t1
using
(select count(*),SESS_KEYfrom table_name
group by SESS_KEYhaving count(*) > 1 ) t2
on t1.SESS_KEY=t2.SESS_KEY
when matched then
update set
CALL_TRNSF_FLG=1;
Use MERGE statement to update the table

VB import Text file into Excel\VB

I have the following text file which I'm trying to automate into a line Graph in excel.. which logs every 5 minutes up until from 08:00 till 18:00 so there is quite a few rows
TIME Rec-Created Rec-Deleted Rec-Updated Rec-read Rec-wait Committed bi- writes Bi-reads DB-Writes DB-READ db-access Checkpoints Flushed
08:09:00 37 0 5 21276 0 1894 33 3 109 43 47691 1 0
08:14:00 49 0 144 20378 0 1225 143 0 88 192 53377 0 0
08:19:00 44 0 237 19902 0 1545 283 6 317 120 49668 2 0
08:24:00 51 0 129 12570 0 626 191 3 164 58 37811 1 0
08:29:00 61 0 49 14138 0 541 86 3 116 77 36836 1 0
08:34:00 59 0 144 58536 0 1438 209 3 143 3753 135427 1 0
08:39:00 85 0 178 28309 0 1822 209 6 209 80 70950 2 0
08:44:00 57 0 157 17940 0 554 132 3 170 92 47561 1 0
08:49:00 115 0 217 29961 0 1867 186 3 333 193 76057 1 0
08:54:00 111 0 225 23320 0 540 198 6 275 246 64138 2 0
08:59:00 41 0 152 15638 0 359 187 3 368 103 44558 1 0
I'm not too concerned about the Line graph part but more the trying to get the data into excel in the correct format.
I'm assuming I would need to use an array, but currently that is little advanced for me at the moment as Im still trying to get to grips VB and this is really my first venture into this world...(as you can see from my previous post)
any help or guidance would be appreciated..
(im studying the VB for Dummies and Visual Basic Fundamentals: Development for Absolute Beginners from the channel9 MSDN)
thanks in advance
I would probably create typed dataset, with all of your columns. Lets call it YourDataset.
Then read the file in and add rows to your table for each row in the file. Not fully functional but an outline of a solution.
dim typedDataset = new YourDataset
Using reader As StreamReader = New StreamReader("file.txt")
line = reader.ReadLine
dim rowData = line.Split(" ")
'add a new row to typed dataset based on data above
End Using
That is how you would get your data into vb.net, it would be sitting in a table just like the excel table, at that point if you didn't care about excel you could use a graphing control like on this page. And see it with a datagrid view https://msdn.microsoft.com/en-us/library/dd489237(v=vs.140).aspx
But going to excel you would need to follow a guide like the one I have in the following link. You need to use the Microsoft.Office.Interop.Excel
http://www.codeproject.com/Tips/669509/How-to-Export-Data-to-Excel-in-VB-NET

MySQL: How to select the UTC offset and DST for all timezones?

I want a list of all timezones in the mysql timezone tables, and need to select:
1) Their current offset from GMT
2) Whether DST is used by that timezone (not whether it's currently in use, just whether DST is considered at some point in the year for that timezone)
Reason:
I need to build a web form and match the users time zone information (which I can generate from javascript) to the correct time zone stored in the mysql DB. I can find UTC offset and get a DST flag from javascript functions.
Try this query. The offsettime is the (Offset / 60 / 60)
SELECT tzname.`Time_zone_id`,(`Offset`/60/60) AS `offsettime`,`Is_DST`,`Name`,`Transition_type_id`,`Abbreviation`
FROM `time_zone_transition_type` AS `transition`, `time_zone_name` AS `tzname`
WHERE transition.`Time_zone_id`=tzname.`Time_zone_id`
ORDER BY transition.`Offset` ASC;
The results are
501 -12.00000000 0 0 PHOT Pacific/Enderbury
369 -12.00000000 0 0 GMT+12 Etc/GMT+12
513 -12.00000000 0 1 KWAT Pacific/Kwajalein
483 -12.00000000 0 1 KWAT Kwajalein
518 -11.50000000 0 1 NUT Pacific/Niue
496 -11.50000000 0 1 SAMT Pacific/Apia
528 -11.50000000 0 1 SAMT Pacific/Samoa
555 -11.50000000 0 1 SAMT US/Samoa
521 -11.50000000 0 1 SAMT Pacific/Pago_Pago
496 -11.44888889 0 0 LMT Pacific/Apia
528 -11.38000000 0 0 LMT Pacific/Samoa
555 -11.38000000 0 0 LMT US/Samoa
521 -11.38000000 0 0 LMT Pacific/Pago_Pago
518 -11.33333333 0 0 NUT Pacific/Niue
544 -11.00000000 0 3 BST US/Aleutian
163 -11.00000000 0 3 BST America/Nome
518 -11.00000000 0 2 NUT Pacific/Niue
496 -11.00000000 0 2 WST Pacific/Apia
544 -11.00000000 0 0 NST US/Aleutian
163 -11.00000000 0 0 NST America/Nome
528 -11.00000000 0 4 SST Pacific/Samoa
528 -11.00000000 0 3 BST Pacific/Samoa