data arguments are not working as expected in Hive - hive

I am passing data parameters to hive script, but it's not working.
SET yrmonth=concat(substr(to_date(${hiveconf:runningdate}),1,4),substr(to_date(${hiveconf:runningdate}),6,2));
SET fom=TRUNC(${hiveconf:runningdate},'MONTH');
SET lom=LAST_DAY(${hiveconf:runningdate});
USE cust_db;
SELECT saleid,podid,pname
FROM product
WHERE productln_yrmo=${hiveconf:yrmonth};
--productln_yrmo is int column
SELECT cid,cname,cloc
FROM customer
WHERE customer_createddt >='${hiveconf:fom}'
AND customer_createddt <='${hiveconf:lom}'
AND cloc = 'AUS';
--customer_createddt is date column
hive -hiveconf runningdate='2016-05-18' -f cust.hql

from your query, 'customer_createddt' seems to be DATE type since you have used with '<=' and '>=' operators to get the range of values between 'fom' and 'lom'. I guess lexicographic comparison for date range works fine perfectly . But, not sure when table DDL created with a partition on 'customer_createddt' column.
So please give a try casting string to date ( also please post your table DDL for more clarity)
SELECT cid,cname,cloc
FROM customer
WHERE customer_createddt >=CAST(${hiveconf:fom} AS DATE)
AND customer_createddt <=CAST(${hiveconf:lom} AS DATE)
AND cloc = 'AUS';

Related

timestamp VS TIMESTAMP_NTZ in snowflake sql

I am using an sql script to parse a json into a snowflake table using dbt.
One of the cols contain this datetime value: '2022-02-09T20:28:59+0000'.
What's the correct way to define ISO datetime's data type in Snowflake?
I tried date, timestamp and TIMESTAMP_NTZ like this in my dbt sql script:
JSON_DATA:",my_date"::TIMESTAMP_NTZ AS MY_DATE
but clearly, these aren't the correct one because later on when I test it in snowflake with select * , I get this error:
SQL Error [100040] [22007]: Date '2022-02-09T20:28:59+0000' is not recognized
or
SQL Error [100035] [22007]: Timestamp '2022-02-13T03:32:55+0100' is not recognized
so I need to know which Snowflake time/date data type suits the best for this one
EDIT:
This is what I am trying now.
SELECT
JSON_DATA:"date_transmission" AS DATE_TRANSMISSION
, TO_TIMESTAMP(DATE_TRANSMISSION:text, 'YYYY-MM-DDTHH24:MI:SS.FFTZH:TZM') AS DATE_TRANSMISSION_TS_UTC
, JSON_DATA:"authorizerClientId"::text AS AUTHORIZER_CLIENT_ID
, JSON_DATA:"apiPath"::text API_PATH
, MASTERCLIENT_ID
, META_FILENAME
, META_LOAD_TS_UTC
, META_FILE_TS_UTC
FROM {{ source('INGEST_DATA', 'TABLENAME') }}
I get this error:
000939 (22023): SQL compilation error: error line 6 at position 4
10:21:46 too many arguments for function [TO_TIMESTAMP(GET(DATE_TRANSMISSION, 'text'), 'YYYY-MM-DDTHH24:MI:SS.FFTZH:TZM')] expected 1, g
However, if I comment out the the first 2 lines(related to timpstamp types), the other two work perfectly fine. What's the correct syntax of parsing json with TO_TIMESTAMP?
Not that JSON_DATA:"apiPath"::text API_PATH gives the correct value for it in my snowflake tables.
Did some testing and it seems you have 2 options.
You can either get rid of the +0000 at the end: left(column_date, len(column_date)-5)
or try_to_timestamp with format
try_to_timestamp('2022-02-09T20:28:59+0000','YYYY-MM-DD"T"HH24:MI:SS+TZHTZM')
TZH and TZM are TimeZone Offset Hours and Minutes
So there are 2 main points here.
when getting data from JSON to pass to any of the timestamp functions that want a ::TEXT object, but the values to get from JSON are still ::VARIANT so they need to be cast. This is the cause of the error you quote
(22023): SQL compilation error: error line 6 at position 4
10:21:46 too many arguments for function [TO_TIMESTAMP(GET(DATE_TRANSMISSION, 'text'), 'YYYY-MM-DDTHH24:MI:SS.FFTZH:TZM')] expected 1, g
also your SQL is wrong there it should have been
TO_TIMESTAMP(DATE_TRANSMISSION::text,
How you handle the timezone format.As other have noted you (as I did in your last question) do you want to ignore the timezone values or read them. I forgot about the TZHTZM formatting. Given you have timezone data, you should use the TO_TIMESTAMP_TZ`TRY_TO_TIMESTAMP_TZto make sure the time zone data is keep, given you second example shows+0100`
putting those together (assuming you didn't want an extra date_transmission as a variant in you data) :
SELECT
TO_TIMESTAMP_TZ(JSON_DATA:"date_transmission"::text, 'YYYY-MM-DDTHH24:MI:SS+TZHTZM') AS DATE_TRANSMISSION_TS_UTC
, JSON_DATA:"authorizerClientId"::text AS AUTHORIZER_CLIENT_ID
, JSON_DATA:"apiPath"::text AS API_PATH
, MASTERCLIENT_ID
, META_FILENAME
, META_LOAD_TS_UTC
, META_FILE_TS_UTC
FROM {{ source('INGEST_DATA', 'TABLENAME') }}
You should use timestamp (not date which does not store the time information), but probably the format you are using is not autodetected. You can specify the input format as YYYY-MM-DD"T"HH24:MI:SSTZHTZM as shown here. The autodetected one has a : between the TZHTZM.

PgSQL insert into select after cast

I have a table with many columns stored as text. These columns have information which i am planning to cast and copy the data into a fresh table with the same column names but with correct data types ( float8 )
I tested the SELECT with CAST operator "::" and it works fine. All the columns are being converted as you can see in the picture
However , when I uncomment the INSERT statement ( above it ) to start writing in the target table, it throws an error. The target table has identical column names and only float8 column types.
The expression from the source table is indeed of type text but i am using the cast operator so why does it not work like before when only running the SELECT statement?
My query below:
INSERT INTO "PM"."new_VM_gcell_evolution_hourly_BSC"
(select
"CR3120:Channel_Assignment_Failures_All_Channels_Busy_or_Channel"::float8,
"DL_Mean_Quality"::float8,
"A312Ca:Failed_Assignments_during_MTC_on_the_A_Interface_Includi"::float8,
"nsp_Urban_TA_4number"::float8,
"A3129C:Failed_Assignments_First_Assignment,_Assignment_Timed_Ou"::float8,
"R3120C:Channel_Assignment_Failures_All_Channels_Busy_or_Channel"::float8,
"TSs_Interf_B5"::float8,
"R3120D:Channel_Assignment_Failures_All_Channels_Busy_or_Channel"::float8,
"HO_Out_Internal_Succ_Rate"::float8,
"CH_Req_Protocol_Undefinednbnumber"::float8,
"UL_Drop_Congnumber"::float8,
"Timing_Adv"::float8,
"DL_Qual"::float8,
"A3100C:Assignment_Requests_TCHH_Only"::float8,
"MS_to_BTS_max_distance"::float8,
"TSs_Interf_B2"::float8,
"UL_Qual"::float8,
"UL_Drop_EDGE_Rate"::float8,
"Avg_BTS_Power_Level_AMR"::float8,
"DL_Drop_N3105number"::float8,
"nsp_Urban_TA_2_R"::float8,
"CH_Req_CallReestabnbnumber"::float8,
"TCH_Drops_cause_Timing_Advance"::float8,
"TCH_Drops_per_Erlang"::float8,
"UL_Drop_Flushnumber"::float8,
"A312F:Number_of_Assignment_Failures_No_Abis_Resource_Available"::float8,
"UL_Drop_GPRS_Rate"::float8,
"nsp_Urban_TA_3number"::float8,
"Call_Drop_rate"::float8,
"DL_Fail_Assingnumber"::float8,
"SDCCH_Radio_Failures"::float8,
"HO_Cmd_UL_Qual"::float8,
"UL/DL_RxQualnumber"::float8,
"UL/DL_RxLevnumber"::float8,
"UL_Fail_EDGE_Rate"::float8,
"Call_Drop_Total"::float8,
"HO_Out_Succ_Rate"::float8,
"Date"::text,
"A3127E:Failed_Assignments_during_Call_Reestablishment_on_the_Um"::float8,
"UL_Drop_N3103number"::float8,
"Configured_TCH"::float8,
"SDCCH_Drop_Call_Rate_only_LU"::float8,
"A312M:Failed_Assignments_Reconnection_to_Old_Channels,_No_Chann"::float8,
"SDCCH_Blocking_Rate"::float8,
"TCH_Drops_cause_DL_FER"::float8,
"TCH_Fail_rate"::float8,
"DL_Drop_Preemnumber"::float8,
"FR_Traffic_totalErl"::float8,
"DL_Congnumber"::float8,
"TCH_Assign_Unsuccnumber"::float8,
"nsp_Urban_TA_0_R"::float8,
"DL_Drop_GPRS_Rate"::float8,
"HR_TraficErl"::float8,
"SDCCH_Cong_Rate"::float8,
"Abis_and_Ater_Interface_Anlysistimes"::float8,
"A3100B:Assignment_Requests_TCHF_Only"::float8,
"AS4300D:Mean_Uplink_Level_during_Radio_Link_Failure_SDCCHdB"::float8,
"nsp_Urban_TA_89number"::float8,
"TCH_Drops_cause_DownQual"::float8,
"SDCCH_Drop_RxLev"::float8,
"Max_UL_Pwr_Duration"::float8,
"nsp_Channel_Req_LUnumber"::float8,
"TCH_HR_Initialy_Config"::float8,
"HO_Inc_Cong_Rate"::float8,
"DL_RxLev_avgdBm"::float8,
"TCH_Assign_Requestsnumber"::float8,
"A312K:Failed_Assignments_First_Assignment,_No_Channel_Available"::float8,
"Call_Drop_TCH_Quality_Rate"::float8,
"AMR_HR_TraficErl"::float8,
"nsp_Urban_TA_67_R"::float8,
"UL_Fail_MSnumber"::float8,
"Time"::text,
"TCH_Drops_cause_UpLevel"::float8,
"AMR_FR_TraficErl"::float8,
"TCH_FR_Initialy_Config"::float8,
"UL_Drop_N3101number"::float8,
"DL_FERnumber"::float8,
"SDCCH_Drop_Call_Rate_wo_LU"::float8,
"DL_Mean_StrengthdB"::float8,
"Call_Drop_Abis"::float8,
"SDCCH_Initialy_Confignumber"::float8,
"Call_Drop_TCH_RxLev_Rate"::float8,
"TA_Meannumber"::float8,
"nsp_Urban_TA_67number"::float8,
"AS3240NA:Average_MS_Power_Level_of_NonAMR_Call"::float8,
"Load"::float8,
"A3129Q:Failed_Assignments_Reconnection_to_Old_Channels,_Timer_E"::float8,
"CH_Request_11_bit"::float8,
"AS4340D:Mean_TA_during_Radio_Link_Failure_SDCCH"::float8,
"S4210A:Uplink_Interference_Indication_Messages_SDCCH"::float8,
"UL_FERnumber"::float8,
"A3129P:Failed_Assignments_Reconnection_to_Old_Channels,_Timer_E"::float8,
"K3003A:Successful_SDCCH_Seizures_Call_Type"::float8,
"CH_Req_LAUnbnumber"::float8,
"SDCCH_Drop_Rate"::float8,
"GBSC"::text,
"CS_CSSR_with_SDCCH_blocks_wo_LU"::float8,
"A312Aa:Failed_Assignments_during_MOC_on_the_A_Interface_Includi"::float8,
"Avg_BTS_Power_Level"::float8,
"R3120A:Channel_Assignment_Failures_All_Channels_Busy_or_Channel"::float8,
"A312S:Failed_Assignments_Signaling_Channel"::float8,
"SD_Fail_MoC"::float8,
"UL_RxQual_avg"::float8,
"Call_Drop_HO"::float8,
"AS4330D:Mean_Downlink_Quality_during_Radio_Link_Failure_SDCCH"::float8,
"CH_Req_Emerg_Callsnbnumber"::float8,
"UL_RxQualnumber"::float8,
"UL_Fail_OtherCausenumber"::float8,
"A312Da:Failed_Assignments_during_Emergency_Call_on_the_A_Interf"::float8,
"UL_Drop_Preemnumber"::float8,
"TCH_Drops_cause_UpDown_FER"::float8,
"AS4320D:Mean_Downlink_Level_during_Radio_Link_Failure_SDCCHdB"::float8,
"A3129O:Failed_Assignments_First_Assignment,_Directed_Retry_Time"::float8,
"Max_DL_Pwr_Duration"::float8,
"UL_Drop_Suspendnumber"::float8,
"TCH_Assign_Congestionnumber"::float8,
"Transmission_Resource_Analysistimes"::float8,
"HO_Out_RxQual_rate"::float8,
"nsp_Urban_TA_1013number"::float8,
"A3129I:Failed_Assignments_Invalid_State"::float8,
"DL_Fail_OtherCausenumber"::float8,
"CR3129:Channel_Assignment_Failures_All_Channels_Busy_or_Channel"::float8,
"HO_Inc_Successnumber"::float8,
"A3129B:Failed_Assignments_First_Assignment,_Terrestrial_Resourc"::float8,
"Trafic_totalErl"::float8,
"nsp_urban_TA_1463all_R"::float8,
"HO_Out_RxQualnumber"::float8,
"DL_RxQualnumber"::float8,
"HO_Inc_Succ_Rate"::float8,
"SDCCH_Fail_Rate"::float8,
"SDCCH_Non_Radio_Drops"::float8,
"nsp_Urban_TA_0number"::float8,
"A3129E:Failed_Assignments_CIC_Unavailable"::float8,
"nsp_CH_Request_PSnumber"::float8,
"TCH_Assign_Unsucc_rate"::float8,
"Call_Drop_Radio"::float8,
"EFR_TraficErl"::float8,
"A312L:Failed_Assignments_Reconnection_to_Old_Channels,_No_Chann"::float8,
"TCH_Drops_cause_DownLevel"::float8,
"nsp_Urban_TA_5number"::float8,
"Abnormal_Terminals_Analysistimes"::float8,
"HO_Inc_Congnumber"::float8,
"CSSR_Rate"::float8,
"M3020C:Call_Drops_on_SDCCHQuality"::float8,
"RH333:Handover_Drop_Rate_of_TCH"::float8,
"UL_Drop_EDGE_Abisnumber"::float8,
"A3129J:Failed_Assignments_Invalid_Message"::float8,
"TCH_Availability"::float8,
"UL_Level"::float8,
"nsp_Urban_TA_4_R"::float8,
"UL_Drop_OtherCausenumber"::float8,
"HR_Traffic_Rate"::float8,
"DL_Level"::float8,
"CH_Request_8_bit"::float8,
"UL_Fail_MS_Assingnumber"::float8,
"DL_Fail_BSC_Commandnumber"::float8,
"A3129R:Failed_Assignments_Reconnection_to_Old_Channels,_Reconne"::float8,
"UL_Fail_BSC_Commandnumber"::float8,
"DL_Drop_Suspendnumber"::float8,
"nsp_urban_TA_1463allnumber"::float8,
"Direct_Retry"::float8,
"nsp_Urban_TA_89_R"::float8,
"CH_Req_PSnbnumber"::float8,
"A3129N:Failed_Assignments_Reconnection_to_Old_Channels,_Terrest"::float8,
"A312A:Failed_Assignments_First_Assignment,_No_Channel_Available"::float8,
"nsp_Urban_TA_2number"::float8,
"DL_RxQual_avg"::float8,
"UL_Mean_StrengthdB"::float8,
"DL_Drop_EDGE_Rate"::float8,
"A3129T:Failed_Assignments_No_Ater_Resource_Available"::float8,
"TSs_Interf_B1"::float8,
"A3129D:Failed_Assignments_Reconnection_to_Old_Channels,_Reconne"::float8,
"nsp_Channel_Req_MOCnumber"::float8,
"UL_Fail_GPRS_Rate"::float8,
"S4210B:Downlink_Interference_Indication_Messages_SDCCH"::float8,
"R3120E:Channel_Assignment_Failures_All_Channels_Busy_or_Channel"::float8,
"HO_Inc_Failnumber"::float8,
"DL_Drop_Flushnumber"::float8,
"S3655:Number_of_configured_TRXs_in_a_cell"::float8,
"TCH_Drops_cause_UpDown_Qual"::float8,
"CR3005:Number_of_Initially_Configured_Channels_Static_PDTCH_Sup"::float8,
"UL/DL_FERnumber"::float8,
"SDCCH_Blocking_"::float8,
"HO_Out_InterRAT_Succ_Rate"::text,
"CM30E:Call_Drops_on_SDCCH_Location_Updating"::float8,
"AS4310D:Mean_Uplink_Quality_during_Radio_Link_Failure_SDCCH"::float8,
"UL_Congnumber"::float8,
"TAnumber"::float8,
"Configured_SDCCH"::float8,
"Othernumber"::float8,
"TCH_Assign_Fail_Radionumber"::float8,
"HO_Out_Internal_Req_Nb"::float8,
"SDCCH_Cong_Nbnumber"::float8,
"UL_Mean_Quality"::float8,
"HO_Out_External_Req_Nb"::float8,
"nsp_Urban_TA_1013_R"::float8,
"SDCCH_Drops_wo_LU"::float8,
"Call_Drop_TCH_Qualitynumber"::float8,
"A3100A:Assignment_Requests_Signaling_Channel_TCH"::float8,
"Better_Cell"::float8,
"Call_Drop_TCH_RxLevnumber"::float8,
"nsp_Channel_Req_MTCnumber"::float8,
"DL_Fail_EDGE_Rate"::float8,
"ZTR104B:Call_Drop_Rate_on_SDCCH_Call_Type"::float8,
"Call_Drop_no_MR"::float8,
"DL_Drop_Congnumber"::float8,
"HO_Inc_InterRAT_reqnumber"::float8,
"TCH_Assign_Cong_rate"::float8,
"HO_Inc_Fail_Rate"::float8,
"TCH_Assign_Fail_sp_ver_unav"::float8,
"HO_Inc_InterRAT_unsucc"::float8,
"A3170A:Number_of_Completed_TCH_Assignments_CSFB_MOC"::float8,
"HO_Inc_Unsucc_Rate"::float8,
"TSs_Interf_B4"::float8,
"A3100K:Assignment_Requests_Signaling_Channel_SDCCH"::float8,
"TCH_Drops_cause_UpDown_Level"::float8,
"SDCCH_Fail_Nbnumber"::float8,
"TCH_Non_Radio_Drops"::float8,
"DL_Fail_MSnumber"::float8,
"HO_Cmd_DL_Qual"::float8,
"DL_Drop_OtherCausenumber"::float8,
"SMS_on_SDCCH"::float8,
"FERnumber"::float8,
"HO_Inc_Reqnumber"::float8,
"nsp_Urban_TA_3_R"::float8,
"M3020D:Call_Drops_on_SDCCHOther"::float8,
"HO_Out_External_Succ_Rate"::float8,
"M3020A:Call_Drops_on_SDCCHTA"::float8,
"TCH_Assign_Fail_Radio_rate"::float8,
"CR3001:Number_of_Initially_Configured_Channels_Static_PDCH"::float8,
"HR_Traffic_totalErl"::float8,
"CH_Req_MTCnbnumber"::float8,
"SDCCH_Drop_RxQual"::float8,
"UL_RxLev_avgdBm"::float8,
"CH_Req_MOCnbnumber"::float8,
"SDCCH_Dropnumber"::float8,
"A3169A:Failed_Assignments_Um_Cause"::float8,
"A3129H:Failed_Assignments_Clear_Commands_Sent_By_MSC"::float8,
"TCH_Radio_Drops"::float8,
"HO_Out_InterRAT_Req_Nb"::float8,
"HO_Inc_InterRAT_succnumber"::float8,
"DL_Drop_Abisnumber"::float8,
"TCH_Drops_cause_UL_FER"::float8,
"UL_RxLevelnumber"::float8,
"nsp_CH_Request_CSnumber"::float8,
"A3170B:Number_of_Completed_TCH_Assignments_CSFB_MTC"::float8,
"A3129F:Failed_Assignments_CIC_Allocated"::float8,
"DL_Fail_GPRS_Rate"::float8,
"M3020B:Call_Drops_on_SDCCHReceived_Level"::float8,
"nsp_Urban_TA_1number"::float8,
"SDCCH_Availability"::float8,
"Call_Drop_forced_HO"::float8,
"SDCCH_Drop_Others"::float8,
"Call_Drop_Equip."::float8,
"CH_Req_PS"::float8,
"SDCCH_Drop_Call_Rate"::float8,
"TCH_Drops_cause_UpQual"::float8,
"CH_Req_LMU_Reservednbnumber"::float8,
"DL_RxLevnumber"::float8,
"nsp_Urban_TA_1_R"::float8,
"A312Ea:Failed_Assignments_during_Call_Reestablishment_on_the_A_"::float8,
"TSs_Interf_B3"::float8,
"SDCCH_Drop_TimingAdvance"::float8,
"Avg_BTS_Power_Level_NAMR"::float8,
"TCH_Drops_cause_Other"::float8,
"FR_TraficErl"::float8,
"Avg_MS_Power_Level"::float8,
"HO_Outgoing_Requestsnumber"::float8,
"A3129G:Failed_Assignments_A_Interface_Failure"::float8,
"CH_Req_LAU"::float8,
"nsp_Urban_TA_5_R"::float8,
"HO_Inc_Unsuccessnumber"::float8,
"AS3240A:Average_MS_Power_Level_of_AMR_Call"::float8
from "PM"."VM_gcell_evolution_hourly_BSC_recovered")
It says "column is of type double but expression is of type text" so what's really happening is that it's trying to insert one of the expressions where you cast to ::text into a column that is of type double.
If the problem was converting from text to double, you'd get a different message. Besides, the SELECT worked fine, which means it didn't encounter any text data that it couldn't convert to double.
Since you don't specify the target table columns in your INSERT, and you have so many of them, most likely you missed a column or got the order wrong.
Honestly if the table has 270 columns and they have the same name in the source and destination tables, you should really generate the query using something like python from a list of columns, that will be faster than proofreading the 270 lines...

Writing SAS dates to SQL Server databse

How to write SAS dates to Microsoft SQL Server 2016 Date data type in database?
I got SAS data with a sas date DataEndDay and I want to write that into a database. The following bit is in use (buffer is just to speed up the testing-failing) :
libname valu oledb provider=sqloledb schema="dbo" INSERTBUFF=100
properties=("User ID"="&username." Password="&pw."
"data source" = &database.
"initial catalog"=&catalog.);
proc sql noprint;
insert into valu.Data_upload_from_me
( <some_columns...>,
<more-columns...>
,DataEndDay
)
select
<some_columns_source...>,
<more-columns_source...>
,DataEndDay
from work.SAS_data_to_publish
;quit;
Of course because SAS dates are numbers, direct writing is going to fail. What works is if I hard-code this as:
select
<some_columns_source...>,
<more-columns_source...>
,'2018-12-12'
from work.SAS_data_to_publish
;quit;
But If I convert the SAS date to string in SAS datasteps:
data SAS_data_to_publish ;
set SAS_data_to_publish ;
dataEndday0 = put(DataEndDay, yymmddd10.);
DataEndDay1 = quote(dataEndday0, "'") ;
run;
and try to write either of these, I get conversion error:
ERROR: ICommand::Execute failed. : Conversion failed when converting date and/or time from character string.
When I select the string it looks pretty ok:
proc sql; select DataEndDay1 from SAS_data_to_publish; quit;
'2018-12-12'
previously I've managed to write dateTimes with similar trick, which works:
proc format;
picture sjm
. = .
other='%Y-%0m-%0d %0H:%0M:%0S:000' (datatype=datetime)
;run;
data to_be_written;
set save.raw_data_to_be_written;
DataEndDay0 = put(dhms(DataEndDay,0,0,0), sjm. -L);
run;
Anyone ran into similar issues? How could I write the dates?
I could ask them to change the column to dateTime, maybe....
Thank you in advance.
Edit:
I managed to develop a work-around, which works but is ugly and -frankly- I don't like it. It so happens that my date is same for all rows, so I can assing it to macro variable and then use it in database writing.
data _NULL_;
set SAS_data_to_publish;
call symput('foobar', quote( put (DataEndDay , yymmddd10. -L), "'") ) ;
run;
....
select
<some_columns_source...>,
<more-columns_source...>
,&foobar.
from work.SAS_data_to_publish
;quit;
Of course this would fail immediately should DataEndDay vary, but maybe demonstrates that something is off in Proc SQLs select clause....
Edit Edit Pasted the question to SAS forums
I finally managed to crack the issue. The issue was for the missing values. As I am passing the values as strings into the database the parser interpreted missing values as real dots instead of empty strings. The following works:
data upload;
set upload;
CreatedReportdate2 = PUT(CreatedReportdate , yymmddn8.);
run;
libname uplad_db odbc noprompt =
"DRIVER=SQL Server; server=&server.; Uid=&user.;Pwd=&pw.; DATABASE=&db.;"
INSERTBUFF=32767;
proc sql;
insert into uplad_db.upload_table
(.... )
select
case when CreatedReportdate2 ='.' then '' else CreatedReportdate2 end,
...
from upload;
quit;
SAS does not really properly support the SQL server DATE data type. I imagine this is due to the fact that it's newer, but for whatever reason you have to pass the data as strings.
For missing values, it's important to have a blank string, not a . character. The easiest workaround here is to set:
options missing=' ';
That will allow you to insert data properly. You can then return it to . if you wish. In a production application that might be used by others, I'd consider storing aside the option value temporarily then resetting to that, in order to do no harm.
Normally I just use PROC APPEND to insert observations into a remote database.
proc append base=valu.Data_upload_from_me force
data=work.SAS_data_to_publish
;
run;
Make sure your date variable in your SAS dataset use the same data type as the corresponding variable names in your target database table. So if your MS SQL database uses TIMESTAMP fields for date values then make sure your SAS dataset uses DATETIME values.
If you want to use constants then make sure to use SAS syntax in your SAS code and MS SQL syntax in any pass through code.
data test;
date = '01JAN2017'd ;
datetime = '01JAN2017:00:00'dt ;
run;
proc sql ;
connect to oledb .... ;
execute ( ... date = '2017-01-01' .... datetime='2017-01-01 00:00' ...)
by oledb;
quit;

How to store the output of a query in a variable in HIVE

I want to store current_day - 1 in a variable in Hive. I know there are already previous threads on this topic but the solutions provided there first recommends defining the variable outside hive in a shell environment and then using that variable inside Hive.
Storing result of query in hive variable
I first got the current_Date - 1 using
select date_sub(FROM_UNIXTIME(UNIX_TIMESTAMP(),'yyyy-MM-dd'),1);
Then i tried two approaches:
1. set date1 = ( select date_sub(FROM_UNIXTIME(UNIX_TIMESTAMP(),'yyyy-MM-dd'),1);
and
2. set hivevar:date1 = ( select date_sub(FROM_UNIXTIME(UNIX_TIMESTAMP(),'yyyy-MM-dd'),1);
Both the approaches are throwing an error:
"ParseException line 1:82 cannot recognize input near 'select' 'date_sub' '(' in expression specification"
When I printed (1) in place of yesterday's date the select query is saved in the variable. The (2) approach throws "{hivevar:dt_chk} is undefined
".
I am new to Hive, would appreciate any help. Thanks.
Hive doesn't support a straightforward way to store query result to variables.You have to use the shell option along with hiveconf.
date1 = $(hive -e "set hive.cli.print.header=false; select date_sub(from_unixtime(unix_timestamp(),'yyyy-MM-dd'),1);")
hive -hiveconf "date1"="$date1" -f hive_script.hql
Then in your script you can reference the newly created varaible date1
select '${hiveconf:date1}'
After lots of research, this is probably the best way to achieve setting a variable as an output of an SQL:
INSERT OVERWRITE LOCAL DIRECTORY '<home path>/config/date1'
select CONCAT('set hivevar:date1=',date_sub(FROM_UNIXTIME(UNIX_TIMESTAMP(),'yyyy-MM-dd'),1)) from <some table> limit 1;
source <home path>/config/date1/000000_0;
You will then be able to use ${date1} in your subsequent SQLs.
Here we had to use <some table> limit 1 as hive got a bug in insert overwrite if we don't specify a table name.

Issues handling YYYY-MM-DD"T"HH:MM:SS date format

I have a date column on my DB table, the value of two rows is "01/01/2017 23:59:59", for my needs, I have to show this column in the "YYYY-MM-DD"T"HH:MM:SS" format.
BDEX CAQ *01/01/2017 23:59:59*
RBCP CAQ *01/01/2017 23:59:59*
When I execute this query:
SELECT CODE_TCT, LIB_TCT,
To_char(D_FIN,'yyyy-MM-dd"T"HH:mm:ss') AS D_FIN,
FROM MY_TABLE;
I get this result:
BDEX CAQ *2017-01-01T11:12:59*
RBCP CAQ *2017-01-01T11:01:59*
Why the values of the result (2017-01-01T11:12:59 and 2017-01-01T11:01:59) are different knowing that they have same
I'm using Oracle 11g.
You can use your date mask like this:
SELECT CODE_TCT, LIB_TCT,
To_char(D_FIN,'yyyy-MM-dd"T"HH24:MI:SS') AS D_FIN,
FROM MY_TABLE;