I am trying to find last Saturday date in HIVE in YYYY-MM-DD format using:
SET DATE_DM2=date_sub(from_unixtime(unix_timestamp(),'yyyy-MM-dd'),cast(((from_unixtime(unix_timestamp(), 'u') % 7)+1) as int));
But this is giving error
Change this line:
c = "hive -e 'use Data; SELECT * FROM table1 WHERE partitiondate='${DATE_D}';'"
into:
c = "hive -e \"use Data; SELECT * FROM table1 WHERE partitiondate='{DATE_D}';\"".format(DATE_D=DATE_D)
The call to format is what #Mai was mentioning. As for printing c, print c will show you the value of c at runtime, so you'll know if the value is as you expect.
P.S. calling commands.getoutput will not only fetch the rows but all of the standard output of calling hive in command line, and store that in a single string - meaning you'll probably need to do some parsing if you need to work with those rows. Or better yet, check out HiveClient.
Related
I want all the records with the wrong date from my data base. There are some records dated like 0645-14-10. Please note the data type of the column is VARCHAR.
I have tried with this query :
SELECT * from LTRECT_JOURNALS_T
where DATE_PART (YEAR,CREATE_DATE like '06%')
So how I can I find these kind of records?
You could use a simple test to find dates in the seventh century CE:
SELECT * from LTRECT_JOURNALS_T
where CREATE_DATE < date '0700-01-01'
/
You should cast the string representation of date first and not compare it to a date constant. This doesn't work in Db2, If you uncomment the commented out line and comment out the last one.
WITH LTRECT_JOURNALS_T (CREATE_DATE) AS
(
VALUES '0645-14-10', '2003-14-10', '2002-14-10'
)
SELECT *
FROM LTRECT_JOURNALS_T
WHERE
--CREATE_DATE < date('2003-01-01')
YEAR(TO_DATE(CREATE_DATE, 'YYYY-DD-MM')) < 2003
;
You can use a UDF function that will attempt to convert the string to a date, but capture the error generated if it fails and return false instead.
E.g.
CREATE OR REPLACE FUNCTION IS_DATE(i VARCHAR(64)) RETURNS INTEGER
CONTAINS SQL
ALLOW PARALLEL
NO EXTERNAL ACTION
DETERMINISTIC
BEGIN
DECLARE NOT_VALID CONDITION FOR SQLSTATE '22007';
DECLARE EXIT HANDLER FOR NOT_VALID RETURN 0;
RETURN CASE WHEN CAST(i AS DATE) IS NOT NULL THEN 1 END;
END
Change the statement terminator when creating the above. E.g. use # not ;
On Db2 11.1 or lower, remove the ALLOW PARALLEL line from the above SQL
Then, e.g.
VALUES IS_DATE('0645-14-10')
will return 0, but
VALUES IS_DATE('0645-12-10')
will return 1
I am facing problems in concatenating the value of a variable with a string .
my script contains the below
set hivevar:tab_dt= substr(date_sub(current_date,1),1,10);
CREATE TABLE default.udr_lt_bc_${hivevar:tab_dt}
(
trans_id double
)
ROW FORMAT DELIMITED FIELDS TERMINATED BY ",";
in the above, the variable tab_dt gets assigned correctly with yesterdays date in the format yyyymmdd.
but when i try to concatenate this variable in a table name with a static string, the script fails. it is not doing the concatenation .
Kindly provide a solution.
note: i tried the below too, which is erroring out too
set hivevar:tab_dt= substr(date_sub(current_date,1),1,10);
set hivevar:tab_nm1= default.udr_lt_bc_;
set hivevar:tab_name= concat(${hivevar:tab_dt},${hivevar:tab_nm1})
CREATE TABLE ${hivevar:tab_name}
(
trans_id double
)
ROW FORMAT DELIMITED FIELDS TERMINATED BY ",";
This too is returning an error.
Hive does not calculate expressions in the variables, substituting them as is.
Your create table expression results in this:
CREATE TABLE default.udr_lt_bc_substr(date_sub(current_date,1),1,10)...
Your second expression results in this:
CREATE TABLE concat(substr(date_sub(current_date,1),1,10),default.udr_lt_bc_)
Unfortunately Hive does not support such expressions in DDL.
I recommend to calculate this variable in a shell and pass as a --hivevar to the hive script.
For example in the sell script:
table_name=udr_lt_bc_$(date +'%Y_%m_%d' --date "-1 day")
#table_name is udr_lt_bc_2017_10_31 now
#call your script
hive -hivevar table_name="$table_name" -f your_script.hql
And then in your_script you can use variable:
CREATE TABLE default.${hivevar:table_name}
Note that '-' is not allowed in table names, that is why i used '_' instead.
For better understanding how Hive substitutes variables, try this:
hive> set hivevar:tab_dt= substr(date_sub(current_date,1),1,10);
hive> select ${hivevar:tab_dt};
OK
2017-10-31
Time taken: 1.406 seconds, Fetched: 1 row(s)
hive> select '${hivevar:tab_dt}';
OK
substr(date_sub(current_date,1),1,10)
Time taken: 0.087 seconds, Fetched: 1 row(s)
Note that in the first select statement the variable was substituted as is before execution and then calculated in the SQL. Second select statement prevent calculation because the variable is quoted and remains as is: substr(date_sub(current_date,1),1,10).
Another way in Hive:
select concat("table_",date_sub(from_unixtime(unix_timestamp(current_date,'yyyy-MM-dd'),'yyyy-MM-dd'),0));
Here, we can use above in a variable and use it as per our needs.
I wrote a basic update query:
Update WA SET WA.Time_Updated = Replace(Time_Updated, 'PM', ' ');
to which I don't get any real error message other than
Microsoft can't update 251 records etc due to type conversion error
There are 5000 records in there. I have the date column as Date/Time and all my other columns (non-dates) as Short Text. The query just does not update anything in the table and keeps it previously was. Any ideas?
Just convert your text times to Date values:
Select *, TimeValue([Time_Updated]) As TimeUpdated From WA
Then, when you display TimeUpdate, format the value as you like.
Can deal with the imported structure.
Consider:
Hour("12:03:00 PM") + Minute("12:03:00 PM")/60 + Second("12:03:00 PM")/3600
This calculates to 12.05
So don't change the raw data, calculate in query. Just use your field name in place of the static value in the expression.
Dates are stored like this in another table as varchars
select wkdocre from works;
wkdocre
-------------
+1654/12/31
+1706/12/31
+1667/12/31
-0332/12/31
-0332/12/31
-1295/12/31
And I want to insert these dates into another table with an attribute that is of type date like this
update ns_works set wor_workcreationdate=(select wkdocre from works where wor_workcreationdate=wkdocre);
I get this error
ERROR: operator does not exist: ns_workcreationdate = dateofcreation
LINE 1: ...lect wkdocre from works where wor_workcreationdate=wkdocre);
^
HINT: No operator matches the given name and argument type(s). You might need to add explicit type casts.
Thank-you
Desired results
select wor_creationdate from ns_works;
wor_creationdate
-------------
1654/12/31
1706/12/31
1667/12/31
-0332/12/31
-0332/12/31
-1295/12/31
You need explicit conversion; try something like that:
... SET wor_workcreationdate =
to_date(
(select wkdocre
from works
where wor_workcreationdate = to_date(wkdocre, 'YYYY/MM/DD')),
'YYYY/MM/DD'
)
Writing years BC with a minus sign is incorrect though; PostgreSQL will interpret -1295 as 1296 BC, since year 0 is actually 1 BC. You might want to fix your works table and use YYYY/MM/DD BC as format specifier.
I have a query of the form
select id, extract(xml, 'Foo/Bar/text()') value
from mytable;
This query runs fine in SQL Developer, and returns what appears to be a char column.
However, the software that is calling this this query, isn't playing well with it, and is throwing the error:
java.lang.NoClassDefFoundError - oracle/xdb/XMLType
So in order to debug this, I'm wanting to know what exact datatype that extract statement is returning.
Is this possible?
As said in the comments, extractValue() is deprecated. And if you use the .GetStringVal() method on your extract-ed xmltype, you'll end up having quot; instead of ", & instead of & and so on. Therefore, now, to the rescue ...
... the xmlcast() function!
Example 1 - not what we want
select xmltype('<xx><value>&"</value></xx>').extract('/xx/value')
from dual;
... yields a result of xmltype type.
Example 2 - still not what we want
select xmltype('<xx><value>&"</value></xx>').extract('/xx/value').getStringVal()
from dual;
... yields a string result of ...
<value>&"</value>
Example 3 - close, but still not what we want
select xmltype('<xx><value>&"</value></xx>').extract('/xx/value/text()').getStringVal()
from dual;
... yields a string result of ...
&"
... whereas ...
Example 4 - the ultimate way
select xmlcast(xmltype('<xx><value>&"</value></xx>').extract('/xx/value') as varchar2(4000))
from dual;
... yields the correct string result of ...
&"