Hive syntax -- Comparision in Case when - hive

I am using Hive to do a comparison in CASE WHEN THEN statement . Can u please check whether my syntax is correct.
${hiveconf:Test Metric} METRIC_ID,
CASE
WHEN ((A.X,A.Y,A.Z)IN (SELECT X,Y,Z FROM HIVE_TPCE_TEMP.TESTTABLE))
THEN CASE
WHEN MODE IN ('A','N')
THEN ${
hiveconf:SOME_CONSTANT ELSE ${hiveconf: SOME_CONSTANT
}
END

I'm guessing your snippet of code is from the SELECT clause of your query? According to the Hive Language Manual: "Hive supports subqueries only in the FROM clause".
Your CASE WHEN statement includes a subquery. Seems like that is not supported, so your syntax is not correct (in Hive).

Related

HIVE: cannot recognize input near 'distinct' '('

I am trying to execute the below query in Hive:
SELECT
regexp_replace('2016-08-05_11:29:46', '\\_', ' ') as tmstmp,
distinct(P.name)
FROM table P;
It throws an exception saying cannot recognize input near 'distinct' '(' 'P' in selection target.
where as when I run the query interchanging the columns like:
SELECT
distinct(P.name),
regexp_replace('2016-08-05_11:29:46', '\\_', ' ') as tmstmp
FROM table P;
It works fine. Any idea on the issue ?
To my knowledge, This is a restriction imposed by hive in select syntax.
As per the Select syntax in hive language manual , DISTINCT should come first in order followed by other expressions.
Reference:
https://cwiki.apache.org/confluence/display/Hive/LanguageManual+Select
I guess the reason being DISTINCT is a row level operation (even if its specified as function call for a column) and specifically in hive it will be a mapreduce operation.
The similar behavior could be observed in SQL ANSI standard supported database engines like Mysql as well.

Update from statement in hive

I have 2 tables with identical schemas in hive.
tbl1(a,b,c)
tbl2(a,b,c)
I want to update tbl1 as follows
update x from tbl1 x,tbl2 y
set x.c=y.c
where x.a=y.a
and x.b=y.b
But this didn't run on hive.
Whats the best way to achieve this?
This is a bit long for a comment.
Hive does not support updating one table using values from another (at least with the update statement).
The syntax for the statement is:
UPDATE tablename
SET column = value [, column = value ...]
[WHERE expression]
In particular, value cannot be a subquery:
The value assigned must be an expression that Hive supports in the
select clause. Thus arithmetic operators, UDFs, casts, literals, etc.
are supported. Subqueries are not supported.
Do note that subqueries are allowed in the where clause.

MSSQL Regular expression

I have the following REGEX: ^[-A-Za-z0-9/.]+$
This currently checks whether the value entered into a textbox matches this. If not, it throws an error.
I need to check whether anything has already gone into the database that doesnt match this.
I have tired:
SELECT * FROM *table* WHERE ([url] NOT LIKE '^[-A-Za-z0-9/.]+$')
SELECT * FROM *table* WHERE PATINDEX ('^[-A-Za-z0-9/.]+$', [url])
UPDATE
So after a bit of research I've realised I don't think I can use REGEXP.
I thought I could do something like this? Its not giving me the expected results but its running unlike anything else. Can anyone spot anything wrong with it?
SELECT *,
CASE WHEN [url] LIKE '^[-A-Za-z0-9/.]+$'
THEN 'Match'
ELSE 'No Match'
END Validates
FROM
*table*
This is what I have used in the end:
SELECT *,
CASE WHEN [url] NOT LIKE '%[^-A-Za-z0-9/.+$]%'
THEN 'Valid'
ELSE 'No valid'
END [Validate]
FROM
*table*
ORDER BY [Validate]
Disclaimer: The original question was about MySQL. The SQL Server answer is below.
MySQL
In MySQL, the regex syntax is the following:
SELECT * FROM YourTable WHERE (`url` NOT REGEXP '^[-A-Za-z0-9/.]+$')
Use the REGEXP clause instead of LIKE. The latter is for pattern matching using % and _ wildcards.
SQL Server
Since you made a typo, and you're using SQL Server (not MySQL), you'll have to create a user-defined CLR function to expose regex functionality.
Take a look at this article for more details.
As above the question was originally about MySQL
Use REGEXP, not LIKE:
SELECT * FROM `table` WHERE ([url] NOT REGEXP '^[-A-Za-z0-9/.]+$')

BigQuery equivalent of COALESCE()?

I'm in the process of converting some aggregate queries from Postgres to our new architecture in BigQuery. Is there an equivalent of COALESCE() in BigQuery?
Currently, I am converting a Postgres query statement like
coalesce(column1,'DEFAULT')
to
CASE
WHEN column1 IS NOT NULL
THEN column1
ELSE 'DEFAULT'
END AS column1
which seems easy enough.
However converting a Postgres query statement with nested coalesce statements like
count(distinct coalesce(
coalesce(
coalesce(column1,column2),
column3),
column4)))
would get much more messy if I used CASE statements all over the place, and also seems like the wrong thing to do.
Does BigQuery have a method equivilent to COALESCE(), or am I stuck writing the whole CASE statement equivalent?
You can use IFNULL function in BigQuery, which can be nested like that:
select ifnull(column1,
ifnull(column2,'DEFAULT'))
from
(select string(NULL) as column1, 'y' as column2)
P.S. But omission of COALESCE in BigQuery is an oversight, I will fix it.
Update: As of 4/16/2015, COALESCE is available in BigQuery.
https://cloud.google.com/bigquery/docs/reference/standard-sql/functions-and-operators#conditional_expressions

Oracle SQL Syntax: With clause

I'm currently using the Java Version of General SQL Parser for Oracle for some relatively complex Oracle SQL Queries.
As in my case I have no access to any Oracle DB but only have the SQL statements in a file I encounter some statements where the parser fails, one particular boils down to following.
select id from (
with foo as (
select bar from sometable
)
select *
from foo
)
The with clause can be parsed without problem, if not nested.
with foo as (
select bar from sometable
)
select *
from foo
So do I have a bug in the parser or in the statement?
Best,
Will
The SQL statement is valid, so I guess the parser just can't handle it.
To be sure, try running the SQL in SQL Plus.
This is a perfectly valid statement in Oracle (I just tried it).
But it might not be valid ANSI SQL and that might be the reason why the parser doesn't understand it.