Using date functions in where clause of SQL - sql

My question is regarding particular use case of SQL.
I am trying out querybook and presto type sql for a work project.
I deciphered that while writing SQL queries in querybook - it follows a different style of SQL which is Presto.
I want to write a SQL query which can:
give all entries from a table where the created_at_unix_timestamp > unix_timestamp_of_past_hour
There are some functions like FROM_UNIXTIME and TO_UNIXTIME which I'm experimenting with - but all of these functions are usable in the 1st part of the query
select *
where we describe the fields we want from a table
Is it possible to write a query like this
select *
from table
where to_unixtime(field_in_table) > some_unix_timestamp_value_calculated_at_runtime
I am not able to find any documentation around this.
An update:
when i try this - it's giving an error
Final Update:
It's working with this syntax:
select * from events.table where event_name='some_event_name' and consumer_timestamp > ( CAST(to_unixtime(CAST(LOCALTIMESTAMP AS timestamp)) AS BIGINT) - 3590) order by consumer_timestamp DESC limit 10

Related

SparkSQL cannot run a simple SQL query

I am working with a simple SparkSQL query:
SELECT
*,
(DATE + DURATION) AS EXPIRY_MONTH
FROM
loan
WHERE
EXPIRY_MONTH >= 12
where the first 10 lines of loan table are the following:
"loan_id";"account_id";"date";"amount";"duration";"payments";"status"
5314;1787;930705;96396;12;8033.00;"B"
5316;1801;930711;165960;36;4610.00;"A"
6863;9188;930728;127080;60;2118.00;"A"
5325;1843;930803;105804;36;2939.00;"A"
7240;11013;930906;274740;60;4579.00;"A"
6687;8261;930913;87840;24;3660.00;"A"
7284;11265;930915;52788;12;4399.00;"A"
6111;5428;930924;174744;24;7281.00;"B"
7235;10973;931013;154416;48;3217.00;"A"
This query works how intented with SQLite (meaning that the column EXPIRY_MONTH is added and data are filtered on the condition EXPIRY_MONTH >= 12) but not with SparkSQL (Spark 3.1.0).
Specifically, the SparkSQL engine throws errore as the EXPIRY_MONTH column does not exist.
How can I fix this query without resorting to subqueries?
What is the reason of this behaviour and difference between SparkSQL and more standard SQL?
you are not able to run this query as spark is lazily evaluated and it won't find that column that you are creating in the where clause.
What you can do is you can use the same logic which you are applying to create the separate column in the where clause which will allow you to run the query without using subquery.
SELECT
*,
(DATE + DURATION) AS EXPIRY_MONTH
FROM
loan
WHERE
(DATE + DURATION) >= 12

WHERE condition on new created Column in Impala

In my Table, I have time information in UNIX time that I have converted to the proper time format using the following function in impala:
cast(ts DIV 1000 as TIMESTAMP) as NewTime.
Now I want to apply WHERE query on the newly created column "NewTime" to select the data from a particular time period but I am getting the following error:
"Could not revolve column/field reference: NewTime".
How can I apply WHERE query on the newly created column in impala.
Thanks.
You can calculate it using inner subquery and then use it for filtering.
select NewTime
from
(select cast(ts DIV 1000 as TIMESTAMP) as NewTime,... from table) subq
where
subq.NewTime >now()
You can also use CTE like Gordon said.

Get Data between two dates using Number as Identifier

I am trying to make an SQL query in MS SQL Server where I put an account number to search and it gives me the data between the two date ranges.
code looks like this
select *
from transactions
where accountNo1 = '2005457846' transaction_date between '15-01-2018' and '18-01-2018'
Apparently what am i not doing correctly it tells me
syntax error near transaction_date
You forget to add AND in the query.
select *
from transactions
where accountNo1 = '2005457846' and transaction_date between '15-01-2018' and '18-01-2018'
use " and " also look date format
select * from transactions where accountNo1 ='2005457846' and transaction_date between '15-01-2018' and '18-01-2018'

Calculating difference of dates In Postgresql

I'm trying to find out the time between certain fields in my tables. However cause I'm using Postgresql :(( I can't use the DATEDIFF function. I can't find any clear guides/ tutorials on the net which shows how to do a similar thing in Postgres so I need help doing the same thing but in Postgres
I'm assuming this query would work if I was using a RDBMS that supported the DATEDIFF function so basically my question is how can I change this so it works using features provided by Postgresql?
SELECT Question.ID,
Question.Status, COUNT (qUpdate.ID) AS NumberofUpdates,
DATEDIFF (Question.LoggedTime,MIN(qUpdate.UpdateTime)) AS TimeBeforeFirstUpdate,
DATEDIFF(Question.LoggedTime, MAX(qUpdate.UpdateTime)) AS TimeBeforeLastUpdate
FROM qUpdate
LEFT JOIN Question ON qUpdate.qID=Question.ID
WHERE Question.Status = 'closed' AND qUpdate.Update NOT NULL
GROUP BY Question.Status, Question.ID, Question.LoggedTime;
If you need more info or any clarification I'll responsd ASAP.
You don't need a "datediff" function.
Just subtract the two dates:
Question.LoggedTime - MIN(qUpdate.UpdateTime)
In case you don't know, but all that is documented online:
http://www.postgresql.org/docs/current/static/functions-datetime.html
You can use the age(<date1>, <date2>) function (instead of DATEDIFF).
This should work -
SELECT Question.ID,
Question.Status, COUNT (qUpdate.ID) AS NumberofUpdates,
age(Question.LoggedTime,MIN(qUpdate.UpdateTime)) AS TimeBeforeFirstUpdate,
age(Question.LoggedTime, MAX(qUpdate.UpdateTime)) AS TimeBeforeLastUpdate
FROM qUpdate
LEFT JOIN Question ON qUpdate.qID=Question.ID
WHERE Question.Status = 'closed' AND qUpdate.Update NOT NULL
GROUP BY Question.Status, Question.ID, Question.LoggedTime;
Note, if psql gives you this error - ERROR: date/time field value out of range, then you would need to choose an appropriate datestyle.
SELECT extract(year from age('2014-01-23', '1985-08-27'));
-- returns 28 years, my current age.
this gives you the time diff in seconds:
select extract(epoch from to_date::timestamp - from_date::timestamp)

how to get data whose expired within 45 days..?

HI all,
i have one sql table and field for that table is
id
name
expireydate
Now i want only those record which one is expired within 45 days or 30 days.
how can i do with sql query .?
I have not much more exp with sql .
Thanks in advance,
If you are using mysql then try DATEDIFF.
for 45 days
select * from `table` where DATEDIFF(now(),expireydate)<=45;
for 30 days
select * from `table` where DATEDIFF(now(),expireydate)<=30;
In oracle - will do the trick instead of datediff and SYSDATE instead of now().[not sure]
In sql server DateDiff is quite different you have to provide unit in which difference to be taken out from 2 dates.
DATEDIFF(datepart,startdate,enddate)
to get current date try one of this: CURRENT_TIMESTAMP or GETDATE() or {fn NOW()}
You can use a simple SELECT * FROM yourtable WHERE expireydate < "some formula calculating today+30 or 45 days".
Simple comparison will work there, the tricky part is to write this last bit concerning the date you want to compare to. It'll depend of your environment and how you stored the "expireydate" in the database.
Try Below:-
SELECT * FROM MYTABLE WHERE (expireydate in days) < ((CURRENTDATE in days)+ 45)
Do not execute directly! Depending of your database, way of obtaining a date in days will be different. Go look at your database manual or please precise what is your database.