Activerecord, select on string compare - sql

I have a table called Basic , start_time is one field with type :VARCHAR(5), which actually stores a 5 bytes time data: byte 0 and 1 map to the year , month and day, and byte 2 to 4 map to the hour, min and second. So, it could possible bytes 2 ,3 ,4 are all 0. And I want to do following query :
Basic.find (:all , :conditions => "start_time > ? AND end_time < ?" , start_time , end_time)
Here are the questions:
Suppose in VARCHAR(5) format ,the start time is [214, 222,0 ,0, 0] (Jun, 24th, 2009) and the end time is [214, 223, 0, 0, 0] (Jun , 25, 2009).
As activerecord maps VARCHAR(5) to String , so in above query the start_time and end_time should also be String. What is the correct way to convert above VARCHAR(5) format time to the String?
I did it this way, but fails to get correct result:
tmp = [214, 222,0 ,0 ,0].map {|t| t.to_s(16)} ; start_time = tmp.to_s
And i was using sqlite3 adapter for activerecord.
Thanks for your help.
I have found where the problem is: "\000" is not allowed to be contained in the start_time when do follwing query:
Basic.find (:all , :conditions => "start_time > ? AND end_time < ?" , start_time , end_time)
So, I need to do two steps:
[214, 222,0 ,0, 0] - > [214,222]
[214,222] -> "\326\336"
The 1st steps can be done using:
a = [214,222,0,0,0]
while a.last ==0 do a.pop end
The 2nd steps can be done using:
a = [214,222]
a.pack("c" * a.size)
However, I still can not do query when start_time = [214, 222,0 ,23, 0] , because the corresponding string contains "\000" in the middle. fortunately, It would not be a big problem in our appication , as we never query in hours level, that means last two number will always be 0.

Is this what you're after?
[214, 222, 0, 0, 0].inject(""){|s,c| s << c.chr; s} # => "\326\336\000\000\000"
As to why I know this...I wanted to say MLM on twitter once, without actually saying it. Because if you actually say it, you get auto-followed by all these MLM spammers. So instead I said
[77, 76, 77].inject(""){|s,c| s << c.chr; s}
In your case, though, it seems like a lot of work just to avoid using a timestamp.

Related

Meta programing of DolphinDB: how to pass a function in sqlCol

I’m trying something with the function sqlUpdate and the where condition that I passed in is generated with the metaprogramming technique. I got an error:
::evaluate(sqlUpdate(t1, u, , w)) => Unrecognized column name date(date)
Here’s my full script:
t1=table(`A`A`B`B as symbol, 2021.04.15 2021.04.16 2021.04.15 2021.04.16 as date, 12 13 21 22 as price)
updCol = `price
dateCol = `date
u = sqlColAlias( expr(sqlCol(updCol), +, -1) , updCol )
w = expr(sqlCol("date("+dateCol+")"), ==, 2021.04.15)
sqlUpdate(table=t1 ,updates=u ,where = w).eval()
What is the problem here?
If you execute sqlCol("date("+dateCol+")") independently, the result is:
< date(date) >
It may seem correct at first glance. However, as the first argument of sqlCol is a column, the whole thing date(date) will be parsed as a column name.
Remember that the second argument of sqlCol is a function. So we can write sqlCol(dateCol,date). In this way, “date“ will be recognized as a column name and the second argument is treated as the function to be applied to the “date“ column.
The parsing result:
< date(date) as date >
Full script:
t1=table(`A`A`B`B as symbol, 2021.04.15 2021.04.16 2021.04.15 2021.04.16 as date, 12 13 21 22 as price)
updCol = `price
dateCol = `date
u = sqlColAlias( expr(sqlCol(updCol), +, -1) , updCol )
w = expr(sqlCol(dateCol, date), ==, 2021.04.15)
sqlUpdate(table=t1 ,updates=u ,where = w).eval()

SQL: Store and query `cron` field

I'm building a cron-as-a-service, to allow users to input their cron expression, and do some stuff periodically.
Here's a simple version of my table:
create table users (
id serial not null primary key,
-- `text` seems the most straightforward data type, any other suggestions?
cron text not null,
constraint valid_cron_expression CHECK (cron ~* '{some_regex}'),
-- snip --
-- maybe some other fields, like what to do on each cron call
)
My backend runs a SQL query every minute. How can I query all the rows whose cron field match the current timestamp (rounded to the closest minute)?
Edit: users input the cron field as a cron expression, e.g. 5 4 * * *.
Edit 2: corrected the fact that cron time resolution is minute, not second.
First of all you don't need to query every second because the cron has only a one minute resolution.
Next, comparing a cron scheduler expression to a timestamp is not a trivial task.
I'm not aware of any PostgreSQL module that would be able to parse the cron expressions.
There are two options, either you write your own function to do the comparison, or else you use a external library in the programming language you are using to do the comparison outside of the Database.
Here you will find an example implementation of such a function for Oracle that could easily be ported to PostgreSQL: SQL Query to convert cron expression to date/time format
It is incomplete because it doesn't handle complex expressions like */5 or 5,10,15 for individual fields of the cron expression but this is where I would start.
You might need to adjust some functions to mach your SQL dialect, but this seems to work, your only entry here is the value for ETL_CRON and the output is a boolean RUN_ETL.
The example returns TRUE at every 12th minute from 3 through 59 past every hour from 2 through 22 on Monday, Tuesday, Wednesday, Thursday, and Friday.:
select
'3/12 2-22 * * 1,2,3,4,5' as etl_cron
, split_part(etl_cron,' ',1) as etl_minute
, split_part(etl_cron,' ',2) as etl_hour
, split_part(etl_cron,' ',3) as etl_daymonth
, split_part(etl_cron,' ',4) as etl_month
, split_part(etl_cron,' ',5) as etl_dayweek
, convert_timezone('Europe/Amsterdam',current_timestamp) as etl_datetime
, case
when etl_minute = '*' then true
when contains(etl_minute,'-') then minute(etl_datetime) between split_part(etl_minute,'-',1) and split_part(etl_minute,'-',2)
when contains(etl_minute,'/') then mod(minute(etl_datetime),split_part(etl_minute,'/',2)) - split_part(etl_minute,'/',1) = 0
else array_contains(minute(etl_datetime)::varchar::variant, split(etl_minute,','))
end as run_minute
, case
when etl_hour = '*' then true
when contains(etl_hour,'-') then hour(etl_datetime) between split_part(etl_hour,'-',1) and split_part(etl_hour,'-',2)
when contains(etl_hour,'/') then mod(hour(etl_datetime),split_part(etl_hour,'/',2)) - split_part(etl_hour,'/',1) = 0
else array_contains(hour(etl_datetime)::varchar::variant, split(etl_hour,','))
end as run_hour
, case
when etl_daymonth = '*' then true
when contains(etl_daymonth,'-') then day(etl_datetime) between split_part(etl_daymonth,'-',1) and split_part(etl_daymonth,'-',2)
when contains(etl_daymonth,'/') then mod(day(etl_datetime),split_part(etl_daymonth,'/',2)) - split_part(etl_daymonth,'/',1) = 0
else array_contains(day(etl_datetime)::varchar::variant, split(etl_daymonth,','))
end as run_daymonth
, case
when etl_month = '*' then true
when contains(etl_month,'-') then month(etl_datetime) between split_part(etl_month,'-',1) and split_part(etl_month,'-',2)
when contains(etl_month,'/') then mod(month(etl_datetime),split_part(etl_month,'/',2)) - split_part(etl_month,'/',1) = 0
else array_contains(month(etl_datetime)::varchar::variant, split(etl_month,','))
end as run_month
, case
when etl_dayweek = '*' then true
when contains(etl_dayweek,'-') then dayofweek(etl_datetime) between split_part(etl_dayweek,'-',1) and split_part(etl_dayweek,'-',2)
when contains(etl_dayweek,'/') then mod(dayofweek(etl_datetime),split_part(etl_dayweek,'/',2)) - split_part(etl_dayweek,'/',1) = 0
else array_contains(dayofweek(etl_datetime)::varchar::variant, split(etl_dayweek,','))
end as run_dayweek
, run_minute and run_hour and run_daymonth and run_month and run_dayweek as run_etl;
This addresses the original version of the question.
Assuming that cron is a timestamp of some sort, you would use:
where cron >= date_trunc('second', now()) and
cron < date_trun('second', now()) + interval '1 second'

Convert "YYYYMMDD" format string to date in MDX?

I have some issue with applying date related functions on the "YYYYMMDD" format string in MDX. For example, if I have this query below:
with
member foo as WEEKDay("2013-03-21")
select
foo on 0
from
[Some Cube]
It will correctly output "5" for foo in SSMS. But if I change the second line to:
member foo as WEEKDay("20130321")
Unfortunately, it will throw "type mismatch" error.
So what I want to do is that converting the string to some recognizable date format and then applying the functions on it. Any ideas for the easiest way, e.g. using existing functions?
Please note that the string is actually inputted from members in any cube where the MDX is running on. So the string format could have been recognizable, e.g. "YYYY-MM-DD". So hard coded string converting algorithm may not be ok.
Topic is too old, but maybe this may help someone. Technique is quite brute, but scalable.
with
member foo_false as WeekDay("20130321")
member foo_true as WeekDay("2013-03-21")
member foo_brute as
case when IsError(WeekDay("20130321"))=False then WeekDay("20130321") else
case
/* YYYYMMDD */
when
IsError(WeekDay("20130321"))=True AND IsNumeric("20130321")=True
and IsError(WeekDay(Left("20130321",4)+'-'+Right(Left("20130321",6),2)+'-'+Right("20130321",2)))=False
then WeekDay(Left("20130321",4)+'-'+Right(Left("20130321",6),2)+'-'+Right("20130321",2))
/* DDMMYYYY */
when
IsError(WeekDay("20130321"))=True AND IsNumeric("20130321")=True
and IsError(WeekDay(Right("20130321",4)+'-'+Right(Left("20130321",4),2)+'-'+Left("20130321",2)))=False
then WeekDay(Right("20130321",4)+'-'+Right(Left("20130321",4),2)+'-'+Left("20130321",2))
/* MMDDYYYY */
when
IsError(WeekDay("20130321"))=True AND IsNumeric("20130321")=True
and IsError(WeekDay(Right("20130321",4)+'-'+Left("20130321",2)+'-'+Right(Left("20130321",4),2)))=False
then WeekDay(Right("20130321",4)+'-'+Left("20130321",2)+'-'+Right(Left("20130321",4),2))
/* Unsupported Message */
else "Unsupported Format" end
end
select
{foo_false,foo_true,foo_brute} on 0
from
[DATA Cube]
Adding supportable formats to the end before "Unsupported", use any input string instead of "20130321".
But the easiest way is to use another layer (e.g. SQL function CONVERT) before inserting to MDX if possible, sure thing.
The vba function isDate can be used to check if the passed date is well formatted. If not then format it first using dateserial and mid and use them.
with
member foo as "20130321"
member bar as
iif(vba!isdate(foo) = TRUE,
WEEKDay(foo), //--if the date is already well formatted, it can be used
WEEKday(vba!dateserial(vba!mid(foo, 0, 4), vba!mid(foo, 5, 2), vba!right(foo, 2))))
select
bar on 0
from
[some cube]
EDIT
The above code can be modified to accommodate other checks such as MMDDYYYY or DDMMYYYY, but the thing is it is impossible in lot of cases for the engine to intuitively know if the passed value is in YYYYMMDDDD or DDMMYYYY or MMDDYYYY. Take for example the string 1111111
This could easily be in any date format as it is a valid date howsoever you break it.
I would suggest that you have another member which can store the date format as well. That way looking at this member, string can be parsed.
e.g.
with
member foo as
// "20130321" //YYYYMMDD
// "03212013"//MMDDYYYY
"21032013"//DDMMYYYY
MEMBER dateformat as "ddmmyyyy"
member bar as
iif(vba!isdate(foo) = TRUE,
WEEKDay(foo),
IIF(dateformat = "yyyymmdd", //YYYYMMDD
WEEKday(vba!dateserial(vba!mid(foo, 0, 4), vba!mid(foo, 5, 2), vba!right(foo, 2))),
IIF(dateformat = "mmddyyyy", //MMDDYYYY
WEEKday(vba!dateserial(right(foo, 4), vba!mid(foo, 0, 2), vba!mid(foo, 3, 2))),
IIF(dateformat = "ddmmyyyy", //DDMMYYYY
WEEKday(vba!dateserial(right(foo, 4), vba!mid(foo, 3, 2), vba!mid(foo, 0, 2))),
null
)
)
)
)
select
bar on 0
from
[aw cube]
With FooMember being an int here, representing a yyyyMMdd date I could convert it to Date using the following code
=Format(CDate(Left(CSTR(Fields!FooMember.Value),4)
+ "-" + Mid(CSTR(Fields!FooMember.Value), 5, 2)
+ "-" + Right(CSTR(Fields!FooMember.Value), 2)),
"dd/MM/yyyy")
use it in a cube using the following code
Format(CDate(Left([Measures].[FooMember],4)
+ "-" + Mid([Measures].[FooMember], 5, 2)
+ "-" + Right([Measures].[FooMember], 2)),"yyyy/MM/dd")

Difference between postgresql time and rails 3 app time

Scenario: I want to get the nearest time in relation to my system time.
I have written the following method inside my model:
def self.nearest_class(day, teacher_id, hour_min)
day_of_week = '%' + day + '%'
if hour_min == "06:00:00"
where('frequency like ? AND teacher_id = ? AND (start_time >= ?)', day_of_week, teacher_id, hour_min).order('start_time ASC')
else
where('frequency like ? AND teacher_id = ? AND (start_time >= localtime(0))', day_of_week, teacher_id).order('start_time ASC')
end
end
When I tested the above method in rails console using Lecture.nearest_class('Monday', 1, 'localtime(0)'), the following query was returned together with some data:
SELECT "lectures".* FROM "lectures" WHERE (frequency like '%Monday%' AND teacher_id = 1 AND (start_time >= localtime(0))) ORDER BY start_time ASC
But I am expecting no record because my system time is greater than any start_time recorded in the database. I have used the query from the console to pgadmin3 to test if the results are same. However, pgadmin3 showed no results, as expected.
Are there differences in postgresql time and rails app time? How can I be able to check these differences? I tried Time.now in the console and it is the same as SELECT LOCALTIME(0) from pgadmin3. What should be the proper way to get the records for nearest system time?
To further set the context, here's my controller method:
def showlectures
#teacher = Teacher.login(params[:id]).first
#lecture_by_course = nil
if #teacher
WEEKDAY.each do |day|
hour_min = "06:00:00"
if day == Time.now.strftime("%A") # check if day of week is same e.g. 'Monday'
hour_min = "localtime(0)" # system time with format: HH:MM:SS
end
#lecture_by_course = Lecture.nearest_class(day, #teacher.id, hour_min).first
if #lecture_by_course
break
end
end
end
render json: #lecture_by_course, include: { course: { only: [:name, :id] }}
end
I solved this by just passing the server Time.now instead of having a SQL check. { :| d'oh }

AVG is not taking null values into consideration

I have loaded the following test data:
name, age,gender
"John", 33,m
"Sam", 33,m
"Julie",33,f
"Jimbo",, m
with schema: name:STRING,age:INTEGER,gender:STRING and I have confirmed that the Jimbo row shows a null for column "age" in the BigQuery Browser Tool > mydataset > Details > Preview section.
When I run this query :
SELECT AVG(age) FROM [peterprivatedata.testpeople]
I get 24.75 which is incorrect. I expected 33 because the documentation for AVG says "Rows with a NULL value are not included in the calculation."
Am I doing something wrong or is this a known bug? (I don't know if there's a public issues list to check). What's the simplest workaround to this?
This is a known bug where we coerce null numeric values to to 0 on import. We're currently working on a fix. These values do however, show up as not not defined (which for various reasons is different from null), so you can check for IS_EXPLICITLY_DEFINED. For example:
SELECT sum(if(is_explicitly_defined(numeric_field), numeric_field, 0)) /
sum(if(is_explicitly_defined(numeric_field), 1, 0))
AS my_avg FROM your_table
Alternately, you could use another column to represent is_null. Then the query would look like:
SELECT sum(if(numeric_field_is_null, 0, numeric_field)) /
sum(if(numeric_field_is_null, 0, 1))
AS my_avg FROM your_table