Not getting results even with long datetime formats in SQL - sql

I am trying to get some data that has datetime set, using the long format(ex: 2019-04-26 18:02:42).
When I use the following query I expected to find the following entry:
SELECT ip, cam_id
FROM test_table
WHERE ( date_time >= '2019-04-26 20:00:00' AND date_time <'2019-04-26 20:59:59' );
Entry:
id | ip | cam_id | date_time
-----+----+--------+---------------------
1 | 13 | 2 | 2019-04-26 20:46:06
However I am not getting any results. What am I doing wrong?
EDIT: Table schema
Kolon | Veri tipi | S²ralama (collation) | Bo■ (null) olabilir | Varsay²lan | Saklama | Stats hedefi | A²klama
-----------+------------------------+----------------------+---------------------+----------------------------------------+----------+--------------+----------
id | integer | | not null | nextval('test_table_id_seq'::regclass) | plain | |
ip | integer | | | | plain | |
cam_id | integer | | | | plain | |
date_time | character varying(255) | | | | extended | |
¦ndeksler:
"test_table_pkey" PRIMARY KEY, btree (id)

If you are new to postgresql, you should start first by reading postgresql manual and examples only. Dont use any kind of third party or unrelated code and sql generators, especialy unrelated to postgresql, those will only confuse you.
Currently, your query is comparing strings not datetime.
if you run this query, it will change date_time columns character varying(255) type to timestamp one, then your query will run properly:
alter table test_table alter column date_time TYPE timestamp without time zone using date_time::timestamp without time zone

You have to convert string to datetime format:
CONVERT(datetime, YOUR_COLUMN, 'yyyy-mm-dd hh:mm')

Related

SQL issue with specific timestamp

I am currently trying to optimize some workflows here. One of our workflows involves calculating a time offset in hours from a given date, and that involves selecting from a number of tables and applying some business logic. That part of the problem is fairly well solved. What I am trying to do is to calculate a final timestamp based upon a timestamp value and an offset (in hours).
My source table looks like:
MariaDB [ingest]> describe tmp_file_3;
+---------------+---------------------+------+-----+---------+-------+
| Field | Type | Null | Key | Default | Extra |
+---------------+---------------------+------+-----+---------+-------+
| mci_idx | bigint(20) unsigned | YES | | NULL | |
| mcg_idx | bigint(20) unsigned | YES | | NULL | |
| ingested_time | timestamp | YES | | NULL | |
| hours_persist | int(11) | YES | | NULL | |
| active | tinyint(1) | YES | | NULL | |
+---------------+---------------------+------+-----+---------+-------+```
And I am populating my new table with the following SQL:
MariaDB [ingest]> insert into master_expiration_index (select mci_idx, TIMESTAMPADD(HOUR, hours_persist, ingested_time) as expiration_time from tmp_file_3 where active=1);
ERROR 1292 (22007): Incorrect datetime value: '2023-03-12 02:20:15' for column `ingest`.`master_expiration_index`.`expiration_time` at row 347025
The SQL is correct to my understanding, since if I add a limit 10 to the query executes without any issues. The questions I have are:
What is wrong with that datetime value? It appears to be in the correct format
How do I figure out which row is causing the issue?
How do I fix this in the general case?

Cast VARCHAR columns to int, bigint, time, etc (PL/pgSQL)

Problem
(This is for an open source, analytics library.)
Here's our query results from events_view:
id | visit_id | name | prop0 | prop1 | url
------+----------+--------+----------------------------+-------+------------
2004 | 4 | Magnus | 2021-10-26 02:25:55.790999 | 142 | cnn.com
2007 | 4 | Hartis | 2021-10-26 02:26:37.773999 | 25 | fox.com
Currently all columns are VARCHAR.
Column | Type | Collation | Nullable | Default
----------+-------------------+-----------+----------+---------
id | bigint | | |
visit_id | character varying | | |
name | character varying | | |
prop0 | character varying | | |
prop1 | character varying | | |
url | character varying | | |
They should be something like
Column | Type | Collation | Nullable | Default
----------+------------------------+-----------+----------+---------
id | bigint | | |
visit_id | bigint | | |
name | character varying | | |
prop0 | time without time zone | | |
prop1 | bigint | | |
url | character varying | | |
Desired result
Hardcoding these castings as in SELECT visit::bigint, name::varchar, prop0::time, prop1::integer, url::varchar FROM tbl won't do, column names are known in run time only.
To simplify things we could cast each column into only three types: boolean, numeric, or varchar. Use regexps below for matching types:
boolean: ^(true|false|t|f)$
numeric: ^(,-)[0-9]+(,\.[0-9]+)$
varchar: every result that does not match boolean and numeric above
What should be the SQL that discover what type each column is and dynamically cast them?
These are a few ideas rather than a true solution for this tricky job. A slow but very reliable function can be used instead of regular expressions.
create or replace function can_cast(s text, vtype text)
returns boolean language plpgsql immutable as
$body$
begin
execute format('select %L::%s', s, vtype);
return true;
exception when others then
return false;
end;
$body$;
Data may be presented like this (partial list of columns from your example)
create or replace temporary view tv(id, visit_id, prop0, prop1) as
values
(
2004::bigint,
4::bigint,
case when can_cast('2021-10-26 02:25:55.790999', 'time') then '2021-10-26 02:25:55.790999'::time end,
case when can_cast('142', 'bigint') then '142'::bigint end
), -- determine the types
(2007, 4, '2021-10-26 02:26:37.773999', 25)
-- the rest of the data here;
I believe that it is possible to generate the temporary view DDL dynamically as a select from events_view too.

Error in condition in where clause in timescale db while visualising in grafana

I am trying to visualise in Grafana from timescale db with the following query
SELECT $__timeGroup(timestamp,'30m'), sum(error) as Error
FROM userCounts
WHERE serviceid IN ($Service) AND ciclusterid IN ($CiClusterId)
AND environment IN ($environment) AND filterid IN ($filterId)
AND $__timeFilter("timestamp")
GROUP BY timestamp;
however it gives an error and no data shows when i add the filterid IN ($filterId) part
have checked the variable names a thousand times but not sure what is error. Logically if the filters for variables are working in other conditions , it should work here also. not sure what is going wrong. Can anyone give input ?
Edit:
The schema is like
timestamp | timestamp without time zone | | not nul
l |
measurement | character varying(150) | |
|
filterid | character varying(150) | |
|
environment | character varying(150) | |
|
iscanary | boolean | |
|
servicename | character varying(150) | |
|
serviceid | character varying(150) | |
|
ciclusterid | character varying(150) | |
--more--
In grafana , it is giving the error
pq: column "in_orgs_that_have_had_an_operational_connector" does not exist
Where filterId = IN_ORGS_THAT_HAVE_HAD_AN_OPERATIONAL_CONNECTOR is selected, it is a value and not a column so not sure why they mentioned that, also they are showing in lower case while the value is in uppercase

How can I calculate time difference(HH:MM:SS) from date_time (ISO format) in Postgres

I have a table T1 in Postgres which is as follows:
| event | Date_Time |
|-------|--------------------------|
| start | 2018-04-30T06:09:30.986Z |
| run | 2018-04-30T10:37:38.044Z |
| end | 2018-04-30T11:39:38.044Z |
The Date_Time is in ISO format (stored as varchar) and I need to calculate the difference in Date_Time so that my output is as follows:
| event | Date_Time | Time_Difference |
|-------|--------------------------|-----------------|
| start | 2018-04-30T06:09:30.986Z | 4:28:08 |
| run | 2018-04-30T10:37:38.044Z | 1:02:00 |
| end | 2018-04-30T11:39:38.044Z | |
(10: 37: 38 - 06: 09: 30 = 4:28:08)
How can I do this using SQL?
Unrelated to the question, but: you should never store timestamp (or date or number) values in a varchar.
You first have to convert the varchar value to a timestamp. If the values are indeed formatted correctly, you can simply cast them: Date_Time::timestamp - or maybe to a timestamptz.
As far as I can tell, you want the different to the next row in your result. This can be achieved with the window function lead()
select event,
Date_Time,
date_time::timestamp - lead(date_time::timestamp) over (order by date_time::timestamp) as time_difference
from the_table
order by date_time;
The result of subtracting one timestamp from another is an interval you can format if you want.

How to remove the UTC info from a time stamp

I have a postgresql table that has a time value but it's stored without time zone information. I'd like to convert it to UTC time, but format it a specific way.
I have a syntax error somewhere and I don't know how to fix it. I have the following sql query :
testdb=# select id, starttime at TIME ZONE 'UTC',endtime AT TIME ZONE 'UTC', dtc, etcd from fre order by id;
And it returns data like this:
id | timezone | timezone | dtc | etcd
-------------+-------------+-------------+-----------+-------------------------
143322 | 13:00:00+00 | 00:00:00+00 | 0000000 | 8899703
990222 | 05:00:00+00 | 05:00:00+00 | 0000000 | 45007
452256 | 05:00:00+00 | 05:00:00+00 | 0000000 | 33303
123118 | 05:08:00+00 | 00:00:00+00 | 1111100 | 8899701
I'd like to remove the "+00" reference in the starttime / endtime fields.
I found another post here on stackoverflow that suggested using the to_char() method, and used this example:
testdb=# select to_char(now(), 'HH24:MI:SS');
to_char
----------
09:55:48
(1 row)
So I adapted it and tried this:
testdb=# select to_char(now() AT TIME ZONE 'UTC', 'HH24:MI:SS');
to_char
----------
14:55:58
(1 row)
Now I'm trying to apply this to my original query like so:
testdb=# select to_char(starttime AT TIME ZONE 'UTC', 'HH24:MI:SS') from fre;
ERROR: function to_char(time with time zone, unknown) does not exist
LINE 1: select to_char(starttime AT TIME ZONE 'UTC', 'HH24:MI:SS') f...
^
HINT: No function matches the given name and argument types. You might need to add explicit type casts.
I'm not sure what I'm doing wrong.
EDIT 1
In case it helps...
testdb=# \d+ fre;
Table "public.fre"
Column | Type | Modifiers | Storage | Description
-----------+------------------------+-----------+----------+-------------
id | character varying(40) | not null | extended |
etcd | character varying(40) | not null | extended |
starttime | time without time zone | | plain |
endtime | time without time zone | | plain |
startdate | date | | plain |
enddate | date | | plain |
dtc | bit(7) | | extended |
Cast it back to timestamp without time zone:
(starttime at TIME ZONE 'UTC')::timestamp without time zone