Getting free(): invalid pointer Aborted (core dumped) while using python interpreter in ubunutu [duplicate] - python-3.8

I've compiled and installed Python 3.6.1 from source code, and run sudo pip3 install readline to install the readline module. But when I start the Python shell, it crashes whatever I enter:
Python 3.6.1 (default, Mar 25 2017, 13:40:56)
[GCC 5.4.0 20160609] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> print("Hello")
*** Error in `python3': munmap_chunk(): invalid pointer: 0x00007fa3c64960a0 ***
======= Backtrace: =========
/lib/x86_64-linux-gnu/libc.so.6(+0x777e5)[0x7fa3c565e7e5]
/lib/x86_64-linux-gnu/libc.so.6(cfree+0x1a8)[0x7fa3c566aae8]
python3(PyOS_Readline+0xec)[0x5c3bcc]
python3[0x447cd0]
python3[0x449788]
python3(PyTokenizer_Get+0x9)[0x44a659]
python3[0x44617e]
python3(PyParser_ASTFromFileObject+0xa3)[0x428803]
python3(PyRun_InteractiveOneObject+0x122)[0x428a42]
python3(PyRun_InteractiveLoopFlags+0x6e)[0x428dce]
python3(PyRun_AnyFileExFlags+0x3c)[0x428efc]
python3(Py_Main+0xe4f)[0x43ba3f]
python3(main+0x162)[0x41dc52]
/lib/x86_64-linux-gnu/libc.so.6(__libc_start_main+0xf0)[0x7fa3c5607830]
python3(_start+0x29)[0x41dd29]
======= Memory map: ========
00400000-00663000 r-xp 00000000 08:05 16779642 /usr/local/bin/python3.6
00862000-00863000 r--p 00262000 08:05 16779642 /usr/local/bin/python3.6
00863000-008c7000 rw-p 00263000 08:05 16779642 /usr/local/bin/python3.6
008c7000-008f7000 rw-p 00000000 00:00 0
00d60000-00e31000 rw-p 00000000 00:00 0 [heap]
7fa3c4a96000-7fa3c4aac000 r-xp 00000000 08:05 528788 /lib/x86_64-linux-gnu/libgcc_s.so.1
7fa3c4aac000-7fa3c4cab000 ---p 00016000 08:05 528788 /lib/x86_64-linux-gnu/libgcc_s.so.1
7fa3c4cab000-7fa3c4cac000 rw-p 00015000 08:05 528788 /lib/x86_64-linux-gnu/libgcc_s.so.1
7fa3c4cac000-7fa3c4cec000 rw-p 00000000 00:00 0
7fa3c4cec000-7fa3c4d11000 r-xp 00000000 08:05 528922 /lib/x86_64-linux-gnu/libtinfo.so.5.9
7fa3c4d11000-7fa3c4f10000 ---p 00025000 08:05 528922 /lib/x86_64-linux-gnu/libtinfo.so.5.9
7fa3c4f10000-7fa3c4f14000 r--p 00024000 08:05 528922 /lib/x86_64-linux-gnu/libtinfo.so.5.9
7fa3c4f14000-7fa3c4f15000 rw-p 00028000 08:05 528922 /lib/x86_64-linux-gnu/libtinfo.so.5.9
7fa3c4f15000-7fa3c4f52000 r-xp 00000000 08:05 16786336 /usr/local/lib/python3.6/site-packages/readline.cpython-36m-x86_64-linux-gnu.so
7fa3c4f52000-7fa3c5151000 ---p 0003d000 08:05 16786336 /usr/local/lib/python3.6/site-packages/readline.cpython-36m-x86_64-linux-gnu.so
7fa3c5151000-7fa3c5153000 r--p 0003c000 08:05 16786336 /usr/local/lib/python3.6/site-packages/readline.cpython-36m-x86_64-linux-gnu.so
7fa3c5153000-7fa3c515a000 rw-p 0003e000 08:05 16786336 /usr/local/lib/python3.6/site-packages/readline.cpython-36m-x86_64-linux-gnu.so
7fa3c515a000-7fa3c515c000 rw-p 00000000 00:00 0
7fa3c515c000-7fa3c55e7000 r--p 00000000 08:05 16253117 /usr/lib/locale/locale-archive
7fa3c55e7000-7fa3c57a6000 r-xp 00000000 08:05 524633 /lib/x86_64-linux-gnu/libc-2.23.so
7fa3c57a6000-7fa3c59a6000 ---p 001bf000 08:05 524633 /lib/x86_64-linux-gnu/libc-2.23.so
7fa3c59a6000-7fa3c59aa000 r--p 001bf000 08:05 524633 /lib/x86_64-linux-gnu/libc-2.23.so
7fa3c59aa000-7fa3c59ac000 rw-p 001c3000 08:05 524633 /lib/x86_64-linux-gnu/libc-2.23.so
7fa3c59ac000-7fa3c59b0000 rw-p 00000000 00:00 0
7fa3c59b0000-7fa3c5ab8000 r-xp 00000000 08:05 524562 /lib/x86_64-linux-gnu/libm-2.23.so
7fa3c5ab8000-7fa3c5cb7000 ---p 00108000 08:05 524562 /lib/x86_64-linux-gnu/libm-2.23.so
7fa3c5cb7000-7fa3c5cb8000 r--p 00107000 08:05 524562 /lib/x86_64-linux-gnu/libm-2.23.so
7fa3c5cb8000-7fa3c5cb9000 rw-p 00108000 08:05 524562 /lib/x86_64-linux-gnu/libm-2.23.so
7fa3c5cb9000-7fa3c5cbb000 r-xp 00000000 08:05 524647 /lib/x86_64-linux-gnu/libutil-2.23.so
7fa3c5cbb000-7fa3c5eba000 ---p 00002000 08:05 524647 /lib/x86_64-linux-gnu/libutil-2.23.so
7fa3c5eba000-7fa3c5ebb000 r--p 00001000 08:05 524647 /lib/x86_64-linux-gnu/libutil-2.23.so
7fa3c5ebb000-7fa3c5ebc000 rw-p 00002000 08:05 524647 /lib/x86_64-linux-gnu/libutil-2.23.so
7fa3c5ebc000-7fa3c5ebf000 r-xp 00000000 08:05 524547 /lib/x86_64-linux-gnu/libdl-2.23.so
7fa3c5ebf000-7fa3c60be000 ---p 00003000 08:05 524547 /lib/x86_64-linux-gnu/libdl-2.23.so
7fa3c60be000-7fa3c60bf000 r--p 00002000 08:05 524547 /lib/x86_64-linux-gnu/libdl-2.23.so
7fa3c60bf000-7fa3c60c0000 rw-p 00003000 08:05 524547 /lib/x86_64-linux-gnu/libdl-2.23.so
7fa3c60c0000-7fa3c60d8000 r-xp 00000000 08:05 524644 /lib/x86_64-linux-gnu/libpthread-2.23.so
7fa3c60d8000-7fa3c62d7000 ---p 00018000 08:05 524644 /lib/x86_64-linux-gnu/libpthread-2.23.so
7fa3c62d7000-7fa3c62d8000 r--p 00017000 08:05 524644 /lib/x86_64-linux-gnu/libpthread-2.23.so
7fa3c62d8000-7fa3c62d9000 rw-p 00018000 08:05 524644 /lib/x86_64-linux-gnu/libpthread-2.23.so
7fa3c62d9000-7fa3c62dd000 rw-p 00000000 00:00 0
7fa3c62dd000-7fa3c6303000 r-xp 00000000 08:05 524634 /lib/x86_64-linux-gnu/ld-2.23.so
7fa3c6339000-7fa3c64e2000 rw-p 00000000 00:00 0
7fa3c64f8000-7fa3c64f9000 rw-p 00000000 00:00 0
7fa3c64f9000-7fa3c6500000 r--s 00000000 08:05 16522416 /usr/lib/x86_64-linux-gnu/gconv/gconv-modules.cache
7fa3c6500000-7fa3c6502000 rw-p 00000000 00:00 0
7fa3c6502000-7fa3c6503000 r--p 00025000 08:05 524634 /lib/x86_64-linux-gnu/ld-2.23.so
7fa3c6503000-7fa3c6504000 rw-p 00026000 08:05 524634 /lib/x86_64-linux-gnu/ld-2.23.so
7fa3c6504000-7fa3c6505000 rw-p 00000000 00:00 0
7ffcb1700000-7ffcb1721000 rw-p 00000000 00:00 0 [stack]
7ffcb17ad000-7ffcb17af000 r--p 00000000 00:00 0 [vvar]
7ffcb17af000-7ffcb17b1000 r-xp 00000000 00:00 0 [vdso]
ffffffffff600000-ffffffffff601000 r-xp 00000000 00:00 0 [vsyscall]
Aborted (core dumped)
What's the matter here? I'm using Ubuntu 16.04.

Try pip install gnureadline instead. And remove readline: pip uninstall readline.
As explained by the gnureadline package page on pypi.org:
Some platforms, such as macOS, do not ship with GNU readline installed
This module [bundles] the standard Python readline module with the GNU readline source code, which is compiled and statically linked to it. The end result is a package which is simple to install and requires no extra shared libraries.
So the gnureadline package comes with the required code compiled and linked whereas the builtin python readline library relies on GNU readline (or an alternative) to be installed on the system already.

As far as I can tell, this is the same problem encountered below. I've posted the answer that worked for me, there.
python 3.6 crash after install readline

Same problem.
Solve it by using gcc-4.7 for compilation python source instead gcc-5.4

Mac:
pip install readline
Windows:
pip install pyreadline
Unix:
pip install gnureadline

Related

pandas (multi) index wrong need to change it

I have a DataFrame multiData that looks like this:
print(multiData)
Date Open High Low Close Adj Close Volume
Ticker Date
AAPL 0 2010-01-04 7.62 7.66 7.59 7.64 6.51 493729600
1 2010-01-05 7.66 7.70 7.62 7.66 6.52 601904800
2 2010-01-06 7.66 7.69 7.53 7.53 6.41 552160000
3 2010-01-07 7.56 7.57 7.47 7.52 6.40 477131200
4 2010-01-08 7.51 7.57 7.47 7.57 6.44 447610800
... ... ... ... ... ... ... ...
META 2668 2022-12-23 116.03 118.18 115.54 118.04 118.04 17796600
2669 2022-12-27 117.93 118.60 116.05 116.88 116.88 21392300
2670 2022-12-28 116.25 118.15 115.51 115.62 115.62 19612500
2671 2022-12-29 116.40 121.03 115.77 120.26 120.26 22366200
2672 2022-12-30 118.16 120.42 117.74 120.34 120.34 19492100
I need to get rid of "Date 0, 1, 2, ..." column and make the actual "Date" column part of the (multi) index
How do I do this?
Use df.droplevel to delete level 1 and chain df.set_index to add column Date to the index by setting the append parameter to True.
df = df.droplevel(1).set_index('Date', append=True)
df
Open High Low Close Adj Close Volume
Ticker Date
AAPL 2010-01-04 7.62 7.66 7.59 7.64 6.51 493729600
2010-01-05 7.66 7.70 7.62 7.66 6.52 601904800

Getting wrong(?) average when calculating values in a time range

I am working with AWS Redshift / PostgreSQL. I have two tables that can be joined on the interval_date (DATE data_type) and interval_time_utc (VARCHAR data type) and/or the status and price_source columns. Source A is equivalent to the Y status and Source B is equivalent to the N status. I am trying to get the average price and the sum of mw_power for a given hour for each status / price_source. An hour is the timestamps from XX:05 to XX:00 so for 15:00, the values should be from the 14:05 to the 15:00 timestamps. Even if for an hour interval where all status are one value, I still need to calculate the average price for both price_source values, but the sum of mw_power would be 0. I am passing in the date and time intervals through my application code. I am seeing a different average price for the 15:00 hour than I expect so either I am bad at math or there is a bug in my query I can't determine. The 14:00 and 16:00 hour results come back as expected.
power_table
interval_date
interval_time_utc
mw_power
status
2022-05-09
13:00
92.25
N
2022-05-09
13:05
90.75
N
2022-05-09
13:10
91.25
N
2022-05-09
13:15
92.00
N
2022-05-09
13:20
92.00
N
2022-05-09
13:25
90.00
N
2022-05-09
13:30
93.00
N
2022-05-09
13:35
91.75
N
2022-05-09
13:40
90.25
N
2022-05-09
13:45
93.00
N
2022-05-09
13:50
91.00
N
2022-05-09
13:55
94.00
N
2022-05-09
14:00
91.00
N
2022-05-09
14:05
91.00
N
2022-05-09
14:10
94.00
N
2022-05-09
14:15
92.00
N
2022-05-09
14:20
91.00
N
2022-05-09
14:25
94.00
Y
2022-05-09
14:30
92.00
Y
2022-05-09
14:35
91.75
Y
2022-05-09
14:40
92.25
Y
2022-05-09
14:45
91.00
Y
2022-05-09
14:50
92.00
Y
2022-05-09
14:55
93.00
Y
2022-05-09
15:00
90.00
Y
price_table
interval_date
interval_time_utc
price
price_source
2022-05-09
13:00
54.20
Source A
2022-05-09
13:05
54.20
Source A
2022-05-09
13:10
54.20
Source A
2022-05-09
13:00
54.20
Source B
2022-05-09
13:05
54.20
Source B
2022-05-09
13:10
54.20
Source B
2022-05-09
13:15
34.11
Source A
2022-05-09
13:20
34.11
Source A
2022-05-09
13:25
34.11
Source A
2022-05-09
13:15
39.61
Source B
2022-05-09
13:20
39.61
Source B
2022-05-09
13:25
39.61
Source B
2022-05-09
13:30
2.81
Source A
2022-05-09
13:35
2.81
Source A
2022-05-09
13:40
2.81
Source A
2022-05-09
13:30
17.13
Source B
2022-05-09
13:35
17.13
Source B
2022-05-09
13:40
17.13
Source B
2022-05-09
13:45
1.58
Source A
2022-05-09
13:50
1.58
Source A
2022-05-09
13:55
1.58
Source A
2022-05-09
13:45
15.98
Source B
2022-05-09
13:50
15.98
Source B
2022-05-09
13:55
15.98
Source B
2022-05-09
14:00
4.60
Source A
2022-05-09
14:05
4.60
Source A
2022-05-09
14:10
4.60
Source A
2022-05-09
14:00
18.09
Source B
2022-05-09
14:05
18.09
Source B
2022-05-09
14:10
18.09
Source B
2022-05-09
14:15
2.46
Source A
2022-05-09
14:20
2.46
Source A
2022-05-09
14:25
2.46
Source A
2022-05-09
14:15
16.66
Source B
2022-05-09
14:20
16.66
Source B
2022-05-09
14:25
16.66
Source B
2022-05-09
14:30
3.36
Source A
2022-05-09
14:35
3.36
Source A
2022-05-09
14:40
3.36
Source A
2022-05-09
14:30
21.52
Source B
2022-05-09
14:35
21.52
Source B
2022-05-09
14:40
21.52
Source B
2022-05-09
14:45
4.55
Source A
2022-05-09
14:50
4.55
Source A
2022-05-09
14:55
4.55
Source A
2022-05-09
14:45
16.30
Source B
2022-05-09
14:50
16.30
Source B
2022-05-09
14:55
16.30
Source B
2022-05-09
15:00
-21.87
Source A
2022-05-09
15:00
4.96
Source B
-- query that i am using to get hourly values
SELECT pricet.price_source,
COALESCE(powert.volume, 0),
pricet.price,
powert.status
FROM (SELECT status,
SUM(mw_power) volume
FROM power_table
WHERE (interval_date || ' ' || interval_time_utc)::timestamp BETWEEN '2022-05-09 14:05:00.0' AND '2022-05-09 15:00:00.0'
GROUP BY status) powert
RIGHT JOIN (SELECT price_source,
AVG(price) price
FROM price_table
WHERE (interval_date || ' ' || interval_time_utc)::timestamp BETWEEN '2022-05-09 14:05:00.0' AND '2022-05-09 15:00:00.0'
GROUP BY price_source) pricet
ON pricet.price_source = CASE WHEN powert.status = 'Y' THEN 'Source A'
ELSE 'Source B'
END;
I am looking to get an expected output of the following for the 15:00 hour:
price_source
volume
price
status
Source A
736.00
0.54
Y
Source B
368.00
17.38
N
Result that I'm getting from query:
price_source
volume
price
status
Source A
736.00
1.54
Y
Source B
368.00
17.05
N
db fiddle link of tables and query and results: https://dbfiddle.uk/?rdbms=postgres_14&fiddle=474b009c5cf5366961751a61c0f96c6c
I think you made a calculator error. I changed your fiddle to add a rolling sum and rolling average for the second part of your query. To get an average of .54 (Source A) your sum would need to be 12 less than the total of the values for this hour. 12 is the count of values for the hour so a possible slip in subtracting 12 before dividing by 12?
The other source (B) the total would need to be off by 4m (an addition of 4 to the sum). Not sure how this could have happened but ...
Anyway the fiddle is at https://dbfiddle.uk/?rdbms=postgres_14&fiddle=e65c38677f3ab92607bbff778bc0f69e

SQL Server : update query doesn't change anything, shows no error

I am trying to update the table with the values from the same table.
What I want is to change the connection setup in the rows where the worker and client are same and that the changed row Connection setup started in 5mins after the other connection (with the same worker and client) ended.
I first created a SELECT query that returned me all the rows that needed to be changed
SELECT t.*
FROM Table1 t
WHERE EXISTS (SELECT 1 FROM Table1
WHERE worker = t.worker
AND client = t.client
AND t.SessionNo != SessionNo
AND t.[Connection setup] <= DATEADD(mi, 5, [Connection end])
AND t.[Connection setup] >= [Connection end])
Then I tried to import this query inside of an UPDATE query, but it didn't change anything :/ and it doesn't show me any errors.
UPDATE t
SET t.Start = t2.Start
FROM Table1 t
INNER JOIN Table1 t2 ON (t.SessionNo = t2.SessionNo)
WHERE t.worker = t2.worker
AND t.client = t2.client
AND t2.SessionNo <> t.SessionNo
AND t.[Connection setup] <= DATEADD(mi, 5, t2.[Connection end])
AND t.[Connection setup] >= t2.[Connection end]
Example:
The first table are the rows that should be changed. As you can see there is a column "right time" that shows what value should they have after the update.
SessionNo worker Tag Start Ende Dauer Connection setup Connection end client right_time
1 424568 mh 09.01.2020 00:00:00 13:45 13:49 00:04 09.01.2020 13:45:00 09.01.2020 13:49:00 OBENAT1D0209 13:44
2 269650 mg 09.03.2020 00:00:00 10:25 10:47 00:21 09.03.2020 10:25:00 09.03.2020 10:47:00 OBENAT1D0117 10:24
3 280892 mg 09.03.2020 00:00:00 12:19 12:22 00:03 09.03.2020 12:19:00 09.03.2020 12:22:00 OBENAT1D0117 12:19
4 175250 mg 09.03.2020 00:00:00 13:12 13:13 00:01 09.03.2020 13:12:00 09.03.2020 13:13:00 ORTNERAT1D0001 13:04
5 332684 dg 09.05.2020 00:00:00 16:05 16:33 00:28 09.05.2020 16:05:00 09.05.2020 16:33:00 KILLYAT3D0102 15:57
but as you can see here Start column is still the same.
SessionNo worker Tag Start Ende Dauer Connection setup Connection end client right_time
1 317045 mh 09.01.2020 00:00:00 09:29 09:38 00:09 09.01.2020 09:29:00 09.01.2020 09:38:00 AUMAAT1D0124 09:29
2 144035 sb 09.01.2020 00:00:00 11:09 11:27 00:18 09.01.2020 11:09:00 09.01.2020 11:27:00 OBENAT1D0231 11:09
3 437704 mh 09.01.2020 00:00:00 13:44 13:44 00:00 09.01.2020 13:44:00 09.01.2020 13:44:00 OBENAT1D0209 13:44
4 424568 mh 09.01.2020 00:00:00 13:45 13:49 00:04 09.01.2020 13:45:00 09.01.2020 13:49:00 OBENAT1D0209 13:44
5 219640 mh 09.01.2020 00:00:00 15:16 15:26 00:10 09.01.2020 15:16:00 09.01.2020 15:26:00 OBENAT1D0209 15:16
6 201023 mh 09.01.2020 00:00:00 16:29 16:35 00:06 09.01.2020 16:29:00 09.01.2020 16:35:00 OBENAT1D0209 16:29
7 236114 mg 09.03.2020 00:00:00 08:55 09:08 00:12 09.03.2020 08:55:00 09.03.2020 09:08:00 NULL NULL
8 271379 mg 09.03.2020 00:00:00 10:24 10:25 00:00 09.03.2020 10:24:00 09.03.2020 10:25:00 OBENAT1D0117 10:24
9 269650 mg 09.03.2020 00:00:00 10:25 10:47 00:21 09.03.2020 10:25:00 09.03.2020 10:47:00 OBENAT1D0117 10:24
10 290765 mg 09.03.2020 00:00:00 12:19 12:19 00:00 09.03.2020 12:19:00 09.03.2020 12:19:00 OBENAT1D0117 12:19
11 280892 mg 09.03.2020 00:00:00 12:19 12:22 00:03 09.03.2020 12:19:00 09.03.2020 12:22:00 OBENAT1D0117 12:19
12 538583 mg 09.03.2020 00:00:00 12:30 12:58 00:28 09.03.2020 12:30:00 09.03.2020 12:58:00 RATTAYAT1D0107 NULL
13 697202 mg 09.03.2020 00:00:00 13:04 13:08 00:04 09.03.2020 13:04:00 09.03.2020 13:08:00 ORTNERAT1D0001 13:04
14 175250 mg 09.03.2020 00:00:00 13:12 13:13 00:01 09.03.2020 13:12:00 09.03.2020 13:13:00 ORTNERAT1D0001 13:04
15 330580 dg 09.05.2020 00:00:00 15:57 16:05 00:08 09.05.2020 15:57:00 09.05.2020 16:05:00 KILLYAT3D0102 15:57
16 332684 dg 09.05.2020 00:00:00 16:05 16:33 00:28 09.05.2020 16:05:00 09.05.2020 16:33:00 KILLYAT3D0102 15:57
NOTE : In this case, in order to test the values I am changing the Start column instead of the connection startup.
You are updating zero rows, because of:
ON (t.SessionNo = t2.SessionNo)
...
AND t2.SessionNo <> t.SessionNo
You want to find rows with another session number, but you have t.SessionNo = t2.SessionNo, so this is exactly what you don't want.
You seem to think that a join needs a comparision with = on a single column, but this is not true. A join condition can be any boolean expression.
This may work for you:
UPDATE t
SET t.Start = t2.Start
FROM Table1 t
INNER JOIN Table1 t2 ON t.worker = t2.worker
AND t.client = t2.client
AND t.SessionNo <> t2.SessionNo
AND t.[Connection setup] <= DATEADD(mi, 5, t2.[Connection end])
AND t.[Connection setup] >= t2.[Connection end];

Select Sum HH:MM for period between two dates

I would like to get the sum of hours (Duration) worked between say 16/11/2016 09:00 > 15/12/2016 17:30. To achieve this I need to calculate the Duration for each day worked between the two dates and then to add those up to get monthly hours worked.
The EventTime table will contain data for every time a user clock in when they start work, clocks out at lunch time, clocks back in after lunch, and then clocks out at the end of the day.
I have managed to get this to work for a single day:
EMPLOYEE CLOCK IN CLOCK OUT DURATION
Tatjana 05/01/2017 08:33 05/01/2017 13:12 04:39
Harj 05/01/2017 10:59 05/01/2017 14:20 03:20
Tomasz 05/01/2017 09:55
John 05/01/2017 09:57
Sam 05/01/2017 08:11 05/01/2017 14:11 05:59
Paul 05/01/2017 09:39 05/01/2017 14:05 04:26
Adrian 05/01/2017 13:59
Sophie 05/01/2017 08:42
Meg 05/01/2017 07:56 05/01/2017 13:10 05:14
Anna 05/01/2017 07:59 05/01/2017 12:30 04:31
Adriana 05/01/2017 07:46 05/01/2017 12:44 04:58
Jacky 05/01/2017 09:01
Anna 05/01/2017 07:57 05/01/2017 12:29 04:32
Kelly 05/01/2017 07:56 05/01/2017 12:45 04:48
Ana 05/01/2017 07:41 05/01/2017 14:13 06:32
The above is achieved using the following query:
SELECT
u.Field14_50 AS EmployeeID,
u.Firstname,
u.Surname,
MIN(e.EventTime) AS [Clocked In],
CASE
WHEN MAX(e.EventTime) = MIN(e.EventTime) THEN NULL
WHEN MAX(e.EventTime) > MIN(e.EventTime) THEN MAX(e.EventTime)
END AS [Clocked Out],
CASE
WHEN MAX(e.EventTime) = MIN(e.EventTime) THEN NULL
WHEN MAX(e.EventTime) > MIN(e.EventTime) THEN FORMAT((DATEDIFF(SECOND, MIN(e.EventTime), MAX(e.EventTime)) / 3600) % 24, '00') + ':' + FORMAT((DATEDIFF(SECOND, MIN(e.EventTime), MAX(e.EventTime)) / 60) % 60, '00')
END AS [Duration]
FROM
UsersEx AS u
INNER JOIN
EventsEx AS e
ON
u.UserID = e.UserID
WHERE
u.Field14_50 <> ''
AND
u.DepartmentName IN (
'Production',
'Finance and Administration',
'Purchase',
'Sales',
'Warehouse'
)
AND
DAY(e.EventTime) = DAY(GETDATE()) AND MONTH(e.EventTime) = MONTH(GETDATE()) AND YEAR(e.EventTime) = YEAR(GETDATE())
AND
e.PeripheralName IN ('TIME AND ATTENDANCE OFFICE (In)', 'TIME AND ATTENDANCE OFFICE (Out)')
GROUP BY
u.Field14_50,
u.UserID,
u.FirstName,
u.Surname
ORDER BY
u.Surname ASC
I hope that is clear.
Thanks in advance.
Update: 06/01/2017
I am able to use the following query:
SELECT
u.FirstName,
e.EventTime
FROM
UsersEx AS u
INNER JOIN
EventsEx AS e
ON
u.UserID = e.UserID
WHERE
u.Field14_50 <> ''
AND
e.EventTime > '2016/11/16' AND e.EventTime < '2016/12/16'
AND
e.PeripheralName IN ('TIME AND ATTENDANCE OFFICE (In)', 'TIME AND ATTENDANCE OFFICE (Out)')
AND
u.DepartmentName IN ('Finance and Administration')
ORDER BY
u.Field14_50,
e.EventTime
ASC
to get this result:
Ram 16/11/2016 09:12
Ram 16/11/2016 12:59
Ram 16/11/2016 13:39
Ram 16/11/2016 17:47
Ram 17/11/2016 09:35
Ram 17/11/2016 12:45
Ram 17/11/2016 13:11
Ram 17/11/2016 17:43
Ram 21/11/2016 09:14
Ram 21/11/2016 12:24
Ram 21/11/2016 12:53
Ram 21/11/2016 17:36
Ram 22/11/2016 09:18
Ram 22/11/2016 13:32
Ram 22/11/2016 17:45
Ram 23/11/2016 09:10
Ram 23/11/2016 13:13
Ram 23/11/2016 13:51
Ram 24/11/2016 09:10
Ram 24/11/2016 13:15
Ram 24/11/2016 13:50
Ram 24/11/2016 17:41
Ram 25/11/2016 09:12
Ram 25/11/2016 17:36
Ram 28/11/2016 09:05
Ram 28/11/2016 12:32
Ram 28/11/2016 13:12
Ram 28/11/2016 17:40
Ram 29/11/2016 09:17
Ram 29/11/2016 12:45
Ram 29/11/2016 13:16
Ram 29/11/2016 17:50
Ram 30/11/2016 09:15
Ram 30/11/2016 12:51
Ram 30/11/2016 13:55
Ram 30/11/2016 17:31
Ram 01/12/2016 09:10
Ram 01/12/2016 12:44
Ram 01/12/2016 13:12
Ram 01/12/2016 17:36
Ram 02/12/2016 09:00
Ram 02/12/2016 12:19
Ram 02/12/2016 12:51
Ram 02/12/2016 17:38
Ram 05/12/2016 09:14
Ram 05/12/2016 12:45
Ram 05/12/2016 13:28
Ram 05/12/2016 17:45
Ram 06/12/2016 09:32
Ram 06/12/2016 12:15
Ram 06/12/2016 12:49
Ram 06/12/2016 17:51
Ram 07/12/2016 09:09
Ram 07/12/2016 12:43
Ram 07/12/2016 13:22
Ram 07/12/2016 17:51
Ram 08/12/2016 09:18
Ram 08/12/2016 12:54
Ram 08/12/2016 13:16
Ram 08/12/2016 17:39
Ram 09/12/2016 09:09
Ram 09/12/2016 18:02
Ram 12/12/2016 09:20
Ram 12/12/2016 12:55
Ram 12/12/2016 13:20
Ram 12/12/2016 17:47
Ram 13/12/2016 09:13
Ram 13/12/2016 13:10
Ram 13/12/2016 13:37
Ram 13/12/2016 18:01
Ram 15/12/2016 09:07
Ram 15/12/2016 12:37
Ram 15/12/2016 13:12
Ram 15/12/2016 17:53
Yuka 16/11/2016 08:52
Yuka 16/11/2016 19:05
Yuka 17/11/2016 09:02
Yuka 17/11/2016 18:25
Yuka 18/11/2016 08:23
Yuka 18/11/2016 18:26
Yuka 21/11/2016 08:12
Yuka 21/11/2016 17:59
Yuka 22/11/2016 08:51
Yuka 22/11/2016 17:44
Yuka 23/11/2016 08:43
Yuka 23/11/2016 18:07
Yuka 24/11/2016 08:42
Yuka 24/11/2016 18:24
Yuka 25/11/2016 08:37
Yuka 25/11/2016 17:34
Yuka 28/11/2016 08:44
Yuka 28/11/2016 18:03
Yuka 29/11/2016 08:11
Yuka 29/11/2016 16:58
Yuka 12/12/2016 08:51
Yuka 12/12/2016 17:57
Yuka 13/12/2016 07:51
Yuka 13/12/2016 18:30
Yuka 14/12/2016 08:32
Yuka 14/12/2016 18:04
Yuka 15/12/2016 08:40
Yuka 15/12/2016 18:09
Duncan 16/11/2016 07:25
Duncan 16/11/2016 18:28
Duncan 17/11/2016 07:25
Duncan 17/11/2016 17:48
Duncan 18/11/2016 07:29
Duncan 21/11/2016 07:33
Duncan 21/11/2016 17:48
Duncan 22/11/2016 07:31
Duncan 22/11/2016 18:14
Duncan 23/11/2016 07:43
Duncan 24/11/2016 07:21
Duncan 25/11/2016 07:32
Duncan 28/11/2016 07:35
Duncan 28/11/2016 18:11
Duncan 29/11/2016 07:34
Duncan 30/11/2016 07:35
Duncan 30/11/2016 18:21
Duncan 01/12/2016 07:27
Duncan 01/12/2016 17:57
Duncan 02/12/2016 07:38
Duncan 05/12/2016 07:29
Duncan 05/12/2016 18:12
Duncan 06/12/2016 07:28
Duncan 06/12/2016 17:37
Duncan 07/12/2016 07:13
Duncan 07/12/2016 07:19
Duncan 07/12/2016 18:01
Duncan 08/12/2016 07:22
Duncan 08/12/2016 17:56
Duncan 09/12/2016 07:24
Duncan 09/12/2016 17:30
Now how can I get the MIN and MAX for each user for each day from that result set (as below)?
Employee Clocked In Clocked Out Duration
Ram 16/11/2016 09:12 16/11/2016 17:47 08:35
Ram 17/11/2016 09:35 17/11/2016 17:43 08:08
...
Ram 13/12/2016 09:13 13/12/2016 18:01 08:48
Ram 15/12/2016 09:07 15/12/2016 17:53 08:46
and so on for each employee...
Convert each Duration field to seconds and then add it to the format HH:MM
This is an example:
IF OBJECT_ID('tempdb..#tmp') IS NOT NULL
DROP TABLE #tmp
CREATE TABLE #tmp (
Id INT IDENTITY
, duration VARCHAR(MAX)
)
INSERT INTO #tmp (duration)
VALUES ('04:39')
, ('03:20')
, ('05:59')
SELECT
Sum(Left(duration, 2) * 3600 + substring(duration, 4, 2) * 60) AS seconds
, CONVERT(VARCHAR, DATEADD(ms, (Sum(Left(duration, 2) * 3600 + substring(duration, 4, 2) * 60)) * 1000, 0), 114) AS 'seconds to HH:MM'
--http://stackoverflow.com/questions/1262497/how-to-convert-seconds-to-hhmmss-using-t-sql
FROM #tmp
DROP TABLE #tmp
RESULT
seconds seconds to HH:MM
----------- ------------------------------
50280 13:58:00:000
(1 row(s) affected)
One option to implement in your query this solution would be to use a CTE. In this case with a CTE would look something like this:
;WITH Base
AS (
SELECT u.Field14_50 AS EmployeeID
, u.Firstname
, u.Surname
, MIN(e.EventTime) AS [Clocked In]
, CASE
WHEN MAX(e.EventTime) = MIN(e.EventTime)
THEN NULL
WHEN MAX(e.EventTime) > MIN(e.EventTime)
THEN MAX(e.EventTime)
END AS [Clocked Out]
, CASE
WHEN MAX(e.EventTime) = MIN(e.EventTime)
THEN NULL
WHEN MAX(e.EventTime) > MIN(e.EventTime)
THEN FORMAT((DATEDIFF(SECOND, MIN(e.EventTime), MAX(e.EventTime)) / 3600) % 24, '00') + ':' + FORMAT((DATEDIFF(SECOND, MIN(e.EventTime), MAX(e.EventTime)) / 60) % 60, '00')
END AS [Duration]
FROM UsersEx AS u
INNER JOIN EventsEx AS e
ON u.UserID = e.UserID
WHERE u.Field14_50 <> ''
AND u.DepartmentName IN (
'Production'
, 'Finance and Administration'
, 'Purchase'
, 'Sales'
, 'Warehouse'
)
AND DAY(e.EventTime) = DAY(GETDATE())
AND MONTH(e.EventTime) = MONTH(GETDATE())
AND YEAR(e.EventTime) = YEAR(GETDATE())
AND e.PeripheralName IN (
'TIME AND ATTENDANCE OFFICE (In)'
, 'TIME AND ATTENDANCE OFFICE (Out)'
)
GROUP BY u.Field14_50
, u.UserID
, u.FirstName
, u.Surname
--ORDER BY u.Surname ASC --Don't include ORDER BY here.
)
SELECT CONVERT(VARCHAR, DATEADD(ms, (Sum(Left(duration, 2) * 3600 + substring(duration, 4, 2) * 60)) * 1000, 0), 114) AS 'sum of hours (Duration)'
FROM Base
ORDER BY u.Surname
UPDATED
According to your update, you would need to query using the MIN and MAX aggregate functions, but performing a GROUP BY for dates converted to DATE. With this you can get the highest and lowest value for each day.
CODE:
SELECT u.FirstName
, MIN(e.EventTime) AS 'Clocked In'
, CASE
WHEN MIN(e.EventTime) = MAX(e.EventTime)
THEN NULL
ELSE MAX(e.EventTime)
END AS 'Clocked Out'
, DATEDIFF(SECOND, min(e.EventTime), MAX(e.EventTime)) AS 'Duration in seconds'
, CONVERT(VARCHAR, DATEADD(ms, DATEDIFF(SECOND, min(e.EventTime), MAX(e.EventTime)) * 1000, 0), 114) AS 'Duration'
FROM UsersEx AS u
INNER JOIN EventsEx AS e
ON u.UserID = e.UserID
WHERE u.Field14_50 <> ''
AND e.EventTime > '2016/11/16'
AND e.EventTime < '2016/12/16'
AND e.PeripheralName IN (
'TIME AND ATTENDANCE OFFICE (In)'
, 'TIME AND ATTENDANCE OFFICE (Out)'
)
AND u.DepartmentName IN ('Finance and Administration')
GROUP BY u.FirstName
, cast(e.EventTime AS DATE)
ORDER BY u.FirstName
Then, you only need to sum the seconds and convert them to the HH:MM format that I showed you, using the CTE or a Subquery.

How to round Informix datetime by half hour, quarter and hour?

This is so easy to do in SQL Server and Informix is making me angry.
Showing the hour a datetime is in is easy:
select startdatetime,
startdatetime::DATETIME HOUR TO HOUR AS IntervalHour
from Contactcalldetail ccd
Gives:
startdatetime IntervalHour
04-05-2016 19:53:35 19
I want:
startdatetime IntervalHour IntervalHalfHour IntervalQuarterHour
04-05-2016 19:53:35 19 19:30 19:45
04-05-2016 19:56:57 19 19:30 19:45
04-05-2016 20:23:14 20 20:00 20:15
So far I've tried....cursing at my screen, telling google to go to horrible places because it gives me results not including the word "informix".
Thoughts on what else to try?
may be it will help you.
select startdatetime
, startdatetime::DATETIME HOUR TO HOUR AS IntervalHour
, Case when DatePart(minute,startdatetime)>=30 then cast(DatePart(hour,startdatetime) as varchar)+':30'
else cast(DatePart(hour,startdatetime) as varchar)+':00'
End
as IntervalHalfHour
, Case when DatePart(minute,startdatetime)>=45 then cast(DatePart(hour,startdatetime) as varchar)+':45'
when DatePart(minute,startdatetime)>=15 then cast(DatePart(hour,startdatetime) as varchar)+':15'
else cast(DatePart(hour,startdatetime) as varchar)+':00'
End
as IntervalQuarterHour
from Contactcalldetail ccd
Whilst the torturous CASE statements shown in the other answers will obviously work, it's not the only solution. Even if you stick with it, encapsulating this logic inside a FUNCTION (SPL) would make the code much more reusable if it's going to appear in lots of code.
But I would suggest that set theory is what databases do best, and there's a better approach:
Create a table with all the ranges you require - this will be completely static, and if quarter-hour is the finest grain you need, you'll have 24 * 4 = 96 rows. It would look like this:
from_intvl | to_intvl | intvl_qtr | intvl_hlf | intvl_hr
-----------+----------+-----------+-----------+---------
00:00:00 | 00:14:59 | 00:00 | 00:00 | 0
00:15:00 | 00:29:59 | 00:15 | 00:00 | 0
...
08:30:00 | 08:44:59 | 08:30 | 08:30 | 8
...
22:45:00 | 22:59:59 | 22:45 | 22:30 | 22
...
The first two columns in that table are DATETIME HOUR TO SECOND, and the rest can be DATETIME of the appropriate granularity, or CHAR, INT or whatever suits your application.
Using it is as simple as:
SELECT ccd.startdatetime, it.intvl_qtr, it.intvl_hlf, it.intvl_hr
FROM contactcalldetail AS ccd
JOIN intvl AS it
ON ccd.startdatetime::DATETIME HOUR TO SECOND BETWEEN it.from_intvl AND it.to_intvl
If you later decide you need 5 minute ranges, or 20 minute ranges, or even 4 hour blocks, it's trivial to add either more rows or more columns to your lookup table. Even going down to individual minute records would still only produce a table with 1,440 rows.
I'd create a stored procedure to do the job. While it could be done in an expression with enough casting, it would be unpleasant. The key trick is converting the number of minutes past the hour into a CHAR(2) string, which is then auto-convertible to a number. AFAIK, there isn't another way to convert an interval to a number.
DROP PROCEDURE IF EXISTS Multiple_Of_Quarter_Hour;
CREATE PROCEDURE Multiple_Of_Quarter_Hour(dtval DATETIME YEAR TO MINUTE
DEFAULT CURRENT YEAR TO MINUTE)
RETURNING DATETIME YEAR TO MINUTE AS rounded_value;
DEFINE mm CHAR(2);
LET mm = EXTEND(dtval, MINUTE TO MINUTE);
LET dtval = dtval - mm UNITS MINUTE;
RETURN dtval + (15 * (mm / 15)::INTEGER) UNITS MINUTE;
END PROCEDURE;
DROP PROCEDURE IF EXISTS Multiple_Of_N_Minutes;
CREATE PROCEDURE Multiple_Of_N_Minutes(dtval DATETIME YEAR TO MINUTE
DEFAULT CURRENT YEAR TO MINUTE,
n_min INTEGER DEFAULT 15)
RETURNING DATETIME YEAR TO MINUTE AS rounded_value;
DEFINE mm CHAR(2);
LET mm = EXTEND(dtval, MINUTE TO MINUTE);
LET dtval = dtval - mm UNITS MINUTE;
RETURN dtval + (n_min * (mm / n_min)::INTEGER) UNITS MINUTE;
END PROCEDURE;
Test code:
CREATE TEMP TABLE t_times (dtval DATETIME YEAR TO SECOND PRIMARY KEY);
INSERT INTO t_times VALUES('2016-05-27 00:00:00');
INSERT INTO t_times VALUES('2016-05-27 00:04:59');
INSERT INTO t_times VALUES('2016-05-27 00:06:59');
INSERT INTO t_times VALUES('2016-05-27 00:14:59');
INSERT INTO t_times VALUES('2016-05-27 00:15:00');
INSERT INTO t_times VALUES('2016-05-27 00:16:57');
INSERT INTO t_times VALUES('2016-05-27 00:21:00');
INSERT INTO t_times VALUES('2016-05-27 00:24:36');
INSERT INTO t_times VALUES('2016-05-27 00:25:11');
INSERT INTO t_times VALUES('2016-05-27 00:29:59');
INSERT INTO t_times VALUES('2016-05-27 00:30:00');
INSERT INTO t_times VALUES('2016-05-27 00:35:44');
INSERT INTO t_times VALUES('2016-05-27 00:44:59');
INSERT INTO t_times VALUES('2016-05-27 00:45:00');
INSERT INTO t_times VALUES('2016-05-27 00:49:53');
INSERT INTO t_times VALUES('2016-05-27 00:50:30');
INSERT INTO t_times VALUES('2016-05-27 00:59:59');
INSERT INTO t_times VALUES('2016-05-27 01:16:07');
INSERT INTO t_times VALUES('2016-05-27 01:34:10');
INSERT INTO t_times VALUES('2016-05-27 02:24:46');
INSERT INTO t_times VALUES('2016-05-27 04:32:08');
INSERT INTO t_times VALUES('2016-05-27 11:52:09');
INSERT INTO t_times VALUES('2016-05-27 14:00:28');
INSERT INTO t_times VALUES('2016-05-27 16:10:31');
INSERT INTO t_times VALUES('2016-05-27 17:46:58');
INSERT INTO t_times VALUES('2016-05-27 19:25:35');
INSERT INTO t_times VALUES('2016-05-27 22:52:48');
SELECT dtval,
Multiple_Of_Quarter_Hour(dtval) AS m15a,
EXTEND(Multiple_Of_N_Minutes(dtval, 2), HOUR TO MINUTE) AS m02,
EXTEND(Multiple_Of_N_Minutes(dtval, 3), HOUR TO MINUTE) AS m03,
EXTEND(Multiple_Of_N_Minutes(dtval, 4), HOUR TO MINUTE) AS m04,
EXTEND(Multiple_Of_N_Minutes(dtval, 5), HOUR TO MINUTE) AS m05,
EXTEND(Multiple_Of_N_Minutes(dtval, 6), HOUR TO MINUTE) AS m06
FROM t_times
ORDER BY dtval;
SELECT dtval,
EXTEND(Multiple_Of_N_Minutes(dtval, 10), HOUR TO MINUTE) AS m10,
EXTEND(Multiple_Of_N_Minutes(dtval, 12), HOUR TO MINUTE) AS m12,
EXTEND(Multiple_Of_N_Minutes(dtval, 15), HOUR TO MINUTE) AS m15b,
EXTEND(Multiple_Of_N_Minutes(dtval, 20), HOUR TO MINUTE) AS m20,
EXTEND(Multiple_Of_N_Minutes(dtval, 30), HOUR TO MINUTE) AS m30,
EXTEND(Multiple_Of_N_Minutes(dtval), HOUR TO MINUTE) AS m15c
FROM t_times
ORDER BY dtval;
Sample outputs:
dtval m15a m02 m03 m04 m05 m06
2016-05-27 00:00:00 2016-05-27 00:00 00:00 00:00 00:00 00:00 00:00
2016-05-27 00:04:59 2016-05-27 00:00 00:04 00:03 00:04 00:00 00:00
2016-05-27 00:06:59 2016-05-27 00:00 00:06 00:06 00:04 00:05 00:06
2016-05-27 00:14:59 2016-05-27 00:00 00:14 00:12 00:12 00:10 00:12
2016-05-27 00:15:00 2016-05-27 00:15 00:14 00:15 00:12 00:15 00:12
2016-05-27 00:16:57 2016-05-27 00:15 00:16 00:15 00:16 00:15 00:12
2016-05-27 00:21:00 2016-05-27 00:15 00:20 00:21 00:20 00:20 00:18
2016-05-27 00:24:36 2016-05-27 00:15 00:24 00:24 00:24 00:20 00:24
2016-05-27 00:25:11 2016-05-27 00:15 00:24 00:24 00:24 00:25 00:24
2016-05-27 00:29:59 2016-05-27 00:15 00:28 00:27 00:28 00:25 00:24
2016-05-27 00:30:00 2016-05-27 00:30 00:30 00:30 00:28 00:30 00:30
2016-05-27 00:35:44 2016-05-27 00:30 00:34 00:33 00:32 00:35 00:30
2016-05-27 00:44:59 2016-05-27 00:30 00:44 00:42 00:44 00:40 00:42
2016-05-27 00:45:00 2016-05-27 00:45 00:44 00:45 00:44 00:45 00:42
2016-05-27 00:49:53 2016-05-27 00:45 00:48 00:48 00:48 00:45 00:48
2016-05-27 00:50:30 2016-05-27 00:45 00:50 00:48 00:48 00:50 00:48
2016-05-27 00:59:59 2016-05-27 00:45 00:58 00:57 00:56 00:55 00:54
2016-05-27 01:16:07 2016-05-27 01:15 01:16 01:15 01:16 01:15 01:12
2016-05-27 01:34:10 2016-05-27 01:30 01:34 01:33 01:32 01:30 01:30
2016-05-27 02:24:46 2016-05-27 02:15 02:24 02:24 02:24 02:20 02:24
2016-05-27 04:32:08 2016-05-27 04:30 04:32 04:30 04:32 04:30 04:30
2016-05-27 11:52:09 2016-05-27 11:45 11:52 11:51 11:52 11:50 11:48
2016-05-27 14:00:28 2016-05-27 14:00 14:00 14:00 14:00 14:00 14:00
2016-05-27 16:10:31 2016-05-27 16:00 16:10 16:09 16:08 16:10 16:06
2016-05-27 17:46:58 2016-05-27 17:45 17:46 17:45 17:44 17:45 17:42
2016-05-27 19:25:35 2016-05-27 19:15 19:24 19:24 19:24 19:25 19:24
2016-05-27 22:52:48 2016-05-27 22:45 22:52 22:51 22:52 22:50 22:48
dtval m10 m12 m15b m20 m30 m15c
2016-05-27 00:00:00 00:00 00:00 00:00 00:00 00:00 00:00
2016-05-27 00:04:59 00:00 00:00 00:00 00:00 00:00 00:00
2016-05-27 00:06:59 00:00 00:00 00:00 00:00 00:00 00:00
2016-05-27 00:14:59 00:10 00:12 00:00 00:00 00:00 00:00
2016-05-27 00:15:00 00:10 00:12 00:15 00:00 00:00 00:15
2016-05-27 00:16:57 00:10 00:12 00:15 00:00 00:00 00:15
2016-05-27 00:21:00 00:20 00:12 00:15 00:20 00:00 00:15
2016-05-27 00:24:36 00:20 00:24 00:15 00:20 00:00 00:15
2016-05-27 00:25:11 00:20 00:24 00:15 00:20 00:00 00:15
2016-05-27 00:29:59 00:20 00:24 00:15 00:20 00:00 00:15
2016-05-27 00:30:00 00:30 00:24 00:30 00:20 00:30 00:30
2016-05-27 00:35:44 00:30 00:24 00:30 00:20 00:30 00:30
2016-05-27 00:44:59 00:40 00:36 00:30 00:40 00:30 00:30
2016-05-27 00:45:00 00:40 00:36 00:45 00:40 00:30 00:45
2016-05-27 00:49:53 00:40 00:48 00:45 00:40 00:30 00:45
2016-05-27 00:50:30 00:50 00:48 00:45 00:40 00:30 00:45
2016-05-27 00:59:59 00:50 00:48 00:45 00:40 00:30 00:45
2016-05-27 01:16:07 01:10 01:12 01:15 01:00 01:00 01:15
2016-05-27 01:34:10 01:30 01:24 01:30 01:20 01:30 01:30
2016-05-27 02:24:46 02:20 02:24 02:15 02:20 02:00 02:15
2016-05-27 04:32:08 04:30 04:24 04:30 04:20 04:30 04:30
2016-05-27 11:52:09 11:50 11:48 11:45 11:40 11:30 11:45
2016-05-27 14:00:28 14:00 14:00 14:00 14:00 14:00 14:00
2016-05-27 16:10:31 16:10 16:00 16:00 16:00 16:00 16:00
2016-05-27 17:46:58 17:40 17:36 17:45 17:40 17:30 17:45
2016-05-27 19:25:35 19:20 19:24 19:15 19:20 19:00 19:15
2016-05-27 22:52:48 22:50 22:48 22:45 22:40 22:30 22:45
Obviously, Multiple_Of_Quarter_Hour() could be written as a simple cover for Multiple_Of_N_Minutes().
Single expression:
SELECT dtval,
(EXTEND(dtval, YEAR TO MINUTE) -
(EXTEND(dtval, MINUTE TO MINUTE)::CHAR(2)) UNITS MINUTE) +
(15 * ((EXTEND(dtval, MINUTE TO MINUTE)::CHAR(2)) / 15)::INTEGER) UNITS MINUTE
FROM t_times
ORDER BY dtval;
If you have to write that more than once, it would be ludicrous not to use a stored procedure.
It is also possible, though a bit trickier, to round to the nearest multiple of N minutes instead of always truncating (so, for example, if the interval was 10 minutes, times from 13:55:00 to 14:04:59 would all be converted to 14:00).
DROP PROCEDURE IF EXISTS Nearest_Multiple_Of_N_Minutes;
CREATE PROCEDURE Nearest_Multiple_Of_N_Minutes(dtval DATETIME YEAR TO SECOND
DEFAULT CURRENT YEAR TO SECOND,
n_min INTEGER DEFAULT 15)
RETURNING DATETIME YEAR TO MINUTE AS rounded_value;
DEFINE mm CHAR(2);
DEFINE dt_yy_mm DATETIME YEAR TO MINUTE;
LET dt_yy_mm = dtval + ((30 * n_min) UNITS SECOND);
LET mm = EXTEND(dt_yy_mm, MINUTE TO MINUTE);
LET dt_yy_mm = dt_yy_mm - mm UNITS MINUTE;
RETURN dt_yy_mm + (n_min * (mm / n_min)::INTEGER) UNITS MINUTE;
END PROCEDURE;
SELECT dtval,
Nearest_Multiple_Of_N_Minutes(dtval, 10) AS m10,
Nearest_Multiple_Of_N_Minutes(dtval, 15) AS m15,
Nearest_Multiple_Of_N_Minutes(dtval, 20) AS m20,
Nearest_Multiple_Of_N_Minutes(dtval, 30) AS m30
FROM t_times
ORDER BY dtval;
Note the change of type of the time argument to the function.
I figured it out, thanks to shamim reza for giving me a couple ideas on the logic:
TO_CHAR(startdatetime::DATETIME HOUR TO HOUR, '%H')||':00' AS IntervalHour,
Case when startdatetime::datetime minute to minute::char(2)::int >=30 then TO_CHAR(startdatetime::DATETIME HOUR TO HOUR, '%H')||':30'
else TO_CHAR(startdatetime::DATETIME HOUR TO HOUR, '%H')||':00'
End as IntervalHalfHour,
Case when startdatetime::datetime minute to minute::char(2)::int >=45 then TO_CHAR(startdatetime::DATETIME HOUR TO HOUR, '%H')||':45'
when startdatetime::datetime minute to minute::char(2)::int >=30 then TO_CHAR(startdatetime::DATETIME HOUR TO HOUR, '%H')||':30'
when startdatetime::datetime minute to minute::char(2)::int >=15 then TO_CHAR(startdatetime::DATETIME HOUR TO HOUR, '%H')||':15'
else TO_CHAR(startdatetime::DATETIME HOUR TO HOUR, '%H')||':00'
Just another way of doing it.
SELECT
startdatetime,
IntervalHour,
IntervalHour + (minutes/30)::INT * 30 UNITS MINUTE,
IntervalHour + (minutes/15)::INT * 15 UNITS MINUTE
FROM (
SELECT
startdatetime,
EXTEND(TRUNC(startdatetime, 'HH'), HOUR TO MINUTE) AS IntervalHour,
TO_CHAR(startdatetime, '%M') AS minutes
FROM contactcalldetail
)
Basically you get the IntervalHour by using the TRUNC function to truncate the date to the beginning of the hour. And use the EXTEND function to adjust the precision of the DATETIME.
Do not use the ROUND function, because it will round the date to the beginning of the nearest hour or minute.
Extract the minutes using the TO_CHAR function and use simple math with the UNITS operator to get the lower interval, no need for CASE.