OVER clause for VARCHAR - sql

I can use over clause for numeric and date columns using an aggregate function. But, I'm stuck with being unable to use over clause for the varchar column. In the example below, I can reproduce the FIRST_FILL_DT column using the following lines:
MIN(FILL_DATE) OVER(PARTITION BY ID) AS FIRST_FILL_DT
However, when trying to produce the FIRST_BP_MED column, I am not sure if I can use similar syntax because I don't know if the aggregate function works correctly with VARCHAR Columns.
Can anyone please offer insights or guidance on how to solve this?
My data is like this:
My desired data should like this:

If your database supports the FIRST_VALUE window function, you can use something like this:
FIRST_VALUE(BP_MED) OVER (PARTITION BY ID ORDER BY FILL_DATE) AS first_bp_med
Docs for FIRST_VALUE:
MySQL, SQL Server,
Postgresql,
SQLite

This is pretty straight forward. Use 'FIRST_VALUE' over your window clause to pick the first value omitted by your partition irrespective of the condition.
https://learn.microsoft.com/en-us/sql/t-sql/functions/first-value-transact-sql?view=sql-server-ver15
SELECT
ID, FILL_DATE, BP_MED,
MIN (FILL_DATE) OVER (PARTITION BY ID ORDER BY FILL_DATE) AS FIRST_FILL_DT,
FIRST_VALUE (BP_MED) OVER (PARTITION BY ID ORDER BY FILL_DATE) AS FIRST_BP_MED
FROM
YOURTABLE;

Related

Sql Limit clause based in input Parameter

I have been trying to find a solution for a limit-clause based on an input parameter from a Json-File. The current code looks somewhat like this
With myJsonTable (JsonText)
as (
Select JsonText)
Select * from Data
Where...
Limit
Case
WHEN (Select JSON_VALUE(JsonText, '$."Amount"') From myJsonTable is not null
THEN (Select JSON_VALUE(JsonText, '$."Amount"') From myJsonTable)
ELSE (10000000)
END
Which I cant seem to get work. The Output I am getting is
Non-negative integeter value expected in LIMIT clause
Is there a way to cast the select done? Trying different Selects anywhere in the Case clause caused the same error.
Exasol only allows constant expression in the limit clause, so it's not directly possible to specify a select statement that references myJsonTable there.
However, you can workaround this issue by using a approach similar to SQL query for top 5 results without the use of LIMIT/ROWNUM/TOP

SQL Server query to get a pre-position No

I am trying to write SQL query, that will display in a new column a 'pre-position number' with values displayed based on the value on the previous position number column row.
I would appreciate any assistance.
Thank you.
You seem to be looking for LAG(). For this to work, you need a column that can be used to order the data, so your RDBMS can assess which record is the previous record to the current one. Assuming that this column is called id, then:
SELECT
id,
position_nr,
LAG(position_nr) OVER(ORDER BY id) pre_position_nr
FROM mytable
You want to use LAG(). To get the 0, use the three argument form:
SELECT position_no,
LAG(position_no, 1, 0) OVER (ORDER BY position_no) as pre_position_no
FROM mytable;
This assumes that you are ordering by position_no, as suggested by your sample code.

Generating row number without ordering any column

I want to generate row numbers in the same order the data are added.
The below query is working fine for SQL Server.
SELECT *,ROW_NUMBER() OVER (ORDER BY (SELECT 100)) AS SNO FROM TestTable
I need standard query to achieve the same scenario in Firebird. Can anyone suggest me about this?
You can't use row_number() over (order by (select 100)) with Firebird, because Firebird - as required by the SQL standard - requires a from clause for a select. The equivalent in Firebird would be row_number() over (order by (select 100 from rdb$database)).
The best solution would be to use an actual column for the order by to ensure a deterministic order.
When looking at the SQL:2016 standard, then an order by is not required for row_number() (but it is for rank() and dense_rank()). Unfortunately, it looks like Microsoft applied that requirement for row_number() as well, possibly for uniformity with the rank-functions, and maybe because row_number() without an order does not make a lot of sense. Using row_number() over () with SQL Server yields an error "The function 'row_number' must have an OVER clause with ORDER BY.", but works with Firebird.
SQL Server also enforces that the order by in a window function is not a numeric column reference. Using row_number() over (order by 1) with SQL Server yields an error "Windowed functions, aggregates and NEXT VALUE FOR functions do not support integer indices as ORDER BY clause expressions.", but works with Firebird (although the 1 is taken as a literal 1, and not as column reference, contrary to an order by on select level).
SQL Server also does not support using constants or literals in the order by in a window function. Using row_number() over (order by '1') with SQL Server yields an error "Windowed functions, aggregates and NEXT VALUE FOR functions do not support constants as ORDER BY clause expressions.", but works with Firebird.
I did find a trick that worked for both Firebird 3 and SQL Server 2017, but it is a dirty hack:
row_number() over (order by current_user)
This works because SQL Server doesn't consider current_user as a constant, but as a function, which means it doesn't fall under the 'no constants allowed'-rule.
Be aware that this trick may yield inconsistent row numbers (eg in Firebird multiple window functions evaluated with different constants will yield different values, and the window function is evaluated before an order by on select level), and you may want to consider if you shouldn't simply track a row index in your application.
One way could be usage of ORDER BY RAND():
CREATE TABLE TestTable(i INT);
INSERT INTO TestTable(i) VALUES (10);
INSERT INTO TestTable(i) VALUES (20);
INSERT INTO TestTable(i) VALUES (30);
SELECT TestTable.*,ROW_NUMBER() OVER (ORDER BY RAND()) AS SNO
FROM TestTable;
db<>fiddle demo - Firebird
db<>fiddle demo - SQL Server

Can SQL SUM() function take an expression as argument?

I'm using SQLite database and I'm wondering whether I'm allowed to write queries as follows:
SELECT SUM(column1 * column2)
FROM my_table;
I googled but references say that SUM function is has the following format:
SUM([DISTINCT|ALL] column)
And my question is: does column mean actually column or does it allow expressions (like above) too?
You can always use a table expression:
SELECT SUM(Calc)
FROM (
SELECT Column1 * Column2 AS 'Calc'
FROM My_Table) t
I don't have SQLite but checking the docs indicates this should work fine.
Yes, you can use an expression like the one you mentioned, if the datatype of both columns allow it.

SQL Framing Query Problem

I am running a SQL query that only returns a portion of the entire data. In other words, the data is 'framed.' I have an 'ORDER(ORDER BY' part and for whatever reason, that part isn't working all of the time. Sometimes I get what I expect, and other times I don't.
The query:
SELECT
*
FROM
(
SELECT
ROW_NUMBER() OVER(ORDER BY [datetime] DESC, timeMicroSec DESC) AS rowNum,
...
FROM
...
WHERE
...
) AS TempTbl
WHERE
rowNum BETWEEN #startRow AND #endRow;
The whole data query works when it is not framed and I use an 'ORDER BY' clause at the end. In the image below, the [datetime] column and the [timeMicroSec] column are joined with string concatenation.
As you can see, the ordering is all messed up. Any help would be appreciated.
When you cast a DateTime as a Varchar, it changes the way that SQL Server will order the column. It will no longer order it chronologically, but instead as just a plain old string.
If the data type is a DateTime, you would get the following descending sort order:
01/11/2011
02/22/2010
The first date is later chronologically... but if the data type is a Varchar... it would be sorted as:
02/22/2010
01/11/2011
Because the string "02" comes after "01"... the actual date value doesn't matter at this point. When you concatenate your date with timeMicroSec, you change the sorting to a Varchar sort.
Like the others said... if you order by RowNum instead of by your concatenated string, you will get a chronological order.
ORDER BY rowNum
as the last part of your query will fix your problem.