How to apply functions stored in a column to other values in the row SQL? - sql

I have a table with function names 'testTable' and would like to use the functions stored in the 'Function' column on values 'Value 1' and 'Value 2'. The 'Function' stores the names of functions which I already have in my schema.
I have this code, but obviously it does not like the dbo.[Function] I have tried using the function as a variable and without the 'dbo' but still, doesn't work.
SELECT *,
CASE
WHEN [Function] IS NOT NULL
THEN dbo.[Function]([Value 1],[Value 2])
ELSE NULL END AS Result
FROM testTable t
Thanks for any help!

Related

how to map column if exists otherwise empty

I'm trying to map a property to a Formula to select value if the column if exists otherwise default value
I tried the following
mapper.Map(x => x.GroupId).Formula("(select case when exists (select * from INFORMATION_SCHEMA.COLUMNS SYS_COLS_TBL WHERE SYS_COLS_TBL.TABLE_NAME ='Azure' AND SYS_COLS_TBL.COLUMN_NAME = 'GroupId') then this_.GroupId else '' end)");
I get an error
SqlException: Invalid column name 'GroupId'.
This SQL statement will return all tables, which do have name 'Azure'
select * from INFORMATION_SCHEMA.COLUMNS SYS_COLS_TBL
WHERE SYS_COLS_TBL.TABLE_NAME ='Azure'
AND SYS_COLS_TBL.COLUMN_NAME = 'GroupId'
but, there could be e.g. two such tables .. but in different schemas. And if that happens, we can be querying dbo.Azure .. while SomeOtherSchema.Azure is having column GroupId. This will fix it:
select * from INFORMATION_SCHEMA.COLUMNS SYS_COLS_TBL
WHERE SYS_COLS_TBL.TABLE_NAME ='Azure'
AND SYS_COLS_TBL.COLUMN_NAME = 'GroupId'
AND SYS_COLS_TBL.TABLE_SCHEMA = 'dbo'
Also, to support more complex querying (JOIN associations) remove this_.:
instead of:
then this_.GroupId else '' end)
use:
then GroupId else '' end)
NHibernate will provide proper alias based on context
The SQL statement requires that all references to the tables and columns do exist. Therefore you're getting Invalid column name 'GroupId' error.
The usual way to do something like that would be to use dynamic sql (sp_executesql #sql):
DECLARE #sql NVARCHAR(MAX) = '
SELECT '+
(CASE WHEN EXISTS (SELECT *
FROM INFORMATION_SCHEMA.COLUMNS
WHERE TABLE_SCHEMA = 'dbo' AND --assuming the schema is dbo
TABLE_NAME = 'Azure' AND
COLUMN_NAME = 'GroupId'
)
then 'GroupId'
else 'NULL as GroupId'
end) + '
FROM Azure';
exec sp_executesql #sql
But this would not work because you are already in the context of SQL statement and it is not possible to inject the dynamic sql there.
To solve the problem there are a few solutions:
Correct solution: create the missing columns in the database.
Painful solution: map the object to a stored procedures which will execute a dynamic SQL similar to the above.
Insanely painful solution: Map your possible missing columns as normal. Before building the factory - introspect your model and either remove mappings for the columns missing or change their mapping to formulas. This would be similar to Configuration.ValidateSchema method of NHibernate, but instead of throwing errors - you'd need to remove these columns from the mapping.

SQL - conditionally set column values to NULL

I have a table - some_table which has a number of columns and some of them have some invalid value in some rows which need to transformed into NULL.
I cannot use the below due as mutating the original table is not allowed by permissions for one and also it needs to be repeated for all column names.
UPDATE some_table TABLE## SET column_name = NULL WHERE column_name = 'invalid value';
So it needs to be a 'SELECT' operation to create a new table with invalid values converted to NULL - is there a quick way to do this ?
Updating with an answer from #Jonny below
NULLIF is a good option. However is there a way to apply it to all columns rather having to do it for each column separately - sometimes the number of columns is pretty huge.
You could use a NULLIF
Have a look at 9.16.3. NULLIF
https://www.postgresql.org/docs/current/static/functions-conditional.html
SELECT NULLIF('invalid value', column_name)
FROM some_table
How about something like:
INSERT INTO some_table2 (column_name, ...) SELECT * FROM some_table WHERE column_name <> 'invalid value';
INSERT INTO some_table2 (column_name, ...) SELECT null, ... FROM some_table WHERE column_name = 'invalid_value';

How to do a conditional where clause with where in PL/SQL within Procedure

I have a pretty simple Stored Procedure that I am in trouble to do because i'm new to SQL and PL/SQL. I Have a table with a name column that is a varchar(55).
I discovered that if the user executes my procedure with an empty string as a paramter the LIKE statment brings all rows from TABLE1
SELECT *
FROM TABLE1
WHERE COLUMN LIKE VARIABLE || '%'
AND...
So I tried to change the query so if the VARIABLE is passed with a empty string it can still perform other conditions in the where statment.
SELECT *
FROM TABLE1
WHERE (VARIABLE <> '' AND COLUMN LIKE VARIABLE || '%')
AND...
But now wherever I pass as variable ('', NULL, 'anystring') I get no rows returned.
How can I build a query that validates if the variable is different of empty string and if it is it performs the LIKE statment with the variable correctly?
If I understand you correctly, it is not difficult thing to do. You can use conditional WHERE clause using CASE WHEN. So your query will support different scenarios, something like this:
SELECT *
FROM TABLE1
WHERE (CASE WHEN variable IS NULL AND column IS NULL THEN 1
WHEN variable LIKE '%' AND column LIKE variable||'%' THEN 1
ELSE 0
END) = 1
AND...
Basically, it checks if the variable = '' then it will compare the column against ''. Otherwise, it will compare it against variable||'%'.
Notice, Oracle treats empty string of the type VARCHAR as NULL (this does not apply to CHAR). So, in the first scenario we compare against NULL.
Hello Just a thought for this we can use Dynamic sql too. If you may try this approach. Hope it helps.
CREATE OR REPLACE PROCEDURE SPS_TEST_OUT(
p_input_in IN VARCHAR2
)
AS
lv_sql LONG;
lv_where VARCHAR2(100);
BEGIN
lv_where:= CASE WHEN p_input_in IS NULL OR p_input_in = '' THEN
''
ELSE
' AND COLUMN1 LIKE '''||p_input_in||'''%'
END;
lv_sql:='SELECT * FROM TABLE
WHERE 1 = 1
' ||lv_where;
dbms_output.put_line(lv_sql);
END;

How to assign a column to a dynamic variable

One column in one table of my database say (Table A and Column A) can be either of Numeric type or VARCHAR type. Datatype is decided dynamically and then table gets created.
I need to create a dynamic variable (#Dynamic) which should check the datatype of this column and assign a different column (column B or column C) to it accordingly i.e.
If column A is NVARCHAR, assign column B to #Dynamic
If column A is NUMERIC, assign column C to #Dynamic
I've to do this in both SQL Server and Oracle.
Any help to write a function for this would be greatly appreciated.
In sql server you can check column data type
SELECT DATA_TYPE
FROM INFORMATION_SCHEMA.COLUMNS
WHERE
TABLE_NAME = 'Table A'
AND COLUMN_NAME = 'Column A'
In oracle
SELECT Type
FROM user_tab_columns
WHERE table_name = 'Table A'
AND column_name = 'Column A';
Sql_variant is a dynamic type in SQL Server. The eqivalent in oracle would be anydata type.
But if you are using dynamic SQL why don't you store the data in nvarchar as it is big enough for the convertet numeric values as well and you can use it directly for your dynamic SQL statement?
Actually this was a generic question and did not need any sample data to answer.
The approach I am following is:
I wrote a function
CREATE FUNCTION [dbo].[GET_DATA_TYPE] (#input VARCHAR)
RETURNS VARCHAR(255) AS
BEGIN
DECLARE #l_data_type VARCHAR(255)
SELECT #l_data_type=DATA_TYPE
FROM INFORMATION_SCHEMA.COLUMNS
WHERE
TABLE_NAME = #input AND
COLUMN_NAME = 'AssetID'
RETURN #l_data_type
END
I would call this function in my stored procedures as
SELECT #dataType = dbo.GET_DATA_TYPE(#input);
I declared another variable i.e. #FinalType
IF #dataType == 'numeric'
THEN #FinalType = columnC
IF #dataType == 'nvarchar'
THEN #FinalType = columnD
Then I'll use this #FinalType variable in all my dynamic sqls.
Any other efficient way to do this.

Select some value knowing column name

I have a table for example with fields(a:int,b:int,c:int,d:int)
I am writing a store procedure. This store procedure will give a string('a' or 'b' or 'c' or 'd') and should give back sum on that column.
I am using SQL server 2005.
You can use a CASE expression to choose the appropriate column to summarize
SELECT SUM(
CASE #col -- the param of the stored procedure
WHEN 'a' THEN col_a
WHEN 'b' THEN col_b
WHEN 'c' THEN col_c
ELSE col_d END) AS sum_of_column
FROM my_table
Alternative using dynamic SQL inside the procedure
DECLARE #sql AS VARCHAR(MAX)
SET #sql = 'SELECT SUM(' + #col +') FROM my_table'
EXEC (#sql)
Try to redesign your database so that you're not trying to implement the same operations against multiple columns. Just because SQL tables resemble a spreadsheet, that doesn't mean you should treat it as such.
It sounds like you may have some form of attribute splitting going on - where you have multiple columns representing the same "type" of data, when you should have multiple rows, and an additional column to distinguish these values (e.g. rather than having twelve columns to represent "values" from each month of the year, you should have two columns, storing month and "value").