How to pass an entire row (in SQL, not PL/SQL) to a stored function? - sql

I am having the following (pretty simple) problem. I would like to write an (Oracle) SQL query, roughly like the following:
SELECT count(*), MyFunc(MyTable.*)
FROM MyTable
GROUP BY MyFunc(MyTable.*)
Within PL/SQL, one can use a RECORD type (and/or %ROWTYPE), but to my knowledge, these tools are not available within SQL. The function expects the complete row, however. What can I do to pass the entire row to the stored function?
Thanks!

Don't think you can.
Either create the function with all the arguments you need, or pass the id of the row and do a SELECT within the function.

Related

How to parameterise variable length of Strings?

I am writing a query where 'batch_name' is the parameter, some times I get only one batch name and sometime I get 2 or more batch names. How can I handle this in Oracle BI Publisher query,
Here is my query,
Select * from pay_batch_headers pbh Where UPPER(pbh.batch_name) = UPPER(:p_batch_name)
Now this query will handle for only one batch name, I want it to handle multiple batch names.
something like Where UPPER(pbh.batch_name) IN ('Batch1','Batch2','Batch3')
But problem to use IN clause is I cant predict number of batches I have to query. Can any one help me in this please.
You have two choices. One is to munge the variables together into a string and use some method, such as regexp_like():
where regexp_like(upper(pbh.batch_name), ??)
The parameter string should look like: '^abc|def|ghi|jkl$'. You can make it as long as you like.
Another method is to use execute immediate. Dump the values into a SQL query as a string, using IN. The advantage of this method is that it can more easily use indexes

How can i use the new UDF functionality to create "Dynamic SQL statement"?

How can i use the new UDF functionality to create "Dynamic SQL statement"?
Is there a way to use UDF in order to construct SQL statement based on template and input variables, and later run this query?
The documentation https://cloud.google.com/bigquery/user-defined-functions?hl=en says:
A UDF is similar to the "Map" function in a MapReduce: it takes a
single row as input and produces zero or more rows as output. The
output can potentially have a different schema than the input.
So your UDF receives just a single row.
Therefore - no, UDF is not for the purpose you described in your question.
You might take a look at views - maybe that will suit you better:
https://cloud.google.com/bigquery/querying-data#views

Avoiding writing of same block of statements at different places in one code in SQL SERVER 2008

There are 3 views and one of them is Time view. All 3 views take the same code used for inserting Time factor( financial period, financial year, quarter) for which I have to write same code in all three views.
Please suggest how to do it without writing the whole code each time.
Create an inline table defined function which accepts a parameter and returns a variety of date functions applied to the parameter. Then use CROSS APPLY to fetch the ones of interest in any given query.
You can use cursor also to achieve the desired result. For example, pass the parameter to the cursor, do the calculation inside of the cursor, then return it.
Cross apply will also work.

SQL: Update table column passed as variable

Any idea how to create this function in t-sql?
Pseudo-code:
function( #table, #table_column )
{
update #table
set #table_column = replace(#table_column,',','')
where #table_column like '%,%'
}
Ideas I've tried:
Procedures: only take readonly tables (http://technet.microsoft.com/en-us/library/ms187926.aspx)
Functions: cannot do updates...
Any suggestions? Thanks everyone!
Update: I had a database with about 40 tables, each with columns that I needed to remove special characters (i.e., ","). Although it would be nice to create a function/procedure where I could give it the name and fix the column, I decided instead (based on the comments) to just write each update out. Perhaps I was just looking for too fancy of a solution to a relatively simple problem.
The only way to do this is dynamic SQL. Unless you are writing database tools, you rarely need to do this kind of thing. Are you sure your database is designed correctly? What is the problem you are trying to solve?
Why do you need a function?
Usually functions are applied to a column inside the select list, such as SELECT MYFUNC(COL1) FROM TAB1;
This can definitely be done in a Stored Procedure with dynamic SQL. You can even look at a return value for the number of rows updated.
I guess the main question is what is your business requirements??

How best to sum multiple boolean values via SQL?

I have a table that contains, among other things, about 30 columns of boolean flags that denote particular attributes. I'd like to return them, sorted by frequency, as a recordset along with their column names, like so:
Attribute Count
attrib9 43
attrib13 27
attrib19 21
etc.
My efforts thus far can achieve something similar, but I can only get the attributes in columns using conditional SUMs, like this:
SELECT SUM(IIF(a.attribIndex=-1,1,0)), SUM(IIF(a.attribWorkflow =-1,1,0))...
Plus, the query is already getting a bit unwieldy with all 30 SUM/IIFs and won't handle any changes in the number of attributes without manual intervention.
The first six characters of the attribute columns are the same (attrib) and unique in the table, is it possible to use wildcards in column names to pick up all the applicable columns?
Also, can I pivot the results to give me a sorted two-column recordset?
I'm using Access 2003 and the query will eventually be via ADODB from Excel.
This depends on whether or not you have the attribute names anywhere in data. If you do, then birdlips' answer will do the trick. However, if the names are only column names, you've got a bit more work to do--and I'm afriad you can't do it with simple SQL.
No, you can't use wildcards to column names in SQL. You'll need procedural code to do this (i.e., a VB Module in Access--you could do it within a Stored Procedure if you were on SQL Server). Use this code build the SQL code.
It won't be pretty. I think you'll need to do it one attribute at a time: select a string whose value is that attribute name and the count-where-True, then either A) run that and store the result in a new row in a scratch table, or B) append all those selects together with "Union" between them before running the batch.
My Access VB is more than a bit rusty, so I don't trust myself to give you anything like executable code....
Just a simple count and group by should do it
Select attribute_name
, count(*)
from attribute_table
group by attribute_name
To answer your comment use Analytic Functions for that:
Select attribute_table.*
, count(*) over(partition by attribute_name) cnt
from attribute_table
In Access, Cross Tab queries (the traditional tool for transposing datasets) need at least 3 numeric/date fields to work. However since the output is to Excel, have you considered just outputting the data to a hidden sheet then using a pivot table?