Get number of changed rows in a postgres sql or plpgsql function - sql

How can I get a total number of rows that were updated and inserted and deleted in a transaction called by a function?
I can get this information with pg_recvlogical. But can the postgres server be configured in any way to return this information with each postgres function that's called (both sql and plpgsql), or would it require a change to every function if it's even possible?
Is there some kind of metadata that the driver is able to pass back along with the actual function results where this can be somehow included? Or would it be possible to write a generic postgres stored procedure that calls functions and adds this information?

Related

SQL Server: Materialized view based on stored proc with dynamic sql - how to

My client wants a pivot table, showing the performance of each month (column headers) per department (row headers). It has to be be possible to insert a 'as-of date' as a parameter, so the user (PHP) can pass that date and the pivot only shows months after that date. My first thought was to write a function. But the pivot has to show a "Totals" column (and a "Totals" row, and a grand total as well). So I wrote a stored procedure, which dynamically puts the pivot together.
The proc works fine, but takes too long to process (which is unsurprising given it's dynamic nature). So I figured I should base an mview on it, or as Microsoft calls it, an indexed view. My approach is to first create a view based on the proc, and then figure out how to materialize it.
It seems that for the first step I need to call the proc inside my view using openquery. That only works if data access is enabled though. So I ran:
SELECT
name,
is_data_access_enabled
FROM sys.servers;
and it turns out is_data_access_enabled = FALSE on our local server (A), but it is TRUE on the another server we use (B). Oddly I can use openquery on B referring to A, something I don't understand but which is probably irrelevant to my question.
I know it's folly (or at least bad practice) to use openquery on server A referring to that same server A, so that's how I got to the point where I ask the community (you people). Would you know a better approach for achieving what I'm trying to do? I use SQL Server 2014.

Call a SQL function in Nifi using ExecuteSQL or another processor

I am currently using a function in SQL Server to get the max-value of a certain column. I Need this value to generate a specific number of dummy files to insert flowfiles that are created later on.
Is there a way of calling this function via a nifi-processor?
By using ExecuteSQL I Always get error like unable to execute SQL select query or the column "ab" was not found, when using select ab.functionname() (ab is the loginname of the db)
In SQL Server I can just use select ab.functionname() and get the desired results.
If there is no possible way of calling this function, is there another way to create #flowfiles dummyfiles to reserve this place for them in the DB so that no one else could insert or use this ids (not autoincremt, because it is not possible) while the flowfiles are getting processed?
I tried using $flowfile.count and the Counterprocessor, but this did not solve the Problem.
It should look like: INSERT INTO table (id,nr) values (max(id)+1,anynumber) for every flowfiles, unfortunately the ExecuteSQL is not able to do this.
Think this conversation can help you:
https://community.hortonworks.com/questions/26170/does-executesql-processor-allow-to-execute-stored.html
Gist:
You can use ExecuteScript or ExecuteProcess to call appropriate script. For example for ExecuteProcess just call sqlplus command. Choose type of command "sqlplus". In command arguments set something like: user_id/password#dbname #"script_path/someScript.sql". In someScript.sql you put something like:
execute spname(param)
You can write your own processor :) Of course it's more difficulty and often unnecessary

Is it possible to return different table variables through SQL Server table valued function based on param value passed in it?

I have written some set of statements which returns a table with some static columns. But I'll required to download csv format using the same function with different column. (Reason is that I created static columns to display high charts and we are using custom code to export chart data so require some different format to download)
In table valued function, I dont know how to return the Dynamic columns of the table based on pram passed to function.
How to write the table value function for this scenario? If this is not possible then what would be the alternative for this task(Stored procedure can't be used in my existing scnerio due to some limitation of code)?
Any suggestions please.

Efficiency executing query multiple times inside a function

I have a stored procedure with about 40 columns. I saved all headers inside one table.
I use a function, inside a sp that generates a report, to get all the headers (dynamic sql).
The function contains a query which queries the headertable with the parameters. This functions executes 40 times, so the query runs 40 times.
Instead of running the query inside the function 40 times is there a solution to do it more efficiently?
EDIT: Changed story a little, maybe more clear now
If only the header would defer i would create different views and query the views which would call a table function (you would need to rewrite your report to a table function) or i would write the stored proc to a table.

How to check a number of inserted/modified records in TADOCommand?

I am using SQL Server database and after calling a simple SQL script I would like to know how many records were affected by last (or only) executed statement in a script.
I cannot find the reference how to achieve this in Delphi's TADOCommand and I know SQL Server gives this information to provider. I am aware of workarounds like getting ##ROWCOUNT in another query, yet this gives some overhead and unnecessary complexity.
Thanks.
Do you use the
function Execute(var RecordsAffected: Integer; const Parameters: OleVariant): _Recordset;
version of the Execute method?
From the doc:
RecordsAffected indicates the number
of records, if the command operates on
data, that are affected by the command
after execution.
So that should give you what you need.
Disclaimer: I cannot test this against SQL Server (don't have it).