SQL statement instrumentation? - sql

I was wondering if there is a reliable way to instrument SQL query statements such as
SELECT * from table where 1=1;
into a new statement like the follows that stores the result relation into a temporary table.
SELECT * into temp result from table where 1=1;
For this simple statement, I can parse the statement and add into clause before from. I was just wondering if there are some libraries that can do this for complicated statements with WITH etc, so that the end result of a set of query statements is stored into a result table.
BTW, I am using PHP/JavaScript with PostgresSQL 9.3.
Thanks.
Clarification:
This question is not a duplicate of Creating temporary tables in SQL or Creating temporary tables in SQL, because those two questions are about SQL grammar/usage issues around creating temporary tables. This question is not about SQL usage but about program instrumentation and whether/how a sequence of SQL query statements can be analyzed and transformed to achieve certain effects. So it's not about how to manually write new statements from scratch, but rather about how to transform an existing set of statements.

Related

How to get CREATE TABLE Statement from existing table in db2?

I am using DB2 and Oracle SQL Developer.
How to get the CREATE TABLE Statements from the existing tables?
There are too many tables and it will be a very lengthy process to do manually.
There is a special db2look utility for DDL extraction in Db2. You may refer to its options and their meaning at this link.
If you want SQL access to its capabilities, you may use the SYSPROC.DB2LK_GENERATE_DDL stored procedure supporting most of the utility's options. The routine has an output parameter getting "invocation number" int value after its call.
In case of a single table:
CALL SYSPROC.DB2LK_GENERATE_DDL ('-e -noview -t MY_SCHEMA.MY_TABLE', ?);
SELECT SQL_STMT
FROM SYSTOOLS.DB2LOOK_INFO_V
WHERE OP_TOKEN = <value_of_output_parameter_from_call_above>
ORDER BY OP_SEQUENCE;
In SQLDeveloper if you can see the table there's the initial Create Table Statement in the SQL Tab
You should do that for each table, this is a way to do it but I'm not sure it's fast enough for you.

Oracle is dropping my table before select statement finishes

At work we have a large Oracle SQL query designed to output two select statements based off of analysis and table combining in prior scripts. At the end of these select statements we truncate the temp tables that were created. My issue is that the tables are getting truncated before the select statement has time to run, resulting in 0 output for both queries and empty tables that now need the whole process to be run over again to populate the tables correctly. This is something I'm trying to help automate but I'm stuck on how to get Oracle to wait for the select statement to finish processing before triggering the truncate. Very simply it looks like:
Select * from temp;
Truncate Table temp;
commit;
TRUNCATE is a DDL sentence in Oracle which means it can modify the structure of tables and databases, instead of using TRUNCATE why don't you try to change it for a simple DELETE which is a simple DML sentence in Oracle that just change the data.
What you describe can only ne the case if the 2 select statements (one followed by a truncate) are running in separate sessions. Run in the same session and the problem would be solved although you may lose out on a performance benefit of runnng them in parallel.

Stored procedure for generic MERGE

I have a set of 10 tables in a database (DB1). And there are 10 tables in another database (DB2) with exact same schema on the same SQL Server 2008 R2 database server machine.
The 10 tables in DB1 are frequently updated with data.
I intend to write a stored procedure that would run once every day for synchronizing the 10 tables in DB1 with DB2. The stored procedure would make use of the MERGE statement.
Now, my aim is to make this as generic and parametrized as possible. That is, accommodate for more tables down the line... and accommodate different source and target DB names. Definitely no hard coding is intended.
This is my algorithm so far:
Have the database names as parameters
Have the first query within the stored procedure... result in giving the names of the 10 tables from a lookup table (this can be 10, 20 or whatever)
Have a generic MERGE statement that does the sync for each of the above set of tables (based on primary key?)
This is where I need more inputs on.
What is the best way to achieve this stored procedure? SQL syntax would be helpful.
I had to do something similar, to do that i used a string with a "skeleton" for the merge statement, then i retrieved the list of columns and the pks with a simple query on the sys views.
you could do something similar to build your merge statement, here's a sketch i wrote now as an example (I know it's horrible but i'm not going to write something decent at this hour, and it should give you a hint anyway :P )
SQLFiddle
then you just need to execute it with the usual sp_executesql stored procedure
by the way, always pay attention when building command strings this way, it's not that secure

SQL stored procedure failing in large database

I have a particular SQL file in which i copy all contents from on table in a database to another table in another database.
The traditional INSERT statements are used to perform the same operation. However this table has 8.5 Million records and it fails. The queries succeed with a smaller database.
Also in when i run the select * query for that particular table the SQL query express shows out of memory exception.
In particular there is one table that has some many records. So this table alone i want to copy from the old Db to the new Db.
What are alternate ways to achieve this?
Is there any quick work around by which we can avoid this exception and make the queries succeed?
Let me put it this way. Why would this operation fail when there are a lot of records?
I don't know if this counts as "traditional INSERT", but have you tried "INSERT INTO"?
http://www.w3schools.com/sql/sql_select_into.asp

SQL CE parameter passing

I have a number of queries which are checking for a record in a table and if there is no such a record it is added. There are 2 queries involved in this process:
1) select id from table where <conditions>
if there is any ids corresponding to I do nothing(or update records with ids as I want), if there are no ids I executing second query:
2) insert into table(<columns>) values(<values>)
Of course <conditions>, <columns> and <values> are correct strings for their context.
What I want is to join these 2 queries into one:
insert into table(<values>)
select <parameter list>
where not exists(select id from table where <conditions>)
So there will be only one query involved instead of 2.
The example of this query is:
insert into persons(name) select #Name where not exists (select id from persons where name = #Name)
The problem is that I use parameters for queries and when I execute this composite query I get an exception saying "A parameter is not allowed in this location. Ensure that the '#' sign is in a valid location or that parameters are valid at all in this SQL statement."
Is there a way to bypass parameters and don't get an exception for this query?
I'm using C# and SqlCe-related classes.
What's the benefit of merging the queries into a composite? If the query is supported (and I actually doubt that the SQL Compact Query Parser supports this) the sql engine is still going to have to essentially do both queries anyway. I find the two separate queries more readable.
As separate queries, I'm also willing to bet you could make it substantially faster than the composite query anyway by using TableDirect. Take a look at the answer to this question and benchmark it in your app. It's likely 10x faster than your existing query pair becasue it foregoes the query parser altogether.