SQL query executed through Windows batch: missing table - sql

I execute a SQL query (for PostgreSQL) via psql.exe inside a Windows batch. I get an error I can't explain, saying that a FROM clause is missing for a table that is not called within the query (see below). When I search in the batch file for geo_c3_0_3_mo table, the string is not found...
Any idea on this kind of issue?
EDIT :
If I copy-paste the query from the batch file into a pgAdminIII SQL query window, the query runs perfectly and no error message is returned.
When I remove one of the subqueries, the error either disappear or mention another badly written table name (for instance: missing FROM-clause for table "geoc__0_3_mo")... It seems more and more that the issue comes from the length of the line (19,413 characters!). To me, it is not possible to write the query on several lines within a batch file, like inside a pgAdminIII SQL query window. The solution would be to keep the query inside a *.sql file and to call that file from the batch file.

Write the query to a tempfile in your batch, then execute it with psql -f. This will bypass command-line length issues.

Related

How to run thousands of queries in Ms Access?

I've discovered the SQL VIEW in Ms Access to execute some queries, but I need to execute about 20.000 UPDATE queries I have in a .sql file.
When I paste in the SQL VIEW it says the "Text is too long to modify".
How can I run those UPDATE's ?
The limit to the number of characters in an Access SQL query is "about 64000" - see here https://support.office.com/en-us/article/Access-2010-specifications-HA010341462.aspx. And unfortunately you cannot execute multiple statements in a query. I think this will mean quite a bit of work for you in VBA. Here is an example approach (pseudocode):-
open file
read line into variable
while not EOF
currentdb.execute variable, dbfailonerror
read next line
wend
close file
Probably a nasty surprise for you if you are used to executing huge batches of statements using other RDBMS!
An alternative suggestion: we don't know exactly what your file looks like or where it comes from, but if it is generated from another RDBMS which you have access to, then I would very strongly recommend that you set up an ODBC connection to it, and query out the data you need (either by linking the tables or writing a pass through query), then inserting into your local Access tables. This will be many orders of magnitude faster than executing thousands of individual statements.
If your only source of the data is the SQL statements then you may still be better of if you can parse the SQL text into relevant columns (for example PK, and value to be updated, or if inserting, then all column values), then save as a csv file, import into Access, add keys as necessary, and then run a single update statement as an updateable query against the imported data and the existing tables. Dumping the file into Excel and using the various string functions may enable you to parse the data quite quickly.
There may be an easier way but you could write VBA code that reads the text file line by line and then uses DoCmd.RunSQL to run each query.

SQL INSERT sp_cursor Error

I have a pair of linked SQL servers: ServerA and ServerB. I want to write a simple INSERT INTO SELECT statement which will copy a row from ServerA's database to ServerB's database. ServerB's database was copied directly from ServerA's, and so they should have the exact same basic structure (same column names, etc.)
The problem is that when I try to execute the following statement:
INSERT INTO [ServerB].[data_collection].[dbo].[table1]
SELECT * FROM [ServerA].[data_collection].[dbo].[table1]
I get the following error:
Msg 16902, Level 16, State 48, Line 1
sp_cursor: The value of the parameter 'value' is invalid.
On the other hand, if I try to execute the following statement:
INSERT INTO [ServerB].[data_collection].[dbo].[table1] (Time)
SELECT Time FROM [ServerA].[data_collection].[dbo].[table1]
The statement works just fine, and the code is executed as expected. The above statement executes just fine, regardless of which or how many tables I specify to insert.
So my question here is why would my INSERT INTO SELECT statement function properly when I explicitly specify which columns to copy, but not when I tell it to copy everything using "*"? My second question would then be: how do I fix the problem?
Googling around to follow up on my initial hunch, I found a source I consider reliable enough to cite in an answer.
The 'value' parameter specified isn't one of your columns, it is the optional argument to sp_cursor that is called implicitly via your INSERT INTO...SELECT.
From SQL Server Central...
I have an ssis package that needs to populate a sql table with data
from a pipe-delimited text file containing 992 (!) columns per record.
...Initially I'd set up the package to contain a data flow task to use
an ole db destination control where the access mode was set to Table
or view mode. For some reason though, when running the package it
would crash, with an error stating the parameter 'value' was not valid
in the sp_cursor procedure. On setting up a trace in profiler to see
what this control actually does it appears it tries to insert the
records using the sp_cursor procedure. Running the same query in SQL
Server Management Studio gives the same result. After much testing and
pulling of hair out, I've found that by replacing the sp_cursor
statement with an insert statement the record populated fine which
suggests that sp_cursor cannot cope when more than a certain number
of parameters are attempted. Not sure of the figure.
Note the common theme here between your situation and the one cited - a bazillion columns.
That same source offers a workaround as well.
I've managed to get round this problem however by setting the access
mode to be "Table or view - fast load". Viewing the trace again
confirms that SSIS attempts this via a "insert bulk" statement which
loads fine.

capture executed sql from input table in pentaho pdi

I am using pentaho for data migration testing. I have set a "table input" step where many parts of the query inside "table inputs" are variables. I have been looking for a way to capture that query after it gets executed during runtime.
I was wondering if there is any specific system log variables for sql or is it to do with metadata. need help! Thanks
Maybe the following approach will help:
We assume a transformation reading a CSV file to get the dynamic portion of the SELECT statement (e.g. the columns) and setting the variable columns with it.
The second transformation uses this variable to generate the SELECT statement and store it into the variable sql_statement.
In the main transformation we use ${sql_statement} as the SELECT statement of the table input and write the data to an output file (that's the business process so to say). From the same input we copy the output to another path. There we add the current time as a field (use element "Get system data") and we add the generated SQL statement, join them as a cartesian product and group the result by the sql_statement. That way we can compute the first time and the last time that the statement was used. These results are written to a text file.
The last thing we need is a job calling the three transformations sequentially.
This is a sample output:
sql_statement;min_time;max_time
SELECT my_column FROM test_table;2014/05/08 00:41:21.143;2014/05/08 00:41:21.144
Thank you Marcus! I did some thing similar.
It works. awesome.
I gathered parts of queries from table field where they were kept and formed a full query in javascript. After that full query will be sent as parameter to a transformation that will run and log the query.

Looping over SQL files in directory

I've always thought that looping over sql files in directory in SSIS is easy... But I've got a problem today:
Execute SQL Task isn't executing statements that are in the sql file.
In the sql file I've got delete statement and then insert statement.
SSIS Execute SQL Task component is done after about 2 sec, while executing the same script manually takes usually about 2 minutes, and of course in SSIS it doesn't insert anything.
I checked variable value that is coming from Foreach loop (with full filemane path) and it is ok.
I've got parametrized (by Expression) File Connection with this variable.
What am I doing wrong? Thanks for help.
Do you have GO after each call within your SQL file?
Example:
// Your insert TSQL code here
GO
// Your delete TSQL code here
GO
// Etc...
It will not continue if you don't have this.

How do I programmatically run a complex query on an as400?

I'm new at working on an as400 and I have a query the joins across 4 tables. The query itself is fine, it runs in STRSQL and displays the results.
What I am in struggling with is getting the query to be able to run programmatically (it will eventually be run from a scheduled CL script).
I tried have creating a physical file that contains the query running it with RUNQRY, but it simply displays the query itself, not the actual result set.
Does anyone know what I am doing wrong?
UPDATE
Thanks everyone for the direction and the resources, with them I was able to reach my goal. In case it helps anyone, this is what I ended up doing (all of this was done in it's own library, ALLOCATE):
Created a source physical file (using CRTSRCPF): QSQLSRC, and created a member named SQLLEAGSEA, with the type of TXT, that contains the SQL statement.
Created another source physical file: QCLSRC, and created a member named POPLEAGSEA, with the type of CLP, that changes the current library to ALLOCATE then runs the query using RUNSQLSTM (more detail on this below). Here is the actual command:
RUNSQLSTM SRCFILE(QSQLSRC) SRCMBR(SQLLEAGSEA) COMMIT(*NONE) NAMING(*SYS)
Added the CLP to the scheduled jobs (using ADDJOBSCDE), running the following command:
CALL PGM(ALLOCATE/POPLEAGSEA)
With regard to RUNSQLSTM, my research indicated that I wasn't going to be able to use this function, because it didn't support SELECT statements. What I didn't indicate in my question was what I needed to do with the the result - I was going to be inserting the resultant data into another table (had I done that I'm sure the help could have figured that out a lot quicker). So effectively, I wasn't going to be doing an SELECT, my end result is actually an INSERT. So my SQL statement (in SQLLEAGSEA) begins with:
INSERT INTO
ALLOCATE/LEAGSEAS
SELECT
...
BLAH BLAH BLAH
...
From my research, I gather that RUNSQLSTM doesn't support SELECT because it doesn't have a mechanism to do anything with the results. Once I stopped taking baby steps and realized I needed to SELECT AND INSERT in the same statement, it solved my main problem.
Thanks again everyone!
The command is RUNSQLSTM to run a static SQL statement in a physical file member or stream file.
It is a non-interactive command so it will not execute sql statements that attempt to return a result set.
If you want more control, including the ability to run interactive statements, see the Qshell db2 utility.
For example:
QSH CMD('db2 -f /QSYS.LIB/MYLIB.LIB/MYSRCFILE.FILE/MYSQL.MBR')
Note that the db2 utility only accepts the *SQL naming convention.
QM Query
If all the SQL you need is the single complex SQL statement, and this is what it sounds like, then your best bet is to use Query Management Query (see QM Query manual here).
The results can be directed to a display, a spool file, or a physical file (ie a DB2 table). The default output when run interactively is to the screen, but when run in a (scheduled) batch job it will default to a spool file report.
You can create the QM Query interactively via WRKQMQRY, in prompted mode (much like Query/400) or in SQL mode. Or you can compile the QM Query from source, with the CRTQMQRY command.
To run your QM Query, STRQMQRY command.
RUNSQL cmd
If you are using a system that has IBM i 7.1 fully up-to-date, and has Technology Refresh 4 (TR4) installed, then you could also use the new RUNSQL command to execute a single statement. (see discussion in developerWorks)
SQL Scripting w/ RUNSQLSTM cmd
From CL you can run SQL scripts of multiple SQL statements from a source file member. There is no standard default source file name for this, but QSQLSRC is commonly used. The source member can contain multiple non-interactive SQL statements. This means you cannot use a SELECT statement (directly) since theoretically it will not know where to send the results. CL commands are even allowed if given a CL: prefix. Both SQL and CL statements should be terminated with a semicolon ;. While the SQL statements cannot display data directly to the screen, the same restriction does not apply to the scripted CL commands.
The STRQMQRY command can be embedded in the RUNSQLSTM script, by placing the prefix "CL: " in front of the command. Since STRQMQRY can direct output to the screen, a report, or an output table, this can come in very useful.
Remember that to direct your output from a SELECT query to a file you can use either the INSERT or CREATE TABLE statements.
CREATE TABLE newtbl AS
( full-select )
WITH DATA;
Or, to put the results into a table you create in your job's QTEMP library:
DECLARE GLOBAL TEMPORARY TABLE AS
( full-select )
WITH DATA;
[Note: If you create the source to be used by CRTQMQRY, you are advised to create it as CRTSRCPF yourlib/QQMQRYSRC RCDLEN(91), since the compiler will only use 79 columns of your source data (adding 12 for sequence and change date =91). However for QM Forms, which can be used to provide additional formatting, the CRTQMFORM compiler will use 81 columns so RCDLEN(93) is advised for QQMFORMSRC.]
RUNQRY is a utility that lets you execute a query that was created by another utility named WRKQRY. If you really want to process SQL statements held in a file try RUNSQLSTM. It uses a source physical file to store the statements, not a database file. The standard name for that source physical file is QQMQRYSRC. To create that file, CRTSRCPF yourlib/QQMQRYSRC. Then you can use PDM to work with that source PF. WRKMBRPDM yourlib/QQMQRYSRC. Use F6 to create a new source member. Make it source type TXT. Then use option 2 to will start an editor called SEU. Copy/paste your SQL statements into this editor. F3 to save the source. Once the source is saved, use RUNSQLSTM to execute it.
It is (now) possible to run SQL directly in a CL program without using QM Query, RUNSQLSTM or QShell.
Here is an article that discusses the RUNSQL statement in CL programs...
http://www.mcpressonline.com/cl/the-cl-corner-introducing-the-new-run-sql-command.html
The article contains information on what OS levels are supported as well as clear examples of several ways to use the RUNSQL statement.
This will work in two steps:
RUNSQL SQL('CREATE TABLE QTEMP/REPORT AS (SELECT +
EXTRACT_DATE , SYSTEM, ODLBNM, SUM( +
OBJSIZE_MB ) AS LIB_SIZE FROM +
ZSYSCOM/DISKRPTHST WHERE ODLBNM LIKE +
''SIS%'' GROUP BY EXTRACT_DATE, SYSTEM, +
ODLBNM ORDER BY LIB_SIZE DESC) WITH +
DATA') COMMIT(*NONE) DATFMT(*USA) DATSEP(/)
RUNQRY QRYFILE((QTEMP/REPORT)) OUTTYPE(*PRINTER) +
OUTFORM(*DETAIL) PRTDFN(*NO) PRTDEV(*PRINT)
The first step creates a temporary table result in qtemp and the second step/line runs an adhoc query over just the temporary table to a spool file.
Thanks,
Michael Frilot
There is of course a totally different solution: You could write and compile a program containing the statement. It requires some longer reading into, especially if you are new to the platform, but it should give you most flexibility over what you do with results. You can use SQL in C, C++, RPG, RPG/LE, REXX, PL (of which I don't know, what it is) and COBOL. Doing that, you can react in any processable way on results from one query and start/create other queries based on what you get.
Although some oldfashioned RPG-programmers try everything to deny SQL in RPG exists, it is possible today for many cases, to write RPG-programs with SQL only and no direct file access (without F-Specs, for those who know RPG).
If your solution works for you, perfect. If you need to do something else, try a look into this pdf: http://publib.boulder.ibm.com/infocenter/iseries/v5r3/topic/rzajp/rzajp.pdf
The integration into RPG is not too bad. It works with the normal program flow. Would look something like this (in free form):
/free
// init search values:
searchval = 'Someguy';
// so the sql query:
exec sql
SELECT colum1, colum2
INTO :var1, :var2
FROM somelib/somefile
WHERE keycol=:searchval;
// now do something with the values:
some_proc(var1);
/end-free
In this, var1, var2, and searchval are ordinary RPG-variables. No quoting needed. Works also with datastructures (externally defined e.g., the record format of the file itself fits well). You can work with cursors and loops, too, of course. I feel that RPG-programs tend to be easier to read with this.