How sql loader scrip is working? - sql

Currently I'm working with Exasol database first time and came across one script which is responsible to run sql script written in .sql file.
Here is the script
C:\Program Files\EXASOL\EXASolution\EXAplus\exaplusx64.exe -configDir EXASolutionConfig -profile profile_PROD_talend -q -f D:/Data/Customer/PROD/EXASolution_SQL/EXASOL_data_script.sql -- databaseName tableName /exasolution/StageArea/fileName.csv
I want to know, how this script is working and what its doing actually ? What I understood so far is below
First "C:\Program Files\EXASOL\EXASolution\EXAplus\exaplusx64.exe " is starting a Exasol on command line and then its pointing to the script where .sql file is located.
Not getting:
1) What this part is doing "-configDir EXASolutionConfig -profile profile_PROD_talend -q -f "?
2) What are these identifiers doing "-q -f "?
3)After launching exaplusx64.exe, Is exasol going to connect with database and table name mentioned in script ? If then How cav file is paying its role in this script ? I mean in .sql there is just an sql statement, If its taking data from file then how ? I'm not getting this ..!!
Please share your comments

1) This where you say to Exasol to read the profile profile_PROD_talend in the folder EXASolutionConfig and execute the file D:/Data/Customer/PROD/EXASolution_SQL/EXASOL_data_script.sql in quiet mode (-q).
From the manual:
-configDir *This is not actually in the EXASOL manual, I assume it's the folder with the profiles, or maybe it does nothing*
-profile Name of connection profile defined in <configDir>/profiles.xml (profiles can be edited in the GUI version). You can use a profile instead of specifying all connection parameters.
-q Quiet mode which suppresses additional output from EXAplus.
-f Name of a text file containing a set of instructions that run and then stop EXAplus.
2) Quiet mode and flag for the name of the file.
3) When you run this command EXAPlus connects to the db using the information provided in the profile and it will execute the .sql file passed.
Now things become interesting, the -- allows you to pass some arguments to the .sql file. So you are passing three parameters (databaseName, tableName, and /exasolution/StageArea/fileName.csv). If you open the sql script you will find &1, &2, and &3, these are the placeholders for the parameters passed by your command.
From the manual again:
-- <args> SQL files can use arguments given over via the parameter “-- ” by evaluating the variables &1, &2 etc. .
For example, the file test.sql including the content
--test.sql
SELECT * FROM &1;
can be called in the following way:
exaplus -f test.sql -- dual

Related

Relacing a word in an db2 sql file causes DSNC105I : End of file reached while reading the command error

I have a dynamic sql file in which name of TBCREATOR changes as given in a parameter.
I use a simple python script to change the TBCREATOR=<variable here> and write the result to an output sql file.
calling this file using db2 -td# -vf <generated sql file>gives
DSNC105I : End of file reached while reading the command
Here is the file i need the TBCREATOR variable replaced:
CONNECT to 204.90.115.200:5040/DALLASC user *** using ****#
select REMARKS from sysibm.SYSCOLUMNS WHERE TBCREATOR='table' AND NAME='LCODE'
#
Here is the python script:
#!/usr/bin/python3
# #------replace table value with schema name
# print(list_of_lines)
fin = open("decrypt.sql", "rt")
#output file to write the result to
fout = open("decryptout.sql", "wt")
for line in fin:
fout.write(line.replace('table', 'ZXP214'))
fin.close()
fout.close()
After decryptout.sql is generated I call it using db2 -td# -vf decryptout.sql
and get the error given above.
Whats irritating is I have another sql file that contains exactly same data as decryptout.sql which runs smoothly with the db2 -td# -vf ... command. I tried to use the unix command cmp to compare the generated file and the one which I wrote, with the variable ZXP214 already replaced but there are no differences. What is causing this error?.
here is the file (that executes without error) I compare generated output with:
CONNECT to 204.90.115.200:5040/DALLASC user *** using ****#
select REMARKS from sysibm.SYSCOLUMNS WHERE TBCREATOR='ZXP214' AND NAME='LCODE'
#
I found that specifically on the https://ibmzxplore.influitive.com/ challenge, if you are using the java db2 command and working in the Zowe USS system (Unix System Services of zOS), there is a conflict of character sets. I believe the system will generally create files in EBCDIC format, whereas if you do
echo "CONNECT ..." > syscat.clp
the resulting file will be tagged as ISO8859-1 and will not be processed properly by db2. Instead, go to the USS interface and choose "create file", give it a folder and a name, and it will create the file untagged. You can use
ls -T
to see the tags. Then edit the file to give it the commands you need, and db2 will interoperate with it properly. Because you are creating the file with python, you may be running into similar issues. When you open the new file, use something like
open(input_file_name, mode=”w”, encoding=”cp1047”)
This makes sure the file is open as an EBCDIC file.
If you are using the Db2-LUW CLP (command line processor) that is written in c/c++ and runs on windows/linux/unix, then your syntax for CONNECT is not valid.
Unfortunately your question is ambigiously tagged so we cannot tell which Db2-server platform you actually use.
For Db2-LUW with the c/c++ written classic db2 command, the syntax for a type-1 CONNECT statement does not allow a connection-string (or partial connection string) as shown in your question. For Db2-LUW db2 clp, the target database must be externally defined (i.e not inside the script) , either via the legacy actions of both catalog tcpip node... combined with catalog database..., or must be defined in the db2dsdriver.cfg configuration file as plain XML.
If you want to use connection-strings then you can use the clpplus tool which is available for some Db2-LUW client packages, and is present on currently supported Db2-LUW servers. This lets you use Oracle style scripting with Db2. Refer to the online documentation for details.
If you not using the c/c++ classic db2 command, and you are instead using the emulated clp written in java only available with Z/OS-USS, then you must open a ticket with IBM support for that component, as that is not a matter for stackoverflow.

How to create a new text file dynamically using BCP command

Currently What I have is:
bcp "select * from TEST.dbo.HPA_1 WITH (NOLOCK)" queryout D:\Gift_Voucher\HPA\test1.txt -T -c -t
When I run the above code for the first time it works fine and it creates the textfile test1.txt, but When new data is added in the table I want to create new textfile something like test2.txt and test3.txt and so on without changing the code in bcp. Is there anything we can do here?
Change your BAT file to replace the output file path with a command-line parameter:
bcp "select * from TEST.dbo.HPA_1 WITH (NOLOCK)" queryout "%1" -T -c -t
In the SSIS package, create a string variable for the desired output file path. Set the Execute Process Task Executable property to "cmd.exe". On the Expressions page, set the Arguments property to an expression that builds the command with the BAT file path plus the output file argument. The example below also encloses the values in quotes to handle whitespace in the paths:
Set the variable value to the output file path variable in your package prior to executing the task. This can be done in package code or set the value via SSIS configuration.
Note that you could accomplish the same functionality in a Data Flow task instead of shelling out to BCP. That leverages the native export capability of SSIS.

Multiple sql files in a directory which is in turn used in an unix script

I have a script say sql_result.sh in directory /tmp/SQL_QUERY which just calls a sql script in the same location and executes the sql commands.
Code:
sqlplus -S $MY_UN/$MY_PW#$MY_DB <<!
set serveroutput on;
#/tmp/SQL_QUERY/sql_file1
quit
!
However, if I have say 2 SQL files sql_file1.sql and sql_file1.sql_new in that directory. Which of the either sql scripts will my unix script pick? How and why?
Thanks
Short answer: normally sql_file1.sql
The default extension is .SQL as explained in the docs for #:
file_name[.ext]
Represents the script you wish to run. If you omit ext, SQL*Plus assumes the default
command-file extension (normally SQL). For information on changing the default extension,
see the SUFFIX variable of the SET command.
As it says, you can use the SET command to change which extension is used, or you can specify the extension in the script explicitly.
The SET command says:
Sets the default file extension that SQL*Plus uses in commands that refer to scripts. SUFFIX does not control extensions for spool files.
Example
To change the default command-file extension from the default, .SQL to .UFI, enter
SET SUFFIX UFI
If you then enter
#EXAMPLE
SQL*Plus will look for a file named EXAMPLE.UFI instead of EXAMPLE.SQL.
(Note that a SET command might be present in your LOGIN file.)

want to run multiple SQL script file in one go with in SQLPLUS

I have to run multiple SQL script file in one go.
Like every time i have to write command in SQLPLUS
SQL>#d:\a.txt
SQL>#d:\a2.txt
SQL>#d:\a3.txt
SQL>#d:\a4.txt
is there any way put all file in one folder & run all script file in one go without missing any single file like #d:\final.txt or #d\final.bat
There is no single SQL*Plus command to do that, but you can create a single script that calls all the others:
Put the following into a batch file
#echo off
echo.>"%~dp0all.sql"
for %%i in ("%~dp0"*.sql) do echo #"%%~fi" >> "%~dp0all.sql"
When you run that batch file it will create a new script named all.sql in the same directory where the batch file is located. It will look for all files with the extension .sql in the same directory where the batch file is located.
You can then run all scripts by using sqlplus user/pwd #all.sql (or extend the batch file to call sqlplus after creating the all.sql script)
If you're using gnu linux, you could use process substitution:
sqlplus USERNAME/PASSWORD#DOMAIN < <(cat a.txt a2.txt a3.txt a4.txt)
# ... or a for loop on input files, inside the process substitution
Alternatively, you can create a .pdc file and list your sql scripts:
-- pdc file
#a.txt;
#a2.txt;
#a3.txt;
#a4.txt;
and call sql plus:
sqlplus USERNAME/PASSWORD#DOMAIN < my_scripts.pdc
Some tricks and command can help you to generate master.sql file and you can run from that location.
c:\direcotory_location\dir *.sql /-t /b >master.sql
Go to the parent directory open master.sql open using notepad++
remove master.sql line and use regular expression to replace
\n with \n #
go to cmd
From cmd
C:\root_directory\sqlplus user/password #master.sql
I find this process very convenient if i have 30 to 40 scripts placed in a single directory.
Use *.PDC extension file like this
install.pdc file content
whenever sqlerror exit sql.sqlcode
prompt started!
prompt 1.executing script 1
##install/01.script_1.sql
prompt 2.executing script 2
##install/02.script_2.sql
prompt 3.executing script 3
##install/03.script_3.sql
prompt finished!
where ##install/ points in which directory is the SQL script located
It might be worth the time to write a shell script that runs multiple files.
#!/bin/ksh
sqlplus user/password#instance <<EOF
#a.txt
#a1.txt
exit
EOF
For more on the syntax, look into Here Document
here is similar solution but you do not have to iterate and to have special formated an sql file names. You compose an one sql file and run it once.
cat table_animal.sql > /tmp/temp.sql
cat table_horse.sql >> /tmp/temp.sql
cat table_fish.sql >> /tmp/temp.sql
sqlplus USERNAME/PASSWORD#DOMAIN #/tmp/temp.sql
For Windows try
copy /b *.sql +x final.sql
sqlplus user/password #final.sql
Special Thanks to Joseph Torre
sqlplus login/password#server #filename
reference link

Is there batch script command where I can change variable values in a .sql file?

I am creating a batch file where I am restoring a database from an IP address and then executing a couple .sql files onto the database. In a couple of the .sql files there are variables declared and set. But this process has to be done on many machines with different values for each variable in each machine.
So I'm able to restore the database through user input of the IP, but I'm not sure how to use the batch script command to change the variable values.
For example, in one of the .sql files, a variable #store was declared and set to some random number. I want to change that number through the batch file.
I am using windows and sql server express 2008 r2
You can use "scripting variables" with SQLCMD.
Here's an example from that MSDN page:
You can also use the -v option to set a scripting variable that exists in a script. In the following script (the file name is testscript.sql), ColumnName is a scripting variable.
USE AdventureWorks2012;
SELECT x.$(ColumnName)
FROM Person.Person x
WHERE c.BusinessEntityID < 5;
You can then specify the name of the column that you want returned by using the -v option:
sqlcmd -v ColumnName ="FirstName" -i c:\testscript.sql
To return a different column by using the same script, change the value of the ColumnName scripting variable.
sqlcmd -v ColumnName ="LastName" -i c:\testscript.sql
If you are working on a Unix / Linux system, you can use sed to search a string.
Example: Assuming you need to replace 127.0.0.1 to 192.168.1.1, you can use the following instruction:
$ sed 's/127.0.0.1/192.168.1.1/g' script.sql > newScript.sql
This will replace the old ip in script.sql and will save a copy of this script in newScript.sql.
On windows, I don't know how to do it, but you can always download and install Cygwin to do exactly as above.
Hope this helps you.