How can I include multiple search paths in a psql command, so that multiple files can be run with different search_paths but all be run in one transaction?
psql
--single-transaction
--command="set search_path = 'a'; \i /sqlfile/a.sql; set search_path = 'b'; \i /sqlfile/b.sql;"
When I run this I get a syntax error at \i. I need to have the files included separately and they're generated dynamically so I'd rather run it using a --command than having to generate a file and using --file if possible.
The manual about the --command option:
command must be either a command string that is completely parsable by
the server (i.e., it contains no psql-specific features), or a single
backslash command. Thus you cannot mix SQL and psql meta-commands
within a -c option. To achieve that, you could use repeated -c options
or pipe the string into psql [...]
Bold emphasis mine.
Try:
psql --single-transaction -c 'set search_path = a' -c '\i /sqlfile/a.sql' -c 'set search_path = b' -c '\i /sqlfile/b.sql'
Or use a here-document:
psql --single-transaction <<EOF
set search_path = a;
\i /sqlfile/a.sql
set search_path = b;
\i /sqlfile/b.sql
EOF
The search_path needs no quotes, btw.
Related
I am having a shell file named test.sh which is invoking other sql file 'table.sql'. 'table.sql' file will create some tables, but I want to create the tables in a particular schema 'bird'.
content of sql file.
create schema bird; --bird should not be hard coded it should be in variable
set search_path to 'bird';
create table bird.sparrow(id int, name varchar2(20));
content of shell file.
dbname=$1
cnport=$2
schemaname=$3
filename=$4
gsql -d ${dbname} -p ${cnport} -f ${filenam} #[how to give schema name here so that it can be used in table.sql without hardcoding]
I will execute my shell file like this
sh test.sh db1 9999 bird table.sql
it is easier doing it in shell, eg:
dbname=$1
cnport=$2
schemaname=$3
filename=$4
gsql -d ${dbname} -p ${cnport} <<EOF
create schema $3; --bird should not be hard coded it should be in variable
set search_path to '$3';
create table bird.sparrow(id int, name varchar2(20));
EOF
otherwise use psql variables
I'm hoping someone can help with applying the output from a db2 command to a variable to use later on in a script.
So far I am at...
db2 "connect to <database> user <username> using <password>"
while read HowMany ;
do
Counter=$HowMany
echo $HowMany
done < <(db2 -x "SELECT COUNT(1) FROM SYSCAT.COLUMNS WHERE TABNAME = 'TableA' AND TABSCHEMA='SchemaA' AND GENERATED = 'A'")
When trying to reference $Counter outside of the while loop, it returns SQL1024N A database connection does not exist. SQLSTATE=08003 as does the echo $HowMany
I've tried another method using pipe, which makes the $HowMany show the correct value, but as that is a sub shell, it's lost afterwards.
I'd rather not use temp files and remove them if possible as I don't like left over files if scripts abort at any time.
The DB2 CLP on Linux and UNIX can handle command substitution without losing its database connection context, making it possible to capture query results into a local shell variable or treat it as an inlined block of text.
#!/bin/sh
# This script assumes the db2profile script has already been sourced
db2 "connect to <database> user <username> using <password>"
# Backtick command substitution is permitted
HowMany=`db2 -x "SELECT COUNT(1) FROM SYSCAT.COLUMNS WHERE TABNAME = 'TableA' AND TABSCHEMA='SchemaA' AND GENERATED = 'A'"`
# This command substitution syntax will also work
Copy2=$(db2 -x "SELECT COUNT(1) FROM SYSCAT.COLUMNS WHERE TABNAME = 'TableA' AND TABSCHEMA='SchemaA' AND GENERATED = 'A'")
# One way to get rid of leading spaces
Counter=`echo $HowMany`
# A while loop that is fed by process substitution cannot use
# the current DB2 connection context, but combining a here
# document with command substitution will work
while read HowMany ;
do
Counter=$HowMany
echo $HowMany
done <<EOT
$(db2 -x "SELECT COUNT(1) FROM SYSCAT.COLUMNS WHERE TABNAME = 'TableA' AND TABSCHEMA='SchemaA' AND GENERATED = 'A'")
EOT
As you have found, a DB2 connection in one shell is not available to sub-shells. You could use a sub-shell, but you'd have to put the CONNECT statement in that sub-shell.
So it's more of a simple rewrite, and don't use a sub-shell:
db2 "connect to <database> user <username> using <password>"
db2 -x "SELECT COUNT(1) FROM SYSCAT.COLUMNS WHERE TABNAME = 'TableA' AND TABSCHEMA='SchemaA' AND GENERATED = 'A'" | while read HowMany ; do
Counter=$HowMany
echo $HowMany
done
I use mysqldump to save data from tables and move it to Postgres db.
I make dump with:
mysqldump --complete-insert --no-create-info --no-create-db --compatible=postgresql -uroot -p music files > music_files.sql
But while doing in psql:
=> \i music_files.sql
have this error:
music_files.sql:29: ERROR: syntax error at or near "s"
LINE 1: ...,5,'Impossible',NULL),(33,4103,178,841,198,'Tifa\'s Theme [P...
Postgres doesn't understand this escaping. It wants 2 quotes '' before s
How can I make it with mysqldump?
You can use this in the psql shell:
=> set backslash_quote = on;
=> set standard_conforming_strings = off;
=> select 'foo\'bar';
?column?
----------
foo'bar
=> \i ...whatever...
Do NOT make this persistent by setting it in some configuration file or by ALTER USER etc. since this may be security relevant.
I would like to perform a scalar database query and return the result into a variable in a batch file.
How would one do this? The closest I example in our system that I see is if I return an exit code based on a scalar query result.
Z:\SQL2005\90\Tools\Binn\sqlcmd -S servername -dCLASS -E -Q "EXIT(select case run_type when 'Q' then 200 else 100 end from cycle_date where cycle = '1')">NUL
if %errorlevel% == 200 call %SQLSERVER%
QRTLY.BAT
if %errorlevel% == 100 call %SQLSERVER%
MTHLY.BAT
Can someone help me with the syntax?
Here's some sqlcmd help info:
-v var = value[ var=value...]
Creates a sqlcmdscripting variable that can be used in a sqlcmd script. Enclose the value in quotation marks if the value contains spaces. You can specify multiple var="values" values. If there are errors in any of the values specified, sqlcmd generates an error message and then exits.
sqlcmd -v MyVar1=something MyVar2="some thing"
sqlcmd -v MyVar1=something -v MyVar2="some thing"
-x disable variable substitution
Causes sqlcmd to ignore scripting variables. This is useful when a script contains many INSERT statements that may contain strings that have the same format as regular variables, such as $(variable_name).
How about saving it to a file without headers then reading the contents back in?
sqlcmd -S(local)\SQLExpress -dMyDatabase -Umyuser -Pmypassword -W -h -1 -Q "SELECT Top 1 MyValue FROM MyTable" -o sqlcmdoutput.txt
set /p x= <sqlcmdoutput.txt
del sqlcmdoutput.txt
echo My scalar value is %x%
I use this in a batch file. It returns the LogicalFilename for a SQL Server Database data file. This only works if there is one data file in the DB.
So the result is the environment variable DATABASEFILENAME is set to say AdventureWorks_Data.
FOR /F "usebackq tokens=1" %%i IN (`sqlcmd -w200 -h-1 -E -Q"set nocount on; Select df.name From sysdatabases as d Inner Join sysaltfiles as df on d.dbid=df.dbid Where d.name ='$(DatabaseName)' and df.Fileid =1"`) DO set DATABASEFILENAME=%%i
Have you looked at sqlcmd?
I'm trying to load a sql from a file in bash and execute the loaded sql. The sql file needs to be versatile, meaning it cannot be altered in order to make things easy while being run in bash (escaping special characters like * )
So I have run into some problems:
If I read my sample.sql
SELECT * FROM SAMPLETABLE
to a variable with
ab=`cat sample.sql`
and execute it
db2 `echo $ab`
I receive an sql error because by doing a cat the * has been replaced by all the files in the directory of sample.sql.
Easy solution would be to replace "" with "\" . But I cannot do this, because the file needs to stay executable in programs like DB Visualizer etc.
Could someone give me hint in the right direction?
The DB2 command line processor has options that accept a filename as input, so you shouldn't need to load statements from a text file into a shell variable.
This command will execute all SQL statements in the file, with newline treated as the statement terminator:
db2 -f sample.sql
This command will execute all SQL statements in the file, with semicolon treated as the statement terminator:
db2 -t -f sample.sql
Other useful CLP flags are:
-x : Suppress the column headings
-v : Echo the statement text immediately before execution
-z : Tee a copy of all CLP output to the filename immediately following this flag
Redirect stdin from the file.
db2 < sample.sql
In case, you have a variable used in your script and wanted to get it replaced by the shell before executed in DB2 then use this approach:
Contents of File.sql:
cat <<xEOF
insert values(1,2) into ${MY_SCHEMA}.${MY_TABLE};
select * from ${MY_SCHEMA}.${MY_TABLE};
xEOF
In command prompt do:
export MY_SCHEMA='STAR'
export MY_TAVLE='DIMENSION'
Then you are all good to get it executed in DB2:
eval File.sq |db2 +p -t
The shell will replace the global variables and then DB2 will execute it.
Hope it helps.