OUT parameter from sql file to shell script - sql

I am executing a batch of sql statements with following command in a shell script,
mysql -u root -p"${PASSWORD}" -h 127.0.0.1 -P 3306 -B -e "set #readonly_user:='${readonly_user}', #readonly_user_password:='${readonly_user_password}'; source db.sql;"
db.sql file content is,
SET #create_user := CONCAT("CREATE USER IF NOT EXISTS '", #readonly_user, "' IDENTIFIED BY '", #readonly_user_password, "';");
SET #grant_user := CONCAT("GRANT SELECT ON orion.* TO '", #readonly_user, "';");
SHOW DATABASES;
SELECT user FROM mysql.user;
PREPARE stmt FROM #create_user;
EXECUTE stmt;
DEALLOCATE PREPARE stmt;
PREPARE stmt FROM #grant_user;
EXECUTE stmt;
DEALLOCATE PREPARE stmt;
I want to get a out parameter from this db.sql file to shell script to check if user already exists.
How to do out parameter from .sql file to shell script, so based up that shell variable, I can do further processing in shell script.

Related

Execute BASH script from SQL script under ASSERT

The goal is to execute a script hi.sh after an assertion in SQL script.
hi.sh contains:
#!/bin/bash
echo 'Hello World'
I'm trying to execute hi.sh within DO:
DO $$
BEGIN
ASSERT EXISTS (SELECT * FROM pg_catalog.pg_tables WHERE schemaname = 'public'), '"Public" schema is empty';
\! ./hi.sh
END$$;
It should raise error '"Public" schema is empty' and not execute hi.sh but once there are entries in public schema, hi.sh must be executed.
Running the sql script raises error:
psql:assert.sql:5: ERROR: syntax error at or near "\"
LINE 4: \! ./hi.sh
^
However, it will execute shell script once '\!' line is outside DO, like this:
\! ./hi.sh
How can we create a simple SQL script for executing hi.sh based on assertion?

assign DB name to some varibale while executing script using CMD

I have more than 1 sql scripts. I want to execute them using only 1 SQLCMD line instead of writing separate SQLCMD line for each script. There's a way where i can create a new .sql file and by using :r . I just need to assign a sql script name in-front :r but the scripts which i am going to execute refers to more than one DB. I prepared the following script but its not working as expected:
Script-1
INSERT INTO [$(dbuser1)].[dbo].[A](Name)
VALUES('Tod')
INSERT INTO [$(dbuser2)].[dbo].[B]( Name )
VALUES ('John')
Script-2
INSERT INTO [$(dbuser1)].[dbo].[A]( Email )
VALUES('tod#gmail.com')
INSERT INTO [$(dbuser2)].[dbo].[B]( Email )
VALUES ('john#gmail.com')
Script-3
SET NOCOUNT ON
SET NOEXEC OFF
-- Quit when error.
:On Error Exit
-- SQLCMD Variables
:setvar dbname1 TestDatabase1
:setvar dbname2 TestDatabase2
:r Script-1.SQL
:r Script-2.SQL
and CMD Line
sqlcmd -S <Serevrname> -i <Script-3 path>
is it a correct way by assigning DB name to variable?

Passing parameters to BCP command in SSIS execute SQL task

First of all, can i pass parameters for Path in BCP command?
I wrote this query in Exec SQL task
EXEC xp_cmdshell 'bcp "SELECT * FROM TLC.dbo.World_Hosts" queryout `"C:\Users\akshay.kapila\Desktop\TLC\Foreachlearn\Dest\?.txt" -T -c -t '`
I have given ? in path. I want specific countries in place of that. They are held in variable "Country".I am using Foreach loop which creates rather it should create a file ex Aus.txt,In.txt everytime loop runs for that specific value.
Can i use this way. If not, how can i pass variable value to Path in BCP command?
You can use variable as the SQLSourceType in your Execute SQL Task.
Create a variable to hold your bcp command, it may look like:
"EXEC xp_cmdshell 'bcp \"SELECT ''abc'' as output\" queryout \"" + #[User::strFileName] + "\" -T -c -t '"
Here #[User::strFileName] is the dynamic file you want to generate.
Then in the Execute SQL Task, change SQLSourceType to variable, and select the variable you just generated.

DB2 Output to Variable via bash script

I'm hoping someone can help with applying the output from a db2 command to a variable to use later on in a script.
So far I am at...
db2 "connect to <database> user <username> using <password>"
while read HowMany ;
do
Counter=$HowMany
echo $HowMany
done < <(db2 -x "SELECT COUNT(1) FROM SYSCAT.COLUMNS WHERE TABNAME = 'TableA' AND TABSCHEMA='SchemaA' AND GENERATED = 'A'")
When trying to reference $Counter outside of the while loop, it returns SQL1024N A database connection does not exist. SQLSTATE=08003 as does the echo $HowMany
I've tried another method using pipe, which makes the $HowMany show the correct value, but as that is a sub shell, it's lost afterwards.
I'd rather not use temp files and remove them if possible as I don't like left over files if scripts abort at any time.
The DB2 CLP on Linux and UNIX can handle command substitution without losing its database connection context, making it possible to capture query results into a local shell variable or treat it as an inlined block of text.
#!/bin/sh
# This script assumes the db2profile script has already been sourced
db2 "connect to <database> user <username> using <password>"
# Backtick command substitution is permitted
HowMany=`db2 -x "SELECT COUNT(1) FROM SYSCAT.COLUMNS WHERE TABNAME = 'TableA' AND TABSCHEMA='SchemaA' AND GENERATED = 'A'"`
# This command substitution syntax will also work
Copy2=$(db2 -x "SELECT COUNT(1) FROM SYSCAT.COLUMNS WHERE TABNAME = 'TableA' AND TABSCHEMA='SchemaA' AND GENERATED = 'A'")
# One way to get rid of leading spaces
Counter=`echo $HowMany`
# A while loop that is fed by process substitution cannot use
# the current DB2 connection context, but combining a here
# document with command substitution will work
while read HowMany ;
do
Counter=$HowMany
echo $HowMany
done <<EOT
$(db2 -x "SELECT COUNT(1) FROM SYSCAT.COLUMNS WHERE TABNAME = 'TableA' AND TABSCHEMA='SchemaA' AND GENERATED = 'A'")
EOT
As you have found, a DB2 connection in one shell is not available to sub-shells. You could use a sub-shell, but you'd have to put the CONNECT statement in that sub-shell.
So it's more of a simple rewrite, and don't use a sub-shell:
db2 "connect to <database> user <username> using <password>"
db2 -x "SELECT COUNT(1) FROM SYSCAT.COLUMNS WHERE TABNAME = 'TableA' AND TABSCHEMA='SchemaA' AND GENERATED = 'A'" | while read HowMany ; do
Counter=$HowMany
echo $HowMany
done

My bcp hangs after creating an empty file

I am trying to drop a file into a directory on the local machine (same machine running SQL Instance). The content of the table I am trying to drop out is in xml format.
ie. table=xmlOutFiles, fieldName = xmlContent; fieldName contains essentially varchar(max) data that is to become the xml file we need.
When the bcp command is executed it seems to create the file in the #dest location, size = 0 bytes and then the process running from within SMSS just sits there waiting for something!
I cannot do anything with that empty file, like delete it, unless I use task manager to kill the process "bcp.exe".
I have tried multiple combinations of the bcp flags, etc.
Running the bcp command from a system prompt, replacing the "#vars" seems to work but I really need it to be part of my SQL Trigger script and function.
Assistance appreciated!!!!
Select #dest = (Select filename from xmlOutfiles)
Select #cmd = 'bcp "Select xmlContent from ProcureToPay.dbo.XmlOutFiles" queryout '+#dest+' -x -w -T -S' + ##servername
Exec xp_cmdshell #cmd
I have tried executing with -T and -Uusername -Ppassword parameters, etc.
This command works from the DOS prompt:
bcp "Select xmlContent from Procure.To.Pay.dbo.XmlOutFiles" queryout c:\temp\test.xml -x -w -T S<myservernameHere>