Using sql within shell script - sql

I am currently trying to integrate an sql statement into a shell script, But facing major syntax issue:
My statement in the script:
su - <sid>adm -c 'hdbsql -U SYSTEM export "'SCHEMA'"."'*'" as binary into "'Export Location'" with reconfigure'
I get the following error:
* 257: sql syntax error: incorrect syntax near "*": line 1 col 16 (at pos 16) SQLSTATE: HY000
Would really appreciate if anyone could help me with this.
Thanks and Regards,
AK

Your command line doesn't make much sense to me. It starts with
su - <sid>adm
which means that you are redirecting the contents of the file "sid" into "su" and then redirecting the result of that operation into the file "adm".
Second problem is that in the command you are giving to adm, the single quotes end right before the "" which means, that the "" will get interpreted by the shell as a file glob:
-c 'hdbsql -U SYSTEM export "'SCHEMA'"."'*'" as binary into "'Export Location'" with reconfigure'
You'll need to escape those single quotes like this: "\'".
But I think your problem solving approach is not good. Try to reduce to problem and only then start adding additional things to it. So first try to execute the SQL statement from the "hdbsql" shell. Does it work?
$ hdbsql
> YOUR SQL STATEMENT HERE
Once that works, try to execute the SQL statement from the unix shell as a user:
$ hdbsql -U SYSTEM export ...
Once that works, try to execute it via su
$ su - ...

Related

How to fix syntax error at or near "psql" in psql ubuntu

I am entirely new to psql and not particularly familiar with some terms. I am following instructions on an ETL process for mimic-in the link here: https://github.com/chichukw/mimic-omop/blob/master/README-run-etl.md. When I run this code, it shows no output but this error:
syntax error at or near "psql"
I have tried adding semicolon, removing the psql part and removing the quotes and dollar sign but I still get this syntax error on the first character regardless.
psql "$MIMIC" postgres_create_mimic_id.sql;
I expect concept ids to be created after running this code on the server using the jupyter terminal.
The only way I can think of how you could get that output/error is if you did this:
[root#foo /]# psql
psql (11.5)
Type "help" for help.
postgres=# psql "$MIMIC" postgres_create_mimic_id.sql;
ERROR: syntax error at or near "psql"
LINE 1: psql "$MIMIC" postgres_create_mimic_id.sql;
^
postgres=#
Instead, I think you should be doing this:
[root#foo /]# export MIMIC='host=localhost dbname=postgres user=postgres options=--search_path=mimiciii'
[root#foo /]# psql "$MIMIC" -f postgres_create_mimic_id.sql;
Disclosure: I am an EnterpriseDB (EDB) employee

SQL Server :query for exporting to file

I'm trying to learn the basics of sql programming, I am working with SQL Server 2014. I have managed to import a file into a table with the command:
BULK INSERT Db.dbo.Co2_table
FROM 'd:\dataset_co2.txt'
with
(
FIRSTROW =2,
ROWTERMINATOR ='\n'
)
GO
I would like to do the dual operation, that is exporting the content of a table to a file. I have tried:
SELECT *
INTO OUTFILE 'C:\datadump\sqldbdump.txt"
FROM dbo.alarms_2_2014
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n';
bcp Db.dbo.Co2_table out "C:\users\ws5.en-cre\desktop\prova.txt" -T –c
sqlcmd -S . -d Db -E -s, -W -Q "SELECT * FROM dbo.Co2_table" > ExcelTest.csv
But none of these seem to work (I get error messages). Any idea?
I suspect you are running those commands from Management Studio. You should use console for this command.This works for me. Also check if you have permissions on that folder.
bcp "select * from Db.dbo.Co2_table" queryout C:\users\ws5.en-cre\desktop\prova.txt -c -T
or
bcp Db.dbo.Co2_table out C:\users\ws5.en-cre\desktop\prova.txt -c -T
Also you have suspicious symbol in c parameter -T –c. It is not a regular dash -.
Thank you for you answers and suggestions, and apologies for my lack of precision and my late reply (in this case I missed the notifications from stackoverflow).
Regarding the question on whether I use mstudio or console, what I do is clicking on “new query” from mstudio, write the code and press execute. So I guess the answer is that I use mstudio.
If I try:
bcp "select * from Db.dbo.Co2_table" queryout
C:\users\ws5.en-cre\desktop\prova.txt -c –T
it says
Msg 102, Level 15, State 1, Line 1 Incorrect syntax near 'queryout'.
I guess in this case one of the problem is that the quotes are missing, but even adding them doesn’t solve the problem.
I am looking for a solution that can be implemented as a script. I am familiar with excel vba macros, I would like to implement something like that.
Thanks,
Alex

How to escape apostrophe in a db2 sql query, running within a shell script?

I'm trying run a query that will include static columns in its output. The select statement works when I run it via the CLP, but not when I execute it within a shell script:
su - myid -c 'db2 connect to mydb;db2 -x -v "select COL1,'','',COL2,'','',COL3L from MYTABLE fetch first 10 rows only"; db2 connect reset;'
When I run this, the output error I get is:
SQL0104N An unexpected token "," was found following "select COL1,".
Expected tokens may include: "<select_sublist>". SQLSTATE=42601
SQL1024N A database connection does not exist. SQLSTATE=08003
I've even tried putting the select statement in a variable and inserting that within the statement, but still the same error. Any help would be greatly appreciated. -Thx
You should escape the single quotes as with a backlash as in :
su - myid -c 'db2 connect to mydb;db2 -x -v "select COL1,\'\',\'\',COL2,\'\',\'\',COL3L from MYTABLE fetch first 10 rows only"; db2 connect reset;'
Beware, I didn't test it... no shell at hand just now.
UPDATE:
Finally I got my hands on a DB2 instance.. after a little testing i got it working.
It turns out that the previous syntax was faulty. The proper way of quoting the single quote is (in this case) '\'' as in:
su - myid -c 'db2 connect to mydb;db2 -x -v "select COL1,'\'','\'',COL2,'\'','\'',COL3L from MYTABLE fetch first 10 rows only"; db2 connect reset;'
That's because the single quote around the whole command must be closed (') in order to supply the escape for the single quote in the db2 query (\') and then reopened to resume the command quoting ('). Weird as it looks, it works....
This is the command I used to test it:
bash -c 'db2 connect to mydb;db2 -x -v "select 1,'\'','\'',2,'\'','\'',3 from SYSIBM.SYSDUMMY1 fetch first 10 rows only"; db2 connect reset;'

Execute SQL from file in bash

I'm trying to load a sql from a file in bash and execute the loaded sql. The sql file needs to be versatile, meaning it cannot be altered in order to make things easy while being run in bash (escaping special characters like * )
So I have run into some problems:
If I read my sample.sql
SELECT * FROM SAMPLETABLE
to a variable with
ab=`cat sample.sql`
and execute it
db2 `echo $ab`
I receive an sql error because by doing a cat the * has been replaced by all the files in the directory of sample.sql.
Easy solution would be to replace "" with "\" . But I cannot do this, because the file needs to stay executable in programs like DB Visualizer etc.
Could someone give me hint in the right direction?
The DB2 command line processor has options that accept a filename as input, so you shouldn't need to load statements from a text file into a shell variable.
This command will execute all SQL statements in the file, with newline treated as the statement terminator:
db2 -f sample.sql
This command will execute all SQL statements in the file, with semicolon treated as the statement terminator:
db2 -t -f sample.sql
Other useful CLP flags are:
-x : Suppress the column headings
-v : Echo the statement text immediately before execution
-z : Tee a copy of all CLP output to the filename immediately following this flag
Redirect stdin from the file.
db2 < sample.sql
In case, you have a variable used in your script and wanted to get it replaced by the shell before executed in DB2 then use this approach:
Contents of File.sql:
cat <<xEOF
insert values(1,2) into ${MY_SCHEMA}.${MY_TABLE};
select * from ${MY_SCHEMA}.${MY_TABLE};
xEOF
In command prompt do:
export MY_SCHEMA='STAR'
export MY_TAVLE='DIMENSION'
Then you are all good to get it executed in DB2:
eval File.sq |db2 +p -t
The shell will replace the global variables and then DB2 will execute it.
Hope it helps.

loop sql query in a bash script

I need to loop a oracle sqlplus query using bash.
my scenario is like this. I have a set of names in a text file and i need to find out details of that names using a sqlplus query.
textfile.txt content:
john
robert
samuel
chris
bash script
#!/bin/bash
while read line
do
/opt/oracle/bin/sqlplus -s user#db/password #query.sql $line
done < /tmp/textfile.txt
sql query: query.sql
set verify off
set heading off
select customerid from customers where customername like '%&1%';
exit
problem is when I run the script I get errors like
SP2-0734: unknown command beginning
"robert..." - rest of line ignored.
can someone tell me how to solve this?
The way I do this all the time is as follows:
#!/bin/bash
cat textfile.txt |while read Name
do
sqlplus -s userid/password#db_name > output.log <<EOF
set verify off
set heading off
select customerid from customers where customername like '%${Name}%'
/
exit
EOF
Bash will auto magically expand ${Name} for each line and place it into the sql command before sending it into sqlplus
Do you have set define on ? Is your wildcard & ? You could check glogin.sql to know.
And yes, establishing n connections to pass n queries is probably not a good solution. Maybe it's faster for you to develop and you will do that one time, but if not, you should maybe think of crafting a procedure.