how export a table of database postgres to a csv with qt? - sql

Already has the connection succesful.
With qtsqlquery. Seem imposible execute the copy sentence of the postgresql.
For example:
execute this query on qt:
COPY table TO 'D:/table.csv' DELIMITER ',' CSV HEADER;

After study a little bit, found a script on the folder of installation of PostgreSQL (.../scripts), then modified the script to executed the command COPY TO. Because the QSQLQUERY Definitely don't executed the command COPY. Here are the Script on win7.
for /f "delims=" %%a in ('chcp ^|find /c "932"') do # SET CLIENTENCODING_JP=%%a
if "%CLIENTENCODING_JP%"=="1" SET PGCLIENTENCODING=SJIS
if "%CLIENTENCODING_JP%"=="1" SET /P PGCLIENTENCODING="Client Encoding [%PGCLIENTENCODING%]: "
REM Run psql
"D:/PostgreSQL\bin\psql.exe" -h localhost -U USER -d DATABASE -p 5432 --no-password --command="\copy (select * from TABLE) TO './EXPORT_TABLE.csv' CSV HEADER"

Related

How to import several sql files into postgreSQL using batch script?

I have a set of sql files
file1.sql
file2.sql
....
that contain table inserts INSERT INTO <tablename> (...) VALUES (....); in a directory. I would like to add a command into a existing batch file so that it iterates over each file.
Currently I use the command
<postgreSQLCommandWithParameters> -U <user> -d <database> -f <pathToSQL>\complete.sql -q
I looked into a for loop but I missing the parameter to point to the correct directory.
for %%G in (*.sql) do <postgreSQL> -U <user> -d <database> -E -i"%%G"
Therefore how can I use a for loop to iterate over all sql files which contain insert commands?
As a side question: And what is the syntax in the for loop in batch scripts?
So I used the following command
FOR /F "tokens=4 USEBACKQ" %%F IN (`dir <pathToSQLFiles> ^| findstr .sql `) DO (
<postgreSQL> %1 -U <user> -d <database> -f <pathToSQLFiles>\%%F -q
)
And here are some cliffnotes
"tokens=4 USEBACKQ" takes the numbered items to read from each line and also includes names with space in them
^| findstr .sql filters all SQL files

Can I specify an input sql file with bcp?

How can I specify an input sql file with a long query when using bcp? I tried using the -i option but it keeps complaining about a command-line error with no extra information. Is this possible?
I had this problem today and found a convenient workaround, at least in an ad-hoc situation.
Temporary tables can be created by any user with connect permissions. This means you can also create GLOBAL temporary tables.
Just run your query in enterprise manager (or sql cmd or whatever) using SELECT ...INTO with a global temporary table e.g.
SELECT *
INTO ##mytemptable
FROM SomeTable
WHERE [massive where clause, for example]
You can then use the temporary table in the BCP query with a simple
SELECT * FROM ##mytemptable
Then drop the temp table through enterprise manager
DROP TABLE ##mytemptable
I did other way for fix that.
I create a batch file which read a file and send your content in bcp command. See:
#ECHO off
SETLOCAL EnableDelayedExpansion
SET queryFile=%1
SET outFileName=%2
FOR /F "delims=" %%i IN (%queryFile%) DO SET join=!join! %%i
ECHO %join%
bcp "%join%" queryout %outFileName% /S.\SQLE_CAESAR /d /c /t"|" /T
That script receive two parameters:
Filename which has a query;
Filename for export data;
Execute a script in cmd like that:
export-query.bat query.sql export.txt
I hope helped.
As far as I'm concerned the BCP utility only supports Transact-SQL queries directly written to the command line. Ex:
bcp "SELECT Name FROM AdventureWorks.Sales.Currency" queryout Currency.Name.dat -T -c
According to its reference the "-i" option:
Specifies the name of a response file, containing the responses to the command prompt questions for each data field when a bulk copy is being performed using interactive mode (-n, -c, -w, or -N not specified).
Notice that it differs from the sqlcmd Utility "-i" option:
Identifies the file that contains a batch of SQL statements or stored procedures. Multiple files may be specified that will be read and processed in order (...)
try :
query=$( cat < /file.sql )
export query
bcp "${query}" queryout /home/file.csv
Multi-line queries can be given to bcp easily using powershell:
PS> $query = #'
select *
from <table>
'#
PS> bcp $query queryout <outfile> -d <database> -T -S <server> -c
I had face same issue, may not be a very good approach. However, I did something like the following
bcp "declare #query nvarchar(max) set #query = (SELECT * FROM OPENROWSET(BULK 'F:\tasks\report_v2.sql', SINGLE_CLOB) AS Contents) exec sp_executesql #query" queryout %outFileName% /c /C RAW -S . -U sa -P 123 -d blog /T
And I must say, if you use like global temp table then global temp table is dropped itself of after query executed. you can't use this at some situations
What really worked for me is this:
#ECHO off
setlocal enableextensions enabledelayedexpansion
SET "queryFile=%1"
SET "outFileName=%2"
SET RESULT=
FOR /F "delims=" %%i IN ('type %queryFile%') DO SET RESULT=!RESULT! %%i
echo %RESULT%
rem bcp "%RESULT%" queryout %outFileName% -t^ -r \n -T -k -c -d DB_NAME -S SERVER_NAME
type file is the equivalent of cat file in unix
What I did with complex queries was create a stored procedure with the desired statement and call it from BCP:
bcp "exec db.schema.stored_procedure" queryout "c:\file.txt" -T -S localhost -t "|" -c
This worked great for me. Greetings!
I made my own script (called of bulk.sh) to do this (not optimal and not best practice... The script is too ugly, but very functional).
#!/bin/bash
input="SQL_FILE.sql"
count=0
const=1000
lines=()
mkdir -p bulk
while IFS= read -r line
do
lines+=("$line")
count=$((count+1))
check=$((count % const))
if [[ $check -eq 0 ]]; then
bulk="${lines[*]}"
unset lines
number=$(printf "%010d" $count)
echo $bulk > "bulk/bulk${number}.sql"
bulk=""
fi
done < "$input"
FILES="bulk/*"
for f in $FILES
do
echo "Processing $f file..."
sqlcmd -S SERVER -d DATABASE -U USER -P "PASSWORD" -i "$f"
sleep 2s
done
You can try it, with:
$ docker run -v /path/to/your/sql/file/folder:/backup -it mcr.microsoft.com/mssql-tools
$ bash bulk.sh

How to run multiple SQL scripts using a batch file?

I have a case where i have got 10+ SQL script.
I don't want to go and run all my scripts 1 by 1.
Is there a way that i can run all my scripts in succession in SQL Management studio.
I found this post. Creating a batch file seems easier.
This is all you need:
#echo off
ECHO %USERNAME% started the batch process at %TIME% >output.txt
for %%f in (*.sql) do (
(
sqlcmd.exe -S servername -E -d databasename -i %%f >>output.txt
)
pause
Replacing servername and databasename, but it seems to be not working.
Any ideas?
You've got an unmatched parenthesis, there.
Try
for %%f in (*.sql) do sqlcmd.exe -S servername -E -d databasename -i %%f >>output.txt
I just saved it in a .cmd file and it appears to be working.
Yes, it's possible. You can do it with :r command of SQLCMD.
I strongly recommend you to read this article and do it with SQLCMD
http://www.mssqltips.com/sqlservertip/1543/using-sqlcmd-to-execute-multiple-sql-server-scripts/
You can create a Strored Procedure to call all your Scripts. You could also create a schedule plan to run the scripts automaticaly.
http://msdn.microsoft.com/en-us/library/aa174792(v=sql.80).aspx
Here is an open source utility with source code http://scriptzrunner.codeplex.com/
This utility was written in c# and allows you to drag and drop many sql files and start running them against a database.
You can use Batch Compiler add-in for SMSS, it let's you run multiple scripts at once, create SQLCMD scripts or consolidate them into a *.sql file.
Some batch trick
cd %~dp0 //use this if you use 'for xxx in', it solved most of my problems
ECHO %USERNAME% started the batch process at %TIME% >output.txt
for %%f in (*.sql) do (
(
sqlcmd.exe -S servername -E -d databasename -i %%f >>output.txt
)
echo %errorlevel%
pause
If you want to run Oracle SQL files through a Batch program, then the code below will be useful. Just copy & change the Database credential and DB names
#echo off
for %%i in ("%~dp0"*.sql) do echo #"%%~fi" >> "%~dp0all.sql"
echo exit | sqlplus scott/tiger#orcl #"c:\users\all.sql"
pause
Basically, you need to put this batch file in the folder where you have all the SQL files. It will first get all the sql file names in the directory and load their full path with the sql file names. Then, it will write into a file all.sql and then sqlplus will call that all.sql to execute all the sql files that you have in that directory.

Why doesn't prompt for input in a batch file allow for a DASH character?

I have this simple little batch file program that I wrote but it fails if I enter a database name that contains a "-" character. Im not exactly sure why but I wish I could figure out a way around this?
:: open DB batch file
#echo off
:: starts Sql Server Management Studio Express 2005
:: and opens it to a specific database with query
:: window already open
cls
:SHOWDBNAMES
echo Database names detected on this system:
echo.
"%PROGRAMFILES%\Microsoft SQL Server\90\Tools\Binn\OSQL.EXE" -h-1 -S . -E -Q "SELECT CAST(name AS VARCHAR(30)) FROM sysdatabases"
#echo.
set DBNAME=
set /P DBNAME=What database name would you like to open (choose from list)?
if "%DBNAME%" == "" (
echo.
echo I don't recognize your selection. Try again.
goto SELECTDB
)
:SHOWTABLES
cls
echo.
echo Tables that you can query from %DBNAME% are:
echo.
"%PROGRAMFILES%\Microsoft SQL Server\90\Tools\Binn\OSQL.EXE" -h-1 -S . -E -Q "use [%DBNAME%];SELECT CAST(name AS VARCHAR(30)) FROM sys.Tables ORDER BY name"
echo.
:RUNIT
sqlwb.exe -nosplash -S . -E -d %DBNAME%
pause
:EOF
Try enclosing the database name in square brackets:
[database-name]
EDIT
The following should work - you need to quote the database name in the call to sqlwb.exe:
:SHOWTABLES
cls
echo.
echo Tables that you can query from %DBNAME% are:
echo.
"%PROGRAMFILES%\Microsoft SQL Server\90\Tools\Binn\OSQL.EXE" -h-1 -S . -E -Q "SELECT CAST(name AS VARCHAR(30)) FROM [%DBNAME%].sys.Tables ORDER BY name"
echo.
:RUNIT
sqlwb.exe -nosplash -S . -E -d "%DBNAME%"
I've got to ask though - what's the point of this script? The built-in SSMS object explorer gives you all this information for free.
Also, your script doesn't take account of SQL server instances other than the default - SQL Server Express is installed as <machine_name>\SQLEXPRESS by default.
Why don't you just try an underscore (_)?
Is it failing on the sqlwb.exe line when the dash is the first letter in the database name? If so, your problem is that sqlwb is misinterpreting the database name as a command line option. There should be some way to make it not do that; check the manual.

DOS command to execute all SQL script in a directory and subdirectories

I need a DOS command or a batch (.bat) file I can execute to run all the *.sql scripts in a directory and its subdirectories. What would the solution be?
The following will get you started
for /r %f in (*.sql) do echo %f
Run from the command line that will print the names of all the SQL files in the current directory and all sub directories.
Then substitute sqlcmd <connection args> -i%f for echo %f to execute the scripts.
Hope this helps.
Here you go. This batch file will execute all sql files in a directory and its subdirectories. It will also create an output.txt file with the results so you can see errors and whatnot. Some notes on batch file:
[YourDatabase] is the name of the database you want to execute the scripts against.
[YourPath] is the path of where you keep all the scripts.
[YourServerName\YourInstanceName] is the SQL server name and instance name, separated with a '\'
You'll want to replace the text after the '=' for each variable with whatever is appropriate for your server
Be sure NOT to put spaces around the '='
Do not put any quotes around [YourPath]
Make sure that [YourPath] has a '\' at the end
SET Database=[YourDatabase]
SET ScriptsPath=[YourPath]
SET ServerInstance=[YourServerName\YourInstanceName]
IF EXIST "%ScriptsPath%output.txt" del "%ScriptsPath%output.txt"
type NUL > "%ScriptsPath%output.txt"
FOR /R "%ScriptsPath%" %%G IN (*.sql) DO (
sqlcmd -d %Database% -S %ServerInstance% -i "%%G" -o "%%G.txt"
echo ..................................................................................... >> "%ScriptsPath%output.txt"
echo Executing: "%%G" >> "%ScriptsPath%output.txt"
echo ..................................................................................... >> "%ScriptsPath%output.txt"
copy "%ScriptsPath%output.txt"+"%%G.txt" "%ScriptsPath%output.txt"
del "%%G.txt"
)
for %f in ("c:\path\to\dir\*.sql") do sqlcmd -S [SERVER_NAME] -d [DATABASE_NAME] -i "%f" -b
Try a for loop. The options of this command have evolved and I'm not sure what version of DOS you are using, but assuming that DOS includes "cmd.exe from Windows XP", something like this could work:
for /r . %f in (*.sql) do #echo %f
Ok, this will only print the names of the files. I'm assuming you already have a program that you can run from the command line that will execute one SQL file, which you can use instead of echo.
For more information, try for /?.