I have a small vbscript to load data and process it using sqlldr and sqlplus. I have 2 questions about sqlplus usage though:
1) Can I exec a stored procedure without using an .sql script file?
e.g. sqlplus user/pass#server #exec proc_myname
2) Can I use .sql script files on a shared UNC path?
e.g. sqlplus user/pass#server #\server\path\script.sql
I've tried playing around and at the moment am working around the problem by using a local temp directory to store the sql files. But I'm curious if there's another/better way.
Thanks
Can I exec a stored procedure without using an .sql script file?
Yes. I don't know VisualBasic good enough but the basic principle is that you create the child process for sqlplus and then send the commands via stdin (i.e. you have your script write to standard input of the child process).
Can I use .sql script files on a shared UNC path?
If the path is correct, then that should work. You can also try I/O redirection:
sqlplus user/pass#server < \\server\path\script.sql
The drawback of this approach is that the error messages won't include the .sql script name.
Re your second question:
Running a script from a UNC path works for me without drive mapping or I/O redirection:
sqlplus /nolog "#\\server\path\file.sql"
sqlplus version 12.2.0.1
The quotes and '#' are important, apparently.
Related
I am using Aqua Data Studio 7.0.39 for my Database Stuff.
I have a 20 SQL files(all contains sql statements, obviously).
I want to execute all rather than copy-paste contains for each.
Is there any way in Aqua to do such things.
Note: I am using Sybase
Thank you !!
I'm also not sure of how to do this in Aqua, but it's very simple to create a batch/powershell script to execute .sql files
You can use the SAP/Sybase isql utility to execute files, and just create a loop to cover all the files you wish to execute.
Check my answer here for more information:
Running bulk of SQL Scripts in Sybase through batch
In the latest versions of ADS there is an integrated shell named FluidShell where you can achieve what you are looking for. See an overview here: https://www.aquaclusters.com/app/home/project/public/aquadatastudio/wikibook/Documentation15/page/246/FluidShell
The command you are looking for is source
source
NAME
source - execute commands or SQL statements from a file
SYNOPSIS
source [OPTION...] FILE [ARGUMENT...]
source [OPTION...]
DESCRIPTION
Read and execute commands or SQL statements from FILE in the current shell environment.
I have not used Aquafold before so I can't tell you exactly. However I have tackled a similar problem once before.
I once created a Powershell script. It opened a ODBC connection to my database and then executed stored procedures in a loop until end of file.
I suggest having a text document with each line being the name of an Stored Proc to run. Then in your powershell script read in a line from the file concatenate it into the call to execute a stored procedure. After each execution is completed you can delete the line from the text file and then read the next line until the EOF (end of file) is reached.
Hope this helps. If I have some time this morning I will try and do a working example for you and post it.
I have a an sql script that creates a database, and want to know how to run this from a batch file at a command prompt.
Do i create a batch file with a few lines of code pointing to the location of the .sql file, or create a new batch file containing the contents of the .sql file?
I've had a look at a couple of related questions, but can't seem to see a clear answer.
Thanks :)
You could create a batch file and use the -i flag with sqlcmd.exe, where -i sets the path to the .sql file you want to run:
sqlcmd.exe -i F:\wherever\the\file\is
See http://msdn.microsoft.com/en-us/library/ms162773.aspx for a full list of the flags and this post, How to use sqlcmd to create a database.
is quite easy, you only need 3 things:
SQL Client Path/
Script Path/
SQL Conection data/
Then you can run it directly from the batch without any complex script - let me know if you have the info I can arrange it.
regards
Suppose I am having 100 sql files and I need to execute all the files one by one in sequence. Is there any approach to do this with out executing the scripts manually?
You can write a bat file to execute them using sqlcmd Utility
Write a shell script or similar to run them sequentially.
We've had great success with the SQL Deploy tool by SSW Australia.
It's not free - but worth every penny, and saves you so much time, it pays for itself in no time at all!
(I have no affiliation with SSW Australia other than being a happy user of SQL Deploy)
Pipe the dir /b > foo.txt output to a file
Add sqlcmd at the start of each line etc using a decent text editor like notepad++
You can use PowerShell to do this. The following blog post describes such a script. As part of the foreach a pipe is used to sort the files in the manner that you want to process them. In this example it is being sorted by descending alphabetical file name, but you can also do it by other attributes, such as the date the file was created.
Also the following blog post describes how to run all the .sql files in a directory like the above linked post, but without the use of PowerShell
Assuming your files are named something like this:
001_my_script.sql
002_another_script.sql
003_foo_script.sql
004_bar_script.sql
You can do the following at the command line:
copy *.sql /a my_big_script.sql
And then run the resulting file as one script (via sqlcmd or Management Studio).
Can I use TSQL to operate on normal operating system files? Such as create a .bat file at c:\test and write some query result into that batch file?
Thanks.
For general tips on reading/writing files, you can check out this link.
You can also use SQLCMD, like this (input.sql would be your input sql, Results.txt would be your output):
SQLCMD -i Input.sql -o C:\Results.txt -e
Yes use SQLCMD
You could also use xp_cmdshell:
xp_cmdshell
Executes a given command string as an
operating-system command shell and
returns any output as rows of text.
Grants nonadministrative users
permissions to execute xp_cmdshell.
link to: xp_cmdshell - msdn reference
I have a SQL script which is extremely large (about 700 megabytes). I am wondering if there is a good way to reduce the size of the script?
I know there are code minimizers for JavaScript and am looking for one to use with SQL scripts.
I am not looking to get performance on the SQL script. I am trying to make the file size smaller. Removing excess whitespace. Keeping name-qualification down so that the script file sizes can be smaller.
If I attempt to load the file in SQL Server Management Studio I get this error.
Not enough storage is available to
process this command. (Exception from
HRESULT: 0x80070008) (mscorlib)
What's in this script of 700MB?! I would hope that there are some similarities/repetitions that would allow it to shorten the file.
Just some guesses:
Instead of inserting a million records using Insert statements, use a bulk loading tool
Instead of updating a number of individual records, try to batch updates to the same value into one (e.g. Update tab set col=1 where id in (..) instead of individual updates)
long manipulations can be defined as a stored procedure (before running the script) and the script would only have to call the stored proc
Of course, splitting the script up into smaller portions and calling each one from a simple batch file would work too. But I'd be a little worried about performance (how long does the execution take?!) and would look for some faster ways.
What about breaking your script into several small files, and calling those files from a single master script?
This link describes how to do it from a stored procedure.
Or you can do it from a batch file like this:
REM =-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
REM Define widely-used variables up here so they can be changed in one place
REM Search for "sqlcmd.exe" and make sure this path is valid for you
REM =-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
set sqlcmd="C:\Program Files\Microsoft SQL Server\100\Tools\Binn\sqlcmd.exe"
set uname="your_uname_here"
set pwd="your_pwd_here"
set database="your_db_name_here"
set server="your_server_name_here"
%sqlcmd% -S %server% -d %database% -U %uname% -P %pwd% -i "c:\script1.sql"
%sqlcmd% -S %server% -d %database% -U %uname% -P %pwd% -i "c:\script2.sql"
%sqlcmd% -S %server% -d %database% -U %uname% -P %pwd% -i "c:\script3.sql"
pause
I like the batch file approach myself, because it is easier to tinker with it, and you can schedule it as a windows job.
Make sure the .BAT file is in a folder with the appropriate security restrictions, since it has your credentials in a plain text .BAT file.
gzip should do.
SQL is much harder to shrink, the field, table names and commands need to be what they are. Plus, you wouldn't just want to rewrite the commands as something shorter because it could have implications on performance.
Depending on the DBMS that you use, it may allow short names for commands, and then there might be a converter.
(Answering this because it is the top item returned when I searched for "SQL script size")
I got the same error when trying to load a large script into Management Studio. In my case I was trying to downgrade a database from SQL2008 R2 to SQL 2008 by using the SQL Server script generator, which created a 700mb structure and data .sql file.
To get around it I used the command line to run the script instead:
C:>sqlcmd -S [SQLSERVER\INSTANCE] -i [FILELOCATION\FILENAME].sql
Hopefully this helps someone else.
Compress the sql file will have the most compression ratio.
Minimizing the txt sql file will reduce some bytes/kilobytes per mega.. is not worth...
The better approach is to create a "function" to unzip and read the file. The best benefit I guess.
Today, filesize shouldn't be a problem. Dial-up connection? Floppy disks?
pg-minify can do it, and not just for PostgreSQL, but for most notations, including MS, MySql, etc.