I have looked through the SQL Server questions and answers and I didn't see an answer to this one, if it is out there and I've missed it, please let me know.
Here's the situation:
I write stored procedures and views that are then run as reports (using Crystal) - this is not the problem. Before I am able to release the reports into Production, I need to have the end users run the reports and check them for errors, etc. In a perfect world, I would have a frozen test environment, but I don't live in a perfect world. Every night everything I place into my test environment is wiped out and every morning anything that is in end user testing needs to be re-added. This means that when I come in the first thing I do is run all of the stored procedure, along with a script that unhides the reports in the program we use.
What I'd like to be able to do is to write a package that would find all of the stored procedures in a folder and execute them to add them to the database and, then, run the script that unhides the reports.
I know how to set up an SSIS package to run a stored procedure, but I don't know how to set one up that would run an ever changing list of stored procedures. Is this even possible? And, if it is, how do I go about starting this up?
I should note that while I have more then 10 years of query writing experience, I haven't used VB since VB 6.0 and I very new to the SSIS and SSRS world.
Thanks in advance!
Good old nt shell will do the trick. Run this statement in the folder containing the files.
for %A in (*.sql) DO sqlcmd -i %A -S <myServer> -d <myDb> -E
if you want to include it in a batch file it could look like
#echo off
for %%A in (*.sql) DO sqlcmd -i %%A -S <myServer> -d <myDb> -E
sqlcmd -i script_to_update_config.sql -S <myServer> -d <myDb> -E
This actually sounds like it may be more of a deployment issue than a SQL one. Take a look at Jenkins CI. I believe it's mostly used for code build and deployment, but it can also be used for any automated task.
If you had one SQL file that listed all the changed procs and their associated files you can use that single script to run all the others http://www.devx.com/tips/Tip/15132. For that matter you could just use a scheduled task to run it every morning.
Adding one more step, you could build the file based on the contents of a folder (using a little Powershell script or the like).
I'm not sure trying to do this all within an SSIS package is the right tool for the job.
Related
I have a .sql dump file 20 gb and I am trying to run it on Mysql workbench using run script and after successful execution, using SSMA I'll migrate the data from Mysql workbench to SQL Server. I have migrated the data this way many times successfully however for 20 gb file it seems very time-consuming. Please let me know if there is any alternate way to achieve this quickly. I have followed the following link:
Steps to migrate mysql tables to sql server using SSMA!
From your Title "unable to run .sql file in SSMS" and "I have a .sql dump file 20 gb" are you trying to open a 20GB .sql in SSMS? That's never going to work. SSMS is a 32bit application, so the maximum addressable memory is 2GB. If you want to run your .sql file, I suggest using sqlcmd.
Open up Powershell, and then run the command below replacing the appropriate parts:
sqlcmd -S {Server Name/ServerIP} -U {Your Login} -i {Your full path to your script}
You'll be prompted for your password and then you the file will be run. So, as an example, you might run:
sqlcmd -S svSQL2017 -U Larnu -i \\svFileServer\SQLShare\Scripts\BigBatchFile.sql
If you are using integrated security, then don't pass the -U parameter for the command.
Edit: This answer is no relevant to the OPs question, as they were using "SSMS" as a synonym for SQL Server, which it is not. I have left this here for the moment so the OP can review my comments, and I will likely remove this answer at a later point.
I am trying to run a .sql script on a schedule. I have created a batch file to run the script. The script runs fine in sql server management studio and also when I run the batch file content through cmd.
Contents of the batch file:
sqlcmd -S omfmesql -U OMESRV -P orat -i "\\pvsrv-
fsr14\data\Projects\Stat_Table_Creation_unique.sql"
The sql script is supposed to update a stat table. When I run it though cmd and refresh the stat table, the numbers are updated. But when I run this batch file through Task Scheduler, the only action that seems to be performed is running C:\Windows\SYSTEM32\cmd.exe
The task is stated to be completed successfully but the sql query is just not run.
I am not too experienced with Task Scheduler. Any help here would be very much appreciated. Thanks!
Note: I am not intending to use SQL Server Agent
If you have not done so, you need to set the location in Task Scheduler (TS). In at least some versions of TS, this can only be done when you create a basic task, not from the more general "Create Task..." option. Ensure that all the paths in the batch file are absolute or are based in this location.
I have created a sql query that updates certain tables taking a CSV file as the input.
I want my co-workers to be able to execute this query as easily as possible. At first, I thought a batch file using sqlcmd was the best solution.
The end product works on my computer, because I have SSMS installed, but no other computer is able to properly launch the batch file.
What is the best way for my end-users to run an sql query? I have thought/researched these solutions:
-Install SSMS or the required tools(don't want each user to have to do this.)
-Install Psexec tools to allow for remote batch launching (also don't like this.)
Is there a better way?
Check SQLS*Plus from www.memfix.com - works the best.
Why don't you create a C-Sharp or VB.Net program that executes the proc and distribute the program to your users?
You don't have to install all of SMS. You can just install SQLServer2008CmdLnUtilsx86.msi for SQL 2008 or go here to get SQLCMD for SQL 2012. http://www.microsoft.com/en-us/download/details.aspx?id=36433. Just be aware that if you install SQLCMD in a bat file and then attempt to use SQLCMD after installing it in that same bat file you have to specify full path to SQLCMD because PATH value is loaded at time bat was started and SQLCMD was not yet available at that time.
I'm new to T-SQL and I'm trying to backup my databases (using SQL Server 2008).
When I try to run the script via sqlcmd -i inputfile I got this error messages:
'DATE' Scripting variable not defined.
The problem is I have a line like this:
...TO DISK = "FileName_$(ESCAPE_NONE(DATE)).BAK" ...
With a date in a filename, it will prevent it from replacing my old backups.
If I run it in management studio, it works, but if I run it in command line with the sqlcmd -i command, then it doesn't work.
EDIT:
I looked at the job history and I saw this error message:
"For SQL Server 2005 SP1 or later, you must use the appropriate ESCAPE_xxx
macro to update job steps containing tokens before the job can run"
I don't quite understand what that means. I've already used $ESCAPE_NONE(DATE), what's wrong?
Old question I know but this is one of the first results and if anyone else has the same problem the answer isn't particularly easy to find.
Including the -x switch to disable environment variables fixed the problem for me;
sqlcmd -x -i inputfile
If you're trying to backup your sql server databases and append the date to them using sqlcmd there's an easy thing you can try.
First, create the sp called sp_BackupDabases which you can find here:
http://support.microsoft.com/kb/2019698
You can invoke it from sql cmd using some command like this:
sqlcmd -U Damieh -P ilovechocolate -S (local) -Q "EXEC sp_BackupDatabases #backupLocation ='C:\MyBackups\', #BackupType='F'"
I'm sure you know this already, but just in case: -U is the user, -P is password, -S is server, and -Q is query. You can either backup all of your databases or some of them, there are parameters for that. You can find the stored proc parameters details on the same link I gave you.
The date will be automatically appended and you can play with the sp's code if you want it in a different place/way/format. I use this regularly on servers which don't have a non-express sqlserver (meaning that I can't schedule backups without using a .bat and task scheduler) with great success.
I apologize if this wasn't the answer you were looking for =). Have a nice day!
I know I'm coming along late on this thread, but you can use the following:
SQLCMD -S YourServer -E -d YourDatabase -i YourScript 2> nul
That will send the StdErrorOut to the bit bucket.
Off late I have been dumping relatively large tables using SSMS. The usual way is to set Query->Results-To->File, 'Execute`, choose a file and let the SQL query run. After it finishes, I usually zip the file and then transfer it to my local machine. This has obvious problems of the host machine running out of space during overnight SQL queries.
I was wondering if there is a way to compress the output from SSMS directly without having to wait until it dumps the results from the entire query. Any suggestions? The host machine is pretty restricted in what it allows me to run on it so a suggestion that requires minimal third-party software would be great.
Run the queries from sqlcmd instead and pipe the output into a command line zip (you'll need to install one, see What's a good tar utility for Windows?). Or you can use PowerShell that can zip out-of-the-box, including piped input, see Compress Files with Windows PowerShell then package a Windows Vista Sidebar Gadget, this requires no additional tools as PS is already on your host server (although on second read I think the PS solutions, as in the link, still requires a deflated file first, cannot compress on-the-file).
Sample query using sqlcmd and 7zip:
sqlcmd -S <DATABASE> -s <COLUMNSEP> -Q "SELECT ..." | .\7za.exe a -si <FILENAME>
Remember to use the -Q (run query and exit) and not the -q (run query) or else this won't work.