I am executing couple of update and append scripts on my sql server, and the results sql shows like
(73558 row(s) affected)
or
successfully column created
so, any messages which pop up, after an operation, how can I save them to a log/ text file. Like I do in python. It will be really helpful to see my scripts progress when I set up them on task scheduler for running automatically. Any help will be highly appreciated
Run your scripts through sqlcmd and redirect output to a file.
Modify your scripts to write diagnostic messages after each statement to a log table.
Related
I am using ola hallengren script for maintenance solution. When I run just the Database backup job for user database I get the following error. Unable to start execution of step 1 (reason: Variable SQLLOGDIR not found). The step failed.
I have checked the directory permissions and there is no issue there. The script creates the job with no problem. I get error message when I try to run the job.
I had this same issue just the other day. I run a number of 2017 servers but the issue happened when I started running on a 2012 server.
I've dropped Ola a mail to confirm but best I can make out is that the SQLLOGDIR parameter specified in the 'advanced' tab for the step (for logging outputs) is not compatible with 2012 and maybe below 2017 though I have not tested these.
HTH,
Adam.
You need to replace this part in the advanced tab with the job name for example :
$(ESCAPE_SQUOTE(JOBNAME)) replace it with CommandLogCleanup_$(ESCAPE_SQUOTE(JOBID)) so then it will look like this:
$(ESCAPE_SQUOTE(SQLLOGDIR))\CommandLogCleanup_$(ESCAPE_SQUOTE(JOBID))_$(ESCAPE_SQUOTE(STEPID))_$(ESCAPE_SQUOTE(DATE))_$(ESCAPE_SQUOTE(TIME)).txt
instead of this:
$(ESCAPE_SQUOTE(SQLLOGDIR))\$(ESCAPE_SQUOTE(JOBNAME))_$(ESCAPE_SQUOTE(STEPID))_$(ESCAPE_SQUOTE(DATE))_$(ESCAPE_SQUOTE(TIME)).txt
Do this for all the other jobs if you don't want to recreate them.
I had the same issue on my SQL Server 2012 version, the error was during the dB backup using Ola's scripts, as mentioned above the issue is with the output file, I changed the location and the output file from the SQL Job and reran the job successfully (refer the attached screenshot for reference.
The error is related to the job output file.
When you create a maintenance job using the Ola script it will automatically assign output file to the step. Sometimes the location does not exist on the server.
I faced the same issue, then I ran the integrity script manually on the server and it completed without error, then I found that the error is in job configuration.
I changed the job output file location and now job also running fine.
The trick is to build the string for the #output_file_name parameter element by element before calling the stored procedure. If you look into Olas code you will see that is exactly what he is doing.
I have tried to describe this in more detail in the post Add SQL Agent job step with tokens in output file name.
I am using Microsoft Visual Studio 2010 currently. I have a .sql file on my computer which has a lot of sql statements of code in it. Basically has three create table statements, multiple insert into statements, multiple alter table statements,adding foreign keys, etc.
I want to know is there a way that I can load that .sql file into an Execute SQL Task. Or how is it possible, in SSIS, that I can execute this long .sql file? I feel like an Execute SQL Task is involved, but I don't know for sure. This was the Execute SQL Task I tried before to no avail.
Any help would be appreciated.
I have some screenshots basically to show how long of a file I'm talking about...and it goes longer than what is shown.
Execute SQL Task will be what runs commands.
Change the SQL Source type from the default of "Direct Input" to "File Connection"
Then in the FileConnection property, specify a file connection manager that points to MyFile.sql
That said, you can just run the above file(s) in SSMS, or sqlcmd if you prefer a command line
does the sql file contain parameterised insert statements? if so you need to have map the parameters and check that the source to which you are connecting is accessible and the structure of the tables are same as well
This is my first job creation task as a SQL DBA. First step of the job runs a query and sends the output to a .CSV. As a last step, I need the job to execute the query from the .CSV file (output of first step).
I have Googled all possible combinations but no luck.
your question got lost somehow ...
You last two comments make ist a little clearer.
If I understand it correctly you create a SQL script which restores all the logins, roles and users, their rights etc. into a newly created db.
If this created script is executable within a query window you can easily execute it with EXECUTE (https://msdn.microsoft.com/de-de/library/ms188332(v=sql.120).aspx)
Another approach could be SQLCMD (http://blog.sqlauthority.com/2013/04/10/sql-server-enable-sqlcmd-mode-in-ssms-sql-in-sixty-seconds-048/)
If you need further help, please come back with more details: What does your "CSV" look like? What have you tried so far?
I am using Aqua Data Studio 7.0.39 for my Database Stuff.
I have a 20 SQL files(all contains sql statements, obviously).
I want to execute all rather than copy-paste contains for each.
Is there any way in Aqua to do such things.
Note: I am using Sybase
Thank you !!
I'm also not sure of how to do this in Aqua, but it's very simple to create a batch/powershell script to execute .sql files
You can use the SAP/Sybase isql utility to execute files, and just create a loop to cover all the files you wish to execute.
Check my answer here for more information:
Running bulk of SQL Scripts in Sybase through batch
In the latest versions of ADS there is an integrated shell named FluidShell where you can achieve what you are looking for. See an overview here: https://www.aquaclusters.com/app/home/project/public/aquadatastudio/wikibook/Documentation15/page/246/FluidShell
The command you are looking for is source
source
NAME
source - execute commands or SQL statements from a file
SYNOPSIS
source [OPTION...] FILE [ARGUMENT...]
source [OPTION...]
DESCRIPTION
Read and execute commands or SQL statements from FILE in the current shell environment.
I have not used Aquafold before so I can't tell you exactly. However I have tackled a similar problem once before.
I once created a Powershell script. It opened a ODBC connection to my database and then executed stored procedures in a loop until end of file.
I suggest having a text document with each line being the name of an Stored Proc to run. Then in your powershell script read in a line from the file concatenate it into the call to execute a stored procedure. After each execution is completed you can delete the line from the text file and then read the next line until the EOF (end of file) is reached.
Hope this helps. If I have some time this morning I will try and do a working example for you and post it.
I have a .sql file and I am trying to import it into SQL Server 2008. What is the proper way to do this?
If your file is a large file, 50MB+, then I recommend you use sqlcmd, the command line utility that comes bundled with SQL Server. It is easy to use and it handles large files well. I tried it yesterday with a 22GB file using the following command:
sqlcmd -S SERVERNAME\INSTANCE_NAME -i C:\path\mysqlfile.sql -o C:\path\output_file.txt
The command above assumes that your server name is SERVERNAME, that you SQL Server installation uses the instance name INSTANCE_NAME, and that windows auth is the default auth method. After execution output.txt will contain something like the following:
...
(1 rows affected)
Processed 100 total records
(1 rows affected)
Processed 200 total records
(1 rows affected)
Processed 300 total records
...
use readfileonline.com if you need to see the contents of huge files.
UPDATE
This link provides more command line options and details such as username and password:
https://dba.stackexchange.com/questions/44101/importing-sql-server-database-from-a-sql-file
If you are talking about an actual database (an mdf file) you would Attach it
.sql files are typically run using SQL Server Management Studio. They are basically saved SQL statements, so could be anything. You don't "import" them. More precisely, you "execute" them. Even though the script may indeed insert data.
Also, to expand on Jamie F's answer, don't run a SQL file against your database unless you know what it is doing. SQL scripts can be as dangerous as unchecked exe's
Start SQL Server Management Studio
Connect to your database
File > Open > File and pick your file
Execute it
Try this process -
Open the Query Analyzer
Start --> Programs --> MS SQL Server --> Query Analyzer
Once opened, connect to the database that you are wish running the script on.
Next, open the SQL file using File --> Open option. Select .sql file.
Once it is open, you can execute the file by pressing F5.
In order to import your .sql try the following steps
Start SQL Server Management Studio
Connect to your Database
Open the Query Editor
Drag and Drop your .sql File into the editor
Execute the import
A .sql file is a set of commands that can be executed against the SQL server.
Sometimes the .sql file will specify the database, other times you may need to specify this.
You should talk to your DBA or whoever is responsible for maintaining your databases. They will probably want to give the file a quick look. .sql files can do a lot of harm, even inadvertantly.
See the other answers if you want to plunge ahead.
Get the names of the server and database in SSMS:
Run the following command in PowerShell or CMD:
sqlcmd -S "[SERVER NAME]" -d [DATABASE NAME] -i .\[SCRIPT].sql
Here is a screenshot of what it might look like:
There is no such thing as importing in MS SQL. I understand what you mean. It is so simple. Whenever you get/have a something.SQL file, you should just double click and it will directly open in your MS SQL Studio.