I need to have a T-SQL statement that will delete files older than a day in a directory, but only if they match a certain naming convention. There are multipul backup files in this directory, but only one set we need to keep for a day. I'm unable to use a maintenance cleanup task because it will clear out all bak files, instead of only the ones I want.
Will using the dbo.xp_delete allow the user of the wildcard character %?
You should be able to do this through a SQL Server Maintenance Plan. Look at the "Maintenance Cleanup Task". You can clean up files using a variety of criteria, including age and filename.
This task uses the EXEC xp_delete_file statement.
This feature doesn't seem well documented at all. From what you can read in the comments here you get the general idea of the function:
http://sqlblog.com/blogs/andy_leonard/archive/2009/03/11/xp-delete-file.aspx
I would suggest you write a clean-up task outside of the sql server. This can be easily done with a bat-script or a Powershell script. You can execute these scripts with windows tasks or, in case of a Powershell script, even from within SQL Server itself.
Related
I have an application that has been released for several years. I have updated the SQL Server database around 90 times and each time I have created an update SQL script, so when the user runs the new software version the database is updated.
I would like to amalgamate all the updates into a single SQL script to make deployment faster and simpler, is there an easy way to do this? I presume I could simply grab the latest database after it has run through all 90 updates via SQL Server Management Studio?
EDIT
For clarification; the software I wrote automatically applies new database updates when the user downloads the latest version. This is done via C# / .Net and look for an embedded sql script on startup in the format XX_update.sql calling each script one by one i.e.
1_update.sql - this creates the tables and initial data etc. This was my initial release database.
2_update.sql - updates to the initial database such as adding a new SP or changing column datatype etc
3_update.sql
4_update.sql
...
90_update.sql (4 years and lots of program updates later!).
Ideally, I would install my software and create a brand new database running through all 90 update scripts. Then take this database and convert it into a script which I can replace all 90 scripts above.
This is too long for a comment.
There is no "easy" way to do this. It depends on what the individual updates are doing.
There is a process you can follow in the future, though. Whenever an update occurs, you should maintain scripts both for incremental updates and for complete updates. You might also want to periodically introduce major versions, and be able to upgrade to and from those.
In the meantime, you'll need to build the complete update by walking through the individual ones.
I use a similar system at work and while I prefer to run the scripts separately I have amalgamated several scripts sometimes when they have to be deployed by another person with no problem.
In SQL Server the rule is that as long as you separate the scripts by go and use SQL Server Management Studio or another tool that process the batch separator properly there is no problem in amalgamating it, because it would look like separate scripts to SQL Server (instead of being sent to SQL Server as a one big script the tool send it in batches using the go as the batch separator).
The only problem is that if you have an error in the middle of the script, the tool would continue sending the rest of batches to the server (at least by default, maybe there is an option for changing this). That is why I prefer when possible using a tool to run then separately, just to err on the safer side and stop if there is a problem and locate easily the problematic command.
Also, for new deployments your best option is what you say of using a database that is already updated instead of using the original one and apply all the updates. And to prevent being in this situation again you can keep a empty template database that is not used for any testing and update it when you update the rest of databases.
Are you running your updates manaually on the server. Instead, you can create a .bat file, powershell script or a exe. Update scripts in your folder can be numbered. The script can just pick up updates one by one and execute it over the database connection instead of you doing it.
If you have multiple script files and you want them to combine into one,
rename these as txt, combine, rename the resulting file as sql.
https://www.online-tech-tips.com/free-software-downloads/combine-text-files/
This is easy. Download SQL Server Data Tools and use that to create the deployment script. If there's no existing database, it will create all the objects, and if targeting an older version it will perform a diff against the target database and create scripts that create the missing objects, and alter the existing ones.
You can also generate scripts for the current version using SSMS, or use SSMS to take a backup, and use a restore to in an install.
I'm looking for a way to copy stored procedures from one sql database to another on the same instance. It needs to be automatic, preferably through code (t-sql or anything), so using the generate scripts trick is not viable (plus I don't want to maintain that many scripts, and people forget to run them).
I've searched a bit on this and have not found a workable solution. Someone suggested a clever trick with generating all the stored procedure text into a sql field and then converting that and executing it on the destination db but unfortunately that had issues.
Has anyone got any other ideas on how this can be done, if it's at all possible?
If I can't do it programmatically, would there be a quick solution using ssis?
Thanks.
Edit: Using mixture of sql 2005 and 2008 versions.
You can do it programatically in .NET using the SMO framework.
Free/easy implementation could be done via PowerShell.
I have a script that does just this - it generates the scripts for SQL objects, including Stored Procs, and executes the creation scripts on the target server.
It's a handy workaround when security concerns don't allow linked servers but you need to keep certain resources in sync across multiple servers.
If all you care about are the procs, it should be fairly straightforward to check sys.sql_modules on your source and target DBs and execute any that don't exist in the target via the definition field in that view.
My Application Database Without Project and without Source safe, i planned to make my DB to be as project and add it to TFS, but I have no idea how to script the stored procedures, Triggers, Views, Functions, and what is the best practice to Make Update Script for All My stored procedures, Triggers, Views, and Functions to My customers DB.
The best procedure (IMHO) is to manually maintain a strict version of your schemas. Then when you need to make changes you write a delta script to move from one version to the next. I suggest you write the DDL scripts by hand -- you can make them concise and comment them.
You can use a tool like Visual Studio Team System for database architects, take a look at Running static code analysis on SQL Server database with Visual Studio Team System for database architects it will show you how to import the data, disregard the static code analysis that comes later it does not apply to your question
I've found a good way to get SQL scripts into SCM from an existing database is to use SMSS's "export all to script" option or whatever it's called, can't remember now.
Then every other change you add the change script into your SCM with a different version number in the file name.
Every release (or set cycle depending on your development/release methodology) you apply all change scripts, then re-script the entire database, tag it, and start again.
The best way to do it - save the database in TFS as set of database creation script, i.e. MyTable table should be added to TFS as MyTable.sql file (CREATE TABLE...) etc. We are using SQL Examiner to do this - see the following article: How to keep your database under version control
We are working with SVN and I never tested SQL Examiner with TFS, but I know that the tool supports TFS.
I am using the DatabasePublishingWizard to generate a large create script, containing both data and schema. The file it produces is ginormous so opening the script to fix any syntax errors is next to impossible and machines with less than 4gb have a hard time getting it to run! What should I do and how should I do it? Thanks everyone for any help you can provide.
With the Database Publishing Wizard, you can have all the objects created as separate files instead of one big one. Then you can put all the files in source control and track all changes.
My current project uses a script to recreate a database for development. The script deletes all existing objects and then readds them using the following statement for each object file.
sqlcmd -S %1 -d THRIVEHQ -E -b -i "../Tables/Schema.sql"
if %ERRORLEVEL% NEQ 0 goto errors
Just want to add to Kevin's comment. Breaking the scripts into separate files THEN write the script to put all the files in order of execution.
When dumping a large database that has a lot of inter-dependencies as one large file won't do you much good as in most cases the script won't execute without errors. In my world I use a naming convention that helps me quickly see which, in this case, views are dependent on other views. For example, if I have a view that just produces a dump of data I'd use something like this v_VIEW_NAME_ORIGINATING-TABLE_Dump, then I'd change the suffix to something like _weekly or _weekly_Summary for views that are derived off of the main dump table.
I learned my lesson many years ago and have since follow this naming schema in all my databases.
Try DBSourceTools.
http://dbsourcetools.codeplex.com
It will script your entire database to disk, one file per database object.
Using Deployment Targets, you can then re-create any database from file.
It's specifically designed to help developers get their databases under source code control.
I would do it in steps.
Generate all of your tables and views as 1 script.
Generate all of your stored procedures and grants as 1 script.
Use DTS or SSIS to migrate your data.
All of this can be achieved with MS SQL Server Management Studio.
Off the top of my head, I would say write a script to separate the file into multiple ones with the break occurring after each "GO" statement. You could the write another script to execute each broken out file in order.
I have this SQL Job (in SQL Server 2005) that creates a backup every six(6) hours, the backup's filename is based on the timestamp so that it will create a unique filename(dbname_yyyymmddhhmmss.bak), Now my question is, How would I know using xp_cmdshell if the file is three day old and based on my script I want to delete backup(.bak) that is a three day old. Can someone out there help me, thanks in advance. Cheers!
I agree that xp_cmdshell is not the best alternative for the job. If you're like me and you don't like/trust maintenance plans, you can probably write a C# console application, where file system support is much stronger than what you can do in DOS (or using T-SQL to parse the output of xp_cmdshell 'DIR ...'), and then schedule that in a windows scheduled task so that you don't have to worry about escalation of privileges from the SQL Server service/proxy account. While it's nice to put everything in one package, you don't always want the guy who changes your oil to make you a quiche.
This is not really the answer to your questions, but you could do this directly in SqlServer 2005 with a Maintenance Plan (Object Explorer -> Management -> Maintenance Plans).
I usually create one Maintenance Plan including two tasks: One "Maintenance Cleanup Task" which deletes old backups after x days, followed by a "Back Up Database Task".
This is not really a task that is suited to xp_cmdshell. Enabling this feature within SQL Server also has security implications.
What you are looking to achieve would be much better suited to SQL Server Integration Services (SSIS). There are components available that can be used to manage and perform your backups, as well as File System Task components that can be used to move and delete data.
You could use a combination of a File System Task, variables and expressions in order to retrieve the backup filename, extract the date component and determine how old the file is. You can then take appropriate action on the file.
I hope this helps but please feel free to pose further questions if you require additional information.
Cheers,John
You could write a .NET assembly and call it from within SQL Server. It would be fairly easy to write it so that a table valued function returns all of the files in a certain directory with filename and file datestamp.