Schedule creation of a stored procedure - sql-server-2012

I have a Test database which is overwritten each week by a fresh new production copy.
But we have changes in our Test environment which I script in manually each Monday morning after the copy is created.
Is there a way to schedule script code to run which can generate my objects and data changes eg new stored procedures etc.
The Job scheduler in SQL Server can import a SQL script, but it's not dynamic I need something that I can use in future where it will read in the script each time before it's run and pick up any changes.

I suggest you create a SSIS package and use SMO inside a script component to generate DDL.
This link may help you a little bit.
Using SMO is very easy and straight forward
SMO tutorial

Related

SQL Server database : amalgamate 90 database update scripts into a single script

I have an application that has been released for several years. I have updated the SQL Server database around 90 times and each time I have created an update SQL script, so when the user runs the new software version the database is updated.
I would like to amalgamate all the updates into a single SQL script to make deployment faster and simpler, is there an easy way to do this? I presume I could simply grab the latest database after it has run through all 90 updates via SQL Server Management Studio?
EDIT
For clarification; the software I wrote automatically applies new database updates when the user downloads the latest version. This is done via C# / .Net and look for an embedded sql script on startup in the format XX_update.sql calling each script one by one i.e.
1_update.sql - this creates the tables and initial data etc. This was my initial release database.
2_update.sql - updates to the initial database such as adding a new SP or changing column datatype etc
3_update.sql
4_update.sql
...
90_update.sql (4 years and lots of program updates later!).
Ideally, I would install my software and create a brand new database running through all 90 update scripts. Then take this database and convert it into a script which I can replace all 90 scripts above.
This is too long for a comment.
There is no "easy" way to do this. It depends on what the individual updates are doing.
There is a process you can follow in the future, though. Whenever an update occurs, you should maintain scripts both for incremental updates and for complete updates. You might also want to periodically introduce major versions, and be able to upgrade to and from those.
In the meantime, you'll need to build the complete update by walking through the individual ones.
I use a similar system at work and while I prefer to run the scripts separately I have amalgamated several scripts sometimes when they have to be deployed by another person with no problem.
In SQL Server the rule is that as long as you separate the scripts by go and use SQL Server Management Studio or another tool that process the batch separator properly there is no problem in amalgamating it, because it would look like separate scripts to SQL Server (instead of being sent to SQL Server as a one big script the tool send it in batches using the go as the batch separator).
The only problem is that if you have an error in the middle of the script, the tool would continue sending the rest of batches to the server (at least by default, maybe there is an option for changing this). That is why I prefer when possible using a tool to run then separately, just to err on the safer side and stop if there is a problem and locate easily the problematic command.
Also, for new deployments your best option is what you say of using a database that is already updated instead of using the original one and apply all the updates. And to prevent being in this situation again you can keep a empty template database that is not used for any testing and update it when you update the rest of databases.
Are you running your updates manaually on the server. Instead, you can create a .bat file, powershell script or a exe. Update scripts in your folder can be numbered. The script can just pick up updates one by one and execute it over the database connection instead of you doing it.
If you have multiple script files and you want them to combine into one,
rename these as txt, combine, rename the resulting file as sql.
https://www.online-tech-tips.com/free-software-downloads/combine-text-files/
This is easy. Download SQL Server Data Tools and use that to create the deployment script. If there's no existing database, it will create all the objects, and if targeting an older version it will perform a diff against the target database and create scripts that create the missing objects, and alter the existing ones.
You can also generate scripts for the current version using SSMS, or use SSMS to take a backup, and use a restore to in an install.

Process SSIS Package when DateModified Changes

I have built an SSIS Package for our HR Department.
The Data is manually downloaded from a web portal in .xls format.
The current process is not automated due to the inconsistent frequency of the data drop.
What I would like to do is alter the SSIS package so that it looks at the Source data (they just save over the existing file every month) and have the Job run as soon as the "Date Modified" has changed.
I am not familiar with C# so I would like to avoid this option if possible, I am still a little new to all of this. I am hoping that there is a Loop Container option or something.
*Additional data: The Table that is loaded is truncated before the load. I don't know if this will factor in or not but I wanted to put it out there.
There are few approaches to accomplish this. Basically you need to implement some sort of trigger mechanism to run package when data modified has changed in source data.
Create a Windows service that uses WMI to detect change in date modified and launch packages.
You can use create package with infinite loop and schedule it through Agent Job. But please note SSIS is resource intensive so you have to plan interval for loop accordingly.
Set up a Table and create a trigger on the table for insert, that should execute the agent job with the sp_start_job system stored procedure. You could use Service Broker to add a new item to a queue. Then the execution of the SSIS would not be anywhere near as problematic as calling it directly from a trigger.
Some useful articles -
Execute SSIS package when a file is arrived at folder
Trigger SSIS package when files available in a Folder
Trigger SSIS package when files available in a Folder part#2
How to Check IF File Exists In Folder [Script Task]
Trigger SSIS package

Understanding SQL Database Project workflow

The build process for my SQL 2008 Database Project takes upwards of 15 minutes for 1 build. I only need to manage roughly 50 stored procedures. I created a database project and a server project.
Next thing I do is fix all the build errors. Now I modify a stored procedure. Then I have to build the entire database and script the entire database just to see if my stored procedure compiles.
Is there any way to test the stored procedure without going through a 15 minute build - then deploy the script? Can I build just changes instead of the entire DB?
First of all, you should look at why your builds take so long. Maybe you need more memory or something.
Second, why do you have to script the entire database to test the stored procedure? Just deploy to a test database, or even your local sandbox database.
I decided to go with MSSCCI. It's simple UI which plugs directly into SSMS and behaves like Team Explorer is exactly what I've been searching for. Getting started.
I've never used visual studios to "build" and maintain databases. This, I'd guess, will quickly become unmanageable as the database starts getting bigger. And I'm assuming when you "build" it, it's verifying and deploying all objects in the database.
I would suggest you do not use visual studios in this fashion. Just maintain your sql code independently. And store them in version control systems manually. That way you can update each stored procs separately. In other words, keep each stored proc as a file.
if object_id(<ProcName>) is not null
drop proc <ProcName>
GO
create proc <ProcName>
...
GO
Then, store that as ProcName.sql and let source control handle the rest. Sorry if this isn't helpful or if I'm just stating the obvious.

can a sql 2005 ssis package be scheduled?

I have a data dump that I manually initiate and I want to automate things now that they are working well. I have a system that exports data into Excel that I ultimately want to import into a SQL table.
I have a ssis package that I used for the import and saved it for re-use later. I just manually ran it and it works well. Now I would like to have it run either when invoked by a file watcher or schedule or some thing so that all I need to do is over-write the excel file and have it trigger the ssis to run its import.
Any ideas on how to make this happen?
SQL Server does its scheduling with SQL Agent, so try creating a schedule in that to do what you want.

SQL SERVER Project

My Application Database Without Project and without Source safe, i planned to make my DB to be as project and add it to TFS, but I have no idea how to script the stored procedures, Triggers, Views, Functions, and what is the best practice to Make Update Script for All My stored procedures, Triggers, Views, and Functions to My customers DB.
The best procedure (IMHO) is to manually maintain a strict version of your schemas. Then when you need to make changes you write a delta script to move from one version to the next. I suggest you write the DDL scripts by hand -- you can make them concise and comment them.
You can use a tool like Visual Studio Team System for database architects, take a look at Running static code analysis on SQL Server database with Visual Studio Team System for database architects it will show you how to import the data, disregard the static code analysis that comes later it does not apply to your question
I've found a good way to get SQL scripts into SCM from an existing database is to use SMSS's "export all to script" option or whatever it's called, can't remember now.
Then every other change you add the change script into your SCM with a different version number in the file name.
Every release (or set cycle depending on your development/release methodology) you apply all change scripts, then re-script the entire database, tag it, and start again.
The best way to do it - save the database in TFS as set of database creation script, i.e. MyTable table should be added to TFS as MyTable.sql file (CREATE TABLE...) etc. We are using SQL Examiner to do this - see the following article: How to keep your database under version control
We are working with SVN and I never tested SQL Examiner with TFS, but I know that the tool supports TFS.