Adding SQL Scripts to TFS Build Process - sql

our team currently updates our production databases via the dacpacs/bacpacs since we do not have any client data. This will not be the case in the near future and I'm looking to change the process of only modifying the database via SQL scripts via build automation.
Is managing these SQL scripts in Team Foundation Server and executing them in the build do-able? And If so how have people approached this?
Thanks!

You should probably not be deploying DACPACs during the build process; it's more appropriate to push changes as part of a release/deployment workflow that occurs after a build.
That said, you can publish a DACPAC without dropping the database and recreating it; there are numerous options for controlling the database publishing behavior that can be set at the project level and overridden at the command line if necessary.

There are two main ways of managing SQL data base changes when deploying to production.
DACPAC
Sequential SQL Scripts
Both have there own issue and bonus when trying to deploy. If you control your entire DevOps pipeline then Dacpac is a good option. If you have to deal with corporate dba's then SQL scripts can also be done.

Related

How to handle environment specific security permissions when automating Azure SQL database deployment?

We are trying to integrate Azure SQL database deployment into our CICD process.
We have three environment development, staging, and production.
CI pipeline will produce DACPAC file.
CD pipeline will deploy DACPAC into the next environment. However, it's failing because of conflicts of roles and permissions that are assigned to the databases in each environment.
What's the best way to handle this situation?
I think you should use an exclude parameter while generating your dacpac or when pusblishing.
while extracting add parameter (Doc sqlpackage extract):
sqlpackage /Action:extract /p:ignorePermissions=true
while publishing use (Doc sqlpackage publish):
sqlpackage /Action:Publish /p:ExcludeObjectTypes="Users;Permissions"
For me, I would choose the second option with publish since it gives more control over objects you want to exclude (in future you might want to exclude something else).

Create an oracle pl/sql package using a package on a different remote database

Is it possible to create a package or replace an existing package in a local database using a package from a different database without having to export it from the remote database?
Basically i have two environments/servers (DEV and QA).
The developers that work on the packages use the development environment and i would like to update the same packages in the QA environment using the package in DEV (ignore any possible issues for now e.g compilation failures etc).
Is it possible to frequently update the package in QA using the package in Dev as the source (instead of compiling from an .sql file)? Maybe a database link?
Yes, it's possible, you could created a process on your target system which uses the DBMS_METADATA package on the remote system to fetch the DDL for the desired package spec and body, and then use dynamic SQL on local system to compile the fetched code.
Alternatively, you could use tools such as Oracle's SQL Developer for migrating code. Using either the database diff functionality to detect differences and prepare the appropriate DDL scripts, or the Cart functionality to pick and choose what get's migrated. However, I'm not sure how well the SQL Developer method can be automated.

proper application version update that includes database and code update

I got an application written in YII that from time to time will need version update. Currently, when we release a new update, we manually run a shell script to copy/overwrite the application code/source files from our git repo and set the appropriate permissions and other things, then at the end of the script, we run a YII command to run our database update. We have versioning on our database update. We also rollback changes to the database if one of the sql statements of a version fails. Now the issue occurs if a database update fails, and the application code/source is updated, then it will fail when it tries to access some table fields, table or views.
How to best handle an application update with versioning? Much like the way wordpress handles its update or better.
Would like to ask for suggestions to the rigth approach, it may include RPM, GIT or other info.
It would be good to have a detailed list of processes from you guys.
Thanks.
Database updates may include backups, and running multiple scripts, and
should be handled outside of rpm packaging. There are too many failure modes
for RPM scripting to handle flawlessly.
You can always package up the database schema update script in the package
and then check for the correct schema version when your application starts,
providing instructions (or a pointer to instructions) on how to upgrade the
database, and how to reinstall the last-known-good application, when the wrong
schema version is detected.

Need a .NET database versioning script runner

I'm looking at versioning databases and came across the usual articles regarding how to do this (coding horror, ode to code, etc). This all make perfect sense to me, however I'm trying to find a script runner that will run the sql scripts for me. All these articles mention having something to run them automatically, but none of them make any recommendations.
Does anybody know of any utilities for running these scripts? Ideally something that works in the following way:
Runs everything in a transaction so if any single update fails, the whole thing fails
I have control over the name of the scheme version database table
Ability to have a series of scripts that are always run if an upgrade takes place
Can be run as part of an automated task
EDIT
Open Source
We Use DbUp as Script Runner in our Web Project. Its simple and nice open source tools that help you to write you own script runner with Console Application fashion.
DbUp is a .NET library that helps you to deploy changes to SQL Server
databases. It tracks which SQL scripts have been run already, and runs
the change scripts that are needed to get your database up to date.
we can run scripts from folder in filesystem or you can embed them to your assembly and run them as embedded scripts.
you can find more information and sample on their code repository on github.
http://dbup.github.com
Check out SSW SQL Deploy - it would appear to do just about all you're asking for. It keeps track of already executed scripts, it'll run a whole batch of scripts at once and on multiple servers (if required), and so forth.
It's a pretty simple, but nifty tool - highly recommended!

How do I set up a build server on the cheap/free?

Currently I'm tasked with doing the daily build. We have an ASP.NET 2005 website with a SQL Server 2005 backend. Our current source control is Visual Source Safe 2005.
At this point, I use the brute-force method of daily builds.
Get Latest version of source code
Get Latest version of Database release script
Backup old website files to a directory
Publish new code to my local machine
Run on my server to keep the test/stage site working
Push newly created files to the website
Run SQL Script on test database (assuming updates, otherwise I don't bother)
Test website on the Test Server.
Looking at the idea of automated builds intrigues me since it means that I do less each morning. How would you recommend I proceed? I want to have a fully fleshed out idea before I present it to my boss.
Ditch VSS, move to Subversion, and check out CruiseControl.NET. Alternatively, if you have a MSDN developer license, you can run TFS workgroup edition and set up a build server on any old XP box. Its what we do at our shop.
As Assaf noted, you can use CC.NET with VSS directly. Nice.
TeamCity has worked well for me. It has a very simple setup. Combine it with an MsBuild script for your operations and you're auto-matic.
For build management I wholeheartedly recommend TeamCity. It doesn't require IIS6 (like CC.net does) since it runs on it's own copy of Tomcat and the setup is all done thru various forms. This is a big deal to me since the build server is just an XPPro box. It integrates well with SVN and there is no crazy XML file manipulation like I had to do with CruiseControl.Net. Big win for me.
For a build runner we use NAnt to send emails to various people, copy the packaged builds where they're supposed to go, run NUnit and NCover, and deploy the software to our web farm.
For automated testing we use Watin.
http://www.nunit.org/index.php
http://www.jetbrains.com/teamcity
http://ncover.sourceforge.net/
http://subversion.tigris.org/
http://nant.sourceforge.net/
http://watin.sourceforge.net/
Try CruiseControl.Net. It's free, and whatever customized daily/continuous routine you want it to perform you can always add with scripts.
Remember, it's not just about daily (nightly) builds, but also about letting you catch build errors in time (since it continuously builds after every source commit/check-in). You don't necessarily test every code chance on every possible platform and build configuration, but CC can do exactly that for you (in the background).
http://confluence.public.thoughtworks.org/display/CCNET/Visual+Source+Safe+Source+Control+Block
All of what you are doing can be performed by a set of batch files, depending on how automated your test environment is. The main batch file can be started as a 'scheduled task' at midnight or whatever. That's how we 'do it cheap' here and at other places I've worked. If you need help with a particular batch, I can provide a sample.
I second (or third) the reccomendation for Subversion/CruiseControl.net. Also, if it is appropriate, check out hosted services for SVN like CVSDude. You'll probably become well versed with MSBuild in the process too. Once you get it setup it is great.
The cost doesn't come from licensing of the tools or even hardware necessarily, but from your time building and maintaining the system - and depending on what you are doing, that could become significant.
Start with the basics and incrementally improve it over time. Like anything else, if you try to come out of the gate with lots of automation and functionality you could find yourself mired in it fulltime for weeks.
Whatever tools you use, house them in a virtual machine (ie., vmware).
When the equipment inevitably goes south, you can copy the image onto any machine and not miss a beat because your build server decided to take the day off, assuming of course, you back up.