SSDT - Build Deployment Script without dacpac - sql

I've got a question about building a deployment script using SSDT.
Could anyone tell me if it's possible to build a deployment script using SQLPackage.exe where the source file is NOT a dacpac file, but uses the .sql files instead?
To give some background, I've created a project in Visual Studio 2012 for my database schema. This works great, and SSDT builds the folder structure without a problem (functions, stored procedures etc which contain all the .sql files).
Here's the problem - the database in question is from a legacy system, and is riddled with errors. Most of these errors we don't care about anymore and it's not practical or safe to fix them all, so for years we've basically ignored them. However it means we can't build the project and therefore can't generate the dacpac file. Now this doesn't prevent us from doing the schema compare and syncing the database with the file system (a local mercurial repository). However it does seemingly prevent us from building a deployment script.
What I'm looking for is a way of building the deployment script using SQLPackage.exe without having to generate the dacpac file. I need to use the .sql files in the file system instead. Visual Studio will produce a script of the differences without building the dacpac, so this makes me think it must be possible to do it using SQLPackage.exe using one of the parameters.
Here's an example of SQLPackage.exe which I'd like to adapt to use the .sql files instead of the dacpac:
sqlpackage.exe /Action:Script /SourceFile:"E:\SourceControl\Project\Database
\test_SSDTProject\bin\Debug\test_SSDTProject.dacpac" /TargetConnectionString:"Data
Source=local;Initial Catalog=TestDB;User ID=abc;Password=abc" /OutputPath:"C:
\temp\hbupdate.sql" /OverwriteFiles:true /p:IgnoreExtendedProperties=True
/p:IgnorePermissions=True /p:IgnoreRoleMembership=True /p:DropObjectsNotInSource=True
This works fine because it uses the dacpac file. However I need to point it at the folder structure where the .sql files are instead.
Any help would be great.

As has been suggested in comments, I think that biting the bullet and fixing the errors is the way ahead. You say
it's not practical or safe to fix them all,
but I think you should give this a bit more thought. I have recently been in a similar situation to you, and the key to emerging from it is to realise that the operational risk associated with dropping procedures and functions that will throw an exception as soon as they are called is zero.
Note that this does not apply if the reason these objects won't build is that they contain cross-database or cross-server references that are present in production but not in your project; this is a separate problem altogether, but also a solvable one.
Nor am I in favour of "exclude from build" as an alternative to "delete"; a while ago I saw a project where this technique had been deployed extensively; it makes it harder to see what does what from the source files and I am now of the opinion that "Build Action=None" is simply "commenting out the bits that don't work" for the Snapchat generation.
The key to all of this, of course, is source control. This addresses the residual risk that one day you might indeed want to implement a working version of one of your currently non-working procedures, using the non-working code as a starting point. It also obviates the need to keep stuff hanging around in the solution using Build Action=None, as one can simply summon an earlier revision of the code that contained the offending objects.
If my experience is any guide, 60 build errors is nothing; these could easily be caused by references to three or four objects that no longer exists, and can be consigned to the dustbin of source control with some enthusiastic use of the "Delete" key.

Do you have a copy of SQL Compare at your disposal? If not, it might be worth downloading the trial to see if it will work in your scenario.
Here are the available switches:
http://documentation.red-gate.com/display/SC10/Switches+used+in+the+command+line
At the very least you'll need to specify the following:
/scripts1:
/server2:
/database2:
/ScriptFile:

Related

Reference to system databases becomes duplicated in SSDT database project

In a huge SSDT database solution with lots of projects and references, I'm adding a reference from my project to system databases(master, msdb), it works well, and the build is successful.
And after some time I start receiving errors about incorrect reference. I go to the references section and see this: https://pasteboard.co/JqzDSDh.png
I tried removing the second reference and the errors were gone but then this issue comes back and I see two identical references again.
Thank you!
Most probably something is wrong with your project.sqlproj file. Try to search master.dacpac keyword there and make sure that there is no multiple entries. Make sure that dacpac path is not fully hardcoded, but uses $(DacPacRootPath) variable there.
This is an example how the reference should look like (make sure that you have the right SQL version defined in the path. Mine one is 140 here).
<ArtifactReference Include="$(DacPacRootPath)\Extensions\Microsoft\SQLDB\Extensions\SqlServer\140\SqlSchemas\master.dacpac">
<HintPath>$(DacPacRootPath)\Extensions\Microsoft\SQLDB\Extensions\SqlServer\140\SqlSchemas\master.dacpac</HintPath>
<SuppressMissingDependenciesErrors>False</SuppressMissingDependenciesErrors>
<DatabaseVariableLiteralValue>master</DatabaseVariableLiteralValue>
</ArtifactReference>
If that wouldn't help, try to run "Clean Solution", then delete all *.jfm files and *.dbmdl files, bin and obj folders and re-build the project.

Source control in SSIS and Concurrent work on dtsx file

I am working on building a new SSIS project from scratch. I want to work with couple of my teammates. I was hoping to get a suggestion on how we can have some have some source control, so that few of us can work concurrently on the same SSIS project (same dtsx file, building new packages.)
Version:
SQL Server Integration Service v11
Microsoft Visual Studio 2010
It is my experience that there are two opportunities for any source control system and SSIS projects to get out of whack: adding new items to the project and concurrent changes to an existing package.
Adding new items
An SSIS project has the .dtproj extension. Inside there, it's "just" XML defining what all belongs to the project. At least for 2005/2008 and 2012+ on the package deployment model. The 2012+ project deployment model carries a good bit more information about the state of the packages in the project.
When you add new packages (or project level connection managers or .biml files) the internal structure of the .dtproj file is going to change. Diff tools generally don't handle merging XML well. Or at all really. So, to prevent the need for merging the project definition, you need to find a strategy that works for you team.
I've seen two approaches work well. The first is to upfront define all the packages you think you'll need. DimFoo, DimDate, DimFoo, DimBar, FactBlee. Check that project and the associated empty packages in and everyone works on what is out there. When the initial cut of packages is complete, then you'll ensure everyone is sync'ed up and then add more empty packages to the project. The idea here is that there is one person, usually the lead, who is responsible for changing the "master" project definition and everyone consumes from their change.
The other approach requires communication between team members. If you discover a package needs to be added, communicate with your mates "I need to add a new package - has anyone modified the project?" The answer should be No. Once you've notified that a change to the project definition is coming, make it and immediately commit it. The idea here is that people commit and sync/check in whatever terminology with great frequency. If you as a developer don't keep your local repository up to date, you're going to be in for a bad time.
Concurrent edits
Don't. Really, that's about it. The general problem with concurrent changes to an SSIS package is that in addition to the XML diff issue above, SSIS also includes layout data alongside tasks so I can invert the layout and make things flow from bottom to top or right to left and there's no material change to SSIS package but as Siyual notes "Merging changes in SSIS is nightmare fuel"
If you find your packages are so large and that developers need to make concurrent edits, I would propose that you are doing too much in there. Decompose your packages into smaller, more tightly focused units of work and then control their execution through a parent package. That would allow a better level of granularity to your development and debugging process in addition to avoiding the concurrent edit issue.
A dtsx file is basically just an xml file. Compare it to a bunch of people trying to write the same book. The solution I suggest is to use Team Foundation Server as a source control. That way everyone can check in and out and merge packages. If you really dont have that option try to split your ETL process in logical parts and at the end create a master package that calls each sub packages in the right order.
An example: Let's say you need to import stock data from one source, branches and other company information from an internal server and sale amounts from different external sources. After u have all information gathered, you want to connect those and run some analyses.
You first design the target database entities that you need and the relations. One of your member creates a package that does all the import to staging tables. Another guy maybe handles external sources and parallelizes / optimizes the loading. You would build a package that in merges your staging and production tables, maybe historicizing and so on.
At the end you have a master package that calls each of the mentioned packages and maybe some additional logging or such.
In our multi-developer operation, we follow this rough plan:
Each dev has their own branch, separate from master branch
Once a week, devs push all their changes to remote
One of us pulls all changes, and merges all branches into master, manually resolving .dtproj conflicts as we go
Merge master in all dev branches - now all branches agree
Test in VS
Push all branches to remote, other devs can now pull and keep working
It's not a perfect solution, but it helps quarantine the amount of merge pain we have to experience.
We have large ssis solutions with 20+ packages in one solution, with TFS Git. One project required adding a bunch of new packages to the existing solution. We thought we were smart and knew to assign only one person to work on each new package, 2 people working on the same package would be suicide. Wasn't good enough. When 2 people tried add a different named, new, package at the same time, each showed dtproj as a file that had changed/needed to be checked in and suddenly I found myself looking at the xml for dtproj and trying to figure out which lines to keep (Microsoft should never ask end users to manually edit their internal files, which only they wrote and understand). Billinkc's solutions here are very good and the problem is very real. You may think that Microsoft is the great Wise One, and that your team can always add new packages to an existing solution without conflicts, but you'd be wrong. It also doesn't work to put dtproj in .gitignore. If you do that, you won't see other peoples new packages (actually the .dtsx file will come down in git, but you won't see that package in Solution Explorer because dtproj is what feeds Solution Explorer). This is a current problem (2021) and we are using Visual Studio 2017 Enterprise with SSDT.
To explain this problem to people, git obviously can handle a group of independent, individual files in a directory (like say .bat files) and can add, change, and delete those files easily. The problem comes in when you have a file that is naming, describing, and counting all the files in a directory (what dtproj does). When you have a file like dtproj you are creating a conflict on dtproj itself, when 2 people try to a add a new package at the same time. Your dtproj file has a line that shows the package you added, and my dtproj file shows the package I added, and tfs/git sees that as a Conflict.
Some are suggesting ways to deal with this if you have to add a lot of new packages, my idea is a little different. For the people who have to add new packages, don't work in the primary solution where this problem is, work somewhere else. Probably best to work in the "Projects" directory you get when you install Visual Studio, outside of TFS/Git. Obviously follow all the standards, Variable naming, and Package Configuration conventions for the target Solution. Then when the new packages are ready, give the .dtsx files to your Solution Gatekeeper for them to check in. Only the Gatekeeper can check in new packages using Add From Existing, avoiding conflicts. Once the package is checked in, developers can work on them in the main Solution.

Batch rename with MSBuild

I just joined a team that has no CI process in place (not even an overnight build) and some sketchy development practices. There's desire to change that, so I've now been tasked with creating an overnight build. I've followed along with this series of articles to: create a master solution that contains all our projects (some web apps, a web service, some Windows services, and couple off tools that compile to command line executables); created an MSBuild script to automatically build, package, and deploy our products; and created a .cmd file to do it all in one click. Here's a task that I'm trying to accomplish now as part of all this:
The team currently has a practice of keeping the web.config and app.config files outside of source control, and to put into source control files called web.template.config and app.template.config. The intention is that the developer will copy the .template.config file to .config in order to get all of the standard configuration values, and then be able to edit the values in the .config file to whatever he needs for local development/testing. For obvious reasons, I would like to automate the process of renaming the .template.config file to .config. What would be the best way to do this?
Is it possible to do this in the build script itself, without having to stipulate within the script every individual file that needs to be renamed (which would require maintenance to the script any time a new project is added to the solution)? Or might I have to write some batch file that I simply run from the script?
Furthermore, is there a better development solution that I can suggest that will make this entire process unnecessary?
After a lot of reading about Item Groups, Targets, and the Copy task, I've figured out how to do what I need.
<ItemGroup>
<FilesToCopy Include="..\**\app.template.config">
<NewFilename>app.config</NewFilename>
</FilesToCopy>
<FilesToCopy Include="..\**\web.template.config">
<NewFilename>web.config</NewFilename>
</FilesToCopy>
<FilesToCopy Include"..\Hibernate\hibernate.cfg.template.xml">
<NewFilename>hibernate.cfg.xml</NewFilename>
</FilesToCopy>
</ItemGroup>
<Target Name="CopyFiles"
Inputs="#(FilesToCopy)"
Outputs="#(FilesToCopy->'%(RootDir)%(Directory)%(NewFilename)')">
<Message Text="Copying *.template.config files to *.config"/>
<Copy SourceFiles="#(FilesToCopy)"
DestinationFiles="#(FilesToCopy->'%(RootDir)%(Directory)%(NewFilename)')"/>
I create an item group that contains the files that I want to copy. The ** operator tells it to recurse through the entire directory tree to find every file with the specified name. I then add a piece of metadata to each of those files called "NewFilename". This is what I will be renaming each file to.
This snippet adds every file in the directory structure named app.template.config and specifies that I will be naming the new file app.config:
<FilesToCopy Include="..\**\app.template.config">
<NewFilename>app.config</NewFilename>
</FilesToCopy>
I then create a target to copy all of the files. This target was initially very simple, only calling the Copy task in order to always copy and overwrite the files. I pass the FilesToCopy item group as the source of the copy operation. I use transforms in order to specify the output filenames, as well as my NewFilename metadata and the well-known item metadata.
The following snippet will e.g. transform the file c:\Project\Subdir\app.template.config to c:\Project\Subdir\app.config and copy the former to the latter:
<Target Name="CopyFiles">
<Copy SourceFiles="#(FilesToCopy)"
DestinationFiles="#(FilesToCopy->'%(RootDir)%(Directory)%(NewFileName)')"/>
</Target>
But then I noticed that a developer might not appreciate having his customized web.config file being over-written every time the script is run. However, the developer probably should get his local file over-written if the repository's web.template.config has been modified, and now has new values in it that the code needs. I tried doing this a number of different ways--setting the Copy attribute "SkipUnchangedFiles" to true, using the "Exist()" function--to no avail.
The solution to this was building incrementally. This ensures that files will only be over-written if the app.template.config is newer. I pass the names of the files as the target input, and I specify the new file names as the target output:
<Target Name="CopyFiles"
Input="#(FilesToCopy)"
Output="#(FilesToCopy->'%(RootDir)%(Directory)%(NewFileName)')">
...
</Target>
This has the target check to see if the current output is up-to-date with respect to the input. If it isn't, i.e. the particular .template.config file has more recent changes than its corresponding .config file, then it will copy the web.template.config over the existing web.config. Otherwise, it will leave the developer's web.config file alone and unmodified. If none of the specified files needs to be copied, then the target is skipped altogether. Immediately after a clean repository clone, every file will be copied.
The above turned out be a satisfying solution, as I've only started using MSBuild and I'm surprised by its powerful capabilities. The only thing I don't like about it is that I had to repeat the exact same transform in two places. I hate duplicating any kind of code, but I couldn't figure out how to avoid this. If anyone has a tip, it'd be greatly appreciated. Also, while I think the development practice that necessitates this totally sucks, this does help in mitigating that suck factor.
Short answer:
Yes, you can (and should) automate this. You should be able to use MSBuild Move task to rename files.
Long answer:
It is great that there is a desire to change from a manual process to an automatic one. There are usually very few real reasons not to automate. Your build script will act as living documentation of how build and deployment actually works. In my humble opinion, a good build script is worth a lot more than static documentation (although I am not saying you should not have documentation - they are not mutually exclusive after all). Let's address your questions individually.
What would be the best way to do this?
I don't have a full understanding of what configuration you are storing in those files, but I suspect a lot of that configuration can be shared across the development team.
I would suggest raising the following questions:
Which of the settings are developer-specific?
Is there any way to standardise local developer machines so that settings could be shared?
Is it possible to do this in the build script itself, without having to stipulate within the script every individual file that needs to be renamed?
Yes, have a look at MSBuild Move task. You should be able to use it to rename files.
...which would require maintenance to the script any time a new project is added to the solution?
This is inevitable - your build scripts must evolve together with your solution. Accept this as a fact and include in your estimates time to make changes to your build scripts.
Furthermore, is there a better development solution that I can suggest that will make this entire process unnecessary?
I am not aware of all the requirements, so it is hard to recommend something very specific. I can say suggest this:
Create a shared build script for your solution
Automate manual tasks as much as possible (within reason)
If you are struggling to automate something - it could be an indicator of an area that needs to be rethought/redesigned
Make sure your team mates understand how the build works and are able to make changes to it themselves - don't "own" the build and become a bottleneck
Bear in mind that going from no build script to full automation is not an overnight process. Be patient and first focus on automating areas that are causing the most pain.
If I have misinterpreted any of your questions, please let me know and I will update the answer.

Visual studio 2010 database project and code generation

I'm trying to use the database project in VS2010, but my setup is a bit different from standard and I can't find an easy way to get it to work.
I have a "model" project which contains some xml model definitions of a simple information for an ETL process. As well as the schema for the supplied information, it contains other metadata, for example details of which columns need to be matched up with other tables, what to do in case of a non-match, etc, etc.
Using T4 templates, I then generate sql scripts, views and tables to manage the whole thing - one sql file per xml file. There are around 30 xml definitions, but the number of parameters is small and the pattern very repetitive, so it works well.
I want to dump these sql files into the database project, in order to get it to generate the deploy scripts and identify database changes for me. I can arrange for the files to be combined into one script. Is there a way to get VS to analyse the scripts automatically, or do I need to import them every time?
EDIT: I originally asked about getting VS not to split my scripts up into individual components. I found a solution to this: copy the existing script into the project, and - crucially - change the "build action" for the script to "build" (for some reason, default is "not in build"). VS will then add the item into the model and it will be part of the generated scripts - yay! However, still no way to reference scripts in other projects...
I've read the MS how-to for database projects, but didn't find anything in it that seemed relevant
Thanks for your help,
You can do this with T4 Toolbox. Here is how: http://www.olegsych.com/2010/03/t4-tutorial-integrating-generated-files-in-visual-studio-projects/. Specifically, you want to take advantage of the Template.Output.File and Template.Output.Project properties.
Oleg

WIX MSBuild automation help - solution best practices

I know there are many questions out there regarding this same information. I have read them all, but my brain is all turned around and I don't know which way to go. Plus the lack of documentation really hurts.
Here is my scenerio. We are trying to use WIX to create an installer for our application that goes out to our dealers for our product information. The app includes about 2000 images and documents of our products and a SQL CE database that are updated via Microsoft Sync Framework. The data changes so often that keeping these 2000 as content files in the app's project is very undesirable. The app relies on .NET Framework 3.5 SP1, SQL Server CE 3.5, Microsoft Sync Framework 1.0 and ADO.NET Sync Services 2.0.
Here are the requirements for the app:
The dealers will be given the app on a CD every year for any updates (app or data updates).
The app must update itself from the internet to get any new images, documents or data.
The prerequisites must be installed if they do not exist on the client machine.
The complete installer should be generated from an MSBuild script with as little human interaction as possible (we don't want to be manually updating the 2000+ file list).
What we have accomplished so far is that we have a Votive project in our solution. We have manually specified the binaries in a .wxs file. Web have modified the .wixproj file to use the HeatDirectory task to gather our data (images and documents and database) from a specified location (This is broken and giving an ICE38 error). This seems all right, but still is a lot of work. We have to manually update our data by running the program in release mode and copying it to the specified directory.
I am looking to see what other people would do in this situation.
How would you arrange your solution with regards to the 2000+ data files? Would you create a custom build script that gets the current data from the server or would you include them as content files in the main project?
How would you get WIX to include all of the project output (including the referenced assemblies) and all of the data files? If you have any complete samples, that would be great. All I have found are little clips here and there and not an entire example from start to finish.
How would you deal with the version numbers? Would you put them as a constant in the build script and reference them through the $(var.VersionNumberName)? Would you have the version number automatically picked up from the project being deployed? If so, How?
If there is any better information than what I am finding, please include. I have read numerous articles, blogs, Stackoverflow questions, the tuturial, the wiki, etc. Everything seems to be in bits and pieces. The tutorial is nice, but doesn't explain anything about MSBuild and Votive. I would like to see a start to finish tutorial on using MSBuild and Votive and all the WIX MSBuild targets. If no one knows of a tutorial like this I may put one together. I have already spent the entire week gathering info and reading. I'm new to MSBuild as well, so if anyone has any great articles on MSBuild, please include them.
The key is to isolate the different types of complexities into separate merge modules and put them altogether into an MSI as part of the build. That way things that change often can change without impacting things that hardly change at all.
1) For the data files:
We use Paraffin to generate the WiX and hence the merge modules for an html + Flash based help system consisting of thousands of files (I can't convince the customer to go to CHM).
Compile these into a merge module all by themselves.
2) Assemblies: assuming that this is a set that changes less often just make a merge module by hand or with WixEdit with the correct files and dependencies.
3) For the version number there a lot of ways to manage this depending on your build system. The AssemblyInfoTask is pretty straight forward way to make sure all your assemblies are versioned appropriately. The MSBuild Extension Pack has some versioning stuff if you are using TFS.
I had a similar scenario and was unable to find a drop in solution so ended up with the following:
I wrote a custom command line program called wixgen.exe for generating wxs manifest files. It is pretty specific to our implementation in that it only knows how to create 2 types of wxs files. One for IIS Website/Virtual Directory deployments and another for Windows Service deployments.
Each time a build is triggered by our continuous integration server a post-build task runs wixgen with the right args to generate a new manifest.wxs for the project being changed. It automatically includes all the files needed for the deployment. These builds also version the dlls using a variation of the technique at: http://richardsbraindump.blogspot.com/2007/07/versioning-builds-with-tfs-and-msbuild.html
A seperate build which is manually triggered is then used to build the wixproj projects containing the generated wxs files and produce the msi's.
I would ditch the CD delivery (so 90's) and got with ClickOnce. This solution seems to fit well since you already use the .NET framework. With ClickOnce you should be able to just keep updating the content of your solution and make updates available to your heart's content. Let me know if you need, sample ClickOnce deployment code.
You can find more ClickOnce information here.
Similar to dkackman's answer, you should seperate your build into several components, isolating build components to be built seperately.
I come from a mainly Java background, however for building MSIs and NET executables we use maven; with the 'maven-wix-plugin' plugin for building the installers, and using the NMaven plugin for compiling any NET code. However, as we're only performing very basic development in NET, with most development in Java, we don't need too much complexity from the NMaven plugin (which is probably a 'good thing' (TM) as it's only at version 0.17).
If you're a purely NET house, you could also look into Blydan (http://www.codeplex.com/byldan), which seems to be the focus of development there at the moment (it's the same team for NMaven and Byldan).
If you do want more information on NMaven or Byldan raise another question and I'll give as much info as I can (which is not a huge amount, as stated I only do very limited NET development).