DB Script generation using fluentmigrator - msbuild

I'm using fluentmigrator and I'm stuck with a problem I need to create DB script using fluentmigrator each time I run the build script and its done but the problem is I just want to rewrite the script only if the db is altered . How can I achieve that my current code is given below
<Target Name="Migrate" >
<MakeDir Directories="$(OutputFolder)\DBScripts"></MakeDir>
<Migrate Database="sqlserver2008"
Connection="Data Source=ALen-PC;Initial Catalog=TestMigrator;User ID=user;Password=password"
Target="$(OutputFolder)\Release\bin\MigratorTest.dll"
Output="True"
OutputFilename="$(OutputFolder)\DBScripts\DBScript.sql">
</Migrate>
</Target>

At the moment, there is no support for this process in FluentMigrator. You could add a timestamp to the filename and then check the size of the file. If it is very small, less than 200 bytes then throw it away. If it is larger than 200 bytes then the schema has changed so rename the file to DBScript.sql and replace the previous version.
I would recommend submitting this as a feature request for FluentMigrator here.

Related

Fixed filename in Nlog incorporating date and time

I want to be able to set the target's filename to be a fixed name that includes the process's start time and date even if the configuration file is updated due to autoReload="true" being in the nlog intro.
I have tried the following but it recreates the log file when the log file is altered.
<target
name="file"
xsi:type="File"
fileName="C:\Logs\${cached:cached=true:inner=${date:format=yyyyMMdd.HHmmss}}.${processid}.txt"
layout="${time} ${level}: ${message}">
</target>
Is there a way that I can get a time and date that is essentially fixed for the duration of the process no matter how many times the config file is updated and incorporate it into the filename?
You could pass a custom parameter to NLog, containing the process start date and time.
GlobalDiagnosticsContext.Set("ProcessStart", yourDateVariable);
logger.Debug("Hello world!");
And then use that as part of the filename.
${gdc:ProcessStart}

Can MSBuild ItemGroup's be chunked?

I've got an ItemGroup that includes source files from my project:
<ItemGroup>
<SourceFiles Include=".\**\*.h;.\**\*.cpp"/>
</ItemGroup>
There are a few hundred source files. I want to pass them to a command line tool in an Exec task.
If I call the command line tool individually for each file:
<Exec Command="tool.exe %(SourceFiles.FullPath)" WorkingDirectory="."/>
Then, it runs very slowly.
If I call the command line tool and pass all of the files in one go:
<Exec Command="tool.exe #(SourceFiles -> '"%(FullPath)"', ' ')" WorkingDirectory="."/>
Then, I get an error if there are too many files (I'm guessing the command line length exceeds some maximum).
Is there a way I can chunk the items so that the tool can be called a number of times, each time passing up to a maximum number of source file names to the tool?
I'm not aware of any mechanism to do that with well known item metadata. What you could do is load all those paths into their own item group and write a custom task that calls the exec task. Writing a custom task is pretty simple, it can be done inline:
http://msdn.microsoft.com/en-us/library/vstudio/dd722601(v=vs.100).aspx

How detect failure of Web Performance Test run from command line

I've got a visual studio 'web performance test' to run from the command line. The plan is to create a scheduled task to run this. How do i trigger an email on failure? Either I wire that logic up in the test itself or it's external and dependent on return code but i don't think there is a return value - i.e. failure is shown in output text or by checking the saved results file.
You can use the /resultsfile:[ file name ] option with mstest.exe to create a ".trx" file. Its contents is XML and it contains a section similar to:
<ResultSummary outcome="Completed">
<Counters total="1" executed="1" passed="1" error="0" failed="0"
timeout="0" aborted="0" inconclusive="0" passedButRunAborted="0"
notRunnable="0" notExecuted="0" disconnected="0" warning="0"
completed="0" inProgress="0" pending="0" />
</ResultSummary>
(Extra white space added for clarity).
It should be a simple matter to examine the TRX file after the run and send an email if anything failed.

Conditional logic in PostDeployment.sql script using SQLCMD

I am using a SQL 2008 database project (in visual studio) to manage the schema and initial test data for my project. The atabase project uses a post deployment which includes a number of other scripts using SQLCMD's ":r " syntax.
I would like to be able to conditionally include certain files based on a SQLCMD variable. This will allow me to run the project several times with our nightly build to setup various version of the database with different configurations of the data (for a multi-tenant system).
I have tried the following:
IF ('$(ConfigSetting)' = 'Configuration1')
BEGIN
print 'inserting specific configuration'
:r .\Configuration1\Data.sql
END
ELSE
BEGIN
print 'inserting generic data'
:r .\GenericConfiguration\Data.sql
END
But I get a compilation error:
SQL01260: A fatal parser error occurred: Script.PostDeployment.sql
Has anyone seen this error or managed to configure their postdeployment script to be flexible in this way? Or am I going about this in the wrong way completely?
Thanks,
Rob
P.S. I've also tried changing this around so that the path to the file is a variable, similar to this post. But this gives me an error saying that the path is incorrect.
UPDATE
I've now discovered that the if/else syntax above doesn't work for me because some of my linked scripts require a GO statement. Essentially the :r just imports the scripts inline, so this becomes invalid sytax.
If you need a GO statement in the linked scripts (as I do) then there isn't any easy way around this, I ended up creating several post deployment scripts and then changing my project to overwrite the main post depeployment script at build time depending on the build configuration. This is now doing what I need, but it seems like there should be an easier way!
For anyone needing the same thing - I found this post useful
So in my project I have the following post deployment files:
Script.PostDeployment.sql (empty file which will be replaced)
Default.Script.PostDeployment.sql (links to scripts needed for standard data config)
Configuration1.Script.PostDeployment.sql (links to scripts needed for a specific data config)
I then added the following to the end of the project file (right click to unload and then right click edit):
<Target Name="BeforeBuild">
<Message Text="Copy files task running for configuration: $(Configuration)" Importance="high" />
<Copy Condition=" '$(Configuration)' == 'Release' " SourceFiles="Scripts\Post-Deployment\Default.Script.PostDeployment.sql" DestinationFiles="Scripts\Post-Deployment\Script.PostDeployment.sql" OverwriteReadOnlyFiles="true" />
<Copy Condition=" '$(Configuration)' == 'Debug' " SourceFiles="Scripts\Post-Deployment\Default.Script.PostDeployment.sql" DestinationFiles="Scripts\Post-Deployment\Script.PostDeployment.sql" OverwriteReadOnlyFiles="true" />
<Copy Condition=" '$(Configuration)' == 'Configuration1' " SourceFiles="Scripts\Post-Deployment\Configuration1.Script.PostDeployment.sql" DestinationFiles="Scripts\Post-Deployment\Script.PostDeployment.sql" OverwriteReadOnlyFiles="true" />
</Target>
Finally, you will need to setup matching build configurations in the solution.
Also, for anyone trying other work arounds, I also tried the following without any luck:
Creating a post build event to copy the files instead of having to hack the project file XML. i couldn't get this to work because I couldn't form the correct path to the post deployment script file. This connect issue describes the problem
Using variables for the script path to pass to the :r command. But I came across several errors with this approach.
I managed to work around the problem using the noexec method.
So, instead of this:
IF ('$(ConfigSetting)' = 'Configuration1')
BEGIN
print 'inserting specific configuration'
:r .\Configuration1\Data.sql
END
I reversed the conditional and set NOEXEC ON to skip over the imported statement(s) thusly:
IF ('$(ConfigSetting)' <> 'Configuration1')
SET NOEXEC ON
:r .\Configuration1\Data.sql
SET NOEXEC OFF
Make sure you turn it back off if you want to execute any subsequent statements.
Here's how I am handling conditional deployment within the post deployment process to deploy test data for the Debug but not Release configuration.
First, in solution explorer, open the project properties folder, and right-click to add a new SqlCmd.variables file.
Name the file Debug.sqlcmdvars.
Within the file, add your custom variables, and then add a final variable called $(BuildConfiguration), and set the value to Debug.
Repeat the process to create a Release.sqlcmdvars, setting the $(BuildConfiguration) to Release.
Now, configure your configurations:
Open up the project properties page to the Deploy tab.
On the top dropdown, set the configuration to be Debug.
On the bottom dropdown, (Sql command variables), set the file to Properties\Debug.sqlcmdvars.
Repeat for Release as:
On the top dropdown, set the configuration to be Release.
On the bottom dropdown, (Sql command variables), set the file to Properties\Release.sqlcmdvars.
Now, within your Script.PostDeployment.sql file, you can specify conditional logic such as:
IF 'Debug' = '$(BuildConfiguration)'
BEGIN
PRINT '***** Creating Test Data for Debug configuration *****';
:r .\TestData\TestData.sql
END
In solution explorer, right click on the top level solution and open Configuration Manager. You can specify which configuration is active for your build.
You can also specify the configuration on the MSBUILD.EXE command line.
There you go- now your developer builds have test data, but not your release build!
As Rob worked out, GO statements aren't allowed in the linked SQL scripts as this would nest it within the BEGIN/END statements.
However, I have a different solution to his - if possible, remove any GO statements from the referenced scripts, and put a single one after the END statement:
IF '$(DeployTestData)' = 'True'
BEGIN
:r .\TestData\Data.sql
END
GO -- moved from Data.sql
Note that I've also created a new variable in my sqlcmdvars file called $(DeployTestData) which allows me to turn on/off test script deployment.
I found a hack from an MSDN blog which worked fairly well. The trick is to write the commands to a temp script file and then execute that script instead. Basically the equivalent of dynamic SQL for SQLCMD.
-- Helper newline variable
:setvar CRLF "CHAR(13) + CHAR(10)"
GO
-- Redirect output to the TempScript.sql file
:OUT $(TEMP)\TempScript.sql
IF ('$(ConfigSetting)' = 'Configuration1')
BEGIN
PRINT 'print ''inserting specific configuration'';' + $(CRLF)
PRINT ':r .\Configuration1\Data.sql' + $(CRLF)
END
ELSE
BEGIN
PRINT 'print ''inserting generic data'';' + $(CRLF)
PRINT ':r .\GenericConfiguration\Data.sql' + $(CRLF)
END
GO
-- Change output to stdout
:OUT stdout
-- Now execute the generated script
:r $(TEMP)\TempScript.sql
GO
The TempScript.sql file will then contain either:
print 'inserting specific configuration';
:r .\Configuration1\Data.sql
or
print 'inserting generic data';
:r .\GenericConfiguration\Data.sql
depending on the value of $(ConfigSetting) and there will be no problems with GO statements etc. when it is executed.
I was inspired by Rob Bird's solution. However, I am simply using the Build Events to replace the post deployment scripts based on the selected build configuration.
I have one empty "dummy" post deployment script.
I set up a pre-build event to replace this "dummy" file based on the selected build configuration (see attached picture).
I set up a post-build event to place the "dummy" file back after the build has finished (see attached picture). The reason is that I do not want to generate changes in the change control after the build.

MSBUILD Executing only Changed SQL Scripts

I need to construct an MSBUILD script executes .SQL Scripts which have changed since last build.
I initially thought that I could copy all scripts from one folder to another using the <Copy> task and using the CopiedFiles <Output> for the copy task. However the copy task returns All files that it Attempted to copy, not actual copied files.
I am able to get MSBUILD to execute SQL Scripts via MSBUILD.ExtensionPack but Im scratching my head on this one
You can do this with a concept known as incremental building. The idea is that you would create a target and then specify the inputs and outputs, which would be files. MSBuild will compare the timestamps of the input files to the output files. If all outputs were created after all outputs then the target is skipped. If all inputs are newer then all the target will be executed for all files. If only a portion are out of date, then only those will be passed to the target. For more info on this see the section Using Incremental Builds in my article Best Practices For Creating Reliable Builds, Part 2.
Also for more resources on MSBuild I have compiled a list at http://sedotech.com/Resources#MSBuild
<Project xmlns="http://schemas.microsoft.com/developer/msbuild/2003" DefaultTargets="RunScripts">
<Import Project="$(MSBuildExtensionsPath)\ExtensionPack\MSBuild.ExtensionPack.tasks"/>
<PropertyGroup>
<ConnStr>Server=Example;Database=Example;Trusted_Connection=True</ConnStr>
<BuildFolder>Build\</BuildFolder>
</PropertyGroup>
<ItemGroup>
<Scripts Include="*.sql"/>
</ItemGroup>
<Target Name="RunScripts"
Inputs="#(Scripts)"
Outputs="#(Scripts->'$(BuildFolder)%(Filename)%(Extension)')">
<SqlExecute TaskAction="ExecuteScalar"
Files="#(Scripts)"
ConnectionString="$(ConnStr)"/>
<Copy SourceFiles="#(Scripts)"
DestinationFiles="#(Scripts->'$(BuildFolder)%(Filename)%(Extension)')"/>
</Target>
</Project>
Could it be that you copying into an empty destination?
SkipUnchangedFiles
If true, skips the copying of files that are unchanged
between the source and destination. The Copy task considers
files to be unchanged if they have the same size and the
same last modified time.
In your case i suspect that all files are considered changed since they don't exist at the destination.