SQL16010: Incorrect syntaxt after using :r on a database project - sql

I have a database project with the following structure
When I try to publish the profile, the VS compiles de code before and is showing me the following error:
SQL46010: Incorrect syntax near .
I have this option enable for the SQLCMD on my VS configurations
My OneTimeMaster.sql looks also has another error after :r, the code looks like this
:setvar path ".\Sprint 1.11"
:r $(path)\Header.sql
How can I make it run just to get the generated script.

I am assuming that your post-deployment or pre-depolyment script are pointing to your OneTimeMaster.sql, on that case if you have your SQLCMD activated, the pre and post deployment sqls will not have any error, but when you try to make a build seems like the other files interpreted by the compiler as regular sql without the SQLCMD command. I tested your scenario and the way that I was able to generate the script was changing the Property Action of the OneTimeMaster.sql and all the subsql files to None. Doing that the generated script had the merge of your Testing1.sql and Testing2.sql. Hope this helps

Related

Getting error while executing DACPAC file (using sqlpackage.exe)

I am getting below error while executing DACPAC file using SQLPackage.
The column [dbo].[Temp].[GMTOffset] on table [dbo].[Temp] must be added, but the column has no default value and does not allow NULL values. If the table contains data, the ALTER script will not work. To avoid this issue you must either: add a default value to the column, mark it as allowing NULL values, or enable the generation of smart-defaults as a deployment option.
PowerShell script -
& $using:SqlPackagePath /Action:Publish /tu:$using:DatabaseUsername /tp:$using:DatabasePassword
/tsn:$using:ServerInstance /tdn:"$_" /sf:$using:DacpacLocation /p:BlockOnPossibleDataLoss=False
I have set 'Generate smart defaults, when applicable' setting in publish profile of the DB project and execute the PowerShell script after compiling the project, however, still getting this error. Any pointers or help would be appreciated.
This error was resolved after specifying this option on the command line like below as #Peter also mentioned.
& $using:SqlPackagePath /Action:Publish /tu:$using:DatabaseUsername /tp:$using:DatabasePassword /tsn:$using:ServerInstance /tdn:"$_" /sf:$using:DacpacLocation /p:GenerateSmartDefaults=True /p:BlockOnPossibleDataLoss=False

TFS2015 Powershell on Target Machine

I am trying to pass some data to a remote powershell script within the TFS2015 build step.
My step is calling a remote Powershell script on a target machine. I am passing data as script parameters. The following script parameters are what I have defined.
This parameter list works:
-buildVersion $(Build.BuildNumber) -queuedBy $env:USERNAME (but the name is the account running the script)
,but I really want the Build.QueuedBy username to get passed so I have tried:
-queuedBy $(Build.QueuedBy)
….or
-queuedBy $env:BUILD_QUEUEDBY
This does not work. Am I specifying something incorrectly or is there a better way?
I would also like to get some of the Build definition Variables to the remote script as well.
I have displayed the variables available to me with a Command line step running: cmd /k set
In order to get the correct value you need something like this:
$a = Get-Item -Path "Env:BUILD_QUEUEDBY"
$a = $a.Value

How to provide vsdbcmd deploy command line target dbschema sql command variables?

The Visual Studio (2010) gui provides options for specifying second command variable file for target. I however cant find this option for the command line implementation - vsdbcmd.exe.
Running vsdbcmd deploy for dbschema to dbschema with only source model command variables given results that objects that implement the variables are treated as having changes. Resulting in incorrect(improper) update script.
The command i use currently:
vsdbcmd.exe /a:deploy /dd:- /dsp:sql /model:Source.dbschema /targetmodelfile:Target.dbschema /p:SqlCommandVariablesFile=Database.sqlcmdvars /manifest:Database.deploymanifest /DeploymentScriptFile:UpdateScript.sql /p:TargetDatabase="DatabaseName"
What im looking for is the /p:TargetSqlCommandVariablesFile, if such thing exists ...
The result script is the same as running so GUI compare without specifying the sqlcmd vars for target
I found what looks like full documentation for VSDBCMD.EXE at this link.
I think you may be looking for something like:
/p:SqlCommandVariablesFile=Filepath
In the end i found no info on the possibility to do what I required - checked vsdbcmd libs with IL spy for hidden parameters - didn't find any.
Reached my goal by parsing the dbschema files for both target and current and parsing the cmd variable values directly into them - then doing the compare on modified dbschemas. This approach no longer allows to change sql cmd vars in resulting script (as the values are already baked into code), however this was deemed as acceptable loss.
Not the most beautiful solution but so far i have had no issues with it.

Running Powershell scripts through SQL

I have a script that runs Invoke-SQLCmd against a SQLServer called Server1. Data that is collected from that is passed along to another script that is fired off against Server2 and the results are inserted back into a table on Server 1. On every Invoke-SQLCmd I have used the -user -password with an account that has sa permissions on both systems.
When i run the script from the command shell or from the Poershell ISE my data is inserted into the table and every thing works fine; When i run it from within SQL nothing happens. I get no outputs ("null" is returned) when i use xp_cmdshell as below.
xp_cmdshell 'powershell.exe -file c:\script.ps1 -ExecutionPolicy Unrestricted'
I have put it into a SQLjob and used a proxy account that links to my domain account that has admin rights on both boxes yet still no results recorded in the job history and no data in my table on Server1.
What am i doing wrong? surely this should work if it works from the ISE?
I haven't had any problem doing this and even created a couple of blog posts:
http://sev17.com/2009/04/05/executing-powershell-in-sql-server (mirror)
http://sev17.com/2010/11/29/executing-powershell-in-sql-server-redux (mirror)
The one thing that I'm doing differently is using the -command parameter with a file name instead of -file, but that shouldn't make a difference. I'm also enclosing the file name in double quotes but this shouldn't make a difference either if the script file has not spaces in the file path.
Outside of that I would need to see what your script is doing. For instance is it connecting to other machines? Can you run a simple command in powershell like 'powershell -command get-command'?
It would seem that i was not loading the snapins correctly. Although my SQLsnapins were loaded for the first session it wasnt passing it to the second PS script that was running invoke-SQLcmd. My second script did add the cmdlet snapin but that may not have been enough.
It should have worked but for whatever reason, adding the script block from here fixed it. :/
http://msdn.microsoft.com/en-us/library/cc281962.aspx
Thanks to those that responded.
What am i doing wrong?
I think there is an error in the example provided. I would have expected:
xp_cmdshell 'powershell.exe -ExecutionPolicy Unrestricted -file c:\script.ps1'
Because: "File must be the last parameter in the command, because all characters typed after the File parameter name are interpreted as the script file path followed by the script parameters and their values."
Source: PowerShell.exe Command-Line Help

Conditional logic in PostDeployment.sql script using SQLCMD

I am using a SQL 2008 database project (in visual studio) to manage the schema and initial test data for my project. The atabase project uses a post deployment which includes a number of other scripts using SQLCMD's ":r " syntax.
I would like to be able to conditionally include certain files based on a SQLCMD variable. This will allow me to run the project several times with our nightly build to setup various version of the database with different configurations of the data (for a multi-tenant system).
I have tried the following:
IF ('$(ConfigSetting)' = 'Configuration1')
BEGIN
print 'inserting specific configuration'
:r .\Configuration1\Data.sql
END
ELSE
BEGIN
print 'inserting generic data'
:r .\GenericConfiguration\Data.sql
END
But I get a compilation error:
SQL01260: A fatal parser error occurred: Script.PostDeployment.sql
Has anyone seen this error or managed to configure their postdeployment script to be flexible in this way? Or am I going about this in the wrong way completely?
Thanks,
Rob
P.S. I've also tried changing this around so that the path to the file is a variable, similar to this post. But this gives me an error saying that the path is incorrect.
UPDATE
I've now discovered that the if/else syntax above doesn't work for me because some of my linked scripts require a GO statement. Essentially the :r just imports the scripts inline, so this becomes invalid sytax.
If you need a GO statement in the linked scripts (as I do) then there isn't any easy way around this, I ended up creating several post deployment scripts and then changing my project to overwrite the main post depeployment script at build time depending on the build configuration. This is now doing what I need, but it seems like there should be an easier way!
For anyone needing the same thing - I found this post useful
So in my project I have the following post deployment files:
Script.PostDeployment.sql (empty file which will be replaced)
Default.Script.PostDeployment.sql (links to scripts needed for standard data config)
Configuration1.Script.PostDeployment.sql (links to scripts needed for a specific data config)
I then added the following to the end of the project file (right click to unload and then right click edit):
<Target Name="BeforeBuild">
<Message Text="Copy files task running for configuration: $(Configuration)" Importance="high" />
<Copy Condition=" '$(Configuration)' == 'Release' " SourceFiles="Scripts\Post-Deployment\Default.Script.PostDeployment.sql" DestinationFiles="Scripts\Post-Deployment\Script.PostDeployment.sql" OverwriteReadOnlyFiles="true" />
<Copy Condition=" '$(Configuration)' == 'Debug' " SourceFiles="Scripts\Post-Deployment\Default.Script.PostDeployment.sql" DestinationFiles="Scripts\Post-Deployment\Script.PostDeployment.sql" OverwriteReadOnlyFiles="true" />
<Copy Condition=" '$(Configuration)' == 'Configuration1' " SourceFiles="Scripts\Post-Deployment\Configuration1.Script.PostDeployment.sql" DestinationFiles="Scripts\Post-Deployment\Script.PostDeployment.sql" OverwriteReadOnlyFiles="true" />
</Target>
Finally, you will need to setup matching build configurations in the solution.
Also, for anyone trying other work arounds, I also tried the following without any luck:
Creating a post build event to copy the files instead of having to hack the project file XML. i couldn't get this to work because I couldn't form the correct path to the post deployment script file. This connect issue describes the problem
Using variables for the script path to pass to the :r command. But I came across several errors with this approach.
I managed to work around the problem using the noexec method.
So, instead of this:
IF ('$(ConfigSetting)' = 'Configuration1')
BEGIN
print 'inserting specific configuration'
:r .\Configuration1\Data.sql
END
I reversed the conditional and set NOEXEC ON to skip over the imported statement(s) thusly:
IF ('$(ConfigSetting)' <> 'Configuration1')
SET NOEXEC ON
:r .\Configuration1\Data.sql
SET NOEXEC OFF
Make sure you turn it back off if you want to execute any subsequent statements.
Here's how I am handling conditional deployment within the post deployment process to deploy test data for the Debug but not Release configuration.
First, in solution explorer, open the project properties folder, and right-click to add a new SqlCmd.variables file.
Name the file Debug.sqlcmdvars.
Within the file, add your custom variables, and then add a final variable called $(BuildConfiguration), and set the value to Debug.
Repeat the process to create a Release.sqlcmdvars, setting the $(BuildConfiguration) to Release.
Now, configure your configurations:
Open up the project properties page to the Deploy tab.
On the top dropdown, set the configuration to be Debug.
On the bottom dropdown, (Sql command variables), set the file to Properties\Debug.sqlcmdvars.
Repeat for Release as:
On the top dropdown, set the configuration to be Release.
On the bottom dropdown, (Sql command variables), set the file to Properties\Release.sqlcmdvars.
Now, within your Script.PostDeployment.sql file, you can specify conditional logic such as:
IF 'Debug' = '$(BuildConfiguration)'
BEGIN
PRINT '***** Creating Test Data for Debug configuration *****';
:r .\TestData\TestData.sql
END
In solution explorer, right click on the top level solution and open Configuration Manager. You can specify which configuration is active for your build.
You can also specify the configuration on the MSBUILD.EXE command line.
There you go- now your developer builds have test data, but not your release build!
As Rob worked out, GO statements aren't allowed in the linked SQL scripts as this would nest it within the BEGIN/END statements.
However, I have a different solution to his - if possible, remove any GO statements from the referenced scripts, and put a single one after the END statement:
IF '$(DeployTestData)' = 'True'
BEGIN
:r .\TestData\Data.sql
END
GO -- moved from Data.sql
Note that I've also created a new variable in my sqlcmdvars file called $(DeployTestData) which allows me to turn on/off test script deployment.
I found a hack from an MSDN blog which worked fairly well. The trick is to write the commands to a temp script file and then execute that script instead. Basically the equivalent of dynamic SQL for SQLCMD.
-- Helper newline variable
:setvar CRLF "CHAR(13) + CHAR(10)"
GO
-- Redirect output to the TempScript.sql file
:OUT $(TEMP)\TempScript.sql
IF ('$(ConfigSetting)' = 'Configuration1')
BEGIN
PRINT 'print ''inserting specific configuration'';' + $(CRLF)
PRINT ':r .\Configuration1\Data.sql' + $(CRLF)
END
ELSE
BEGIN
PRINT 'print ''inserting generic data'';' + $(CRLF)
PRINT ':r .\GenericConfiguration\Data.sql' + $(CRLF)
END
GO
-- Change output to stdout
:OUT stdout
-- Now execute the generated script
:r $(TEMP)\TempScript.sql
GO
The TempScript.sql file will then contain either:
print 'inserting specific configuration';
:r .\Configuration1\Data.sql
or
print 'inserting generic data';
:r .\GenericConfiguration\Data.sql
depending on the value of $(ConfigSetting) and there will be no problems with GO statements etc. when it is executed.
I was inspired by Rob Bird's solution. However, I am simply using the Build Events to replace the post deployment scripts based on the selected build configuration.
I have one empty "dummy" post deployment script.
I set up a pre-build event to replace this "dummy" file based on the selected build configuration (see attached picture).
I set up a post-build event to place the "dummy" file back after the build has finished (see attached picture). The reason is that I do not want to generate changes in the change control after the build.