This is the situation: The Dacpac and ISpac files are deployed with a Powershell script.
The result of the dacpac goes to Server1, the ISpac to Server2.
In post-deploy of the dacpac an account and credentials are added on Server1 along with some other configurations.
When that is done, the connection should be changed to Server2 done by :connect Server2, for some additional setup .
When testing in SSMS SQLCMD mode this works fine, but VS complains with error 72006: Fatal scripting error: Command Connect is not supported.
So, can it be done? And if it can, how?
TIA
Make sure that VS have activated the sql cmd mode that is a button in the query toolbar
It looks like what I try to do is not possible, but there is a workaround.
Create a dummy database project with an essentially empty database.
You can either use a publish script to basically not create anything, or you
can drop the database afterwards in your Powershell script.
Put your code in Postdeploy of the dummy project.
Test and deploy
Related
I have a Powershell script that loops through a list of SQL Servers and creates server logins and database users.
The script runs on a separate server, under the administrator credentials on that server, and connects to the other SQL Servers via linked servers.
#Get administrator credentials
$password = get-content C:\Powershell\General\password.txt | convertto-securestring;
$cred = new-object -typename System.Management.Automation.PSCredential -argumentlist "DOMAIN\administrator",$password;
When this script is run manually (either directly through a Powershell window or using a batch file through a command prompt) it works perfectly well. I am logged onto the executing server as administrator when running the script manually.
I then tried to run this Powershell script using an SSIS package on the executing server, using the Execute Process Task to run a batch file. The package was executed from a SQL Agent Job. Although both the job and the package seemed to execute successfully, the DDL statements were not executed against the linked servers.
SQL Agent on the executing server is run under a designated Service Account. SSIS runs under the Network Service account.
Does anybody have any thoughts on what I might be doing wrong? I am happy to provide details of the script or anything else that is required.
Thanks
Ash
UPDATE: ok we have a little more information.
I took out the lines I posted above as I have discovered I don't actually need the administrator credentials I was retrieving.
I logged onto the server with the script on it using the service account. As per #ElecticLlama's suggestion I set a Profiler trace on the destination server. When running the script manually (or running a batch file manually that runs the Powershell script) everything works well and the Profiler shows the DDL actions, under the service account login.
When running a job through SQL Agent (either a CmdExec job or an SSIS package) that runs the same batch file, I get the following error:
'Login failed for user 'DOMAIN\ServiceAccount'. Reason: Token-based server access validation failed with an infrastructure error.'
Anybody have any further thoughts?
Thnaks to everyone for their help. Once I got that last error a quick search revealed I just had to restart SQL Agent and now everything works as it should. Thanks in particular to #ElecticLlama for pointing me in the right direction.
Ash
I've created a SQL Agent Job that executes a SSIS package as one of the job steps. I'm trying to get configure the SSIS job step to be set to execute the package on "localhost" (or whatever I need to call it to reference the same SQL server instance the job is on) so that I can script it the job out, and deploy between environments using the same script.
This is SQL Server 2012, so I'm trying to run it using the SSIS Catalog that's installed on the local SQL instance. I don't want to have to go into the script and change server names as we push the script from development, to the test environment, and eventually production.
I've tried putting "localhost" in the "Server" textbox, then clicking the "..." by the "Package" setting, but I get an error saying "Verify the instance name is correct and that SQL Server is configured to allow remote connections" -- which I take to mean that it's attempting to connect to a server that is actually named localhost, as opposed to just checking itself.
Anyone know how to solve this?
How about in your script:
DECLARE #Command NVARCHAR(MAX) =
N'/ISSERVER "\"\SSISDB\....dtsx\"" /SERVER "\"' + ##SERVERNAME + '\"" ...'
And then pass it (#command = #Command) to sp_add_jobstep instead.
You might have to handle non-default instances returned by ##SERVERNAME special because SSIS doesn't use the instance name, I guess, but I didn't try it out.
I am trying to move all .zip in a specific folder to another folder. the source folder is located on another server, currently i am using
EXECUTE xp_cmdshell 'copy \\server1\e$\ETL\*.zip \\server2\e$\ETL\'
GO
Which is working if I am logged into both server, but the goal is to automate this process VIA sql server job agent. I have tried
EXECUTE sp_xp_cmdshell_proxy_account 'domain\useracc','pass'
GO
EXECUTE xp_cmdshell 'copy \\server1\e$\ETL\*.zip \\server2\e$\ETL\'
GO
but I am receiving the following error;
An error occurred during the execution of sp_xp_cmdshell_proxy_account. Possible reasons: the provided account was invalid or the '##xp_cmdshell_proxy_account##' credential could not be created. Error code: '0'.
And also not sure if this is my solution. Please help with how I can achieve this. The file names on server1 change name and quantity everyday.
I would strongly advise...Do not use xp_cmdshell. It opens up large security wholes in your surface area and makes you vulnerable to attack. xp_cmdshell should be disabled!
Instead, if you want to automate this with server agent you have 2 options. My preference would be to write a simple SSIS package with a file system task and schedule this package with server agent. SSIS is underutilized for this kind of task but is actually pretty good at it.
Alternatively re-write your script to use Server Agent CmdExec job steps. This does not require xp_cmdshell to be enabled and reduces the attack surface.
I Found that the following worked for me;
In the command prompt, type services.msc, this would open the list of all services on the server.
In the list of services, look for SQL Server Agent, Right Click -> Properties. Go to Logon Tab
Change the logon to a user with access on both servers. then re-write your script to use Server Agent CmdExec job steps(Thank you Pete Carter)
I have a deployment package that needs to run against about 3 different enviroments.
I want to specify a sql script to run (source) with the enviroments database (destiniation).
I don't want to specify the connection string in the deploy script because it contains sql login info.
I would like to be able to read a setting from the destination for the connection string.
Can I mark this a parameter to be specified when unpackaging the deployment package on the server? If so, how so I use the parameter in the dest:sql="connection string"?
Any suggestions would be great.
Scott Guthrie has a pretty good write up on this sort of thing here. He specifically mentions the changing of parameters both in prompts for the admin and in an automated fashion via the command line within deployment and/or automation scripts.
I started building my site in WebMatrix and then switched to using VS2010 so I could have better Intellisense and debugging. I've been loading WebMatrix to deploy and it's been working fine.
However, loading WebMatrix is a PITA and I actually want more flexibility over the web deployment process.
So I started learning about msdeploy.exe and how to use it. I was able to successfully get the site to sync as I wanted with the following command line:
msdeploy.exe
-verb:sync
-dest:iisApp=MySite,wmsvc=www.mysite.com,username=administrator,password=blahblahblah
-allowUntrusted
-skip:absolutePath=webdeploy.cmd
-skip:absolutePath=web.config
-skip:objectName=dirPath,absolutePath="App_Data"
-skip:objectName=dirPath,absolutePath="bin"
-skip:absolutePath=vwd.webinfo
-source:iisApp="C:\Users\charlie\Documents\Visual Studio 2010\WebSites\MySite"
I had to use -allowTrusted because the cert on the server uses a differnt host name than www. No biggie. I have some -skips for stuff I don't want to write to the dest as well.
It all works great.
I use SQL Server (Express) on my host (a WebMatrix AMI on AWS).
I want to have the ability to push my database to the host as well. I am trying to use the following msdeploy commmand:
msdeploy.exe
-verb:sync
-source:dbFullSql="data source=.\SQLEXPRESS;Integrated Security=SSPI;AttachDBFilename=C:....\MySite.mdf;User instance=true"
-dest:dbFullSql="Server=www.mysite.com\SQLEXPRESS;Initial Catalog=webmatrix_db;Uid=webmatrix_user;Pwd=<pwd>"
This gives me
Error: The database 'webmatrix_db' could not be created.
Error: A network-related or instance-specific error occurred while establishing aa
connection to SQL Server. The server was not found or was not accessible. ...
I think my problem is the connection string. I copied Server=".\SQLEXPRESS;Initial Catalog=webmatrix_db;Uid=webmatrix_user;Pwd=<pwd> from the WebMatrix UI and pre-pended it with www.mysite.com thinking it needed my hostname somewhere.
Obviously this is not correct and I can't find any examples of connection strings that work either.
Note that SQL is not exposed directly by this server. I assume WebMatrix's invocation of msdeploy is connecting using my admin credentials (not the SQL credentials) first and then msdeploy invokes the SQL commands on the remote host. I need something like the ...wmsvc=www.mysite.com,username=administrator,password=blahblahblah in the -dest option of the first example I gave above.
It would be awesome if I could see a log of how WebMatrix was invoking msdeploy.
What is the correct msdeploy command to do what I want?
[UPDATE - ANSWER]
One of the best things about StackOverflow, is that posting a question really makes you think about what you are doing. Shortly after I posted the above, I realized the wmsvc=www.mysite.com,username=administrator,password=blahblahblah parameter in the -dest parameter was the key. The question became how to correctly add it to my specific example.
This msdeploy command line now connects correctly:
msdeploy.exe -verb:sync -source:dbFullSql="data source=.\SQLEXPRESS;Integrated Security=SSPI;AttachDBFilename=C:\Users\charlie\Documents\Visual Studio 2010\WebSites\Fiinom\App_Data\MySite.mdf;User instance=true" -dest:dbFullSql="Server=.\SQLEXPRESS;Initial Catalog=webmatrix_db;Uid=webmatrix_user; Pwd=rI2vP3rK6hV8nN8",wmsvc=www.mysite.com,username=administrator,password=blahblahblah -allowUntrusted
Now that msdeploy is connecting successfully and executing commands, I need to figure out how to make it actually merge the database. Right now it's giving me an error that a table already exists and can't create it...
This is related to your side comment on merging the database...
Currently Web Deploy does not have any provider that supports database merges - the dbFullSql provider uses SQL Server Management Objects ("SMO") to script out the db contents and we then apply it on the other side. Effectively Web Deploy thus will only overwrite the destination db with the source db.
If you are okay with that as a "merge" you can get around that table already exists error by using SMO scripting options - this is what WebMatrix does to make the db publishing/downloading work. To your source just add:
,scriptDropsFirst=true
(this scripts out drops for all the objects in your source database so that if they exist on the destination they will get dropped and won't block you)
You might also need:
,copyAllUsers=false
(if you aren't a sysadmin on the remote SQL database, chances are you won't be able to create logins, which are a server-level action. Typically if you don't use this setting, you'll get an error about creating a login, or login doesn't exist, because SMO scripts our your database user as "for LOGIN " and that login doesn't exist on the server)
Hope that helps!
Kristina