Change the application name of sqlcmd - sql

I am using sqlcmd command line to connect to a database. I want to change the application name so that I can have multiple processes using sqlcmd, and be able to differentiate these processes from the server side. I know in SQL Server Management Studio I could do this (Change the Application Name of SSMS), but how can I do this in command line?
I checked the sqlcmd -?, but didn't find any useful parameter. There is another command line tool bcp, but I don't find any useful info there either.

The flag -H can be used to set "workstation_name".
It's is not the application name, but can serve the same purpose.
Use SELECT HOST_NAME() to get the value.

Related

Need to change connection in SQL Postdeploy

This is the situation: The Dacpac and ISpac files are deployed with a Powershell script.
The result of the dacpac goes to Server1, the ISpac to Server2.
In post-deploy of the dacpac an account and credentials are added on Server1 along with some other configurations.
When that is done, the connection should be changed to Server2 done by :connect Server2, for some additional setup .
When testing in SSMS SQLCMD mode this works fine, but VS complains with error 72006: Fatal scripting error: Command Connect is not supported.
So, can it be done? And if it can, how?
TIA
Make sure that VS have activated the sql cmd mode that is a button in the query toolbar
It looks like what I try to do is not possible, but there is a workaround.
Create a dummy database project with an essentially empty database.
You can either use a publish script to basically not create anything, or you
can drop the database afterwards in your Powershell script.
Put your code in Postdeploy of the dummy project.
Test and deploy

How to script SSIS SQL Agent Job Step to run on localhost?

I've created a SQL Agent Job that executes a SSIS package as one of the job steps. I'm trying to get configure the SSIS job step to be set to execute the package on "localhost" (or whatever I need to call it to reference the same SQL server instance the job is on) so that I can script it the job out, and deploy between environments using the same script.
This is SQL Server 2012, so I'm trying to run it using the SSIS Catalog that's installed on the local SQL instance. I don't want to have to go into the script and change server names as we push the script from development, to the test environment, and eventually production.
I've tried putting "localhost" in the "Server" textbox, then clicking the "..." by the "Package" setting, but I get an error saying "Verify the instance name is correct and that SQL Server is configured to allow remote connections" -- which I take to mean that it's attempting to connect to a server that is actually named localhost, as opposed to just checking itself.
Anyone know how to solve this?
How about in your script:
DECLARE #Command NVARCHAR(MAX) =
N'/ISSERVER "\"\SSISDB\....dtsx\"" /SERVER "\"' + ##SERVERNAME + '\"" ...'
And then pass it (#command = #Command) to sp_add_jobstep instead.
You might have to handle non-default instances returned by ##SERVERNAME special because SSIS doesn't use the instance name, I guess, but I didn't try it out.

SQL xp_cmdshell copy files between servers

I am trying to move all .zip in a specific folder to another folder. the source folder is located on another server, currently i am using
EXECUTE xp_cmdshell 'copy \\server1\e$\ETL\*.zip \\server2\e$\ETL\'
GO
Which is working if I am logged into both server, but the goal is to automate this process VIA sql server job agent. I have tried
EXECUTE sp_xp_cmdshell_proxy_account 'domain\useracc','pass'
GO
EXECUTE xp_cmdshell 'copy \\server1\e$\ETL\*.zip \\server2\e$\ETL\'
GO
but I am receiving the following error;
An error occurred during the execution of sp_xp_cmdshell_proxy_account. Possible reasons: the provided account was invalid or the '##xp_cmdshell_proxy_account##' credential could not be created. Error code: '0'.
And also not sure if this is my solution. Please help with how I can achieve this. The file names on server1 change name and quantity everyday.
I would strongly advise...Do not use xp_cmdshell. It opens up large security wholes in your surface area and makes you vulnerable to attack. xp_cmdshell should be disabled!
Instead, if you want to automate this with server agent you have 2 options. My preference would be to write a simple SSIS package with a file system task and schedule this package with server agent. SSIS is underutilized for this kind of task but is actually pretty good at it.
Alternatively re-write your script to use Server Agent CmdExec job steps. This does not require xp_cmdshell to be enabled and reduces the attack surface.
I Found that the following worked for me;
In the command prompt, type services.msc, this would open the list of all services on the server.
In the list of services, look for SQL Server Agent, Right Click -> Properties. Go to Logon Tab
Change the logon to a user with access on both servers. then re-write your script to use Server Agent CmdExec job steps(Thank you Pete Carter)

specify server in sql script

I am writing a sql script for sql server 2008 where I place a use statement at the beginning that specifies the database the script should be run against:
use [my_database]
As I have different environments where the same database exists, eg dev, qa, prod, is there any way I can specify in the script the environment the script is for, either by server name or ip address or by any other mechanism.
You can put the SQL Management Studio in SQLCMD mode and specify the server with the :CONNECT myserver statement.
You can switch on command mode by clicking on the option in the pic below
Your script would then look something like this
:CONNECT devserver
use [my_database]
SELECT * FROM my_table
You can even make the query window switch servers during execution
:CONNECT devserver
use [my_database]
SELECT * FROM my_table
GO
:CONNECT uatserver
use [my_database]
SELECT * FROM my_table
To connect with a specific user and password you can specify this as follows
:CONNECT devserver -U myUser -P myPassword
use [my_database]
SELECT * FROM my_table
There are actually a number of options you can specify which are well documented on msdn.
That's a CONNECTION setting, not a parameter within the script.
You can run the same script in different environments via batch file or powershell script if desired, or you could set up linked servers, but you can't just say
USE SomeOtherServer
There are security and networking implications as well.
Assuming you will run all of these scripts on a particular server - e.g. the Dev server - then you merely need to create a Linked Server to each of the other servers.
Then, for example, you could run an identically named stored procedure on each of these servers thusly:
EXEC MyDatabase.dbo.mysp_DoSomething --Dev Server; no server prefix needed since that's where we are
EXEC QA.MyDatabase.dbo.mysp_DoSomething --QA Server
EXEC Prod.MyDatabase.dbo.mysp_DoSomething --Prod server
etc.

MSDeploy with sql script

I have a deployment package that needs to run against about 3 different enviroments.
I want to specify a sql script to run (source) with the enviroments database (destiniation).
I don't want to specify the connection string in the deploy script because it contains sql login info.
I would like to be able to read a setting from the destination for the connection string.
Can I mark this a parameter to be specified when unpackaging the deployment package on the server? If so, how so I use the parameter in the dest:sql="connection string"?
Any suggestions would be great.
Scott Guthrie has a pretty good write up on this sort of thing here. He specifically mentions the changing of parameters both in prompts for the admin and in an automated fashion via the command line within deployment and/or automation scripts.