SQL Hangs on FTP Command but works off Command Line - sql

SQL hangs on this command:
EXEC master..xp_cmdshell 'ftp -s:C:\FTP\connect'
But doesn't hang on any other command I've tried using xp_cmdshell like echo open get and it works just fine, so I know the permissions for SQL Server to the Folder (and Download folder) are set properly.
...And when ftp -s:C:\FTP\connect is executed on the command line, the FTP transfer begins, and completes successfully.
The SQL command that is giving issues on this particular server, worked completely fine on my other server. I'm really not sure what else needs to be done. Does anyone know why SQL hangs when I execute an FTP command besides anything else I've been through?

Most probably Server firewall is blocking the command for the port you are running in the first instance (which might get byspassed using the commandline)

Related

Need to change connection in SQL Postdeploy

This is the situation: The Dacpac and ISpac files are deployed with a Powershell script.
The result of the dacpac goes to Server1, the ISpac to Server2.
In post-deploy of the dacpac an account and credentials are added on Server1 along with some other configurations.
When that is done, the connection should be changed to Server2 done by :connect Server2, for some additional setup .
When testing in SSMS SQLCMD mode this works fine, but VS complains with error 72006: Fatal scripting error: Command Connect is not supported.
So, can it be done? And if it can, how?
TIA
Make sure that VS have activated the sql cmd mode that is a button in the query toolbar
It looks like what I try to do is not possible, but there is a workaround.
Create a dummy database project with an essentially empty database.
You can either use a publish script to basically not create anything, or you
can drop the database afterwards in your Powershell script.
Put your code in Postdeploy of the dummy project.
Test and deploy

Laravel keeping remote connection until all commands have finished

Toolset:
Laravel 5.2.*
LaravelCollective remote package ^5.2
Let's say I have a route http://example.com/npm when I hit this route I process some request parameters and then ssh into a remote server using the LaravelCollective remote package.
After some time I see in my logs that the connection is closed. I know this because that message is logged after the ssh command. So my applications tells me that my command is executed successfully.
But when I go and check the server there is no node_modules folder, but after hitting the route 10x is suddenly is there.
That made me think that my connection is closed even the commands where not finished. To be sure about that I started monitoring the process on ther server with the following command
ps aux
This resulted in the fact that I got my success message but the process was still running on my server, which means the output I get is not correct and it prevents a follow-up command to fail (gulp production)
I dug a bit into the source code to see that there is a way to keep that connection open but no luck so far.
The question: can I keep this connection open until the commands are definitely finished so that my response to the end user is correct?

Logging commands executed with Invoke-Command

I recently hacked together a process that would spin off multiple remote installations on servers. We are looking for a way to install SQL Patches on 100+ remote servers automatically. I was able to make a successful pass of this last Saturday. It worked very well.
I am not a PowerShell expert or even novice. I was able to get something working and would like to expand on it. The command below is what I am issuing in PowerShell and would like to put some logging like each servers start and end time of applying the patch. I would like to save this to a central table that all servers can connect to.
My question is does anyone have any good resources that would get me down the path of what I am looking for or could help me advance my skills to improve my process.
The -ComputerName parameter is pathed to a txt file with a list of server names that I want patched in this group.
The -ScriptBlock parameter is pathed to a batch file on the remove server that has the command to execute the SQL Patch via the command line.
"D:\DBA\SQLPatching\SQL_2012\SP3_CU2\SQLServer2012-KB3137746-x64.exe" /quiet /IAcceptSQLServerLicenseTerms /Action=Patch /AllInstances
Thanks
$jobWRM = invoke-command -computerName (get-content D:\DBA\SQLPatching\May2016\LLE\ServerLists\ServerList_EIQDBS01_1.txt) -scriptblock {D:\DBA\SQLPatching\SQL_2012\SP3_CU2\Patch-SQL2012_SP3_CU2.bat} -jobname WinRM -throttlelimit 16 -AsJob
If you're looking to just catch the execution state, you could throw those computernames in a variable and foreach through them while writing completion out to a log.
If you want errors to give you more depth in troubleshooting, Pssessions will give you errors back (not sure if the batch file you list will give you good troubleshooting data though).
You can find examples of both of these approaches on the web
ForEach
PSSessions

echoid.exe remote execution issue (wrong Locking Code output)

I am trying to bring locking code of a farm in automatic way.
So, i have on each remote server echoid.exe and a batch file.
The batch file simply execute the echoid.exe and write its output into a text file which i can parse.
The problem is when im triggering the .bat file remotely, it seems like the echoid.exe executed on the container host (the one im using to send execution command through psexec for example) rather than executing the code in the remote host (meaning- the locking code output is wrong) . If the same .bat file executed locally (and manually), the results are OK.
Any idea why? does anyone know how can i run the echoid remotely and get the correct results?
i have tried several remote action and all failed and brought wrong results :(
please help!
BTW all remote machines are WIN OS.

Powershell script to execute DDL statements on linked servers - not working when run using SSIS

I have a Powershell script that loops through a list of SQL Servers and creates server logins and database users.
The script runs on a separate server, under the administrator credentials on that server, and connects to the other SQL Servers via linked servers.
#Get administrator credentials
$password = get-content C:\Powershell\General\password.txt | convertto-securestring;
$cred = new-object -typename System.Management.Automation.PSCredential -argumentlist "DOMAIN\administrator",$password;
When this script is run manually (either directly through a Powershell window or using a batch file through a command prompt) it works perfectly well. I am logged onto the executing server as administrator when running the script manually.
I then tried to run this Powershell script using an SSIS package on the executing server, using the Execute Process Task to run a batch file. The package was executed from a SQL Agent Job. Although both the job and the package seemed to execute successfully, the DDL statements were not executed against the linked servers.
SQL Agent on the executing server is run under a designated Service Account. SSIS runs under the Network Service account.
Does anybody have any thoughts on what I might be doing wrong? I am happy to provide details of the script or anything else that is required.
Thanks
Ash
UPDATE: ok we have a little more information.
I took out the lines I posted above as I have discovered I don't actually need the administrator credentials I was retrieving.
I logged onto the server with the script on it using the service account. As per #ElecticLlama's suggestion I set a Profiler trace on the destination server. When running the script manually (or running a batch file manually that runs the Powershell script) everything works well and the Profiler shows the DDL actions, under the service account login.
When running a job through SQL Agent (either a CmdExec job or an SSIS package) that runs the same batch file, I get the following error:
'Login failed for user 'DOMAIN\ServiceAccount'. Reason: Token-based server access validation failed with an infrastructure error.'
Anybody have any further thoughts?
Thnaks to everyone for their help. Once I got that last error a quick search revealed I just had to restart SQL Agent and now everything works as it should. Thanks in particular to #ElecticLlama for pointing me in the right direction.
Ash