Logging commands executed with Invoke-Command - sql

I recently hacked together a process that would spin off multiple remote installations on servers. We are looking for a way to install SQL Patches on 100+ remote servers automatically. I was able to make a successful pass of this last Saturday. It worked very well.
I am not a PowerShell expert or even novice. I was able to get something working and would like to expand on it. The command below is what I am issuing in PowerShell and would like to put some logging like each servers start and end time of applying the patch. I would like to save this to a central table that all servers can connect to.
My question is does anyone have any good resources that would get me down the path of what I am looking for or could help me advance my skills to improve my process.
The -ComputerName parameter is pathed to a txt file with a list of server names that I want patched in this group.
The -ScriptBlock parameter is pathed to a batch file on the remove server that has the command to execute the SQL Patch via the command line.
"D:\DBA\SQLPatching\SQL_2012\SP3_CU2\SQLServer2012-KB3137746-x64.exe" /quiet /IAcceptSQLServerLicenseTerms /Action=Patch /AllInstances
Thanks
$jobWRM = invoke-command -computerName (get-content D:\DBA\SQLPatching\May2016\LLE\ServerLists\ServerList_EIQDBS01_1.txt) -scriptblock {D:\DBA\SQLPatching\SQL_2012\SP3_CU2\Patch-SQL2012_SP3_CU2.bat} -jobname WinRM -throttlelimit 16 -AsJob

If you're looking to just catch the execution state, you could throw those computernames in a variable and foreach through them while writing completion out to a log.
If you want errors to give you more depth in troubleshooting, Pssessions will give you errors back (not sure if the batch file you list will give you good troubleshooting data though).
You can find examples of both of these approaches on the web
ForEach
PSSessions

Related

Automating Zope5 Database Pack

I tried asking on the Plone forums but no one had any good responses.
I am running Zope5, no ZeoServer, no Plone, with Apache as a frontend proxy.
In the old Zope2 there was a script called zodb-pack that could pack the database from the command line. This is no longer included with Zope5 and I am searching for a way to pack the db from the command line.
Also, Apache is setup for client certificate authentication, so I cannot do something like:
curl -X POST https://username:password#zope.domain.com
I also don't want to hardcode that type of curl statement because of the need to include the username and password.
My Zope is running in a Docker container, so I thought about doing something like:
source /zope5/bin/activate
python scriptname
with a python script along the lines of
from ZODB.DB import DB
from ZODB.config import databaseFromString
from transaction import commit
db = databaseFromString("<zodb_config>")
storage = db.storage
storage.pack(None, referencesf)
but I'm not sure that's the correct way to do this. Basically I just want my bash script that automates the backups for the server to pack the Zope DB before backing it up, but I need a command line command to do so.
I cannot use any solution that requires me to modify how Zope runs, nor requires me to stop Zope to perform the pack.
Of course I can manually go to the ZMI's Control Panel and click Pack, but like I said, I was trying to automate it so it could run in off peak hours.

echoid.exe remote execution issue (wrong Locking Code output)

I am trying to bring locking code of a farm in automatic way.
So, i have on each remote server echoid.exe and a batch file.
The batch file simply execute the echoid.exe and write its output into a text file which i can parse.
The problem is when im triggering the .bat file remotely, it seems like the echoid.exe executed on the container host (the one im using to send execution command through psexec for example) rather than executing the code in the remote host (meaning- the locking code output is wrong) . If the same .bat file executed locally (and manually), the results are OK.
Any idea why? does anyone know how can i run the echoid remotely and get the correct results?
i have tried several remote action and all failed and brought wrong results :(
please help!
BTW all remote machines are WIN OS.

Powershell script to execute DDL statements on linked servers - not working when run using SSIS

I have a Powershell script that loops through a list of SQL Servers and creates server logins and database users.
The script runs on a separate server, under the administrator credentials on that server, and connects to the other SQL Servers via linked servers.
#Get administrator credentials
$password = get-content C:\Powershell\General\password.txt | convertto-securestring;
$cred = new-object -typename System.Management.Automation.PSCredential -argumentlist "DOMAIN\administrator",$password;
When this script is run manually (either directly through a Powershell window or using a batch file through a command prompt) it works perfectly well. I am logged onto the executing server as administrator when running the script manually.
I then tried to run this Powershell script using an SSIS package on the executing server, using the Execute Process Task to run a batch file. The package was executed from a SQL Agent Job. Although both the job and the package seemed to execute successfully, the DDL statements were not executed against the linked servers.
SQL Agent on the executing server is run under a designated Service Account. SSIS runs under the Network Service account.
Does anybody have any thoughts on what I might be doing wrong? I am happy to provide details of the script or anything else that is required.
Thanks
Ash
UPDATE: ok we have a little more information.
I took out the lines I posted above as I have discovered I don't actually need the administrator credentials I was retrieving.
I logged onto the server with the script on it using the service account. As per #ElecticLlama's suggestion I set a Profiler trace on the destination server. When running the script manually (or running a batch file manually that runs the Powershell script) everything works well and the Profiler shows the DDL actions, under the service account login.
When running a job through SQL Agent (either a CmdExec job or an SSIS package) that runs the same batch file, I get the following error:
'Login failed for user 'DOMAIN\ServiceAccount'. Reason: Token-based server access validation failed with an infrastructure error.'
Anybody have any further thoughts?
Thnaks to everyone for their help. Once I got that last error a quick search revealed I just had to restart SQL Agent and now everything works as it should. Thanks in particular to #ElecticLlama for pointing me in the right direction.
Ash

How to solve the logon issue in sql job?

I have configured a sql job which backups the databases and then transfer them to a remote location in another step. On command prompt my command is working fine but when I schedule this in a job I found the error :
Executed as user administrator. Logon Failure Unknown User Name or Bad Password. 0File(s) copied . Process Exit code 0. The step succedded.
I want to solve these issue and I also want that if does get transferred then job should report failure but it doesn`t show any such message.
I just want that when no files get copied i.e. 0File(s) copied . it should notify failure job .
Thanks
Nitesh Kumar
One way is to use a Script Task to check if there is a file you want to copy. If there is one, the process can proceed, if not, the step can result in an error. You do this by adding
Dts.TaskResult = (int)ScriptResults.Failure;
to the end of the script task logic.
Anyways i dont know your package design so there might be more suitable ways.
The issue has been solved . As the remote location folder was shared and was accessible to every one. My command was working fine from command prompt , even any user was able to create their own file on that location and also able to delete the file from that location.
The issue was related to user. My job was being executed by servername\administrator and remote location administrator password was changed due to that bad password error occurred. I told my IT Team about the problem and they reset the server password as older one, and my job began to work fine.
The issue was solved.
I just want to know how my sql job authenticates the server login as I go through the script of my job and found nothing helpful regarding authentication.
Can any one explain it to me.
Thanks
Nitesh Kumar

SQL Hangs on FTP Command but works off Command Line

SQL hangs on this command:
EXEC master..xp_cmdshell 'ftp -s:C:\FTP\connect'
But doesn't hang on any other command I've tried using xp_cmdshell like echo open get and it works just fine, so I know the permissions for SQL Server to the Folder (and Download folder) are set properly.
...And when ftp -s:C:\FTP\connect is executed on the command line, the FTP transfer begins, and completes successfully.
The SQL command that is giving issues on this particular server, worked completely fine on my other server. I'm really not sure what else needs to be done. Does anyone know why SQL hangs when I execute an FTP command besides anything else I've been through?
Most probably Server firewall is blocking the command for the port you are running in the first instance (which might get byspassed using the commandline)