echoid.exe remote execution issue (wrong Locking Code output) - locking

I am trying to bring locking code of a farm in automatic way.
So, i have on each remote server echoid.exe and a batch file.
The batch file simply execute the echoid.exe and write its output into a text file which i can parse.
The problem is when im triggering the .bat file remotely, it seems like the echoid.exe executed on the container host (the one im using to send execution command through psexec for example) rather than executing the code in the remote host (meaning- the locking code output is wrong) . If the same .bat file executed locally (and manually), the results are OK.
Any idea why? does anyone know how can i run the echoid remotely and get the correct results?
i have tried several remote action and all failed and brought wrong results :(
please help!
BTW all remote machines are WIN OS.

Related

block internet access with Test Complete

Are there any way, to prevent the internet access of the tester application under the test whit Test Complete? I'd like to test the application's reaction of the lost of the internet connection, but I have to perform this whith a CI tool, which means the Test Complete have to block and unblock to connection.
You can do this using WMI directly from a script (see Working With WMI Objects in Scripts) or by executing a PowerShell script (see Running PowerShell Scripts From TestComplete).
For example, see this question to get a sample PS script:
Command/Powershell script to reset a network adapter

Laravel keeping remote connection until all commands have finished

Toolset:
Laravel 5.2.*
LaravelCollective remote package ^5.2
Let's say I have a route http://example.com/npm when I hit this route I process some request parameters and then ssh into a remote server using the LaravelCollective remote package.
After some time I see in my logs that the connection is closed. I know this because that message is logged after the ssh command. So my applications tells me that my command is executed successfully.
But when I go and check the server there is no node_modules folder, but after hitting the route 10x is suddenly is there.
That made me think that my connection is closed even the commands where not finished. To be sure about that I started monitoring the process on ther server with the following command
ps aux
This resulted in the fact that I got my success message but the process was still running on my server, which means the output I get is not correct and it prevents a follow-up command to fail (gulp production)
I dug a bit into the source code to see that there is a way to keep that connection open but no luck so far.
The question: can I keep this connection open until the commands are definitely finished so that my response to the end user is correct?

Logging commands executed with Invoke-Command

I recently hacked together a process that would spin off multiple remote installations on servers. We are looking for a way to install SQL Patches on 100+ remote servers automatically. I was able to make a successful pass of this last Saturday. It worked very well.
I am not a PowerShell expert or even novice. I was able to get something working and would like to expand on it. The command below is what I am issuing in PowerShell and would like to put some logging like each servers start and end time of applying the patch. I would like to save this to a central table that all servers can connect to.
My question is does anyone have any good resources that would get me down the path of what I am looking for or could help me advance my skills to improve my process.
The -ComputerName parameter is pathed to a txt file with a list of server names that I want patched in this group.
The -ScriptBlock parameter is pathed to a batch file on the remove server that has the command to execute the SQL Patch via the command line.
"D:\DBA\SQLPatching\SQL_2012\SP3_CU2\SQLServer2012-KB3137746-x64.exe" /quiet /IAcceptSQLServerLicenseTerms /Action=Patch /AllInstances
Thanks
$jobWRM = invoke-command -computerName (get-content D:\DBA\SQLPatching\May2016\LLE\ServerLists\ServerList_EIQDBS01_1.txt) -scriptblock {D:\DBA\SQLPatching\SQL_2012\SP3_CU2\Patch-SQL2012_SP3_CU2.bat} -jobname WinRM -throttlelimit 16 -AsJob
If you're looking to just catch the execution state, you could throw those computernames in a variable and foreach through them while writing completion out to a log.
If you want errors to give you more depth in troubleshooting, Pssessions will give you errors back (not sure if the batch file you list will give you good troubleshooting data though).
You can find examples of both of these approaches on the web
ForEach
PSSessions

How to run as different user without prompting user batch

I have a batch file that runs another batch file as a different user.
I also need to run the calling batch file remotely. Locally I can bypass having to enter the password with the /savecred option but then when I run the batch remotley, I still get prompted for the password but it would appear the connection times out because I'm brought back to the powershell prompt on the machine i'm connecting from.
my batch looks like this:
runas.exe /env /savecred /user:sqlsvr_dba ".\myBatch.bat"
How can I run the remote batch on my local machine without having to enter the password? I've been trying to use powershell for this.
Same question bothered me too :-)
I've tried to workaround this with schtasks and eventcreate.Here i posted my solution :
http://ss64.org/viewtopic.php?id=1539
If you want to run the script on a remote machine you can also try with wmic:
http://ss64.org/viewtopic.php?id=1495
Hope these will help you.

SQL Hangs on FTP Command but works off Command Line

SQL hangs on this command:
EXEC master..xp_cmdshell 'ftp -s:C:\FTP\connect'
But doesn't hang on any other command I've tried using xp_cmdshell like echo open get and it works just fine, so I know the permissions for SQL Server to the Folder (and Download folder) are set properly.
...And when ftp -s:C:\FTP\connect is executed on the command line, the FTP transfer begins, and completes successfully.
The SQL command that is giving issues on this particular server, worked completely fine on my other server. I'm really not sure what else needs to be done. Does anyone know why SQL hangs when I execute an FTP command besides anything else I've been through?
Most probably Server firewall is blocking the command for the port you are running in the first instance (which might get byspassed using the commandline)