Running an executable in virtual machine through base machine - executable

I want to launch an executable in my virtual machine(Windows Server 2003 OS and is installed using VMWARE Workstation) from my base machine
I tried the following command
"C:\Program Files\VMware\VMware VIX\vmrun" -T server -h https://machine-name.domain-name:8333/sdk -u username -p password -gu guestusername -gp guestpassword runProgramInGuest "[standard] vmname/vmname.vmx" -activeWindow "C:\windows\system32\notepad.exe"
On doing this notepad doesn't shows up but on opening taskmanager it shows notepad.exe running under my account
Thanks in advance

"C:\Program Files\VMware\VMware VIX\vmrun" -T server -h https://machine-name.domain-name:8333/sdk -u username -p password -gu guestusername -gp guestpassword runProgramInGuest "[standard] vmname/vmname.vmx" -activeWindow -interactive "C:\windows\system32\notepad.exe"
made it work...

Related

Execute commands on remote server behind another server (jumphost) using Plink

I am trying to make an automation using Power Automate Desktop for PuTTY. I have come across a solution to use cmd to run commands using plink.
I used the following steps:
I added PuTTY to system variables
I used the command (in cmd):
plink -ssh hostname#ipaddress -pw password -no-antispoof -m C:\commands.txt
I edited command.txt:
ssh anotherIP -pw passwordForAnotherIP
cd /tmp
cat filename
When I run the command in cmd, I can not input password for the other server that needs to be accessed inside the first one. The error shown is
Bad Port 'w'
The server runs bash 4.2. How can I input password inside the txt file commands so that command line plink command takes it?
Better solution is using Plink's -proxycmd:
plink -ssh anotherIP -pw passwordForAnotherIP -no-antispoof -proxycmd "plink -ssh hostname#ipaddress -pw password -nc anotherIP:22" -m C:\commands.txt
With the commands.txt containing only the:
cd /tmp
cat filename
To answer your literal question:
The OpenSSH ssh has no -pw switch. See Automatically enter SSH password with script.
Additionally, your command.txt won't do what you think anyway. It won't run the cd and cat within the ssh. It would run them after the ssh. So on the ipaddress. How to do this properly is discussed in: Entering password to remote ssh through Plink after establishing a connection.

Error when opening tmux directly from ssh connection

I try to open tmux automatically when I connect to my office Computer (Mac with macOS Catalina).
I found the following solution outlined in a few Answers and a few blog posts:
ssh <hostname> -t "tmux"
When I use this I get following error:
bash: tmux: command not found
I'm confused because I can open tmux once the ssh-connection is established but not directly.
It looks like that tmux installation path is not present in your PATH variable when you ssh.
Check what is the path for tmux installation on remote machine using which tmux. And verify if you can see that path in the output of following command.
ssh <hostname> "echo $PATH"
You can either use the full path
ssh <hostname> -t "/usr/bin/tmux"
or update the PATH settings for non-interactive shell.

Fix for tput: No value for $TERM and no -T specified when running a remote shell script using SSH

I have a nodetimecheck.sh file on a server which has a command like this
echo
tput setaf 2; echo -n " What is my node's local time: "; tput setaf 7; date
When I login to my server with SSH and execute ./nodetimecheck.sh it displays properly.
However, if I try to execute the command from my local machine via ssh like this
ssh -i ~/.ssh/privkey username#serverip ./nodetimecheck.sh
It does display the time, but there is a nagging message
tput: No value for $TERM and no -T specified
Local machine running Ubuntu 18.04 LTS
Remote server on GCP running Ubuntu 18.04 LTS
Found the solution as follows. Supply the TERM=xterm as part of the ssh command
ssh -i ~/.ssh/privkey username#serverip TERM=xterm ./nodetimecheck.sh

Use SSH commands in putty and/or psftp script for sftp server [duplicate]

I am looking to script something in batch which will need to run remote ssh commands on Linux. I would want the output returned so I can either display it on the screen or log it.
I tried putty.exe -ssh user#host -pw password -m command_run but it doesn't return anything on my screen.
Anyone done this before?
The -m switch of PuTTY takes a path to a script file as an argument, not a command.
Reference: https://the.earth.li/~sgtatham/putty/latest/htmldoc/Chapter3.html#using-cmdline-m
So you have to save your command (command_run) to a plain text file (e.g. c:\path\command.txt) and pass that to PuTTY:
putty.exe -ssh user#host -pw password -m c:\path\command.txt
Though note that you should use Plink (a command-line connection tool from PuTTY suite). It's a console application, so you can redirect its output to a file (what you cannot do with PuTTY).
A command-line syntax is identical, an output redirection added:
plink.exe -ssh user#host -pw password -m c:\path\command.txt > output.txt
See Using the command-line connection tool Plink.
And with Plink, you can actually provide the command directly on its command-line:
plink.exe -ssh user#host -pw password command > output.txt
Similar questions:
Automating running command on Linux from Windows using PuTTY
Executing command in Plink from a batch file
You can also use Bash on Ubuntu on Windows directly. E.g.,
bash -c "ssh -t user#computer 'cd /; sudo my-command'"
Per Martin Prikryl's comment below:
The -t enables terminal emulation. Whether you need the terminal emulation for sudo depends on configuration (and by default you do no need it, while many distributions override the default). On the contrary, many other commands need terminal emulation.
As an alternative option you could install OpenSSH http://www.mls-software.com/opensshd.html and then simply ssh user#host -pw password -m command_run
Edit: After a response from user2687375 when installing, select client only. Once this is done you should be able to initiate SSH from command.
Then you can create an ssh batch script such as
ECHO OFF
CLS
:MENU
ECHO.
ECHO ........................
ECHO SSH servers
ECHO ........................
ECHO.
ECHO 1 - Web Server 1
ECHO 2 - Web Server 2
ECHO E - EXIT
ECHO.
SET /P M=Type 1 - 2 then press ENTER:
IF %M%==1 GOTO WEB1
IF %M%==2 GOTO WEB2
IF %M%==E GOTO EOF
REM ------------------------------
REM SSH Server details
REM ------------------------------
:WEB1
CLS
call ssh user#xxx.xxx.xxx.xxx
cmd /k
:WEB2
CLS
call ssh user#xxx.xxx.xxx.xxx
cmd /k

ssh connection to Vagrant virtual machine using Ansible fails

I'm new to Ansible.I set-up an Ubuntu virtual machine using Vagrant. I'm able to ssh into the machine using ssh vagrant#172.16.23.228. I have created an ssh key with the same password as the vm, added it to the agent and specified the path in my hosts file.
After following the instructions here I started to receive the following errors, when running this command (ansible all --inventory-file=hosts.ini --module-name ping -u vagrant -vvvv):
Not sure what I'm missing from my set-up, what else I need to check?
<172.16.23.228> ESTABLISH CONNECTION FOR USER: vagrant
<172.16.23.228> REMOTE_MODULE ping
<172.16.23.228> EXEC ssh -C -tt -vvv -o ControlMaster=auto -o ControlPersist=60s -o ControlPath="/Users/user/.ansible/cp/ansible-ssh-%h-%p-%r" - o Port=22 -o IdentityFile="~Users/user/.ssh/onemachine_rsa" -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o User=vagrant -o ConnectTimeout=10 172.16.23.228 /bin/sh -c 'mkdir -p $HOME/.ansible/tmp/ansible-tmp-1451080871.59-247915080664557 && chmod a+rx $HOME/.ansible/tmp/ansible-tmp-1451080871.59-247915080664557 && echo $HOME/.ansible/tmp/ansible-tmp-1451080871.59-247915080664557'
172.16.23.228 | FAILED => SSH Error: tilde_expand_filename: No such user Users
while connecting to 172.16.23.228:22
It is sometimes useful to re-run the command using -vvvv, which prints SSH debug output to help diagnose the issue.
My hosts file looks like:
[testserver]
172.16.23.228 ansible_ssh_port=22 ansible_ssh_user=vagrant ansible_ssh_private_key_file=~Users/user/.ssh/onemachine_rsa
What you're doing can work, but I highly recommend using the built-in Ansible provisioner in Vagrant. It will make your life easier and improve your Vagrant skills at the same time. And if you need to execute any shell scripts, use the shell provisioner.
Providing this answer for the benefit of those, like me, who arrive later at the party. Latest Vagrant installations install a private key in a local directory instead of using the admittedly insecure private key for every VM. You'll have to create an ansible_hosts file like this one:
[vagrantboxes]
jessie ansible_ssh_port=2222 ansible_ssh_host=127.0.0.1
[vagrantboxes:vars]
ansible_ssh_user=vagrant
ansible_ssh_private_key_file=.vagrant/machines/default/virtualbox/private_key
Where the key is the last line, which provides a path to the actual private key used in the virtual machine that has been started up from this particular directory.
The path to your ansible_ssh_private_key_file is incorrect. Try ansible_ssh_private_key_file=~/.ssh/onemachine_rsa instead. The tilde in this case expands to the home directory of your user on the local machine you're running ansible from.