Having issue with Install-Module command - module

I am going to import the VMware module the Powershell, once I run this command "Install-Module VMware.PowerCLI -Scope CurrentUser" , I will get this error message: the term Install-Module is not recognized as the name of CMDlet, Function, Script file...
Do you know how can I fix this? Thanks

The issue was resolved by using this command
Install-Module -Name VMware.PowerCLI -SkipPublisherCheck -AllowClobber

Related

Linux PowerShell The term 'sqlpackage' is not recognized as a name of a cmdlet

Until now, every thing was ok with this command in PowerShell. But after update/reboot i'have troubles...
sqlpackage /a:Extract /scs:"Data Source='$sourceServer';Initial Catalog='$sourceDatabase'; User ID='$sqlUser';Password='$sqlPass'" /tf:$dacpac /p:DacApplicationName=Ubiquite /p:IgnorePermissions=True /p:IgnoreUserLoginMappings=True >> $log
This same command in bash work perfectly but powerShell send this
The term 'sqlpackage' is not recognized as a name of a cmdlet,...
I have allready reinstall sqlpackage as Download and install sqlpackage
I will apreciate some help.

Apache Airflow command not found with SSHOperator

I am trying to use the SSHOperator to SSH into a remote machine and run an external application through the command line. I have setup the SSH connection via the admin page.
This section of code is used to define the commands and the SSH connection to the external machine.
sshHook = SSHHook(ssh_conn_id='remote_comp')
command_1 ="""
cd /files/232-065/Rans
bash run.sh
"""
Where 'run.sh' runs the shell script:
#!/bin/sh
starccm+ -batch run_export.java Rans_Model.sim
Which simply runs the commercial software starccm+ with some options I have specified.
This section defines the task:
inlet_profile = SSHOperator(
task_id='inlet_profile',
ssh_hook=sshHook,
command=command_1
)
I have confirmed the SSH connection works by giving a simple 'ls' command and checking the output.
The error that I get is:
bash run.sh, error: run.sh: line 2: starccm+: command not found
The command in 'run.sh' works when I am logged into the machine (it does not require a GUI). This makes me think that there is a problem with the SSH session and it is not the same as the one that Apache Airflow logs into, but I am not sure how to solve this problem.
Does anyone have any experience with this?
There is no issue with SSH connection (at least from the error message). However, the issue is with starccm+ installation path.
Please check the installation path of starccm+ .
Check if the installation path is part of $PATH env variable
$ echo $PATH
If not, then install it in the standard locations like /bin or /usr/bin etc (provided they are included in $PATH variable), or export the installed director into PATH variable like this,
$ export PATH=$PATH:/<absolute_path>
It is not ideal but if you struggle with setting the path variable you can run starccm stating the full path like:
/directory/where/star/is/installed/starccm+ -batch run_export.java Rans_Model.sim

Powershell not exiting

cd "C:\apache-tomcat-7.0.67\bin"
.\shutdown.bat
.\startup.bat
I have above command in powershell file, but problem is that, powershell has not exited after .\startup.bat.
So, for solving this issue i have used following command, but then apache is not working for me
START "C:\apache-tomcat-7.0.67\bin\startup.bat"
If startup.bat does not exit, start it with Start-Process without the -wait parameter.
Start-Process .\startup.bat
(use the relative path in the working directory, not a global path)

zsh: command not found: "ams_cds"

I am trying to get my Cadence Environment set up for designing my circuit. This is my first encounter with .zshrc files. I am working from my school server. I was instructed to run the command zsh first and then basically run this command:
ams_cds -tech h35b4 -mode fb`
(specific to Cadence design kit). On running this, I get the following error:
zsh: command not found : ams_cds
I have been trying to resolve this "command not found" issue but have not succeded in doing so. Any help/guidance will be appreciated.
Thanks a lot in advance.

Hive script not running in crontab with hadoop must be in the path error

After setting Hadoop Home path and Prefix path in .bashrc and /etc/profile also im getting the same error - Cannot find hadoop installation: $HADOOP_HOME or $HADOOP_PREFIX must be set or hadoop must be in the path
If i run the script from crontab im facing this error from hive> prompt its working fine
plz help with the regarding how to solve this
Set $HADOOP_HOME in $HIVE_HOME/conf/hive-env.sh
try loading user bash profile in the script, as below,
. ~/.bash_profile
bash is included in user bash_profile and it will have user specific configurations as well.
see the similar question Hbase commands not working in script executed via crontab