Recompose VM using vmware 5.5 power cli - virtual-machine

How can I recompose a VM v1 with existing parent VM v2's snapshot s2 ?
After referring few documents, I saw command Send-LinkedCloneRecompose for recomposing VM.
I am trying this command as follows:
$myVM = 'v1'
$clone = Get-DesktopVM -Name $myVM
$pool = Get-Pool -pool_id $clone.pool_id
$date = Get-Date
$date = $date.AddSeconds(10)
Write-Host "Recomposing" $clone.name
Send-LinkedCloneRecompose -machine_id $clone.machine_id -parentVMPath $pool.parentVMPath -parentSnapshotPath $pool.parentVMSnapshotPath -schedule $date -forcelogoff $true | tee-object -variable vmState
Here I am getting error as Get-DesktopVM : The term 'Get-DesktopVM' is not recognized as the name of a
cmdlet, function, script file, or operable program. Check the spelling of the
name, or if a path was included, verify that the path is correct and try again.
I am getting the same error for powercli commandlets Get-Pool and Send-LinkedCloneRecompose as well.
I am using VMware VSphere power cli 5.5 release 2 patch 1.
Can anyone please help me in understanding the problem here ?

I missed to add
& "C:\Program Files\VMware\VMware View\Server\extras\PowerShell\add-snapin.ps1"
before executing above commands. So the commandlets were not recognized.

Related

Microsoft.Azure.IpSecurityRestriction not found in Azure PowerShell script

I am trying to run some PowerShell commands and my script is failing on the following line:
$ipsr = New-Object Microsoft.Azure.IpSecurityRestriction
The error is:
Cannot find type [Microsoft.Azure.IpSecurityRestriction]: verify that the assembly containing this type is loaded
I am trying to run this “inline” in an Azure PowerShell task as part of my deployment pipeline. Is this supported or do I need to first import an assembly?
I can reproduce your issue. First, it should be Microsoft.Azure.Management.WebSites.Models.IpSecurityRestriction, not Microsoft.Azure.IpSecurityRestriction, then make sure you have installed the Az.Websites powershell module, just use the command below.
Import-Module -Name Az.Websites
New-Object Microsoft.Azure.Management.WebSites.Models.IpSecurityRestriction
Besides, actually we import the module just for the Microsoft.Azure.Management.Websites.dll, so you can also use the command as below, check the path of your .dll file.
Add-Type -Path 'C:\Program Files\WindowsPowerShell\Modules\Az.Websites\1.1.0\Microsoft.Azure.Management.Websites.dll'
New-Object Microsoft.Azure.Management.WebSites.Models.IpSecurityRestriction

New-Item Behavior Changes When Using Runspace to Run Command Remotely

I am using VB.Net to open a Powershell runspace using WSManConnectionInfo. The connection works fine. I am trying to run a New-Item command in order to create a new virtual directory.
New-Item "IIS:\Sites\ExternalInventory FTP\TestVT" -Type VirtualDirectory -physicalPath "C:\"
The command works fine when running it locally on the server. However, when ran remotely through the runspace I get an error:
A parameter cannot be found that matches parameter name 'physicalPath'.
I initially tried to use use the New-WebVirtualDirectory command to create the virtual directory, but this led to other errors. I need to use New-Item.
What is causing the difference in behavior of the New-Item command, and how can I use it to create a new virtual directory through the runspace?
Note: I have tried other commands through the runspace and they seem to work as expected.
EDITS
I know that the New-Item command does not have the parameter "physicalPath" listed in the docs: https://learn.microsoft.com/en-us/powershell/module/microsoft.powershell.management/new-item?view=powershell-6
The New-Item command is used in multiple places in the docs to create virtual directory with the physicalPath parameter. See links
https://learn.microsoft.com/en-us/powershell/module/microsoft.powershell.management/new-item?view=powershell-6
https://forums.iis.net/t/1223546.aspx?New+WebVirtualDirectory+does+not+work+when+invoked+remotely+and+a+UNC+path+is+used
As pointed out in the comments, I needed to include the WebAdministration module in my script.
Thanks TheMadTechnitian

error to get backup sql azure from powershell

I need create and download backup sql database. I installed Azure powershell. But my script does not work.
$dt = Get-Date -Format yyyyMMddHHmmss
$dbname = 'AnimalTranslate'
Backup-SqlDatabase $dbname "C:\drops" "XXX.database.windows.net" "login" "password"
I see error:
Backup-Database : The term 'Backup-Database' is not recognized as the name of a cmdlet, function, script file, or operable program. Check the
spelling of the name, or if a path was included, verify that the path is correct and try again.
How it's fixed?
You can't use Backup-SqlDatabase to backup Azure SQL. You need to use one of the methods in this article:
http://msdn.microsoft.com/en-us/library/windowsazure/jj650016.aspx

How do I run a script on VxWorks Tornado Shell?

I am trying to run a script on VxWorks Shell, which will load a module.
I use a Perl script to telnet into the system, login and get access to the shell.
I am able to run the basic commands like 'i', 'time', 'ls' 'pwd' and 'h' and so on.
But I would like to run a script, say 'test.o'.
If I do : <C:\Path\subfolder\test.o the script file WILL run from, the TORNADO Shell.
But I have connected to using Telnet using Perl.
So I connect this way:
use Net::Telnet;
my $username = "username";
my $password = "password";
my $t = new Net::Telnet(Timeout=>10, Errmode=>'die');
$t->open('10.42.177.123');
$t->login($username,$password); # Logins as expected.
my #lines = $t->cmd('i'); # To test
print #lines # This works
#lines = $t->cmd('<C:\\Path\\Subfolder\\test.o'); # This is not working for me. HELP!
print #lines; # Prints the Error below
I get an error saying :
Unknown directory: /C:\Path\Subfolder
can't open input 'C:\Path\Subfolder\test.o
errno = 0x1f5
-
How do I run my script file if it is residing at a particular folder of the host PC?
I am able to run the script manually from the TORNADO SHELL window where the prompt looks like ->. and hence it is a working script. And as I have said, I am able to run and print the basic VxWorks Shell commands ("build-in functions").
Any help? [ My OS is Win7 ]
Thanks!
This is issue is now resolved. Two issues was there, and one was because TORNADO, another VxWorks Client was also logged into the system at the same time, while I am trying run my perl script which sends commands and do instructions using Telnet, and having two clients (Tornado, and my scripts Telnet session) running at the same time (despite the VxWorks OS running on the Embedded system having TelnetDeamon running) it didn't like it.
As for the Error above, why it didn't work and gave an error was a syntax error. I should have used
$t->cmd('<\\Path\\subfolder\\test.o');
No need to give C:

How to use Cloudberry Powershell Snap-in (for Amazon S3) from within a scheduled SQL Agent Job?

I am trying to automate my SQL database backup process. My goal is to use the Cloudberry Powershell cmdlet to give me direct control and access over my S3 buckets. I am able to do this manually but cannot get my SQL jobs to work with this.
According to Cloudberry's installation instructions, I shouldn't have to register the Cloudberry Powershell snap-in if Powershell is already installed. I have found that to be false. I have tried to register it, both 64-bit and 32-bit with no luck.
This works when executed manually/explicitly from the ISE:
Add-PSSnapin CloudBerryLab.Explorer.PSSnapIn
$today = Get-Date -format "yyyy.MM.dd.HH.mm.ss"
$key = "mykeygoeshere"
$secret = "mysecretgoeshere"
$s3 = Get-CloudS3Connection -Key $key -Secret $secret
$destination = $s3 | Select-CloudFolder -path "ProductionBackups/MyClient/log/" | Add-CloudFolder $today
$src = Get-CloudFilesystemConnection | Select-CloudFolder "X:\backups\MyClient\current\"
$src | Copy-CloudItem $destination -filter "log.trn"
^ When this command is executed in a SQL Agent job, it fails with this message:
Executed as user: DB-MAIN\SYSTEM. A job step received an error at line 1 in a PowerShell script. The corresponding line is 'Add-PSSnapin CloudBerryLab.Explorer.PSSnapIn'. Correct the script and reschedule the job. The error information returned by PowerShell is: 'The term 'Add-PSSnapin' is not recognized as the name of a cmdlet, function, script file, or operable program. Check the spelling of the name, or if a path was included, verify that the path is correct and try again. '. Process Exit Code -1. The step failed.
I read in this blog post that SQLPS.exe cannot execute 'Add-PSSnapin' commands? Is that true? I cannot find any clarification on the subject...
how can I automate my SQL backup files to the Amazon S3 cloud? I have tried everything. TNT Drive was a huge waste of time. I am hoping Cloudberry can do it, any tips?
You could use Amazon AWS .Net SDK. You can download it from here:
http://aws.amazon.com/sdkfornet/
Here's the example function to download file from S3:
function DownloadS3File([string]$bucket, [string]$file, [string]$localFile)
{
if (Test-Path "C:\Program Files (x86)")
{
Add-Type -Path "C:\Program Files (x86)\AWS SDK for .NET\bin\AWSSDK.dll"
}
else
{
Add-Type -Path "C:\Program Files\AWS SDK for .NET\bin\AWSSDK.dll"
}
$secretKeyID= $env:AWS_ACCESS_KEY_ID
$secretAccessKeyID= $env:AWS_SECRET_ACCESS_KEY
$client=[Amazon.AWSClientFactory]::CreateAmazonS3Client($secretKeyID,$secretAccessKeyID)
$request = New-Object -TypeName Amazon.S3.Model.GetObjectRequest
$request.BucketName = $bucket
$request.Key = $file
$response = $client.GetObject($request)
$writer = new-object System.IO.FileStream ($localFile ,[system.IO.filemode]::Create)
[byte[]]$buffer = new-object byte[] 4096
[int]$total = [int]$count = 0
do
{
$count = $response.ResponseStream.Read($buffer, 0, $buffer.Length)
$writer.Write($buffer, 0, $count)
}
while ($count -gt 0)
$response.ResponseStream.Close()
$writer.Close()
echo "File downloaded: $localFile"
}