How to add more than two devices with PowerCLI - powercli

I am trying to setup hardware passthrough on a ESXi 6.0 server with the following PowerCLI script:
Connect-VIServer $server -User $user -Password $pwd
$AllVirtualMachines = Get-VM -Location "Dev Machines"
foreach ($VirtualMachine in $AllVirtualMachines)
{
Get-PassthroughDevice -VM $VirtualMachine | Remove-PassthroughDevice -Confirm:$false
}
foreach ($VirtualMachine in $AllVirtualMachines)
{
$PciDeviceList = Get-PassthroughDevice -VMHost 172.16.7.130 -Type Pci
Add-PassthroughDevice -VM $VirtualMachine -PassthroughDevice $PciDeviceList[0] -Confirm:$false
}
foreach ($VirtualMachine in $AllVirtualMachines)
{
$PciDeviceList = Get-PassthroughDevice -VMHost 172.16.7.130 -Type Pci
Add-PassthroughDevice -VM $VirtualMachine -PassthroughDevice $PciDeviceList[1] -Confirm:$false
}
foreach ($VirtualMachine in $AllVirtualMachines)
{
$PciDeviceList = Get-PassthroughDevice -VMHost 172.16.7.130 -Type Pci
Add-PassthroughDevice -VM $VirtualMachine -PassthroughDevice $PciDeviceList[2] -Confirm:$false
}
foreach ($VirtualMachine in $AllVirtualMachines)
{
$PciDeviceList = Get-PassthroughDevice -VMHost 172.16.7.130 -Type Pci
Add-PassthroughDevice -VM $VirtualMachine -PassthroughDevice $PciDeviceList[3] -Confirm:$false
}
...and it works great...for two devices. After any two PCI devices are setup with a hardware passthrough I will be given a message of "The maximum number of PCI devices has been reached". However I can go in via the GUI and add all the PCI devices I want...I've had as many as nine devices on one VM.
The devices I am trying to add are NVMe drives if it matters.

Related

Authenticate Azure Blob Storage Account in cloud using a runbook

In AZURE ANALYSIS SERVICES Tabular model (with compatibility level 1400) I imported a Blob storage account as a Data Source. It's Authentication Kind is Key kind of authentication. The key is a Static Key.
But while refreshing the Tabular using a Runbook in Automation Account (Cloud PowerShell) is there a way to pass the key/credentials so that it could authenticate?
Otherwise the PowerShell fails with below message
The given credential is missing a required property. Data source kind: AzureBlobs. Authentication kind: Key. Property name: Key. The exception was raised by the IDbConnection interface.
Here is the Source definition copied from Model.bim file:
{
"createOrReplace": {
"object": {
"database": "azureanalysisservicesdatabase",
"dataSource": "OMSLogs"
},
"dataSource": {
"type": "structured",
"name": "OMSLogs",
"connectionDetails": {
"protocol": "azure-blobs",
"address": {
"account": "storage",
"domain": "blob.core.windows.net"
},
"authentication": null,
"query": null
},
"credential": {
"AuthenticationKind": "Key",
"kind": "AzureBlobs",
"path": "https://storage.blob.core.windows.net/",
"PrivacySetting": "Organizational"
}
}
}
}
this is the code I ran in PowerShell to process the Database:
Invoke-ProcessASDatabase -databasename $DatabaseName -server $AnalysisServerName -RefreshType "Full" -Credential $SPCredential
Okay I also hit a similar issue and found the solution, add "Key" to the "credential" object:
"credential": {
"AuthenticationKind": "Key",
"kind": "AzureBlobs",
"path": "https://storage.blob.core.windows.net/",
"PrivacySetting": "Organizational",
"Key": "<StorageAccountKey>"
}
This isn't well documented by Microsoft, but this worked for me
Update with PowerShell sample:
Get-ChildItem -Filter "drop" -Recurse -Path $sourcePath -Directory |
Get-ChildItem -recurse -filter *.asdatabase -file | ForEach-Object {
$filename = $_.fullname
$generatedFile = $buildPath + $_.BaseName + ".xmla"
"Processing $filename"
& $deploymentWizard $filename /o:$generatedFile
# Have to add Blob Key now, as Deployment Wizard doesn't like
# adding the Key (bug maybe? Or DeloyWizard isn't up to date)
$file = Get-Content $generatedFile -Raw | ConvertFrom-Json
$file.createOrReplace.database.model.dataSources | ForEach-Object {
# Add Blob Key to credential object
if ($_.name.StartsWith("AzureBlobs/")) {
$_.credential | Add-Member -Name "Key" -Value $storageKey -MemberType NoteProperty -Force
}
}
$file = $file | ConvertTo-Json -Depth 32
$file | Set-Content -Path $generatedFile -Encoding utf8
}

Terraform remote-exec on windows with ssh

I have setup a Windows server and installed ssh using Chocolatey. If I run this manually I have no problems connecting and running my commands. When I try to use Terraform to run my commands it connects successfully but doesn't run any commands.
I started by using winrm and then I could run commands but due to some problem with creating a service fabric cluster over winrm I decided to try using ssh instead and when running things manually it worked and the cluster went up. So that seems to be the way forward.
I have setup a Linux VM and got ssh working by using the private key. So I have tried to use the same config as I did with the Linux VM on the Windows but it still asked me to use my password.
What could the reason be for being able to run commands over ssh manually and using Terraform only connect but no commands are run? I am running this on OpenStack with Windows 2016
null_resource.sf_cluster_install (remote-exec): Connecting to remote host via SSH...
null_resource.sf_cluster_install (remote-exec): Host: 1.1.1.1
null_resource.sf_cluster_install (remote-exec): User: Administrator
null_resource.sf_cluster_install (remote-exec): Password: true
null_resource.sf_cluster_install (remote-exec): Private key: false
null_resource.sf_cluster_install (remote-exec): SSH Agent: false
null_resource.sf_cluster_install (remote-exec): Checking Host Key: false
null_resource.sf_cluster_install (remote-exec): Connected!
null_resource.sf_cluster_install: Creation complete after 4s (ID: 5017581117349235118)
Here is the script im using to run the commands:
resource "null_resource" "sf_cluster_install" {
# count = "${local.sf_count}"
depends_on = ["null_resource.copy_sf_package"]
# Changes to any instance of the cluster requires re-provisioning
triggers = {
cluster_instance_ids = "${openstack_compute_instance_v2.sf_servers.0.id}"
}
connection = {
type = "ssh"
host = "${openstack_networking_floatingip_v2.sf_floatIP.0.address}"
user = "Administrator"
# private_key = "${file("~/.ssh/id_rsa")}"
password = "${var.admin_pass}"
}
provisioner "remote-exec" {
inline = [
"echo hello",
"powershell.exe Write-Host hello",
"powershell.exe New-Item C:/tmp/hello.txt -type file"
]
}
}
Put the connection block inside the provisioner block:
provisioner "remote-exec" {
connection = {
type = "ssh"
...
}
inline = [
"echo hello",
"powershell.exe Write-Host hello",
"powershell.exe New-Item C:/tmp/hello.txt -type file"
]
}

MSYS2 path to Windows clipboard

Is there a way to copy a Unix path from the MSYS2 bash shell to the Windows clipboard?
A work-around is to start a Windows explorer with the current directory: /c/windows/explorer .
The MSYS2 pwd command has a -W switch to output the current path as Windows path (with forward slashes).
The Windows clipboard can be accessed as a Unix device: /dev/clipboard
So this makes a shell function like this:
# pathw [-c] [dir]
pathw () {
local p=''
local clip=false
if [ "$1x" = "-cx" ]; then
clip=true
shift
fi
if [ "$1x" = "x" ]; then
p=$(pwd -W)
else
p=$(cd $1 && pwd -W)
fi
p=$(echo $p | sed 's|/|\\|g')
echo $p
if [ "$clip" = true ]; then
echo $p > /dev/clipboard
fi
}
pathw ~
C:\msys64\home\weberjn

How to connect and create a file on a distant PC using SSH and EXPECT script?

When i run my program, this loop should copy my files $file in my directories $dir. First of all, it is just creating a new file named $file in my $myDest (And it is not creating a new directory $dir to put my $file in it).
foreach file $FILES dir $DIRECTORIES {
set timeout -1;
puts "\nFichier : $file \n"
puts "Repertoire : $dir \n"
spawn scp -p -r "$mySource/$file" "$myDest/$dir"
expect -re "(.*)assword: " {sleep 1; send -- "$pass\r" }
expect -timeout 3600 eof
}
So i tried to add the command mkdir to all this stuff so it create me the directory on the distant PC but its does not work.
foreach file $FILES dir $DIRECTORIES {
set timeout -1;
puts "\nFichier : $file \n"
puts "Repertoire : $dir \n"
spawn ssh marpic#192.168.110.90 'mkdir $path/$dir'
expect -re "(.*)assword: " {sleep 1; send -- "$pass\r" }
expect eof
spawn scp -p -r "$mySource/$file" "$myDest/$dir"
expect -re "(.*)assword: " {sleep 1; send -- "$pass\r" }
expect -timeout 3600 eof
}
Error code :
root#raspberrypi:~# ./recupRaspFiles.sh
Fichier : 2018-03-07_09-34-24_R_HOURS_Q2
Repertoire : 2018-03-07
spawn ssh marpic#192.168.110.90 'mkdir /home/marpic/muonic_data/Data_Q2/2018-03-07'
marpic#192.168.110.90's password:
bash: mkdir /home/marpic/muonic_data/Data_Q2/2018-03-07: Aucun fichier ou dossier de ce type
spawn scp -p -r /root/muonic_data/2018-03-07_09-34-24_R_HOURS_Q2 marpic#192.168.110.90:/home/marpic/muonic_data/Data_Q2/2018-03-07
[...]
Does anyone have a solution ?

run powershell script from sql server agent (type: powershell)

I am trying to run simple powershell script prom a job(sql server agent).
script:
[string]$SrcFolder = "G:\MSSQL\Test";
[string]$TrgFolder = "\\xx.xx.xx.xxx\d$\sql\logshipping" ;
if (-not(Get-Module -Name SQLPS))
{
if (Get-Module -ListAvailable -Name SQLPS) {
Push-Location
Import-Module -Name SQLPS -DisableNameChecking
Pop-Location
};
};
if ($SrcFolder -eq $null -or $TrgFolder -eq $null )
{
Write-Host "The source Folder = $SrcFolder ,OR target folder = $TrgFolder is not valid/Null";
};
$prafix = "[A-Za-z]+_[0-9]+_[0-9].trn" ;
Set-Location -Path C:\ ;
# Copy to Destination
foreach ($file in gci -Path $SrcFolder | Where-Object{ $_.Mode -eq '-a---' -and $_.Extension -eq '.trn' -and $_.Name -match $prafix})
{
write-host "Starting Copy File: $($file.FullName) ." ;
Copy-Item -Path $file.FullName -Destination $TrgFolder -Force -ErrorAction Stop ;
if (Test-Path -LiteralPath "$TrgFolder\$($file.Name)")
{
write-host "End Copy File: $($file.FullName) ." ;
Move-Item -Path $file.FullName -Destination "$SrcFolder\Moved" -Force ;
}
else
{
Write-Host "The Copy File: $TrgFolder\$($file.BaseName) . Failed "
};
}
The script doing: Copy .bak files to remote server. then check if the bak file exists in the remote server, and if it's exists his move the bak file to a local folder .
The job is failing message:
Date 9/1/2016 6:29:31 PM Log Job History (LS_Manual-Copy)
Step ID 3 Server SQL2012ENT-IR-3 Job Name LS_Manual-Copy Step
Name Delete Old Logs Duration 00:00:00 Sql Severity 0 Sql Message
ID 0 Operator Emailed Operator Net sent Operator Paged Retries
Attempted 0
Message Unable to start execution of step 3 (reason: line(23): Syntax
error). The step failed.
the sql server agent is a member in the Administrator group .
please help .
10X