We had successfully been using New-AzureRmSqlDatabaseCopy in an Octopus Deploy step to copy our production database into a staging database. It recently stopped working with the following vague error:
New-AzureRmSqlDatabaseCopy : 40687: The operation cannot be performed on the database 'databaseMcDatabase' in its current state.
At C:\Octopus\Work\20170623140313-4142\Script.ps1:9 char:1
+ New-AzureRmSqlDatabaseCopy -CopyDatabaseName $DatabaseName -DatabaseName Source ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [New-AzureRmSqlDatabaseCopy], CloudException
+ FullyQualifiedErrorId : Hyak.Common.CloudException,Microsoft.Azure.Commands.Sql.Replication.Cmdlet.NewAzureSqlDatabaseCopy
The issue ended up being insufficient space in our Azure Elastic Pool. I discovered I could copy a smaller database without any issue, and when I removed the -ElasticPoolName option from the command I was able to copy the 60GB db also.
Related
Error message:
react-native : File C:\Users\pc\AppData\Roaming\npm\react-native.ps1 cannot be loaded. The
file C:\Users\pc\AppData\Roaming\npm\react-native.ps1 is not digitally signed. You cannot run
this script on the current system. For more information about running scripts and setting
execution policy, see about_Execution_Policies at
https:/go.microsoft.com/fwlink/?LinkID=135170.
At line:1 char:1
+ react-native link
+ ~~~~~~~~~~~~
+ CategoryInfo : SecurityError: (:) [], PSSecurityException
+ FullyQualifiedErrorId : UnauthorizedAccess
How can I solve this? I have installed npm, Node.js and JDK. I have done the tutorial for Okta authentication and in the step of react-native link I get this error.
Try setting your execution policy in PowerShell to remote
signed.
Open a new PowerShell administrator window and run Set-ExecutionPolicy -ExecutionPolicy RemoteSigned -Scope CurrentUser.
There’s also a way to allow running PowerShell scripts on Windows in Settings somewhere, but I don’t remember where. Probably in Developer Settings.
Nothing to worry about this just run PowerShell as admin and run command below in that
steps
Go to Start Menu
Type PowerShell
Right Click PowerShell and run as
administrator
And Run set-executionpolicy remotesigne in
PowerShell
Right Click PowerShell and run as administrator
And Run set-executionpolicy remotesigne in PowerShell
it worked for me.
expo : File D:\Users*user*\AppData\Roaming\npm\expo.ps1 cannot be loaded because running scripts is
disabled on this system. For more information, see about_Execution_Policies at
https:/go.microsoft.com/fwlink/?LinkID=135170.
At line:1 char:1
expo start
+ CategoryInfo : SecurityError: (:) [], PSSecurityException
+ FullyQualifiedErrorId : UnauthorizedAccess
If you see this error follow these steps:
Open Windows PowerShell with Run as Administrator.
Use Get-ExecutionPolicy on Windows PowerShell to see the execution policy.
Most probably there you would see (Restricted) this command. So, the main reason-running scripts on this system are Restricted.
Now we need to change this policy to allow the operation. Use this command to make it "Unrestricted":
Set-ExecutionPolicy Unrestricted
Here you will get a prompt message and you should press 'Y' to change from Restricted to Unrestricted.
To ensure, you may check the execution policy by Get-ExecutionPolicy again.
The possible output should be (Unrestricted)
Now you can use the expo start command on your machine.
I am running a script in Windows Server 2008 that has PowerShell V2 that collects the AppPool names from IIS and then deletes all of them and then deletes all of the websites and contents of those websites. No issue. Works beautifully. Now, run that same script on Windows Server 2012 with PowerShell V4 with the same pools and websites and doesn't delete everything because of spaces in the names of AppPools. Why is this only an issue in PS V4.
This is what it looks like
Also it doesn't matter in v4 whether I double quote the $site variable because it still throws the same error. Again this is just fine in v2 and it runs WAY faster. The exact same pools, sites, content on 2008 servers can get deleted in about 20 seconds, and on 2012 it takes several minutes and then throws errors like this and skips some stuff.
....
$msdeploy=Get-Command 'C:\Program Files (x86)\IIS\Microsoft Web Deploy V3\msdeploy.exe'
foreach ($site in $list.WebAppName) {
write-host $site
& $msdeploy -verb:delete -dest:appPoolConfig=$site –skip:objectname=rootwebconfig32 –skip:objectname=httpCert –skip:objectname=machineconfig32
}
The output is this as an example
2.0 DefaultAppPool
msdeploy.exe : Error: Unrecognized argument '"-dest:"appPoolConfig=2.0'. All arguments must begin with "-".
At line:19 char:5
+ & $msdeploy -verb:delete -dest:appPoolConfig=$site –skip:objectname=rootwe ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (Error: Unrecogn...begin with "-".:String) [], RemoteException
+ FullyQualifiedErrorId : NativeCommandError
Error count: 1.
You are doing it wrongly. From MSDN documentation of Web Deploy Command Line Syntax an excerpt
With a minor modification to its usual syntax, Web Deploy commands can
be run from a Windows PowerShell prompt. To do this, change the colon
character (:) after the verb, source, and dest arguments of the Web
Deploy command to an equal sign (=). In the following example, compare
the Web Deploy command with its PowerShell version.
Web Deploy command: msdeploy -verb:sync -source:metakey=/lm/w3svc/1
-dest:metakey=/lm/w3svc/2 -verbose
PowerShell command: .\msdeploy.exe -verb=sync
-source=metakey=/lm/w3svc/1 -dest=metakey=/lm/w3svc/2 -verbose
Your command syntax should be
$msdeploy -verb=delete -dest=appPoolConfig=$site.trim() –skip=objectname=rootwebconfig32
–skip=objectname=httpCert –skip=objectname=machineconfig32
I would try encapsulating your param value in quotes and the statement in braces, like below:
& { $msdeploy -verb=delete -dest=appPoolConfig="$site" –skip=objectname=rootwebconfig32 –skip=objectname=httpCert –skip=objectname=machineconfig32 }
I am installing NSB infrastructure using Powershell cmdlets. I tried on some test virtual machines and it seems to work ok. However, in production, on Windows 2008 R2 machine I get the following error
PS C:\temp\mcbus\1> Install-NServiceBusPerformanceCounters
Install-NServiceBusPerformanceCounters : Category does not exist.
At line:1 char:1
+ Install-NServiceBusPerformanceCounters
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [Install-NServiceBusPerformanceCounters], InvalidOperationException
+ FullyQualifiedErrorId : System.InvalidOperationException,NServiceBus.PowerShell.InstallPerformanceCounters
I tried to do lodctr /R but it did not help.
I also found that the category can be removed by this command
[Diagnostics.PerformanceCounterCategory]::Delete( "NServiceBus" )
It really works when I go to the Powershell window, issue this command and do Install-NServiceBusPerformanceCounters, then it works. If I repeat issuing Install-NServiceBusPerformanceCounters, it says that performance counters already exist so everything is fine. NSB.Host.exe also starts properly.
However, when I run my msi, from where the Powershell script is invoked, the error returns.
There is a pure PowerShell version of the NServiceBus Per Counters that is currently being worked on. It can be found here https://github.com/Particular/Packages.PerfCounters/blob/master/src/tools/setup.ps1
It is actually designed to be called from a Chocolatey package http://chocolatey.org/packages/nservicebus.perfcounters.install but should be able to be called directly
Does anyone know how I can load a DLL without having it on each remote server I am using in a persistent connection and running the invoke-command cmdlet with?
I am using DotNetZip to backup folders on about 13 servers. Everything is working locally, but when it gets to a remote server (the first one in the array is the local server), it errors because it doesn't see the DLL on the remote server.
I execute this script on one server and it should zip up folders on each remote server:
foreach($i in $appServers) {
$sessionForI = New-PSSession -computername $i
Invoke-Command -Session $sessionForI -ScriptBlock {
if (!(Test-Path -path C:\\newDeploy)) {
New-Item C:\\newDeploy -type directory
}
[System.Reflection.Assembly]::LoadFrom("C:\\newDeploy\\Ionic.Zip.dll");
$directoryToZip = "C:\\Program Files (x86)\\SubDir\\$folder"
$zipfile = new-object Ionic.Zip.ZipFile
$e = $zipfile.AddSelectedFiles("name != '*.e2e'",$directoryToZip, "",1)
if (!(Test-Path -path C:\\newDeploy\\backup)) {
New-Item C:\\newDeploy\\backup -type directory
}
$zipfile.Save("C:\\newDeploy\\backup\\" + $folder+ ".zip")
$zipfile.Dispose()
}
remove-PSSession -session $sessionForI
}
Thank you .
-Jim
I'm pretty sure you are going to need to copy Ionic.Zip.dll to the remote machines to do this. You could try sharing it out from your lead system and using a UNC path to load it from the remote machines (i've never tried that... going to now...) :-)
Update - yep just confirmed you can pass a UNC path to [System.Reflection.Assembly]::LoadFrom.
Update 2 - While the assembly loaded, using it didn't work so well:
Exception calling "AddFile" with "1" argument(s): "Request for the permission of type 'System.Security.Permissions.File
IOPermission, mscorlib, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089' failed."
At line:1 char:11
+ $z.AddFile <<<< ("C:\AMCleanUp.log")
+ CategoryInfo : NotSpecified: (:) [], MethodInvocationException
+ FullyQualifiedErrorId : DotNetMethodException
When I loaded a local copy of the the DLL the AddFile method worked fine. You're only option might be to copy this DLL to all your servers...
You can use a UNC path in the LoadFrom for the remote boxes, but I see that someone has had problem doing the same with DotNetZip:
http://social.technet.microsoft.com/Forums/en-US/winserverpowershell/thread/dd5dcae2-1ccc-4be2-b986-61c069102ffb/
I think your problems with accessing remote resources in an already remote session has to do with double-hop authentication. Check this link http://www.ravichaganti.com/blog/?p=1230