RabbitMQ Web Interface - doesn't show queue until queue read? - rabbitmq

I have a Powershell that sends a message to Rabbit-MQ. I've seen the same from BizTalk. The new queue name does not show up on the web interface until a message is retrieved from that queue at least one time. Is this the normal process? I don't understand why it wouldn't show up as soon as data was written to it.
Similar, I seem to be noticing that I can't store records in the queue until it is read once. I'm sending to an exchange with a routing key. If I write a program and send 5 messages there, the queue doesn't show up in RabbitMQ. But once I create a program to read/listen to that queue, from then on it shows up with the count of messages.
Sample Powershell Code:
Import-Module PSRabbitMQ
Set-RabbitMQConfig -ComputerName localhost
$User = "myuser"
#The second command uses the ConvertTo-SecureString cmdlet to create a secure string from a plain text password. The command uses the *AsPlainText* parameter to indicate that the string is plain text and the *Force* parameter to confirm that you understand the risks of using plain text.
$PWord = ConvertTo-SecureString -String "mypassword" -AsPlainText -Force
#The third command uses the New-Object cmdlet to create a **PSCredential** object from the values in the $User and $PWord variables.
$CredRabbit = New-Object -TypeName "System.Management.Automation.PSCredential" -ArgumentList $User, $PWord
#Set some common parameters we will always use:
$Params = #{
Credential = $CredRabbit
}
$Exchange = "MyExchange"
$RoutingKey = "NealTest3"
$showDate = Get-Date -DisplayHint Date
$numMessagesPerRun = 5
for ($j=1; $j -le $numMessagesPerRun; $j++)
{
$message = "Hello at date/time= $showDate " + $j
Write-Host "Message = $message"
Send-RabbitMQMessage -Exchange $Exchange -Key $RoutingKey -InputObject $message -vhost base -Persistent #Params
}
$showDate = Get-Date -DisplayHint Date
Write-Host "Ended $showDate"
My Queue: NealTest3 will not show up when I do "rabbitmqctl list_queues -p base" until I run another Powershell to consume at least one message from that queue.
Code from second program to read the queue (left out the same logon info):
Start-RabbitMqListener -Exchange $Exchange `
-Key $QueueName `
-QueueName $QueueName `
-AutoDelete $false `
-vhost base `
#Params | % {
#$req = $_ | ConvertFrom-Json
$req = $_
$counter = $counter + 1
Write-Host $counter + " " + $req
}
It reads the 5 messages I put there, even though the queue in theory didn't exist, according to the list_queues.

Related

Powershell send Email with multiple address and attachment

I have an Excel file that stores the locations of .sql and .csv and the email addresses to be sent. The field headers are "To" (email recipients), "Subject" (email subjects), "Query" (.sql paths), and "CSV" (.csv paths to be exported). I made a powershell script that will run the queries and then send the exported csv to corresponding email addresses.
It works now without problem but I have no idea how I could modify the script so I can add this function; some recipients don't want to receive multiple emails after the script -> One email with multiple csv results/attachments (so I can group all csv if they are belong to each address). For example, recipient alan#abc.com may receive email of query 1,2,3 (3 csv) according to the excel file. Do you have any suggestion on that? Thanks.
This is my script (it's a for loop and will check if there is any empty result):
#SQL/Mail
$SQLServer = '.\SQLEXPRESS'
$db = 'db'
$smtp = 'smtp.com'
$from = 'test-report#abc.com'
#variable
$var = Import-CSV C:\SQL\Var.csv
$lastmonth = (Get-Date).AddMonths(-1).ToString("MMMM")
$logfilename = $(get-date -f yyyy-MM-dd)
#starts
Start-Transcript -Path C:\SQL\log_$logfilename.txt
for ($i = 0; $i -lt $var.Count; ++$i){
#email subject
$subject = $var.Subject[$i] + ' of '+ $lastmonth
#status
Write-Host "Running" $var.Subject[$i]
#query execution
$result = Invoke-Sqlcmd -InputFile $var.Query[$i] -ServerInstance $SQLServer -Database $db -OutVariable sqlReturn
#check if the result is empty
$is_empty = ($result).count
if($is_empty -eq 0){
Write-Host $var.Subject[$i] "has empty Result"
Send-MailMessage -From $from -To $var.To[$i] -Subject $subject -Body 'Empty' -Priority High -DeliveryNotificationOption OnSuccess, OnFailure -SmtpServer $smtp #send message without attachment
Write-Host "Sending Email"
}
else{
$result | Export-Csv -NoTypeInformation -Path $var.CSV[$i] -Encoding UTF8 #result export to csv
Send-MailMessage -From $from -To $var.To[$i] -Subject $subject -Body $lastmonth -Attachments $var.CSV[$i] -Priority High -DeliveryNotificationOption OnSuccess, OnFailure -SmtpServer $smtp #send with csv
Write-Host "Sending Email"
}
}
#finished loop
Write-Host "Finished"
Stop-Transcript

PowerShell 7. ForEach-Object -Parallel Does Not Autheticate Against Azure PowerShell

We wrote a script that supposed to execute Azure PowerShell commands in parallel. The problem is when we increase -ThrottleLimit higher than one, some of the commands are not being performed properly. The script is:
# Writing IPs for whitelisting into file.
Add-Content -Path IPs.txt -Value ((Get-AzWebApp -ResourceGroupName "ResourceGroup1" -Name "WebApp1").OutboundIpAddresses).Split(",")
Add-Content -Path IPs.txt -Value ((Get-AzWebApp -ResourceGroupName "ResourceGroup1" -Name "WebApp1").PossibleOutboundIpAddresses).Split(",")
# Writing new file with inique IPs.
Get-Content IPs.txt | Sort-Object -Unique | Set-Content UniqueIPs.txt
# Referencing the file.
$IPsForWhitelisting = Get-Content UniqueIPs.txt
# Assigning priotiry number to each IP
$Count = 100
$List = foreach ($IP in $IPsForWhitelisting) {
$IP|Select #{l='IP';e={$_}},#{l='Count';e={$Count}}
$Count++
}
# Whitelisting all the IPs from the list.
$List | ForEach-Object -Parallel {
$IP = $_.IP
$Priority = $_.Count
$azureApplicationId ="***"
$azureTenantId= "***"
$azureApplicationSecret = "***"
$azureSecurePassword = ConvertTo-SecureString $azureApplicationSecret -AsPlainText -Force
$credential = New-Object System.Management.Automation.PSCredential($azureApplicationId , $azureSecurePassword)
Connect-AzAccount -Credential $credential -TenantId $azureTenantId -ServicePrincipal | Out-null
echo "IP-$Priority"
echo "$IP/24"
echo $Priority
Add-AzWebAppAccessRestrictionRule -ResourceGroupName "ResourceGroup1" -WebAppName "WebApp1" -Name "IP-$Priority" -Priority $Priority -Action Allow -IpAddress "$IP/24"
} -ThrottleLimit 1
If ThrottleLimit is set to 1 - 8 rules are being created, if ThrottleLimit is set to 2 - 7 rules are being created, 3 - 4 rules, 10 - 1 rule, hence some rules are being skipped.
What is the reason for such behavior?
In short - the -Parallel parameter does not (yet perhaps) magically import all dependent variables that fall in the scope of the For-EachObject block. In reality PWSH spans separate processes and only the array that is looped over will be implicitly passed, all other variables need explicit designations.
One should use the $using: directive (prefix) to denote which variables are to be imported (made visible) in the parallel code block.
Example:
$avar = [Int]10
$bvar = [Int]20
$list = #('here', 'it', 'eees')
$list | ForEach-Object -Parallel {
Write-Output "(a, b) is here ($($using:avar), $($using:bvar))"
Write-Output "(a, b) missing ($($avar), $($bvar))"
Write-Output "Current element is $_"
}```
*thus - the described behavior is likely due to the fact that config. variables are not imported (at all) and thus the operations silently fail.*

Invoke-AzVMRunCommand as a job

I am trying to use Invoke-AzVMRunCommand as a job. when I executed below script the job is created and executed successfully but I am failing to write the output like which job result belongs to which vm.
Invoke-AzVMRunCommand is used to invoke a command on a particular VM. You should have this information beforehand.
Here is some information on -AsJob parameter
https://learn.microsoft.com/en-us/powershell/module/az.compute/invoke-azvmruncommand?view=azps-2.6.0#parameters
As suggested by AmanGarg-MSFT, you should have that information before hand. You can use a hashtable $Jobs to store the server name and Invoke-AzVMRunCommand output and later iterate through using the $Jobs.GetEnumerator().
$Jobs = #{}
$Servers = "Server01","Server02"
[System.String]$ScriptBlock = {Get-Process}
$FileName = "RunScript.ps1"
Out-File -FilePath $FileName -InputObject $ScriptBlock -NoNewline
$Servers | ForEach-Object {
$vm = Get-AzVM -Name $_
$Jobs.Add($_,(Invoke-AzVMRunCommand -ResourceGroupName $vm.ResourceGroupName -Name $_ -CommandId 'RunPowerShellScript' -ScriptPath $FileName -AsJob))
}

Database relocation, Detach/attach database, MS-SQL-Server-Management-Studio(2012)

In the next week we want to relocate our database from one server to another one.
On http://msdn.microsoft.com/en-us/library/ms187858%28v=sql.110%29.aspx
I read about detaching the database from the old location and attach it to the new location.
The problem is, that I don't have access to the file system of the server, I don't even know where exactly the server is physically located^^
Is there a way to relocate a database from one Server to another without the need to access the file system of the old Server?
You could use the Import/Export tool in SQL Server to copy the data directly which will create a new database in the destination location. The good thing about this is the new DB will work as you might expect since it is created from scratch on the target server, but that also means that you might have old, deprecated syntax in your stored procs or functions or whatever which won't work unless you lower the compatibility level (although that shouldn't be hard). also be aware of any possible collation conflicts (your old server might have SQL_Latin1_General_CP1_CI_AS and the new one might be Latin1_General_CI_AS which can cause equality operations to fail amongst other things).
In addition, if you have a big database then it'll take a long time, but I can't think of any other method off the of of my head which doesn't require some level of access to the file system as you'd still need to get to the file system to take a copy of a backup, or if using a UNC path for the backup the source server would need to be able to write to that location and you'd need to be able to access it afterwards. If anyone else can think of one I'd be interested because it would be a useful bit of knowledge to have tucked away.
Edit:
Should also have mentioned the use of Powershell and SMO - it's not really any different to using the Import/Export wizard but it does allow you to fine tune things. The following is a PS script I have been using to create a copy of a DB (schema only) on a different server to the original but with certain facets missing (NCIs, FKs Indeitites etc) as the copy was destined to be read-only. You could easily extend it to copy the data as well.
param (
[string]$sourceServerName = $(throw "Source server name is required."),
[string]$destServerName = $(throw "Destination server is required."),
[string]$sourceDBName = $(throw "Source database name is required."),
[string]$destDBName = $(throw "Destination database name is required"),
[string]$schema = "dbo"
)
# Add an error trap so that at the end of the script we can see if we recorded any non-fatal errors and if so then throw
# an error and return 1 so that the SQL job recognises there's been an error.
trap
{
write-output $_
exit 1
}
# Append year to destination DB name if it isn't already on the end.
$year = (Get-Date).AddYears(-6).Year
if (-Not $destDBName.EndsWith($year)) {
$destDBName+=$year
}
# Load assemblies.
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.ConnectionInfo") | out-null
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SMO") | out-null
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SmoExtended") | out-null
# Set up source connection.
$sourceSrvConn = new-object Microsoft.SqlServer.Management.Common.ServerConnection
$sourceSrvConn.ServerInstance = $sourceServerName
$sourceSrvConn.LoginSecure = $false
$sourceSrvConn.Login = "MyLogin"
$sourceSrvConn.Password = "xxx"
# Set up destination connection.
$destSrvConn = new-object Microsoft.SqlServer.Management.Common.ServerConnection
$destSrvConn.ServerInstance = $destServerName
$destSrvConn.LoginSecure = $false
$destSrvConn.Login = "MyLogin"
$destSrvConn.Password = "xxx"
$sourceSrv = New-Object Microsoft.SqlServer.Management.SMO.Server($sourceSrvConn)
$sourceDb = New-Object ("Microsoft.SqlServer.Management.SMO.Database")
$destSrv = New-Object Microsoft.SqlServer.Management.SMO.Server($destSrvConn)
$destDb = New-Object ("Microsoft.SqlServer.Management.SMO.Database")
$tbl = New-Object ("Microsoft.SqlServer.Management.SMO.Table")
$scripter = New-Object Microsoft.SqlServer.Management.SMO.Scripter($sourceSrvConn)
# Get the database objects
$sourceDb = $sourceSrv.Databases[$sourceDbName]
$destDb = $destSrv.Databases[$destDbName]
# Test to see databases exist. Not as easy to test for servers - if you got those wrong then this will fail and throw an error
# so it's down to the user to check their values carefully.
if ($sourceDb -eq $null) {throw "Database '" + $sourceDbName + "' does not exist on server '" + $sourceServerName + "'"}
if ($destDb -eq $null) {throw "Database '" + $destDbName + "' does not exist on server '" + $destServerName + "'"}
# Get source objects.
$tbl = $sourceDb.tables | Where-object { $_.schema -eq $schema -and -not $_.IsSystemObject }
$storedProcs = $sourceDb.StoredProcedures | Where-object { $_.schema -eq $schema -and -not $_.IsSystemObject }
$views = $sourceDb.Views | Where-object { $_.schema -eq $schema -and -not $_.IsSystemObject }
$udfs = $sourceDb.UserDefinedFunctions | Where-object { $_.schema -eq $schema -and -not $_.IsSystemObject }
$catalogs = $sourceDb.FullTextCatalogs
$udtts = $sourceDb.UserDefinedTableTypes | Where-object { $_.schema -eq $schema -and -not $_.IsSystemObject }
$assemblies = $sourceDb.Assemblies | Where-object { -not $_.IsSystemObject }
# Set scripter options to ensure only schema is scripted
$scripter.Options.ScriptSchema = $true;
$scripter.Options.ScriptData = $false;
#Exclude GOs after every line
$scripter.Options.NoCommandTerminator = $false;
$scripter.Options.ToFileOnly = $false
$scripter.Options.AllowSystemObjects = $false
$scripter.Options.Permissions = $true
$scripter.Options.DriForeignKeys = $false
$scripter.Options.SchemaQualify = $true
$scripter.Options.AnsiFile = $true
$scripter.Options.Indexes = $false
$scripter.Options.DriIndexes = $false
$scripter.Options.DriClustered = $true
$scripter.Options.DriNonClustered = $false
$scripter.Options.NonClusteredIndexes = $false
$scripter.Options.ClusteredIndexes = $true
$scripter.Options.FullTextIndexes = $true
$scripter.Options.NoIdentities = $true
$scripter.Options.DriPrimaryKey = $true
$scripter.Options.EnforceScriptingOptions = $true
$pattern = "(\b" + $sourceDBName + "\b)"
$errors = 0
function CopyObjectsToDestination($objects) {
foreach ($o in $objects) {
if ($o -ne $null) {
try {
$script = $scripter.Script($o)
$script = $script -replace $pattern, $destDBName
$destDb.ExecuteNonQuery($script)
} catch {
#Make sure any errors are logged by the SQL job.
$ex = $_.Exception
$message = $o.Name + " " + (Get-Date)
$message += "`r`n"
#$message += $ex.message
$ex = $ex.InnerException
while ($ex.InnerException) {
$message += "`n$ex.InnerException.message"
$ex = $ex.InnerException
}
#Write-Error $o.Name
Write-Error $message # Write to caller. SQL Agent will display this (or at least some of it) in the job step history.
# Need to use Set-Variable or changes to the variable will only be in scope within the function and we want to persist this.
if ($errors -eq 0) {
Set-Variable -Name errors -Scope 1 -Value 1
}
}
}
}
}
# Output the scripts
CopyObjectsToDestination $assemblies
CopyObjectsToDestination $tbl
CopyObjectsToDestination $udfs
CopyObjectsToDestination $views
CopyObjectsToDestination $storedProcs
CopyObjectsToDestination $catalogs
CopyObjectsToDestination $udtts
# Disconnect from databases cleanly.
$sourceSrv.ConnectionContext.Disconnect()
$destSrv.ConnectionContext.Disconnect()
# Did we encounter any non-fatal errors along the way (SQL errors and suchlike)? If yes then throw an exception which tells the
# user to check the log files.
if ($errors -eq 1) {
throw "Errors encountered - see log file for details"
}

Powershell using file? "being used by another process"

I have this powershell script running. The first time it runs it runs flawlessly, the second time it runs i get the error that the .csv cannont be access "because it is being used by another process. Any idea which part of the script is "holding onto" the file and how i can make it let it go at the end?
clear
set-executionpolicy remotesigned
# change this to the directory that the script is sitting in
cd d:\directory
#############################################
# Saves usernames/accountNumbers into array #
# and creates sql file for writing to #
#############################################
# This is the location of the text file containing accounts
$accountNumbers = (Get-Content input.txt) | Sort-Object
$accountID=0
$numAccounts = $accountNumbers.Count
$outString =$null
# the name of the sql file containing the query
$file = New-Item -ItemType file -name sql.sql -Force
###################################
# Load SqlServerProviderSnapin100 #
###################################
if (!(Get-PSSnapin | ?{$_.name -eq 'SqlServerProviderSnapin110'}))
{
if(Get-PSSnapin -registered | ?{$_.name -eq 'SqlServerProviderSnapin110'})
{
add-pssnapin SqlServerProviderSnapin100
Write-host SQL Server Provider Snapin Loaded
}
else
{
}
}
else
{
Write-host SQL Server Provider Snapin was already loaded
}
#################################
# Load SqlServerCmdletSnapin100 #
#################################
if (!(Get-PSSnapin | ?{$_.name -eq 'SqlServerCmdletSnapin100'}))
{
if(Get-PSSnapin -registered | ?{$_.name -eq 'SqlServerCmdletSnapin100'})
{
add-pssnapin SqlServerCmdletSnapin100
Write-host SQL Server Cmdlet Snapin Loaded
}
else
{
}
}
else
{
Write-host SQL Server CMDlet Snapin was already loaded
}
####################
# Create SQL query #
####################
# This part of the query is COMPLETELY static. What is put in here will not change. It will usually end with either % or '
$outString = "SELECT stuff FROM table LIKE '%"
# Statement ends at '. loop adds in "xxx' or like 'xxx"
IF ($numAccounts -gt 0)
{
For ($i =1; $i -le ($AccountNumbers.Count - 1); $i++)
{
$outString = $outstring + $AccountNumbers[$accountID]
$outString = $outString + "' OR ca.accountnumber LIKE '"
$accountID++
}
$outString = $outString + $AccountNumbers[$AccountNumbers.Count - 1]
}
else
{
$outString = $outString + $AccountNumbers
}
# This is the end of the query. This is also COMPLETELY static. usually starts with either % or '
$outString = $outString + "%'more sql stuff"
add-content $file $outString
Write-host Sql query dynamically written and saved to file
###########################
# Create CSV to email out #
###########################
#Make sure to point it to the correct input file (sql query made above) and correct output csv.
Invoke-Sqlcmd -ServerInstance instance -Database database -Username username -Password password -InputFile sql.sql | Export-Csv -Path output.csv
####################################
# Email the CSV to selected people #
####################################
$emailFrom = "to"
$emailTo = "from"
$subject = "test"
$body = "test"
$smtpServer = "server"
# Point this to the correct csv created above
$filename = "output.csv"
$att = new-object Net.mail.attachment($filename)
$msg = new-object net.mail.mailmessage
$smtp = new-object Net.Mail.SmtpClient($smtpServer)
$msg.from = $emailFrom
$msg.to.add($emailto)
$msg.subject = $subject
$msg.body = $body
$msg.attachments.add($att)
$smtp.Send($msg)
Can you try to add at th end :
$att.Dispose()
$msg.Dispose()
$smtp.Dispose()
You could also try and use a tool like procmon and see what does the script do whenever it acquires a lock on the file and doesn't release it. Also, since (supposedly) the problem is with the .csv file, you could load it as byte array instead of passing it's path as an attachment. This way the file should be read once and not locked.