Log Analytics configuration via az cli / PowerShell? - azure-log-analytics

is there a way to programmatically (az cli, PowerShell) to retrieve the following information:

For anyone ever in the need to achieve the above you can refer to Get-AzOperationalInsightsWorkspace and Get-AzOperationalInsightsDataSource. Wrote a simple PowerShell script that output the Log Analytics workspace plus Event Log settings in tabular format.

Related

How do I pass variables to Fastload Script using a wrapper shell script

I'm working on teradata fastload scripts and I'm using the LOGON command to establish a connection with the database.
LOGON DBC_ip/username,password;
But for security purpose I would like to get the password from a vault like application using a shell script.
Initialy I was trying to create a wrapper shell script that would get the password from the vault and use it in the fast load script.
wrapper_shell_script--> fetch_password($password) --> execute fastload script using $password.
Example: LOGON DBC_ip/username,$password;
My question:
Is it possible to use external variables in fastload scripts. if yes, can it done using this process.
Could anyone please help me if this is possible or if there any other bette way to implement this.
Let me know if you need more details
Thank you in advance!!

ADLS to Azure Storage Sync Using AzCopy

Looking for some help to resolve the errors I'm facing. Let me explain the scenario. I'm trying to sync one of the ADLS Gen2 container to Azure BLOB Storage. I have AzCopy 10.4.3, I'm using Azcopy Sync to do this. I'm using the command below
azcopy sync 'https://ADLSGen2.blob.core.windows.net/testsamplefiles/SAMPLE' 'https://AzureBlobStorage.blob.core.windows.net/testsamplefiles/SAMPLE' --recursive
When I run this command I'm getting below error
REQUEST/RESPONSE (Try=1/71.0063ms, OpTime=110.9373ms) -- RESPONSE SUCCESSFULLY RECEIVED
PUT https://AzureBlobStorage.blob.core.windows.net/testsamplefiles/SAMPLE/SampleFile.parquet?blockid=ZDQ0ODlkYzItN2N2QzOWJm&comp=block&timeout=901
X-Ms-Request-Id: [378ca837-d01e-0031-4f48-34cfc2000000]
ERR: [P#0-T#0] COPYFAILED: https://ADLSGen2.blob.core.windows.net/testsamplefiles/SAMPLE/SampleFile.parquet: 404 : 404 The specified resource does not exist.. When Staging block from URL. X-Ms-Request-Id: [378ca837-d01e-0031-4f48-34cfc2000000]
Dst: https://AzureBlobStorage.blob.core.windows.net/testsamplefiles/SAMPLE/SampleFile.parquet
REQUEST/RESPONSE (Try=1/22.9854ms, OpTime=22.9854ms) -- RESPONSE SUCCESSFULLY RECEIVED
GET https://AzureBlobStorage.blob.core.windows.net/testsamplefiles/SAMPLE/SampleFile.parquet?blocklisttype=all&comp=blocklist&timeout=31
X-Ms-Request-Id: [378ca84e-d01e-0031-6148-34cfc2000000]
So far I checked and ensured below things
I logged into correct tenant while logging into AzCopy
Storage Blob Data Contributor role was granted to my AD credentials
Not sure what else I'm missing as the file exists in the source and I'm getting the same error. I tried with SAS but I received different error though. I cannot proceed with SAS due to the vendor policy so I need to ensure this is working with oAuth. Any inputs is really appreciated.
For the 404 error, you may check if there is any typo in the command and the path /testsamplefiles/SAMPLE exists on both source and destination account. Also, please note that from the tips.
Use single quotes in all command shells except for the Windows Command
Shell (cmd.exe). If you're using a Windows Command Shell (cmd.exe),
enclose path arguments with double quotes ("") instead of single
quotes ('').
From azcopy sync supported scenario:
Azure Blob <-> Azure Blob (Source must include a SAS or is publicly
accessible; either SAS or OAuth authentication can be used for
destination)
We must provide include a SAS token in the source, but I tried the below code with AD authentication.
azcopy sync "https://[account].blob.core.windows.net/[container]/[path/to/blob]?[SAS]" "https://[account].blob.core.windows.net/[container]/[path/to/blob]"
but got the same 400 error as the Github issue.
Thus, in this case, after my validation, you could use this command to sync one of the ADLS Gen2 container to Azure BLOB Storage without executing azcopy login. If you have login in, you can run azcopy logout.
azcopy sync "https://nancydl.blob.core.windows.net/container1/sample?sv=xxx" "https://nancytestdiag244.blob.core.windows.net/container1/sample?sv=xxx" --recursive --s2s-preserve-access-tier=false

Permissions Error using Jupyter magic command %load_ext google.cloud.bigquery

Apologies for the complexity of this question, and I really appreciate any help. I'm currently trying to follow the Google tutorial to visualize BigQuery data in a Jupyter notebook (https://cloud.google.com/bigquery/docs/visualize-jupyter). I have permission to use Project-1, but not Project-2.
When I execute the first 2 commands:
%load_ext google.cloud.bigquery
%%bigquery
SELECT
source_year AS year,
COUNT(is_male) AS birth_count
FROM `bigquery-public-data.samples.natality`
GROUP BY year
ORDER BY year DESC
LIMIT 15
...I get an error in the following format:
Caller does not have required permission to use project Project-2
However, when I run !gcloud config list in the notebook, it lists the following (along w/ the correct email account)
[accessibility]
screen_reader = true
[core]
disable_usage_reporting = True
project = Project-1
Your active configuration is: [default]
Am I incorrectly understanding how the %load_ext google.cloud.bigquery statement works? Thanks!
Go to project selector page and select project Project-2, and run gcloud config set project Project-2 command in a cloud shell. Than, check in API & Services -> Credentials section, if you have created any credentials, which allows you to access your enabled APIs, look here.
You can also execute gcloud auth login to specify the credentials that you want to use. Use the same ones that you login to the Google Cloud Console.
The BigQuery Python client library support querying data stored in BigQuery. %load_ext google.cloud.bigquery is one of the many Jupyter built-in commands, which loads the commands from the client library.
Let me know about the results. I hope it helps you.

azure application gateway alert automation

I need a PowerShell script to trigger an alert whenever azure application gateway backend health turn to red. I have 2 subscriptions, under this, we have around 150+ Application Gateway provisioned. so the script should be reusable across all the subscription. it would be great, if any sample script available for the reference.
Thanks
Suri
You can use this command:
Get-AzureRMApplicationGatewayBackendHealth
https://learn.microsoft.com/en-us/powershell/module/azurerm.network/get-azurermapplicationgatewaybackendhealth?view=azurermps-6.13.0
With this command, you can create a runbook inside an automation account, and if the output of the command is a specific value, it can trigger an alert. Spending a few minutes on google shows you how to combine these techniques ;)

How can I execute multiple bq commands for different projects using gcloud configurations

On my linux server, I have a 'load_app' user account. My plan was to write a generic shell script to load data into bigquery using bq tool. gcloud is installed and I have multiple configurations in my environment. These configurations point to different service and user accounts for different projects. I am trying to set it up such that I can run multiple executions of this script at the same time for different configurations.
I tried setting CLOUDSDK_ACTIVE_CONFIG_NAME= in my script, the config-name value is being passed to the script. This setting should switch the active configuration, however, I am getting the following error :
ERROR: (bq) There was a problem refreshing your current auth tokens: invalid_grant: Bad Request