SAS token as a SecureString not working with ARM template deployment using Azure PowerShell - azure-powershell

I have a bunch of nested ARM templates meant to be deployed using Azure PS.
The only way to do that is to host those templates in a Azure blob container and then generate SAS token and send these 2 parameters in the main ARM template (which points to the nested ones).
Here is my PS that generates the SAS token:
$SasToken = ConvertTo-SecureString -AsPlainText -Force (New-AzureStorageContainerSASToken -Container $StorageContainerName -Context $StorageAccount.Context -Permission r -ExpiryTime (Get-Date).AddHours(4))
Here are 2 parts of my deployment script which pass the token to the main ARM template:
$Parameters['_artifactsLocationSasToken'] = $SasToken
and
New-AzureRmResourceGroupDeployment -Name ((Get-ChildItem $TemplateFile).BaseName + '-' + ((Get-Date).ToUniversalTime()).ToString('MMdd-HHmm')) `
-ResourceGroupName $ResourceGroupName `
-TemplateFile $TemplateFile `
-TemplateParameterObject $Parameters `
-Force -Verbose `
-ErrorVariable ErrorMessages
Here is the declaration for the receiving parameter to the main ARM template:
"_artifactsLocationSasToken": {
"type": "securestring"
}
Here is the nested resource template (which happens to be a cosmos db) in the same main ARM template:
{
"apiVersion": "2017-05-10",
"dependsOn": [
"[concat('Microsoft.Resources/deployments/', variables('vnetConfig').Name)]"
],
"name": "[variables('cosmosDbConfig').Name]",
"properties": {
"mode": "Incremental",
"templateLink": {
"uri": "[concat(parameters('_artifactsLocation'), '/', variables('nestedTemplatesFolder'), '/cosmosdb.json', parameters('_artifactsLocationSasToken'))]"
},
"parameters": {
"cosmosDbConfig": {
"value": "[variables('cosmosDbConfig')]"
}
}
},
"type": "Microsoft.Resources/deployments"
}
When I run these, I get this error:
Error: Code=InvalidTemplate; Message=Deployment template validation
failed: 'The provided value for the template parameter
'_artifactsLocationSasToken' at line '16' and column '39' is not
valid.'
If I hard code the SAS token in the nested template resource (in main template) and change the type from securestring to string, it just works!
What is it that I am missing?

Seems you missed the uri() in the templateLink, try the one as below, see this sample which also uses securestring.
"templateLink": {
"uri": "[uri(concat(parameters('_artifactsLocation'), '/', variables('nestedTemplatesFolder'), '/cosmosdb.json', parameters('_artifactsLocationSasToken')))]"
}

I have found that you can reproduce this problem i.e. "Template validation failed" when passing a SAS token to an ARM template, as a parameter.
In my case, the SAS token was being used for the "WEBSITE_RUN_FROM_PACKAGE" App Setting of an Azure Function App.
My solution to this problem, was to prefix the value of the SAS token (that I passed from PowerShell, to the ARM template) with something - so that it was no longer a valid URL.
For example, if you prefix the SAS token with an underscore and pass that to the ARM template, this problem no longer occurs.
You can then strip off the underscore prefix from within the ARM template.

Related

Is there a way in the Google Sheets API to access the script editor?

For example, I'd like to write a (Python, say) program that generates a Google sheet, then writes some custom .gs code into the Apps Script attached to that sheet, then writes values into the sheet itself using formulas defined in the Apps Script. What I currently do is use Tools > Script Editor, then manually copy-paste the relevant Apps Script code.
As mentioned by #Tanaike, You can use Apps Script API to add container-bound script.
(OPTION 1: Writing the function manually in the script content)
What to do:
Create a container-bound script using projecs.create method.
To set a container-bound script, you need to assign the spreadsheet file id to the parentId parameter
Sample Request Body:
{
"title": "new",
"parentId": "spreadsheet file id"
}
Get the newly created container-bound script id in the project response body of projecs.create method.
Sample Response Body:
{
"scriptId": "newly created script id",
"title": "new",
"parentId": "spreadsheet file id",
"createTime": "2020-12-25T23:33:48.026Z",
"updateTime": "2020-12-25T23:33:48.026Z"
}
Update the content of the newly created bound script using projects.updateContent method and include your function.
Sample Request Body:
{
files: [{
name: 'hello',
type: 'SERVER_JS',
source: 'function helloWorld() {\n console.log("Hello, world!");\n}'
}, {
name: 'appsscript',
type: 'JSON',
source: "{\"timeZone\":\"America/New_York\",\"" +
"exceptionLogging\":\"CLOUD\"}"
}]
}
(OPTION 2: Copy an existing script and paste it as bound script)
Create a standalone script that will contain your custom functions.
Get the newly created standalone script project content using projects.getContent which will return a content resource
scriptId can be seen in your standalone script project under File -> Project Properties
Create a container-bound script using projecs.create method.
To set a container-bound script, you need to assign the spreadsheet file id to the parentId parameter
Sample Request Body:
{
"title": "new",
"parentId": "spreadsheet file id"
}
Get the newly created container-bound script id in the project response body of projecs.create method.
Sample Response Body:
{
"scriptId": "newly created script id",
"title": "new",
"parentId": "spreadsheet file id",
"createTime": "2020-12-25T23:33:48.026Z",
"updateTime": "2020-12-25T23:33:48.026Z"
}
Update the content of the newly created bound script using projects.updateContent method.
Use the content resource returned in Step 2 as the request body. Make sure to replace the script id based on the newly created bound script id that is obtained in Step 4.
Example Standalone Script:
Example Result of the Workaround:
You can now use the custom function in the Google Sheets

Unable to set scopes with EnableTokenAcquisitionToCallDownstreamApi with AddMicrosoftIdentityWebApiAuthentication

This compiles in startup.cs:
string[] initialScopes = Configuration.GetValue<string>("GraphApi:Scopes")?.Split(' ');
services.AddMicrosoftIdentityWebAppAuthentication(Configuration)
.EnableTokenAcquisitionToCallDownstreamApi(initialScopes)
.AddMicrosoftGraph();
However, since I'm working on an API controller only, and not a web app, I want to do this instead:
string[] initialScopes = Configuration.GetValue<string>("GraphApi:Scopes")?.Split(' ');
services.AddMicrosoftIdentityWebApiAuthentication(Configuration)
.EnableTokenAcquisitionToCallDownstreamApi(initialScopes)
.AddMicrosoftGraph();
However, this does not compile. At least not with Microsoft.Identity.Web v1.0. Intellisense tells me that string[] is not assignable to parameter type ConfidentialClientApplicationOptions.
The difference is that AddMicrosoftIdentityWebApiAuthentication returns a MicrosoftIdentityWebApiAuthenticationBuilder whereas the former returns a `MicrosoftIdentityAppCallsWebApiAuthenticationBuilder.
Is there a work-around here?
What I'm trying to accomplish, is to ensure that the scope is set correctly when the Graph access token is requested. Currently, this does not seem to be the case, as I get an error like this when trying call Graph - e.g. await graphServiceClient.Applications.Request().GetAsync();.
---> Microsoft.Identity.Web.MicrosoftIdentityWebChallengeUserException: IDW10502: An MsalUiRequiredException was thrown due to a challenge for the user. See https://aka.ms/ms-id-web/ca_incremental-consent.
---> MSAL.NetCore.4.18.0.0.MsalUiRequiredException:
ErrorCode: user_null
Microsoft.Identity.Client.MsalUiRequiredException: No account or login hint was
In other words, what I need is an application access token which looks like this:
[...more json]
{
"aud": "https://graph.microsoft.com",
"iss": "https://sts.windows.net/{tenant id}/",
"iat": 1602055659,
"nbf": 1602055659,
"exp": 1602059559,
"aio": "[...]",
"app_displayname": "My app",
"appid": "{App id guid}",
"appidacr": "1",
"idp": "https://sts.windows.net/{tenant id}/",
"idtyp": "app",
"oid": "{guid}",
"rh": "[...]",
"roles": [
"Application.ReadWrite.All",
"Directory.ReadWrite.All",
"Directory.Read.All",
"Application.Read.All"
],
"sub": "{guid}",
"tenant_region_scope": "EU",
"tid": "{tenant id}",
"uti": "[...]",
"ver": "1.0",
"xms_tcdt": 1597308440
}.[Signature]
I can generate the above manually with Postman and if I use it, it works. Confirming that the configuration of the Application is correct in Azure AD.
Apparently, using an application access token is current not yet supported.
IDW10502: An MsalUiRequiredException was thrown due to a challenge for the user #667
use
.AddMicrosoftGraph(Configuration.GetSection("GraphAPI"))
and make sure you appsettings.json contains the following (use the Scope you want separated by space):
"GraphAPI": {
"BaseUrl": "https://graph.microsoft.com/v1.0",
"Scopes": "user.read mail.read",
"DefaultScope": "https://graph.microsoft.com/.default"
}

How to run a SQL query in Cloud Formation template to enable Delayed_Durability in AWS RDS

I have a Cloud Formation template to create a SQL DB in the RDS and want to enable Delayed_Durability feature by default in it by running this query:
ALTER DATABASE dbname SET DELAYED_DURABILITY = FORCED;
Is there a way to run this query right after db instance is created through CF template?
My CF template looks like this:
"Type":"AWS::RDS::DBInstance",
"Properties":{
"AllocatedStorage":"200",
"AutoMinorVersionUpgrade":"false",
"BackupRetentionPeriod":"1",
"DBInstanceClass":"db.m4.large",
"DBInstanceIdentifier":"mydb",
"DBParameterGroupName": {
"Ref": "MyDBParameterGroup"
},
"DBSubnetGroupName":{
"Ref":"dbSubnetGroup"
},
"Engine":"sqlserver-web",
"EngineVersion":"13.00.4422.0.v1",
"LicenseModel":"license-included",
"MasterUsername":"prod_user",
"MasterUserPassword":{ "Ref" : "dbpass" },
"MonitoringInterval":"60",
"MonitoringRoleArn": {
"Fn::GetAtt": [
"RdsMontioringRole",
"Arn"
]
},
"PreferredBackupWindow":"09:39-10:09",
"PreferredMaintenanceWindow":"Sun:08:58-Sun:09:28",
"PubliclyAccessible": false,
"StorageType":"gp2",
"StorageEncrypted": true,
"VPCSecurityGroups":[
{
"Fn::ImportValue":{
"Fn::Sub":"${NetworkStackName}-RDSSecGrp"
}
}
],
"Tags":[
{
"Key":"Name",
"Value":"my-db"
}
]
}
}
Is there a way to run this query right after db instance is created through CF template?
Depends. If you want to do it from within CloudFormation (CFN) then sadly, you can't do this using plain CFN. To do it from CFN, you would have to develop a custom resource. The resource would be in the form of lambda function. You would pass the DB details to the function in your CFN, and it could run and execute your query. It could also return any results you want to your CFN for further use.
In contrast, if you create your CFN stack using AWS CLI or SDK, then once create-stack call is completed, you can run your query from bash or any programming language you use do deploy your stack.

How to POST json parameters from Postman to Jenkins?

I need to call a Jenkins job using its API through Postman. This job requires parameters (HOST, VERBOSITY and PMSP).
Auth works using Jenkins token and header Content-type:application/json is used.
I tried to call the endpoint https://jenkins_server/job/job_name/build/api/json adding the following body to the request but the result is Nothing is submitted, and the job doesn't run.
I tried to call the endpoint https://jenkins_server/job/job_name/buildWithParameters/api/json adding the same body. I get 201 Created (job is running) but no parameters are given to the job.
{
"parameter": [
{
"name": "HOSTS",
"value": "[linux]\n1.2.3.4"
},
{
"name": "VERBOSITY",
"value": "vv"
},
{
"name": "SANS_PMSP",
"value": true
}
]
}
Is my JSON well constructed ? Which endpoint do I need to call ?
If it's Postman that you would like to focus on, you can import the curl command straight into the application.
This creates a new request for you to use and it populates this request, based on the details in the command.
From here, you should be able to add your own URL and point this at the location you need.

New-StreamAnalyticsJob cannot create Operations Monitoring Input for an IOT Hub

We have a Stream Analytics job that has an Input mapping to an IOT Hub Operations Monitoring endpoint. We originally defined our job on the Azure Portal. It works fine when so created / updated.
We use the job logic in multiple "Azure environments" and are now keeping it in source control. We used the Visual Studio Stream Analytics Project type to manage the source code.
We are using the New-StreamAnalyticsJob Powershell command to deploy our job into different environments.
Each time we deploy, however, the resulting Stream Analytics Job's Input points to the Messaging endpoint of our IOT Hub instead of the Operations Monitoring endpoint.
Is there something we can enter into the input's JSON file to express the endpoint type? Here is the Input content of our JSON input to the cmdlet:
"Inputs": [{
"Name": "IOT-Hub-Monitoring-By-Consumer-Group",
"Properties": {
"DataSource": {
"Properties": {
"ConsumerGroupName": "theConsumerGroup",
"IotHubNamespace": "theIotNamespace",
"SharedAccessPolicyKey": null,
"SharedAccessPolicyName": "iothubowner"
},
"Type": "Microsoft.Devices/IotHubs"
},
"Serialization": {
"Properties": {
"Encoding": "UTF8",
"Format": "LineSeparated"
},
"Type": "Json"
},
"Type": "Stream"
}
},
{
"Name": "IOT-Hub-Messaging-By-Consumer-Group",
"Properties": {
"DataSource": {
"Properties": {
"ConsumerGroupName": "anotherConsumerGroup",
"IotHubNamespace": "theIotNamespace",
"SharedAccessPolicyKey": null,
"SharedAccessPolicyName": "iothubowner"
},
"Type": "Microsoft.Devices/IotHubs"
},
"Serialization": {
"Properties": {
"Encoding": "UTF8",
"Format": "LineSeparated"
},
"Type": "Json"
},
"Type": "Stream"
}
}
]
Is there an endpoint element within the IotHubProperties that we're not expressing? Is it documented somewhere?
I notice that the Azure Portal calls a different endpoint than is indicated here: https://learn.microsoft.com/en-us/rest/api/streamanalytics/stream-analytics-definition
It uses endpoints under https://main.streamanalytics.ext.azure.com/api. e.g.
GET /api/Jobs/GetStreamingJob?subscriptionId={guid}&resourceGroupName=MyRG&jobName=MyJobName
You'll notice in the results JSON:
{
"properties": {
"inputs": {
{
"properties": {
"datasource": {
"inputIotHubSource": {
"iotHubNamespace":"HeliosIOTHubDev",
"sharedAccessPolicyName":"iothubowner",
"sharedAccessPolicyKey":null,
---> "endpoint":"messages/events", <---
"consumerGroupName":"devicehealthmonitoring"
}
For operations monitoring you will see "endpoint":"messages/operationsMonitoringEvents"
They seem to implement Save for Inputs as PATCH /api/Inputs/PatchInput?... which takes a similarly constructed JSON with the same 2 values for endpoint.
Are you able to use that endpoint somehow? i.e. call New-AzureRmStreamAnalyticsJob as you normally would then Invoke-WebRequest -Method Patch -Uri ...
--Edit--
The Invoke-WebRequest was a no-go -- far too much authentication to try to replicate/emulate.
A better option is to go through this tutorial to create a console application and set the endpoint after deploying using the Powershell scripts.
Something like this should work (albeit with absolutely no error/null checks):
string tenantId = "..."; //Tenant Id Guid
string subscriptionId = "..."; //Subcription Id Guid
string rgName = "..."; //Name of Resource Group
string jobName = "..."; //Name of Stream Analytics Job
string inputName = "..."; //Name-of-Input-requiring-operations-monitoring
string accesskey = "..."; //Shared Access Key for the IoT Hub
var login = new ServicePrincipalLoginInformation();
login.ClientId = "..."; //Client / Application Id for AD Service Principal (from tutorial)
login.ClientSecret = "..."; //Password for AD Service Principal (from tutorial)
var environment = new AzureEnvironment
{
AuthenticationEndpoint = "https://login.windows.net/",
GraphEndpoint = "https://graph.windows.net/",
ManagementEnpoint = "https://management.core.windows.net/",
ResourceManagerEndpoint = "https://management.azure.com/",
};
var credentials = new AzureCredentials(login, tenantId, environment)
.WithDefaultSubscription(subscriptionId);
var azure = Azure
.Configure()
.WithLogLevel(HttpLoggingDelegatingHandler.Level.Basic)
.Authenticate(credentials)
.WithDefaultSubscription();
var client = new StreamAnalyticsManagementClient(credentials);
client.SubscriptionId = azure.SubscriptionId;
var job = client.StreamingJobs.List(expand: "inputs").Where(j => j.Name == jobName).FirstOrDefault();
var input = job.Inputs.Where(i => i.Name == inputName).FirstOrDefault();
var props = input.Properties as StreamInputProperties;
var ds = props.Datasource as IoTHubStreamInputDataSource;
ds.Endpoint = "messages/operationsMonitoringEvents";
ds.SharedAccessPolicyKey = accesskey;
client.Inputs.CreateOrReplace(input, rgName, jobName, inputName);
The suggestion from #DaveMontgomery was a good one but turned out to be not needed.
A simple CMDLET upgrade addressed the issue.
The root issue turned out to be that the Azure Powershell Cmdlets, up to and including version 4.1.x were using an older version of the Microsoft.Azure.Management.StreamAnalytics assembly, namely 1.0. Version 2.0 of Microsoft.Azure.Management.StreamAnalyticscame out some months ago and that release included, as I understand, adding an endpoint element to the Inputs JSON structure.
The new CMDLETs release is documented here: https://github.com/Azure/azure-powershell/releases/tag/v4.2.0-July2017. The commits for the release included https://github.com/Azure/azure-powershell/commit/0c00632aa8f767e58077e966c04bb6fc505da1ef, which upgrades to Microsoft.Azure.Management.StreamAnalytics v2.0.
Note that this was a beaking change, in that the JSON changed from PascalCase to camelCase.
With this change in hand we can add an endpoint element to the Properties / DataSource /Properties IOT input, and the as-deployed Stream Analytics Jobs contains an IOT Input properly sewn to the operationsMonitoring endpoint.