How does Azure Key Vault provide better security than encrypted configuration settings? - asp.net-core

I have an ASP.NET Core website that is used to store and retrieve encryption keys which are, at times, used to "sign transactions" on behalf of the user.
Since I'm in Azure, research indicates the most secure way to store these keys is via Azure Key Vault.
I've looked at this article, which shows that to gain access Azure Key Vault values, I would end up using credentials stored in the Web App's Application Configuration settings.
// Connect to Azure Key Vault using the Client Id and Client Secret (AAD) - Get them from Azure AD Application.
var keyVaultEndpoint = settings["AzureKeyVault:Endpoint"];
var keyVaultClientId = settings["AzureKeyVault:ClientId"];
var keyVaultClientSecret = settings["AzureKeyVault:ClientSecret"];
if (!string.IsNullOrEmpty(keyVaultEndpoint) && !string.IsNullOrEmpty(keyVaultClientId) && !string.IsNullOrEmpty(keyVaultClientSecret))
{
config.AddAzureKeyVault(keyVaultEndpoint, keyVaultClientId, keyVaultClientSecret, new DefaultKeyVaultSecretManager());
}
Web App Application Configuration settings are encrypted at rest and during transit, but their values can be leaked in a number of ways, hence the need for Key Vault.
My question, however, is if I have to store the Key Vault access credentials somewhere in the app configuration, doesn't that essentially limit the security of key vault values to the same level as what the configuration setting already provides? Does the extra level of indirection make a difference somehow?
What's the point of using Key Vault if someone can just access the Key Vault by reading the Key Vault credentials from the Web Config Settings?
What am I missing?

Related

how to make a global environment variable accessible for PODs in a kubernetes cluster

In my company, we have an internal Security Token Service consumed by all web apps to validate the STS token issued by the company central access management server (e.g BigIP/APM). Therefore the same endpoint for token validation REST API has to be repeatedly set as an environment variable in Deployment Configuration for each individual web app (Openshift project). So is an ES256 public key used by each web app for validating JWT token.
I'm wondering if there exists a way to set up a global Environment variable or ConfigMap or anything else in Openshift for these kind of common, shared settings per cluster such that they can be by default accessible for all web apps running in all PODs in the cluster? of coz, each individual Deployment Config should override these default values from the global settings at will.
Nothing built in. You could built that yourself with some webhooks and custom code. Otherwise you need to add the envFrom pointing at a Secret and/or ConfigMap to each pod template and copy that Secret/ConfigMap to all namespaces that needed it (kubed can help with that part at least).
I'm wondering if there exists a way to set up a global Environment variable or ConfigMap or anything else in Openshift for these kind of common, shared
When it comes to Microservices it is a good practice to share nothing and avoid "tight coupling". Its typically not good to have global variables.
This will be difficult when you want to evolve and maintain it. Keys are something you regularly should rotate.
In my company, we have an internal Security Token Service consumed by all web apps to validate the STS token issued by the company central access management server (e.g BigIP/APM).
So is an ES256 public key used by each web app for validating JWT token.
When you receive a JWT token, you should inspect the iss (issuer - the value can be an HTTP URL) claim, and if you trust the issuer, you typically can find an OpenID Connect Discovery endpoint where the issuer publishes Json Web Key Set with keys to validate the token.
With this architecture, you have a central service that issue tokens - and also publish keys to validate them. So no need to distributed them in another way - no shared variables. Now you also have a single place to rotate the token, so it becomes more easy to maintain.

ASP.NET Core Data Protection with Azure Key Vault for containerized app deployment to Azure Kubernetes Service

I have an ASP.NET Core app that I deploy in a containerized manner to Azure Kubernetes Service (AKS) and when running just a single replica of the app - it is functional and works as expected.
However, when I run multiple replicas - I run into am error - “Unable to protect the message.State” from the OIDC provider.
Upon further research I have figured out that using ASP.NET Core Data Protection as depicted here is the solution -
https://learn.microsoft.com/en-us/aspnet/core/security/data-protection/configuration/overview?view=aspnetcore-5.0#persisting-keys-when-hosting-in-a-docker-container
However - the above link does not expand upon the usage pattern of it while storing the key in Azure Key Vault. Assuming I have protected my keys in AKV how do I actually get to use it in my app? Is there sample or guidance on this aspect?
First of all I would recommend that the same client instance (With AddOpenIDConnect(...) is the same that also handles the callback from your Identity Provider (/signin-oidc). The state parameter that it sets when it first redirects you to the identity provider must match the returned response (for security reasons).
To make sure that issued cookies in the users browser is valid all the time, you need to make sure:
All client instances uses the same data protection encryption key
The key is the same during redeployment.
You can for example store this key in Azure Key Vault, SQL-Server or somewhere else.
btw, I did a blog post about the Data Protection API here and how you could store the key-ring in AKV as well.

Azure Storage - Allowed Microsoft Service when Firewall is set

I am trying to connect a public logic app (not ISE environment) to a storage account that is restricted to a Vnet.
According to the Storage account documentation access should be possible using a system managed identity.
However I just tried in 3 different subscriptions and the result is always the same:
{
"status": 403,
"message": "This request is not authorized to perform this operation.\r\nclientRequestId: 2ada961e-e4c5-4dae-81a2-520397f277a6",
"error": {
"message": "This request is not authorized to perform this operation."
},
"source": "azureblob-we.azconn-we-01.p.azurewebsites.net"
}
Already provided access with different IAM roles, including owner. This feels like the service that should be allowed according to the documentation is not being allowed.
The Allow trusted Microsoft services... setting also allows a
particular instance of the below services to access the storage
account, if you explicitly assign an RBAC role to the system-assigned
managed identity for that resource instance. In this case, the scope
of access for the instance corresponds to the RBAC role assigned to
the managed identity.
Azure Logic Apps Microsoft.Logic/workflows Enables logic apps to
access storage accounts
[https://learn.microsoft.com/en-us/azure/storage/common/storage-network-security#exceptions][1]
What am I doing wrong?
Added screenshots:
https://i.stack.imgur.com/CfwJK.png
https://i.stack.imgur.com/tW7k9.png
https://i.stack.imgur.com/Lxyqd.png
https://i.stack.imgur.com/Sp7ZV.png
https://i.stack.imgur.com/Hp9JG.png
https://i.stack.imgur.com/rRbau.png
For authenticating access to Azure resources by using managed identities in Azure Logic Apps, you could follow the document. Azure Logic Apps should be registered in the same subscription as your storage account. If you want to access the blob in an Azure Storage container. You could add the Storage Blob Data Contributor(Use to grant read/write/delete permissions to Blob storage resources) role for the Logic App system identity in the storage account.
Update
From your screenshot, I found that you have not used a system-managed identity to design the Create blob logic but using an API connection.
For validating connecting a public logic app to a storage account with Allow trusted Microsoft services... setting enabled. You can design your logic using the managed identity with a trigger or action through the Azure portal. To specify the managed identity in a trigger or action's underlying JSON definition, see Managed identity authentication.
output
For more details, please read these steps in Authenticate access with managed identity.

asp user secret problem after publish project

my connection string in user secret
No problem at local
but is publish and upload in server error
ArgumentNullException: Value cannot be null. (Parameter 'connectionString')
user secrets is only for development and not intended for production.It doesn't encrypt the stored secrets and stored in a JSON configuration file in the user profile directory.
For production , usually you can use JSON file(appsettings.json/appsettings.{Environment}.json), environment variables, and Azure Key Vault(which is recommended), please read below article for more details about configuration providers in asp.net core :
Configuration in ASP.NET Core
It seems the connectionStrings section is not yet configured at production machine. So you must store connection strings into Web.config file at production machine.
Connection Strings and Configuration Files
Besides that, you can protect the configuration by encrypt it using protected configuration.
Encrypting Configuration Information Using Protected Configuration
User secrets used for development security and it doesn't transfer/publish with your app (to GitHub or to IIS), they are stored in your local profile. If you are looking for securing your credentials & sensitive data the better way is to use Azure Key Vault (see Manage User Secrets)

Accessing Azure SQL Database from Azure Web App with VBScript and Azure Key Vault

How can I connect to SQL database hosted on Microsoft Azure without having credentials in plain text in my .asp files or config files in VBScript?
I want to have the database connection string stored in Azure Key Vault, and have the web app access the key vault to get the connection string and then connect to the database.
I have looked at a lot of Microsoft documentations but they are all in C#. My web app is all in VBScript and .asp files and I don't want to spend the time rebuilding the whole web app to ASP.NET/.aspx
Thank you
You don't need Azure Key Vault in this case.
What you can do is to create a new App Setting in App Settings of Azure Web App, and make its value to be the connection string of your database. This will create an environment variable and you can access it with VBScript. This post shows how to access an environment variable with VBScript.
I found a way!
If I want to use the environment variables set under App Settings:
Set objWSH = CreateObject("WScript.Shell")
Set objUserVariables = objWSH.Environment("Process")
Response.Write(objUserVariables("APPSETTING_testAppSet"))
the prefix APPSETTING_ will be different if the variable is stored under Connection String
One other way I was able to use is to store the DB connection string in the Azure Key Vault. And then use OAuth access token to access Azure Key Vault. In this method, you have to send a POST request to Azure with ClientID and ClientSecret in the request body, then you will get an access token from the HTTP response. After that, send a GET request to the Key Vault endpoint with the access token in the request header. Then you will get the value of the key vault secret from the HTTP response.
Another way to do it is to use the MSI_ENDPOINT and MSI_SECRET and send the HTTP request to get the access token. And with that access token, you can access a key vault secret as well (you have to make sure that the Key Vault Access Policy is setup correctly).