Kotlin Console application - store credentials on app.properties and define the path to it with env variables - kotlin

I'm building a Kotlin console app that's going to be installed in one of my customer servers. The app downloads some data from a REST API. Before pulling the data the app needs to obtain a token by issuing a login request using a username and password.
I'm gonna be packaging the app using The Badass Runtime Plugin with the runtime parameter, which basically creates a folder including my jar, dependencies and a bunch of scripts.
To avoid hard coding the password in the application code I came with the following approach:
I'm going to store the password in an application.properties file and pass the file path to it using an environment variable.
val props = Properties()
props.load(FileInputStream(System.getenv("ECC_CONFIGS_PATH")))
println(props.password)
When installing the app on the client's computer I'm going to create an application.properties file and the required env variable.
Is it a good approach or is it preferable to pass the file path to the application.properties using a -D parameter? In that case, how do you access its value in the application code?
Are there better approaches rather than the app properties?

Related

Nuxt JS - reading conf/env file in static site generation

My project with Nuxt JS is set with target:static and ssr: false.
This app need to connect to a local endpoint to retrieve some informations.
I have multiple endpoints and I need multiple instances of the app, every app must read only his endpoint.
The question is: how change the endpoint address for every app without rebuild everyone?
I tried with env file or a json file in the static folder (in order to have access to this file in the dist folder after the build process).
But if I modify the content of the env/json file in the dist folder and then reload the webpage (or also restart the web server that serve the dist folder), the app continue to use the original endpoint provided at the build time.
There is a way or I have to switch to server side rendering mode (which I would rather not use)?
Thank you!
When you use SSG, it will bundle your app at build time. Last time I checked, there was no hack regarding that. (I don't have the Github issue under my hand but it's a popular one)
And at the same time, I don't really see how it would be done since you want to mix something static and dynamic at the same time.
SSR is the only way here.
Otherwise, you may have some other logic to generate dynamic markup when updating your endpoints (not related to Nuxt) by fetching a remote endpoint I guess.
With the module nuxt content it's possible to create a folder "/content" in project directory and read json files from that directory.
After, when creating the dist with nuxt generate command, the "content" folder it's included in "_nuxt" folder of the dist and if you modify the content of the json file and refresh the webpage that read it, will take the new values.

How to pass sensitive data to the IConfiguration interface during startup?

I want to use the configuration setup for my .NET Core Web API. I installed the Microsoft.Extensions.Configuration package for the DI container.
First of all I have 4 config files
appsettings.json
Whenever all three environments use the same config value, this is the file where to put it
appsettings.Development.json
Basic config values for development purposes only. E.g. database connection points to localhost and token secret is "secret", example:
.
{
"Database": {
"ConnectionString": "Server=localhost;Port=3306;Database=db;Uid=root;Pwd=admin;Pooling=true;"
}
}
appsettings.Staging.json
Almost the same as the development file
appsettings.Production.json
Things are different here. I can't put sensitive information to that file, e.g. token secret. These values should come from the environment variable
So in my code I can access the config values via dependency injection
public class MyClass
{
public MyClass(IConfiguration configuration)
{
string databaseConnectionString = configuration["Database:ConnectionString"];
}
}
but what if the code runs in production mode? The information doesn't exist in the production file so I would have to read from the environment variables.
Would I have to create an environment variable called Database:ConnectionString and .NET Core maps all the system environment variables into the configuration file during startup if they don't exist? Or how would I pass in sensitive data to the configuration?
With the default builder, ASP.NET Core will load the configuration from multiple sources, where later sources have the chance to overwrite earlier ones. The default sources in non-development environments are the following:
General JSON configuration from appsettings.json
Environment-specific JSON configuration from appsettings.<Environment>.json
Environment variables, e.g. ConnectionStrings:DefaultConnection or ConnectionStrings__DefaultConnection (both map to the same configuration path)
Command-line arguments
So you have the ability to overwrite the configuration from the JSON files by default using both environment variables and command line arguments.
When it comes to production use, there are also other means to protect the secrets. For example, you could simply edit the appsettings.Production.json during deployment, so that the values will never leave the machine itself.
There are several solutions for this. But obviously you want to keep things such as connection strings away from version controlled files.
If running locally i would sugest using the user secrets functionality in visual studio.
However you can also set environment variables from the cli. This is an example from the documentation about configuration:
set MyKey="My key from Environment"
set Position__Title=Environment_Editor
set Position__Name=Environment_Rick
dotnet run
Of course when running in azure the key vault is a good place to put these kinds of secrets and also has great integration into .Net.

How/Where to store sensitive information in .netCore project in Azure DevOps/Local

Shortly my scenario is to test a remote API if there was any changes in the called APIs, like some parameter removed or something like that.
To get this info I need to have a token.
My problem is, I can't store it in the Database and use windowsCredentials, because in the AzurePipeline the build agents has no access and connection to the Database. And if I pass the token through variables in the pipeline then I won't have the token when I run the code in local.
appSetting is stored in Git so it is not safe.
Any idea on this?
Thanks!
UserSecrets + Environment variables are the key here. Appsettings.json is for configuration that is non-sensitive, but there is a concept called user secrets (see link below) that will allow you to have stored an appsettings.json equivalent just on your machine and not in git. When specifying info in it it should override or add onto anything in your appsettings.json.
If that info is also needed for production, then environment variables should be used. Instead of a file, single configurations can be specified/overridden using environment variables.
This is all accomplished ONLY if the aspnet core web server configuration is setup to accept from all of these places. The default setup from the template should accomplish this but read the links below to make sure that your setup works.
All the configuration and best practices can be found here.
Non sensitive info/Defaults: appsettings.json
Sensitive info for devs/dev specific info: UserSecrets
Sensitive info for prod: Environment variables

How to manage database credentials for mule proejct

I am using database connector component, with vault component to store the database credentials. Now as per the documentation of both components i have created different properties file for each environment to store the encrypted credentials for diff env.
Following is the structure of my mule project
Now the problem with this structure is that i have to build new deployable zip file whenever i have to update the database credentials for any environment.
I need a solution where i can keep all credentials encrypted and centralized and i don't have to create a build every time after updated the credentials, We can afford to restart the server, but building new zip and deploying is really cumbersome.
Second problem we have this approach is a developer needs to know the production db to update it in properties file, this is also a security issue.
Please suggest alternate approach for credentials management for mule projects.
I'm going to recommend you do NOT try to change the secure solution provided to you by MuleSoft. To alleviate the need for packaging and deployment, you would have to extract the properties files outside of the deployment and this would be a huge risk. Regardless of where you store the property files within the deployment if you change the files, you have to package and re-deploy. I see the only solution to your problem as moving the files outside of the deployment and securely storing them. Mule has provided a solution while it may be cumbersome, they are securing these files first with encryption and secondly within the server container. You can move out the property files but you have to provide a custom implementation and you will be assuming great risk to your protected resources.
Set a VM arguement e.g. environment.type=local for local machine on your anypoint studio.
Read this variable in wherever you are reading your properties file in a way that environment type is read dynamically such as below.
" location="classpath:properties/sample-app-${environment.type}.properties" doc:name="Secure Property Placeholder"/>
In order to set the environment type on your production server(or wherever you are using mule runtime), open \conf\wrapper.conf and add the arguement wrapper.java.additional.=-Dserver.type=production. If you already have any property in this file, you may need to set the value of n appropriately. For example 13 or 14.
This way you don't need to generate different deployment artefacts for different environment because correct properties file is picked by using environment specific VM arguement.

VSTS: Different Config Files (WCF endpoint addresses) for different environments using RM

I have different projects that are consuming many WCF services. I am using VSTS to automate deployments. Those services target different URLs (endpoint addresses) based on the environment where they are going to be deployed.
I am trying to use web deploy with VSTS release management as suggested in this link:WebDeploy with VSTS, which proposes to create:
Parmeters.xml
Then, add new task "Replace Tokens" with the specified variable for each environment.
However, i don't guess this will work for me, because it generate tokens only for app settings keys (which is not my case).
Is there is a work around or any other suggestion that could help me to do the configuration part?
"Replace Tokens" task can works with any config file in your project and what content to be replaced is also controlled by you.
For example, if you want to replace a URL in "myconfig.config" file. You can set the URL in the config file to "#{targeturl}#", and add a "Replace Tokens" task in your definition with the following settings: (You can change the token prefix and suffix, but remember to update it accordingly in the config file since the task find the strings to replace base on it)
And then create a variable "targeturl" in the definition with the actual URL value:
Now, when you start the build/release, the string "#{targeturl}#" in "myconfig.config" file will be replaced with "www.test.com".