I am trying to deploy the mulesoft application in to the DEV CloudHub using the Azure Devops CI/CD Pipeline. But when the application getting deployed in to the Cloud Hub it throws error like
ConfigurationPropertiesException: YAML configuration properties only supports string values, make sure to wrap the value with \" so you force the value to be an string. Offending property is sharepoint.auth with value null"}
Below is my DEV yaml file
# Sharepoint
sharepoint:
maxCouncurrency : "2"
siteUrl: "https://example.com/sites/D365Admin"
auth:
##username: ""
##password: ""
listener:
path: "/sites/D365Admin/NP/SF"
polling:
frequency: "60" # In seconds
startDelay: "30" # In seconds
archive:
path: "/sites/D365Admin/NP/SF-Archive"
I commented out the username and password as it is supplied in the Pipeline Variable. But I am not sure why am I getting the error on the sharepoint.auth Any suggestion what I am missing here.
The error message seems pretty clear. The property sharepoint.auth doesn't has any value, it has a value of null, which is not allowed. Only strings are allowed and null is not a string. Try commenting it or assigning it an empty string (""). If you are not using it for anything just comment it.
Related
I've read the documentation for creating an Airflow Connection via an environment variable and am using Airflow v1.10.6 with Python3.5 on Debian9.
The linked documentation above shows an example S3 connection of s3://accesskey:secretkey#S3 From that, I defined the following environment variable:
AIRFLOW_CONN_AWS_S3=s3://#MY_ACCESS_KEY#:#MY_SECRET_ACCESS_KEY##S3
And the following function
def download_file_from_S3_with_hook(key, bucket_name):
"""Get file contents from S3"""
hook = airflow.hooks.S3_hook.S3Hook('aws_s3')
obj = hook.get_key(key, bucket_name)
contents = obj.get()['Body'].read().decode('utf-8')
return contents
However, when I invoke that function I get the following error:
Using connection to: id: aws_s3.
Host: #MY_ACCESS_KEY#,
Port: None,
Schema: #MY_SECRET_ACCESS_KEY#,
Login: None,
Password: None,
extra: {}
ERROR - Unable to locate credentials
It appears that when I format the URI according to Airflow's documentation it's sitting the access key as the host and the secret access key as the schema.
It's clearly reading the environment variable as it has the correct conn_id. It also has the correct values for my access key and secret, it's just parcing it under the wrong field.
When I set the connection in the UI, the function works if I set Login to my access key and Password to my token. So how am I formatting my environment variable URI wrong?
Found the issue, s3://accesskey:secretkey#S3 is the correct format, the problem was my aws_secret_access_key had a special character in it and had to be urlencoded. That fixed everything.
I have problem with build pipeline in Azure Devops, with variables from build pipeline not replacing empty configuration in appsettings.json. Below there is more details.
My current test project is build using asp.net core technology and is connected to SQL server. I also use Entity Framework Core and autofac.
To connect to SQL server I use appsettings.json configuration:
{
"ConnectionStrings": {
"AzureDbConnectionString": ""
}
}
but my credentials are stored in secrets.json
{
"ConnectionStrings": {
"AzureDbConnectionString": "Server=tcp:servername-db-srv.database.windows.net,1433;Initial Catalog=dbname-db;Persist Security Info=False;User ID=user;Password=Password;MultipleActiveResultSets=False;Encrypt=True;TrustServerCertificate=False;Connection Timeout=30;"
}
}
I've configured my build pipeline variable:
Name:
ConnectionStrings--AzureDbConnectionString
Value:
Server=tcp:servername-db-srv.database.windows.net,1433;Initial Catalog=dbname-db;Persist Security Info=False;User ID=user;Password=Password;MultipleActiveResultSets=False;Encrypt=True;TrustServerCertificate=False;Connection Timeout=30;
The problem occurs while running Generate Migration Scripts in the build pipeline.
Autofac.Core.DependencyResolutionException: An exception was thrown while activating λ:Microsoft.EntityFrameworkCore.DbContextOptions[] -> λ:Microsoft.EntityFrameworkCore.DbContextOptions -> λ:Microsoft.EntityFrameworkCore.DbContextOptions`1[[AspNetAutofacAzure02.Data.SchoolContext, AspNetAutofacAzure02, Version=1.0.0.0, Culture=neutral, PublicKeyToken=null]]. ---> System.ArgumentException: The string argument 'connectionString' cannot be empty.
at Microsoft.EntityFrameworkCore.Utilities.Check.NotEmpty(String value, String parameterName)
Like I mentioned, it looks like the variable isn't used while generating the script.
Is there anything wrong I do here?
Did you try ConnectionStrings:AzureDbConnectionString? That's the normal format for overriding appsettings.json. Or ConnectionStrings__AzureDbConnectionString.
If the value is coming from keyvault in the format ConnectionStrings--AzureDbConnectionString, then just map a new variable: ConnectionStrings:AzureDbConnectionString = $(ConnectionStrings--AzureDbConnectionString)
You can add Set Json Property task to replace the ConnectionStrings. You may need to install this task to your organization first.
Usage:
First click the 3dots on the right side to locate your appsettings.json file.
Then set the Property path value to ConnectionStrings.AzureDbConnectionString.
Last set the Property value to your pipeline variable $(ConnectionStrings--AzureDbConnectionString)
I want to add the token that I generated in firebase-tools using firebase login:ci to Gitlab CI. I went to Settings -> CI/CD -> Variables and added the environment variable with the key as FIREBASE_TOKEN.
However I get:
Validation failed:
- Variables value is invalid.
The value I gave is a 25 digit key generated by Firebase CLI as mentioned above.
What is wrong in this and what must I do?
I found the answer myself. Gitlab doesn't allow certain characters, such as - or / as the value for the environment variables. So I split the key into 2 environment variables.
EDIT #1: The problem was because I had the option 'Mask' turned on. So turned it off and I was able to give the whole key as a single variable. Voila!
When I publish my .NET core app to a development IIS server, I make a call to the API and the method works fine locally. This method makes a Db call using a connection string stored in
appSettings.json as well as appSettings.Development.json
Observations:
- Yes, I have ASPNETCORE_ENVIRONMENT = Development and I have both an appSettings.json file AND appSettings.Development.json file
- So I started looking at the published files and I had BOTH of these json files in the published folder, even though appSettings.Development.json properties is set to "Build Action" = content and Copy to Output Directory = Do Not Copy
- If I comment out the code that his the Db, and return dummy data, and republish the api, i get the results fine with no complaints about "development mode"
Error I get when calling the API trying to hit the Db
Error.
An error occurred while processing your request.
Request ID:
0HLL1GOCEHH73:00000001
Development Mode
Swapping to the
Development environment displays detailed information about the error that occurred.
The Development environment shouldn't be enabled for deployed applications.
It can result in displaying sensitive information from exceptions to end users.
For local debugging, enable the
Development environment by setting the
ASPNETCORE_ENVIRONMENT environment variable to
Development
and restarting the app.
Questions:
- The Db connection strings are in both
[ update ]
Brain-fart! The db connection strings were using 'trusted', so no wonder they worked locally! Once I put in the credentials, and re-published, things worked like I expected. However, the error message threw me off.
Im still not sure why I have both of those appSettings files published? Which one will it use?
I am assuming your appsetting files are named as following:
appSettings.json
appSettings.dev.json
typically, you have to explicitly set the environment to dev. if you use Visual Studio for development, it sets an environment variable that tells the application to put it in dev mode.
Without seeing the initializing logic, I would say in prod it will use the appSettings.json.
Take a look at this article, it explains configuration in more details.
When following the instructions on http://developer.gooddata.com/article/loading-data-via-api, I always get a HTTP400 error:
400: Neither expected file "upload_info.json" nor archive "upload.zip" found (is accessible) in ""
When I HTTP GET the same path that I did for the HTTP PUT, the file downloads just fine.
Any pointers to what I'm probably doing wrong?
GoodData is going trough migration from AWS to RackSpace.
Try to change of all get/post/put requests:
secure.gooddata.com to na1.secure.gooddata.com
secure-di.gooddata.com to na1-di.gooddata.com
You can check the datacenter where the project is located via /gdc/projects/{projectId} resource - the "project.content.cluster" field.
For example:
https://secure.gooddata.com/gdc/projects/myProjectId:
{
"project" : {
"content" : {
"cluster" : "na1",
....
For AWS this field has an empty value, "na1" means rackspace.