cfssl gencert fails on generation from json - ssl-certificate

cfssl v 1.2
cfssl gencert -initca ca/ca-csr.json
where the json is:
{
"hosts": [
"cluster.local"
],
"key": {
"algo": "rsa",
"size": 2048
},
"names": [
{
"C": "CA",
"L": "Montreal",
"O": "My Inc.",
"OU": "QC",
"ST": "Montreal"
}
]
}
I got a err msg:
Must specify bundle target through -cert or -domain
From the example seems I use it the same way.
checking the code:
not sure how would it go to that condition.
Q: what actually the way I can use it? to generate the certificate from the JSON file.

You are using the incorrect binary.
Most likely you went to the releases section and obtained the first binary (cfssl-bundle_*) for your platform and renamed/aliased it to cfssl.
That is not the one that the linked tutorial uses.
Further down in the list of release artifacts you'll find a cfssl_<version>_<platform> binary which is the utility you want.

Related

Newman loads the pfx certificate but it is not used to connect to the endpoint

I'm having issue in execute a postman collection, from newman, which involves loading a pfx certificate to establish a TLSMA connection.
From the Postman application, the certificate is loaded correctly (from the setting) and used for the domain https://domain1.com to connect with a TLSMA counterpart server.
When I export the json collection and environment there is no mention about domain and certificate associated.
Checking the json schema here newman accepts a certificate definition in the request but applying it does not work, here my example:
"request": {
"method": "GET",
"header": [],
"certificate": {
"name": "Dev or Test Server",
"matches": ["https://domain1.com/*"],
"cert": { "src": "./certificate.pfx" }
},
"url": {
"raw": "https://domain1.com/as/authorization.oauth2",
"host": ["https://domain1.com"],
"path": ["as", "authorization.oauth2"],
"query": [
{
I also tried to apply the certificate configuration in an external file cert-list.json with the following content:
[{
"name": "Dev or Test Server",
"matches": ["https://domain1.com/*"],
"cert": { "src": "./certificate-t.pfx" }
}]
but it does not work either.
Here the newman command:
newman run domain.postman_collection.json -n 1 --ssl-client-cert-list cert-list.json -e env.postman_environment.json -r cli --verbose
Do you know where I am doing wrong?
Change cert to pfx
try:
[{
"name": "Dev or Test Server",
"matches": ["https://domain1.com/*"],
"pfx": { "src": "./certificate-t.pfx" }
}]

How can i custom config CHANGELOG.md using standard-version npm package?

I'm using the command standard-version each time I want to publish new version, but the yielded changes in the CHANGELOG.md look like this:
### [10.1.9](https://github.com/my-project-name/compare/v10.1.8...v10.1.9) (2021-03-29)
### [10.1.8](https://github.com/my-project-name/compare/v10.1.7...v10.1.8) (2021-03-29)
### [10.1.7](https://github.com/my-project-name/compare/v10.1.6...v10.1.7) (2021-03-29)
first the links do not work - the github url is not correct and i want to configure it to the right url, and second, I'd like to configure the link that's shown in the changeslog file (there are some types)
I tried to use this documentation but didn't find anything that can help me
https://github.com/conventional-changelog/conventional-changelog
so how do I configure the way standard-version works on the CHANGELOG.md ? can someone provide example?
yes.
according to doc:
You can configure standard-version either by:
Placing a standard-version stanza in your package.json (assuming your project is JavaScript).
Creating a .versionrc, .versionrc.json or .versionrc.js.
If you are using a .versionrc.js your default export must be a configuration object, or a function returning a configuration object.
Any of the command line parameters accepted by standard-version can instead be provided via configuration.
Please refer to the conventional-changelog-config-spec for details on available configuration options.
example:
.versionrc
{
"types": [
{
"type": "feat",
"section": "Features"
},
{
"type": "fix",
"section": "Bug Fixes"
},
{
"type": "chore",
"hidden": true
},
{
"type": "docs",
"hidden": true
},
{
"type": "style",
"hidden": true
},
{
"type": "refactor",
"section": "Refactor"
},
{
"type": "perf",
"section": "Performance"
},
{
"type": "test",
"hidden": true
}
]
}

How to configure ssl for API in kong

I'm working on kong 0.13.1. Following the docs I added certificate as follows:
{
"data": [
{
"cert": "certificate is really here",
"created_at": 1529667116000,
"id": "6ae77f49-a13f-45b1-a370-8d53b35d7bfd",
"key": "The key is really here",
"snis": [
"myapp.local",
"mockbin.myapp.local"
]
}
],
"total": 1
}
Then added an API which works perfectly well with http:
{
"data": [
{
"created_at": 1529590900803,
"hosts": [
"mockbin.myapp.local"
],
"http_if_terminated": false,
"https_only": false,
"id": "216c23c5-a1ae-4bef-870b-9c278113f8f8",
"name": "mockbin",
"preserve_host": false,
"retries": 5,
"strip_uri": true,
"upstream_connect_timeout": 60000,
"upstream_read_timeout": 60000,
"upstream_send_timeout": 60000,
"upstream_url": "http://localhost:3000"
}
],
"total": 1
}
But unfortunately Kong keeps sending me a default cert located in /usr/local/kong/ssl/kong-default.crt
I'm testing it with:
openssl s_client -connect localhost:8443/products -host mockbin.myapp.local -debug
Back in the days there was a dynamic ssl plugin (where api ssl was added with version 0.3.0) but it's gone since 0.10 update.
I know that it's kinda fix my code configuration question but possibly someone else might also run into similar issue.
I spent some time on figuring it out but I didn't manage to fix it. As kong docs say, api is deprecated so I ended up with rewriting everything to routes and services and I advise you to do the same. Routes and services work perfectly well when implementing step by step based on docs.
The Kong documentation seems clear on how to use the administrative api to configure ssl certificates. It is certainly easier to maintain the certificate at the global level, rather than service and route-specific administration.
Others looking for the answer to this question should find it straightforward, to follow the instructions in the latest Kong documentation linked above.

Unbundeling a pre-built Javascript file built using browserify

I have a third party library, non uglified which was bundled using browserify. Unfortunately the original sources are not available.
Is there a way to unbundled it into different files/sources.
You should be able to 'unbundle' the pre-built Browserify bundle using browser-unpack.
It will generate JSON output like this:
[
{
"id": 1,
"source": "\"use strict\";\r\nvar TodoActions = require(\"./todo\"); ... var VisibilityFilterActions = require(\"./visibility-filter\"); ...",
"deps": {
"./todo": 2,
"./visibility-filter": 3
}
},
{
"id": 2,
"source": "\"use strict\";\r\n ...",
"deps": {}
},
{
"id": 3,
"source": "\"use strict\";\r\n ...",
"deps": {}
},
...
]
It should be reasonably straight-forward to transform the JSON output into source files that can be required. Note that the mappings of require literals (like "./todo") are in the deps. That is, the module required as "./todo" corresponds to the source with an id of 2.
There is also a browserify-unpack tool - which writes the contents as files - but I've not used it.

ARM - How can I get the access key from a storage account to use in AppSettings later in the template?

I'm creating an Azure Resource Manager template that instantiates multiple resources, including an Azure storage account and an Azure App Service with a Web App.
I'd like to be able to capture the primary access key (or the full connection string, either way is fine) from the newly-created storage account, and use that as a value for one of the AppSettings for the Web App.
Is that possible?
Use the listkeys helper function.
"appSettings": [
{
"name": "STORAGE_KEY",
"value": "[listKeys(resourceId('Microsoft.Storage/storageAccounts', parameters('storageAccountName')), providers('Microsoft.Storage', 'storageAccounts').apiVersions[0]).keys[0].value]"
}
]
This quickstart does something similar:
https://azure.microsoft.com/en-us/documentation/articles/cache-web-app-arm-with-redis-cache-provision/
The syntax has changed since the other answer was accepted. The error you will now hit is 'Template language expression property 'key1' doesn't exist, available properties are 'keys'
Keys are now represented as an array of keys, and the syntax is now:
"StorageAccount": "[Concat('DefaultEndpointsProtocol=https;AccountName=',variables('StorageAccountName'),';AccountKey=',listKeys(resourceId('Microsoft.Storage/storageAccounts', variables('StorageAccountName')), providers('Microsoft.Storage', 'storageAccounts').apiVersions[0]).keys[0].value)]",
See: http://samcogan.com/retrieve-azure-storage-key-in-arm-script/
I faced with this issue two times. First in the 2015 and last today in May of 2017.
I need to add connection strings to the WebApp - I want to add strings automatically from generated resources during deployment from the ARM template. It can help later to not add manually this values.
First time I used old version of the function listKeys (it looks like old version returns result not as object but as value):
"AzureWebJobsStorage": {
"type": "Custom",
"value": "[concat(variables('storageConnectionString'), listKeys(resourceId('Microsoft.Storage/storageAccounts', parameters('storageAccountName')), '2015-05-01-preview').key1)]"
},
Today last version of the working template is:
"resources": [
{
"apiVersion": "2015-08-01",
"type": "config",
"name": "connectionstrings",
"dependsOn": [
"[resourceId('Microsoft.Web/Sites/', parameters('webSiteName'))]"
],
"properties": {
"DefaultConnection": {
"value": "[concat('Data Source=tcp:', reference(resourceId('Microsoft.Sql/servers/', parameters('sqlserverName'))).fullyQualifiedDomainName, ',1433;Initial Catalog=', parameters('databaseName'), ';User Id=', parameters('administratorLogin'), '#', parameters('sqlserverName'), ';Password=', parameters('administratorLoginPassword'), ';')]",
"type": "SQLServer"
},
"AzureWebJobsStorage": {
"type": "Custom",
"value": "[concat(variables('storageConnectionString'), listKeys(resourceId('Microsoft.Storage/storageAccounts', parameters('storageName')), '2016-01-01').keys[0].value)]"
},
"AzureWebJobsDashboard": {
"type": "Custom",
"value": "[concat(variables('storageConnectionString'), listKeys(resourceId('Microsoft.Storage/storageAccounts', parameters('storageName')), '2016-01-01').keys[0].value)]"
}
}
},
Thanks.
below is example for adding storage account to ADLA
"storageAccounts": [
{
"name": "[parameters('DataLakeAnalyticsStorageAccountname')]",
"properties": {
"accessKey": "[listKeys(variables('storageAccountid'),'2015-05-01-preview').key1]"
}
}
],
in variable you can keep
"variables": {
"apiVersion": "[providers('Microsoft.Storage', 'storageAccounts').apiVersions[0]]",
"storageAccountid": "[concat(resourceGroup().id,'/providers/','Microsoft.Storage/storageAccounts/', parameters('DataLakeAnalyticsStorageAccountname'))]"
},