I have some issues with jenkins variables. I would like to pass a variable (reportJenkins) to an ansibletower command :
stage('PATCH'){
steps{
script {
def reportJenkins
try{
reportJenkins = readJSON(file: 'report.json', text: '')
echo "${reportJenkins}"
ansibleTower credential: "${CREDENTIALS}", extraVars: '{
"ENV": "${ENV}",
"REPORT": "${reportJenkins}",
"VERSION": "${VERSION}",
"PRODUCT": "${PRODUCT}",
"username": "${API_CREDS_USR}", "password": "${API_CREDS_PSW}"
}', inventory: 'xxx', jobTemplate: 'xxx', jobType: 'run', throwExceptionWhenFail: true,
towerCredentialsId: '58b248df-c20f-43b4-a706-45d75f0c59b8',
towerLogLevel: 'full', towerServer: 'ALM_ANSIBLE_STAGE'
}
In my ansible template extra vars box it appears like that :
REPORT: '${reportJenkins}'
Can someone helps me ?
I tried to escape char with \"${reportJenkins}\", the other variables works I don't understand why this one doesn't...
I printed the variable :
echo "${reportJenkins}"
I saw my variable in jenkins console output
Thanks
Related
I am trying to run the following test on Cypress, but I am getting this error:
No tests found in your file:
/Users/Name/Desktop/MyFolder/cypress-tutorial-build-todo-starter/cypress/integration/app-init.spec.js
We could not detect any tests in the above file. Write some tests and re-run.
I thought I did have a test written as you can see below, especially since I have the it function. Can someone tell me why I may be getting this error? This is the second file I have with a test in my integration folder. Not sure if that makes a difference.
const todos = [
{
"id": 1,
"name": "Buy Milk",
"isComplete": false
},
{
"id": 2,
"name": "Buy Eggs",
"isComplete": false
},
{
"id": 3,
"name": "Buy Bread",
"isComplete": false
},
{
"id": 4,
"name": "Make French Toast",
"isComplete": false
}
]
describe('App Initialization', () => {
it.only('Loads todos on page load', () => {
cy.server()
cy.route('GET', '/api/todos', todos)
cy.visit('/')
})
cy.get('.todo-list li')
.should('have.length', 4)
})
As far as the log concern, it shows that app-init.spec.js file doesn't exist in /Users/Name/Desktop/MyFolder/cypress-tutorial-build-todo-starter/cypress/integration/
Please check the path in the directory and also verify the "integrationFolder" path in cypress.json. If the path doesn't match to the path where your test resides, then Cypress cannot detect test file to run.
In my BitBucket+Bamboo setup, I'm trying to get a list of email addresses of people having access to a particular repository. This is the output from the BitBucket API:
{
"size": 3,
"limit": 25,
"isLastPage": true,
"values": [
{
"user": {
"name": "name1",
"emailAddress": "name1.lastname1#domain.com",
"id": 1,
"displayName": "Name1 Lastname1",
"active": true,
"slug": "name1",
"type": "NORMAL",
"links": {
"self": [
{
"href": "https://bitbucket.com/stash/users/name1"
}
]
}
},
"permission": "REPO_WRITE"
},
{
"user": {
"name": "name2",
"emailAddress": "name2.lastname2#domain.com",
"id": 2,
"displayName": "Name2 Lastname2",
"active": true,
"slug": "name2",
"type": "NORMAL",
"links": {
"self": [
{
"href": "https://bitbucket.com/stash/users/name2"
}
]
}
},
"permission": "REPO_WRITE"
},
{
"user": {
"name": "name3",
"emailAddress": "name3.lastname3#domain.com",
"id": 3,
"displayName": "Name3 Lastname3",
"active": true,
"slug": "name3",
"type": "NORMAL",
"links": {
"self": [
{
"href": "https://bitbucket.com/stash/users/name3"
}
]
}
},
"permission": "REPO_WRITE"
}
],
"start": 0
}
is there an easy way to, say, put all 3 email addresses into an array or a coma-separated variable within a bash script? I tried using grep and splitting the API output somehow (e.g. by 'permission'), but no luck so far. Let me note that I may be forced to use standard tools like grep, sed or awk, meaning I may not be able to use tools like jq (to process json in bash) since I cannot really temper with available build agents.
Any help would be much appreciated!
Consider using JQ (or another JSON query tool). It will handle any valid Json, even one that is not pretty-printed or formatted in a specific way. Ca be compined with readarray to build the array in bash.
readarray -t emails <<< "$(jq -r '.values[].user.emailAddress' < file)"
Will produce an array 'emails'
declare -p emails
declare -a emails=([0]=$'name1.lastname1#domain.com' [1]=$'name2.lastname2#domain.com' [2]=$'name3.lastname3#domain.com')
Note 2020-07-22: Added '-t' to strip trailing new lines from result array
Assuming your input is always that regular, this will work using any awk in any shell on every UNIX box:
$ awk -F'"' '$2=="emailAddress"{addrs=addrs sep $4; sep=","} END{print addrs}' file
name1.lastname1#domain.com,name2.lastname2#domain.com,name3.lastname3#domain.com
Save the output in a variable or a file as you see fit, e.g.:
$ var=$(awk -F'"' '$2=="emailAddress"{addrs=addrs sep $4; sep=","} END{print addrs}' file)
$ echo "$var"
name1.lastname1#domain.com,name2.lastname2#domain.com,name3.lastname3#domain.com
Take a look on the python:
You can access directly to your api like this:
import urllib.request
import json
with urllib.request.urlopen('http://your/api') as url:
data = json.loads(url.read().decode())
or as an example with the local file with the same data as you provided:
import json
with open('./response.json') as f:
data = json.load(f)
result = {}
for x in data['values']:
node = x['user']
result[node['emailAddress']] = x['permission']
result is {'name1.lastname1#domain.com': 'REPO_WRITE', 'name2.lastname2#domain.com': 'REPO_WRITE', 'name3.lastname3#domain.com': 'REPO_WRITE'}
$ grep -oP '(?<="emailAddress": ).*' file |
tr -d '",' |
paste -sd,
name1.lastname1#domain.com,name2.lastname2#domain.com,name3.lastname3#domain.com
or
$ grep '"emailAddress":' file |
cut -d: -f2 |
tr -d '", ' |
paste -sd,
I've set up a tasks.json file for building a project on multiple platforms. All platforms see the same content of the project repository. This is done either via disk sharing, because of running another platform in a VM, or via sync with the Git repository.
So far so good, they all see the same task.json. However some command lines are rather long and those long lines are identical for most part.
for example:
"rm -rf build; mkdir build; cd build; ../configure --with-bash-malloc=no CFLAGS=\"-O3 -fno-builtin-malloc -fno-builtin-calloc -fno-builtin-realloc -fno-builtin-free\" LDFLAGS=-L/usr/local/lib LIBS=\"-ltcmalloc -lcurl\" CC=clang
Similar lines are there for the different platforms.
The configure part is always the same for the different platforms, so it would be nice to factor out this common part. Thus the question is if it is possible to define your own variables, so you can use them similar to ${workspaceRoot}.
Thus define somewhere
"win_dir": "build_windows",
"linux_dir": "build",
"osx_dir": "build_osx",
"configure": "../configure --with-bash-malloc=no CFLAGS=\"-O3 -fno-builtin-malloc -fno-builtin-calloc -fno-builtin-realloc -fno-builtin-free\" LDFLAGS=-L/usr/local/lib LIBS=\"-ltcmalloc -lcurl\" CC=clang"
And then write
"tasks": [
{
"taskName": "configure",
"command": "bash",
"windows": {
"args": ["-c", "rm -rf ${win_dir}; mkdir ${win_dir}; cd ${win_dir}; ${configure}"]
},
"linux": {
"args": ["-c", "rm -rf ${linux_dir}; mkdir ${linux_dir}; cd ${linux_dir}; ${configure}"]
},
"osx": {
"args": ["-c", "rm -rf ${osx_dir}; mkdir ${osx_dir}; cd ${osx_dir}; ${configure}"]
},
"isBuildCommand": true,
"problemMatcher": "$make-compile"
},
... others tasks using the variables
When making changes to the build directory or arguments passed to configure etc, then the tasks.json file needs only editing at one place, instead of many.
Perhaps it is already possible but I'm unable to find out how. I tried to do something with the declares block, but that seems to be hard tied to problemMatcher. You can find some examples, but I could not find clear documentation of of the elements of the tasks.json file and how they interact.
Perhaps I'm missing something, please educate me!
Adam Parkin's answer won't work because, at least on windows, the shell will not substitute environment variables given as arguments. ${env:...} variables as suggested in a comment on that answer won't be substituted using environment variables set in tasks.json itself, only preexisting ones. You can however add custom settings in settings.json, and reference those in tasks.json using ${config:...}.
e.g. settings.json:
{
"win_dir": "build_windows",
"linux_dir": "build",
"osx_dir": "build_osx",
"configure": "../configure --with-bash-malloc=no CFLAGS=\"-O3 -fno-builtin-malloc -fno-builtin-calloc -fno-builtin-realloc -fno-builtin-free\" LDFLAGS=-L/usr/local/lib LIBS=\"-ltcmalloc -lcurl\" CC=clang"
}
in tasks.json:
{
"tasks": [
{
"taskName": "configure",
"command": "bash",
"windows": {
"args": ["-c", "rm -rf ${config:win_dir}; mkdir ${config:win_dir}; cd ${config:win_dir}; ${config:configure}"]
},
"linux": {
"args": ["-c", "rm -rf ${config:linux_dir}; mkdir ${config:linux_dir}; cd ${config:linux_dir}; ${config:configure}"]
},
"osx": {
"args": ["-c", "rm -rf ${config:osx_dir}; mkdir ${config:osx_dir}; cd ${config:osx_dir}; ${config:configure}"]
},
"isBuildCommand": true,
"problemMatcher": "$make-compile"
},
// ... other tasks using the variables
]
}
Thus the question is if it is possible to define your own variables, so you can use them similar to ${workspaceRoot}.
You could define environment variables in your tasks.json:
{
// See https://go.microsoft.com/fwlink/?LinkId=733558
// for the documentation about the tasks.json format
"version": "2.0.0",
"options": {
"env": {
"win_dir": "build_windows",
"linux_dir": "build",
"osx_dir": "build_osx",
"configure": "../configure --with-bash-malloc=no CFLAGS=\"-O3 -fno-builtin-malloc -fno-builtin-calloc -fno-builtin-realloc -fno-builtin-free\" LDFLAGS=-L/usr/local/lib LIBS=\"-ltcmalloc -lcurl\" CC=clang"
}
},
"tasks": [
{
"label": "Example",
"type": "shell",
"command": "echo win_dir is $win_dir"
},
]
}
With that, you could then also use the environment matching to refer to the relevant environment variables.
I am taking a different approach.
tasks.json
{
"version": "2.0.0",
"params":{
"git_version":"2.30.0",
"node_version":"14.13.6",
"python_version":"3.8"
},
"tasks": [
{
"label":"Process Task.json",
"type":"shell",
"command":"python process_tasks.py",
"group":"build",
"isBackground":true
},
{
"label":"Test process_tasks.py",
"type":"shell",
"command":"echo $[params.git_version]",
"group":"test",
"presentation": {
"reveal": "always"
}
}
]
}
Rather than making env variables, we can follow the flowing steps:
Step 1:
Make a task in tasks.json as follows
{
"label":"Process Task.json",
"type":"shell",
"command":"python process_tasks.py",
"group":"build",
"isBackground":true
},
Step 2:
process_tasks.py is a Python file that will replace variables in tasks.json with the actual value:
import json
import os
import re
if __name__ == "__main__":
# Get the path to the JSON file
json_path = os.path.join(".vscode/tasks.json")
with open(json_path, "r") as f:
data = json.load(f)
with open(json_path, "r") as f:
lines = f.readlines()
new_lines = []
#regex to find text between $[]
regex = re.compile(r"\$\[(.*?)\]")
for line in lines:
#regex in line:
match = regex.search(line)
if match:
#get the text between $[]
text = match.group(1)
keys = text.split(".")
buffer_data = data
for key in keys:
buffer_data = buffer_data[key]
#replace the text with the value of the environment variable
line = line.replace(f"$[{text}]", buffer_data)
new_lines.append(line)
with open(json_path, "w") as f:
f.writelines(new_lines)
Step 3:
Add a test task to verify your result
{
"label":"Test process_tasks.py",
"type":"shell",
"command":"echo $[params.git_version]",
"group":"test",
"presentation": {
"reveal": "always"
}
},
Note:
Making this "Process Task.json" as a global task and adding the correct path of the process.py file in the build task will reduce a lot of work.
Thus we can define our own variables inside tasks.json and access them using $[params.git_version].
After executing the "Process Task.json" task, all variables in $[] format will be replaced by its corresponding value.
I have a web app on OpenShift v3 (all-in-One), using the Wildfly Builder Image. In addition, I created a service named xyz, to point to an external host+IP. Something like this:
"kind": "Service",
"apiVersion": "v1",
"metadata": { "name": "xyz" },
"spec": {
"ports": [
{ "port": 61616,
"protocol": "TCP",
"targetPort": 61616
}
],
"selector": {}
}
I also have an endpoint, pointing externally, but that is not relevant for this question.
When deployed, my program can access an environment variable named XYZ_PORT=tcp://172.30.192.186:61616
However, I cannot figure out how to see all the values of all such variables either via the web-console, or using the CLI. Using the web-console, I cannot see it being injected into the YAML.
I tried some of the oc env options, but none seem to list what I want.
Let's say you are deploying kitchensink, then the below CLI should list all the environment variables:
oc env bc/kitchensink --list
Hi there I am testing an API via postman, I want to automate my tests and have downloaded newman. Now the request I use in postman has been exported as a collection and is giving me a 404 via newman.... Any pointers much appreciated. IP address has been changed for obvious reasons.
{
"id": "11f345f7-9f12-58fb-099d-27f11233cee7",
"name": "GC",
"description": "",
"order": [
"f7fe3f94-0dd2-6dba-05b9-29ae7e571ed9"
],
"folders": [],
"timestamp": 1446559540652,
"owner": "195242",
"remoteLink": "",
"public": false,
"requests": [
{
"id": "f7fe3f94-0dd2-6dba-05b9-29ae7e571ed9",
"headers": "",
"url": "http://218.24.201.144/cb/mobile/v1/residences/568288d0-71b6-11e5-ad9f-0242ac110908/lastAirQuality/rooms",
"pathVariables": {},
"preRequestScript": "",
"method": "GET",
"collectionId": "11f345f7-9f12-58fb-099d-27f11233cee7",
"data": [],
"dataMode": "params",
"name": "http://218.24.201.144/cb/mobile/v1/residences/568288d0-71b6-11e5-ad9f-0242ac110908/lastAirQuality/rooms",
"description": "",
"descriptionFormat": "html",
"time": 1446559548262,
"version": 2,
"responses": [],
"tests": "",
"currentHelper": "normal",
"helperAttributes": {}
}
]
}
this is the output I get in newman
$ newman -c GC.json.postman_collection
Iteration 1 of 1
404 218ms http://218.24.201.144/cb/mobile/v1/residences/568288d0-71b6-11e5-ad9f-0242ac110908/lastAirQuality/rooms http://218.24.201.144/cb/mobile/v1/residences/568288d0-71b6-11e5-ad9f-0242ac110908/lastAirQuality/rooms
Summary:
Parent Pass Count FailCount
-------------------------------------------------------------
Collection GC 0 0
Total
0 0
Do you have tests set?
var data = JSON.parse(responseBody);
tests["Pass this case"] = data.id === 11f345f7-9f12-58fb-099d-27f11233cee7;
Create Tests and don't forgot to update your json collection before testing.
Check this #298
404 is means the resource you are looking in is not exist, or the system is unable to find the requested data, you can try using
$ newman run <path of you collection>
npm install postman
npm install newman
npm install newman-reporter-html
https://github.com/shahing/api-automation-tests
run command in your directory : newman run test.js