Passing parameters coming from a custom stage in spinnaker to a subsequent stage? - spinnaker

we are using Spinnaker to automate our deployment pipelines. As part of the pipelines we have some custom stage that generate change tickets in JIRA for the release (some mandate we have from a regulatory perspective). The custom stage then waits for the change ticket to be approved and moves on. This all works perfectly.
What we are now trying is to add another stage at the end of the pipeline that moves the change ticket to done. For this, we would somehow need to be able to catch the JIRA ticket reference in the previous custom stage and pass it to this custom stage which can then move the JIRA ticket to done using the ticket reference that was passed to it.
Did anyone try passing data from one custom stage to another before?
Looking forward to your responses,
Moritz

Add a new parameter in your configuration stage:
Now assign the Jira ticket number from your Create ticket stage to this variable. The value can be accessed based on how you call the JIRA api and how you get the output. The output can be accessed by spinnaker expressions, for example I am accessing the output for a stage (type: Find Image name from prod cluster stage) like this ${#stage("stagename")["outputs"]["artifacts"][0]["version"]}. This expression cannot be reused and will be unique to your call and how you get the output. So change the expression accordingly. ${parameters.jiraTicket} = ${#stage("stagename")["outputs"]["artifacts"][0]["version"]}
In your 3rd stage you can now use the spinnaker expression ${parameters.jiraTicket} and then close the jira ticket.
You can ignore the steps 1 and 2 and directly use the spinnaker expression from step 2 in step 3 as well.

Related

Auto pause and resume azure synapse analysis database

Want to pause database on Saturday, Sunday and Monday morning want to resume automatically using any script or any option is to do ? or how?
Thanks!
There is no specific feature for this task, but it is doable using a combination of techniques.o perform the Pause/Restart functionality, you can use the Azure REST API.
Recurrence Scheduling
I recommend Logic Apps which has a robust recurrence trigger. You will most likely need to run it daily but you can specify the hour(s). To only continue on specific days, you'll need to add some additional processing to parse the DayOfWeek from the run time:
dayOfWeek(convertFromUtc(utcNow(), 'Eastern Standard Time'))
Get Bearer Token
In this example, I'm using a Service Principle to authenticate, and Azure Key Vault to store the relevant secrets:
Check the Status of the Resource
The next step is to check the status of the Data Warehouse: if it is already Paused, we only want to attempt to pause it if the status is "Online". To do this, we'll call the API again, this time passing the Bearer Token we acquired above:
In this example I'm using Variables instead of Key Vault to demonstrate different approaches.
We'll use the StatusCode property of the previous operation to make this determination:
Check if there are any running Jobs
If the Data Warehouse's Status is "Online", the next thing to check is whether or not there are any active processes. We accomplish this by running a query on the Data Warehouse itself:
We'll then capture the results in a variable and use that in another Condition activity:
body('Get_Ops_Count')?['resultsets']['Table1'][0]['OpsCount']
Issue the Pause Command
If there are no active jobs, we are free to finally issue the Pause command. Once again, we'll leverage the REST API using the Bearer Token we acquired previously:
Restarting the Data Warehouse
The Restart process is very similar, only without the need to check for active processes, so you should be able to extrapolate that from this example. To restart the Data Warehouse, the REST endpoint is "resume" instead of "pause".

Pulling summary report for monitoring using reporting task in NiFi

Working on a piece of the project where a report needs to be generated with all the flow details(memory used, number of records processed, Processes ran successful, failed, etc). Most of the details are present on the Summary tab, but the requirement is to have separate reports.
Can any one help me with solution/steps/examples/screens/videos.
Thanks much.
Every underlying behavior of the UX/UI that Apache NiFi provides is also accessible through an API (in fact, the UI calls the API to perform each of these tasks). So you can invoke the GET /system-diagnostics API to return that information in JSON form, and then parse this data and present it in whatever form you like.

Nifi API - Update parameter context

We've created a parameter context within Nifi which is allocated to several process groups. We would like to update the value of one parameter within the parameter context. Is there any option to do this via the API?
NiFi CLI from nifi-toolkit has commands for interacting with parameters, there is one for set-param:
https://github.com/apache/nifi/tree/master/nifi-toolkit/nifi-toolkit-cli/src/main/java/org/apache/nifi/toolkit/cli/impl/command/nifi/params
You could use that, or look at the code to see how it uses the API.
Also, anything you can do from NiFi UI has go through the REST API. So you can always open Chrome Dev Tools, take some action in the UI like updating a parameter, and then look at which calls are made.

How to get the current Elastic APM instance in Vue.js?

I have integrated Elastic APM to my Vue.js App accordingly to the documentation (https://www.elastic.co/guide/en/apm/agent/rum-js/current/vue-integration.html)
In addition to the default events page-load and route-change I want to add custom transactions/spans for some button clicks.
I am stucked with checking if there is already an existing transaction start which I could use to add a custom span:
const transaction = this.$apm.currentTransaction()
transaction.startSpan('custom_span', 'type_name');
transaction.end();
However getting the current transaction fails (first line).
The Elastic RUM agent has support for click user interactions, therefore you shouldn't need to manually start these type of transactions.
Regarding the failure in your code the correct API call is getCurrentTransaction and not currentTransaction.
Hope this helps.

BPMN model API to edit Process diagram

I have a process diagram that directs flow on the basis of threshold variables. For example, for variable x,y; if x<50 I am directed to service task 1 , if y<40 to service task 2, or if x>50 && y>40 to some task..
As intuition tells, I am using compare checks on sequence flow to determine next task.
x,y are input by user but 50, 40 (Let's call these numbers {n}) is a part of process definition(PD).
Now, for a fixed {n} I have deployed a process diagram and it runs successfully.
What should I do if my {n} can vary for different process instances? Is there a way to maintain the same version of process definition but which takes {n} dynamically?
I read about BPMN Model API here. But, I can't seem to figure out how to use it to edit my PD dynamically? Do I need to redeploy it each time on Tomcat or how does it work?
If you change a process model with the model API you have to redeploy it to actually use it. If you want to have a process definition with variable {n} values you can also use a variable for it and set it during the start of the process instance either using the Java API, REST API or the Tasklist.