Assigning a property to oozie workflow not in job.properties file but in workflow.xml itself and use it further - dynamic

I have a oozie workflow wchich consists of a sub-workflow. My main workflow takes three sqoop job names at a time in fork. Then it has to pass those names to the subworkflow. In main workflow there are three shell actions which receive values of job names in three respective variables(${job1},${job2},${job3}) . But my sub-workflow is common for all three shell actions. I want to assign the value of ${job1} to ${job}. Where to create the property ${job} and how to transfer the value of ${job1} to ${job}???? Please help.

Use a java action in between along with capture-output so that you can do whatever assignment or name change logic there.
Java will accept job1 and will output job=job1 using capture-output which in turn you may pass to sub-workflows.

Related

GitLab setting the CI_JOB_ID as the job's tag

I'm trying to set the tag of a job to that job's unique ID:
some-cool-job:
tags:
- $CI_JOB_ID
however it doesn't seem to resolve the variable. It just sets the tag to "$CI_JOB_ID". Similarly, $CI_PIPELINE_ID doesn't work.
Using $CI_JOB_NAME or $CI_PIPELINE_IID instead, works fine.
Hence I assume that the ID just doesn't exist at the time the tags are parsed.
Following this, how else can I uniquely identify a job using variables available at this time?
GitLab assigns a number of predefined environment variables for you. One of these is CI_JOB_ID. You can view the value by printing it within a script.
some-cool-job:
script:
- echo $CI_JOB_ID
In the context of a .gitlab-ci.yml file, tags map jobs to runners. For instance, I tag my runners with names reflecting the executor being used (e.g. - shell or docker), then I tag jobs within my .gitlab-ci.yml file that need a shell executor with shell.
May I ask, what is the desired outcome of tagging a job with the job ID, in your case?

Camunda : Set Assignee to all UserTasks of the process instance

I have a requirement where I need to set assignee's to all the "user-tasks" in a process instance as soon as the instance is created, which is based on the candidate group set to the user-task.
i tries getting the user-tasks using this :
Collection<UserTask> userTasks = execution.getBpmnModelInstance().getModelElementsByType(UserTask.class);
which is correct in someway but i am not able to set the assignee's , Also, looks like this would apply to the process itself and not the process instance.
secondly , I tried getting it from the taskQuery which gives me only the next task and not all the user-tasks inside a process.
Please help !!
It does not work that way. A process flow can be simplified to "a token moves through the bpmn diagram" ... only the current position of the token is relevant. So naturally, the tasklist only gives you the current task. Not what could happen after ... which you cannot know, because if you had a gateway that continues differently based on the task outcome? So drop playing with the BPMN meta model. Focus on the runtime.
You have two choices to dynamically assign user tasks:
1.) in the modeler, instead of hard-assigning the task to "a-user", use an expression like ${taskAssignment.assignTask(task)} where "taskAssignment" is a bean that provides a String method that returns the user.
2.) add a taskListener on "create" to the task and set the assignee in the listener.
for option 2 you can use the camunda spring boot events (or the (outdated) camunda-bpm-reactor extension) to register one central component rather than adding a listener to every task.

are there any ansible tower global variables like job id

I am using Ansible tower 3.4.3.
As part of one of my jobs, I need to generate a log file and logfile name should contain Tower_Job_ID to easily recognize which log is generated by which tower job id.
I guess there will be some global variables like "ansible_tower_job_id" but unable to find any documentation or the variable name.
Can some one help, how to capture the current running job ID in ansible tower.
The callback link contains the ID in it.
From the docs: "The ‘1’ in this sample URL is the job template ID in Tower." .

List all bamboo variables in inline script

I have alot of Bamboo variables defined due the fact that i have a system with alot of legacy and config at places where it does not belong. Getting rid of all this will take a bit longer on the roadmap so i need to find a way to auto replace all these values.
The number im talking about is that there are 8 customer config files with each about 100 variables. Indeed, there was a maniac who added all of those in Bamboo because as you might thought most of them are variable for each environment.
At this moment i want to automate the deployment process and all is going fine exact the fact that i need to replace 100 variables and i dont want to maintain it in my script itself all the time.
I am looking for a way to retrieve all the variables in an array so i can just iterate through all the keys and try to replace them at the config files.
echo "${bamboo.application.myvalue}" will replace the value as expected. The only problem is, how can i get all the keys under bamboo.*
I tried it with the following functions but all without success:
printenv
env
declare
All above without success. How can i retrieve a list of all those variables as inline script in Bamboo.
Thanks alot
I think it is not possible to change the value of the variables on the fly. Instead, you can use the "Inject Bamboo variables" task in order to be able to change the variable value.
This task reads a file to create the variables. So, all you have to do is to create this file with the values you need, and then use this variables.
E.g.: Creating a file from a powershell script:
$path = 'bambooVariaveis.properties'
$connectionstringX = 'connectionstring="Data Source=XXXX;"'
$Utf8NoBomEncoding = New-Object System.Text.UTF8Encoding($False)
[System.IO.File]::WriteAllLines($path, $connectionstringX, $Utf8NoBomEncoding)
E.g: Inject Bamboo Variables config
Using it (in a subsequent script task):
echo ${bamboo.inject.connectionstring}

Loop over file names in sub job (Kettle job)

The task is to get file names from the folder and then loop the same task (job) over all the files one by one.
I created a simple job with transformation (get files names) and then job with flag "Execute for each row" (now is just logging the name of the file).
Did it the same way it is described here: http://ramathoughts.blogspot.ch/2010/08/processing-group-of-files-with-kettle.html
However, the path of the received files is not passed to the sub-job (logging doesn't display variable value). But the sub-job is executed as many times as there is number of files in the input folder. So it looks like it is passed to some extent, but for some reason is not available as a variable.
Image with log details, as seen the variable is displayed as ${path} instead of value of the path:
http://i.imgur.com/pK1iHtl.png?1
The sample code is below as archive with jobs and transformation and also sample input files. Any help is appreciated, as I may be missing something simple here https://www.hightail.com/download/bXBhL0dNcklCMTVsQXNUQw
The issue is the 2nd Job (i.e. j_log_file_names.kjb) is unable to detect the parameter path. Just try defining the parameter to this Job; like the image below:
This will make sure that the parameter that is coming from the prev. step is correctly fetched into the Job. Rest of your job looks absolutely fine.
Hope this helps :)