How to use a Oozie job property in a Oozie workflow EL function? - el

In my Oozie workflow, this is how a file is passed as a command line argument for a Java action:
<file>${concat(filesPath, 'config.properties')}</file>
While this works fine for a coordinator run, it has a problem when run manually through HUE like in this video -- 'filesPath' does not show up as a parameter in the dialog box that HUE throws up to take parameters.
I tried
${concat(${filesPath}, 'config.properties')} and
${concat(wf:conf(filesPath), 'config.properties')}
First throws syntax error and second returns/concats empty value.
I am basically looking for a way to declare a parameter/job property in an Oozie Workflow EL function so that it works both for a Coordinator run and also for a manual run from HUE (should show a text box to enter value)

I ended up doing it like this:
<file>${additionsPath}config.properties</file>
This would work only with the 'concat' EL function though.

Related

Pentaho Carte How to pass parameter in Job Level

I am currently trying to develop simple parameter passing process using Pentaho and execute the job from web (Carte). I have a transformation and also a job. I have successfully pass the parameter if I execute it directly through transformation. http://cluster:cluster#localhost:8080/kettle/executeTrans/?trans=/data-integration/PENTAHO_JOB/test_var.ktr&testvar=1234567
however, when I try to put the transformation in a Job and execute it in job level, I could not get the parameter testvar now even though I can run it successfully. i also found out that there is no Get Variable function in Job level. I wonder if I can get the parameter testvar by executing from job level in Carte?
http://cluster:cluster#localhost:8080/kettle/executeJob/?job=/data-integration/PENTAHO_JOB/test_var.kjb&testvar=1234567
#Raspi Surya :
Its working for me . You need to set the variable in parameter at job level. I used the below URL.
http://localhost:8080/kettle/executeJob/?job=C:\Data-Integration\carte-testing.kjb&file_name=C:\Data-Integration\file1.csv
See the attached SS

airflow test mode xcom pull/push not working

I try to test 2 tasks through the airflow cli test command`
The first task run, auto pushes last console out to xcom and i see the value some value in the airflow GUI as expected
When i run the second task via airflow cli test command i just get None as return value but as i have read here: How to test Apache Airflow tasks that uses XCom that it should work and at least the xcom_push is obvious working, why not the xcom_pull?
Someone has a hint how to get this working?
Provide context is set to true.
Example code:
t1 = BashOperator(
task_id='t1',
bash_command='echo "some value"',
xcom_push=True,
dag=dag
)
t2 = BashOperator(
task_id='t2',
bash_command='echo {{ ti.xcom_pull(task_ids="t1") }}',
xcom_push=True,
dag=dag
)
Thanks!
Edit: when i run the code (DAG) without test mode the xcom_pull works fine
As far as I know, "test" runs without saving anything to the metadata database which is why when you run the puller task, you get "None" as a result and when you actually run the DAG code, it works.
You can query the metadata database directly after testing the first task to verify this.
Context seems to be missing here, along with xcom_push=True, we need to use provide_context=True

EB Guide (com. ed 6.8) scripting engine reports "Expected 'Function () void' but got 'Error'

I'm trying to start an animation in eb guide using a script (state entry action). The scripting engine reports the error:
Expected 'Function () : void' but got 'Error'
How can I fix this?
The used script is:
function()
{
f:animation_play(this->"View 1"->"Animation 1")
}
I try to get an animation similar to the one described in Sprite animation in eb guide (community ed), but it shall start when the state is entered.
Used version is eb guide 6.8 community edition.
There are two issues with the scripts:
Function returns wrong type:
The scripting engine always takes the return value of the last command as return value of the function.
In this case, f:animation_play does not get a valid parameter (see 2. below), which is interpreted as Error return value. In case the parameter would be correct, the return value would still be incorrect, because animation_play returns a boolean value (see EB GUIDE Studio manual). To return void, use the keyword unit as last line in the script.
State entry action tries to start animation
A script which is executed when a state is entered or left cannot access children of the state, as they are not created yet (or already destroyed in case the state is left).
To start the animation when entering the state, there are two possibilities (I suggest to use the first one):
Move the script to the animation widget (add a conditional script as user-defined property) with following code:
{
f:animation_play(v:this)
false
}
Note the falsekeyword which ensures that a boolean value is returned. The script is automatically run once as soon as the current state is entered and all widgets are initialized.
Fire an event in the entry action script; another script can react on this event to start the animation. This is helpful if you don't want to directly start the animation but have some delay. Otherwise, the first approach is simpler.

What is the use of logging menu in settings

What is the use / purpose of logging menu in the
Settings -> Technical -> Database structure -> Logging
It seems that you can store all the log messages in that view (model ir.logging) as long as you use the parameter --log-db your_database_name when executing odoo-bin in the command line (or add log_db = your_database_name in your Odoo config file).
Check out Odoo 11 info about command line parameters: https://www.odoo.com/documentation/11.0/reference/cmdline.html
--log-db
logs to the ir.logging model (ir_logging table) of the specified database. The database can be the name of a database in the “current” PostgreSQL, or a PostgreSQL URI for e.g. log aggregation
This is the theory, but honestly, I was not able to make it work, and I did not waste much time in trying to know why.
EDIT
As #CZoellner says, it seems that log messages stored in ir_logging table (log messages you see clicking on the menuitem Settings -> Technical -> Database structure -> Logging) come only from scheduled actions. If you create an scheduled action which executes some Python code, you have the following available variables to use in your method code:
env: Odoo Environment on which the action is triggered.
model: Odoo Model of the record on which the action is triggered; is a void recordset.
record: record on which the action is triggered; may be void.
records: recordset of all records on which the action is triggered in multi-mode; may be void.
time, datetime, dateutil, timezone: useful Python libraries.
log: log(message, level='info'): logging function to record debug information in ir.logging table.
Warning: Warning Exception to use with raise To return an action, assign: action = {...}.
If you use the log one, for example:
log('This message will be stored in ir_logging table', level='critical')
That log message and its details will be stored in ir_logging table each time the scheduled action is executed (automatically or manually). This answers your question, but now I am wondering what is the parameter --log-db, as I have tested it and these log messages are stored in ir_logging no matter this parameter is set or not.

Jenkins' EnvInject Plugin does not persist values

I have a build that uses EnvInject Plugin to set an environmental value.
A different job needs to scan last good Jenkins build of that job and get the value of that environmental variable.
This all works well, except sometimes the variable will disappear from build history. It seems that after some time passes, when I look at the 'Environment variables' section in build history, the injected value simply disappears.
How can I make this persist? Is this a bug, or part of the design?
If it make any difference, the value of the injected variable is +1500 chars and in the following format: 'component1=1.1.2;component2=1.1.3,component3=4.1.2,component4=1.1.1,component4=1.3.2,component4=1.1.4'
Looks like EnvInject and/or JobDSL have a bug.
Steps to reproduce:
Set up a job that runs this JobDSL:
job('run_deploy_mock') {
steps {
environmentVariables {
env('deployedArtifacts', 'component1=1.0.0.2')
}
}
}
Run it and it will create a job called 'deploy_mock'
Run the 'deploy_mock' job. After build #1 is done, go to build details and check 'Environmental Variables' section for an entry called 'component1'
Run the JobDSL job again
Check 'Environmental Variables' section for 'deploy_mock' build #1. The 'component1' variable is now missing.
If I substitute the '=' for something else, it works as expected.
Created Jenkins Jira