How to add arguments for run command in Mule Anypoint Studio - mule

I'm trying to use Mule Credentials Vault security feature from Anypoint Studio. As in Mule's example I need to have:
.properties file with encrypted data
Global element - Similar to Mule's example
In mule-app.properties - similar to Mule's example:
When running it from command line I put the password as argument:
The error I get is:
PropertyAccessException 1: org.springframework.beans.MethodInvocationException: Property 'key' threw exception; nested exception is java.lang.RuntimeException: Property code could not be found
When I enter the password directly to the global element the app is deployed and running.
How can I insert the password on runtime (similar to how I enter it via command line)?
Thanks,
Keren

The -M is a way of passing arguments to the JVM if you are using the standalone Mule only. In studio you just need to pass -D.

If you want to set the code property through JVM argument, insert the -M-Dcode=24681357 in the VM arguments (and in the program arguments).
If you want to set the code property in the mule-app.properties, the line should be code=24681357 instead of -M-Dcode=24681357.
Thoose are two ways of setting properties in mule (can use wrapper.conf too). You should choose the one that fits your needs (and not use both simultaneously)

Related

WHMAPI1 Addpkgext Package Extensions Key Value (cPanel)

I am trying to run this WHMAPI1 command using the shell to add package extensions to a package using the guide https://api.docs.cpanel.net/openapi/whm/operation/addpkgext/
We have created 3 Package extensions and working well on the WHM interface, but we need to activate these extensions from WHMAPI calls.
The command we are using is :
whmapi1 addpkgext name=samplepackage _PACKAGE_EXTENSIONS='ext1=value ext2=value ext3=value'
On the guide, it clearly says we can use Key=Value pairs in the command but when we run the above command, it returns -
metadata:
command: addpkgext
reason: Package extension value is invalid.
result: 0
version: 1
I am getting the same result when I use "addpkg", "editpkd", or "modifypkg" API functions.
Can anyone help me sort this?
Thanks!

Pentaho Data Integration (kettle) 6.0.1 will not recognize command line argument

I am transferring a kettle Job to a new production server and when I call the initial kettle job, it will not recognize the command-line argument that I pass.
Here is the script/job call I am using:
/path/pdi-6.0.1/data-integration/kitchen.sh -file /path/kettle_job.kjb /command_line_argument_path >logfile"
The problem is that the command_line_argument_path is a path where files are supposed to be stored, but instead of going to that path, they are being put inside of the location: /path/pdi-6.0.1/data-integration
The same call is working on a testing server and the permissions for all the places this job touches is the same across both servers, so I am confused on why it is not putting the files inside the command line argument path as the testing server does.
Does anyone have any advice?

Mule ESB vm arguments are ignored

I am trying for couple of hours now and it seem I cannot find a solution, how to set vm arguments when mule standalone is started.
There are many sources that are saying I could set arguments in wrapper.conf like so: wrapper.java.additional.21=-Djavax.net.debug=all or when starting mule through command line like so: ./mule -M-Djavax.net.debug=all but nothing is working.
I made no changes to mule file or any other that I think matters.
What am I doing wrong?
ESB standalone version: 3.7 CE
I found out that if you set parameter in wrapper and when starting mule then somehow setting is ignored. Make sure you use it only at one place.

Mule redeploy timer

Whenever we run Mule server, it gives a message on Console saying "Mule is up and kicking (every 5000ms)". I want to change this time from 5000ms to any other value. I am using Community edition of Mule server 3.5
You can configure it setting using mule.launcher.changeCheckInterval system property. Your start command should look like:
bin/mule start -M-Dmule.launcher.changeCheckInterval=3000
Alternatively you can add this line to your conf/wrapper.conf file:
wrapper.java.additional.<n>=-Dmule.launcher.changeCheckInterval=3000
where is the lower available number (if you didn't touch this file before is 4).
start -M-Dmule.launcher.changeCheckInterval=xxxx will help.
In wrapper.conf file add the following line:
wrapper.java.additional.<n>=-Dmule.launcher.changeCheckInterval=xxxx

WSO2 Hive analyser script result set

I am using WSO2 ESB 4.5.1 and WSO2 BAM 2.0.0. In my Hive script I am attempting to get a single value and assign it to a variable so I can later use it in SQL statements. I can use a variable using hiveconf but I'm not sure how to assign a single value from result set to it.
any ideas?
Thanks.
You can extend the AbstractHiveAnalyzer and write you own class which executes the query and set the hive conf value, similar to this summarizer. Here you can see the execute() method should be implemented and this will be called by BAM. Here you can add your preferred query and assign the hive conf with 'setProperty("your_hive_conf", yourResult.string());'.
You can build your java application as typical '.jar' file or osgi bundle. If you have packaged as just a '.jar' file, then you should place the jar in $BAM_HOME/repository/components/lib. If you packaged the application as osgi bundle, then place the file in $BAM_HOME/repository/components/dropins folder. And restart BAM server.
And finally in your hive script that you add in BAM, you should include you extended class as 'class your.package.name.HiveAnalyzerImpl;', so that BAM will run the execute() method which you have implemented in your class and your hive conf will be set. And then, value you have set for your hive conf can be used in the hive script as ${hiveconf:your_hive_conf}.
Hope this helps.