I want to set time zone in server. Similar we have in mule 3
https://help.mulesoft.com/s/article/How-to-properly-set-the-TimeZone-in-Mule-Mule-Runtime-and-CloudHub
But in mule 4, i don't see a way to call class file outside flow. Any suggestion?
One way i found is setting the -Duser.timezone= in VM arguments. But still looking for way at program level
To call Java classes in Mule 4 you need to add the Java Module to your project and use it to call the method. If you want to use the Spring configuration instead of calling Java from a flow, then you need to add the Spring Module. Note that you have to put the Spring bean configuration in its own pure Spring XML file. Mule 4 XML files are not Spring XML files like in Mule 3. That means the Spring XML file must not have a <mule> tag. You need to reference the Spring XML in the Mule XML (example: <spring:config name="springConfig" files="beans.xml" />).
See the examples in the Spring Module documentation if you are not familiar with Spring.
Related
I am performing XML to JSON transformations using Transform Message connector. I have created a mule-plugin for the transformations code and added it as a dependency to my application. When I deploy the application in anypoint studio(4.3.0) it is working as expected i.e. I am getting the full payload transformed to JSON. But, when I deploy the same application to ONPREM some fields of input(XML) are missing in the output(JSON). In case of the ONPREM application I am sending the message(XML-payload) via JMS(1.7.1)-Publish by publishing it to a JMS queue where my application is listening using JMS-On New Message and using the transformations mule-plugin(added as a dependency) to transform the XML to JSON and publishing via JMS-publish to a queue where another API is listening.
I observed that when I am dividing parts of dwl in modules and importing them in a main dwl and deploying at ONPREM the fields are missing. But, when I am using all the module's dwl code in same dwl file I am getting all the fields.
Please Help me with this.
Issue Resolved. There was a difference between Studio Runtime and ONPREM Runtime. When I patched the ONPREM with latest update. The Issue got Resolved.
It looks like cordapp outputs console log in log4j format layout. I need to have it in logstash json layout. I have already implemented a library that outputs in logstash json layout and this library works well with spring boot or regular application. However when used it with Cordapp, the cordapp log4j layout always overrides.
Details on how I am trying to implement this:
I have extended log4j..ConfigurationFactory to create a CustomConfigurationFactory. The main purpose of the CustomConfigurationFactory is to implement Logstash layout and custom rolling of the the log file. There is few more meta data that is included with every log statement. It uses the org.slf4j.Logger in the the background along with the custom configuration to log statements in the custom format and implement our custom rolling. This is created as an independent library so it can be used across all our application.
I am using this custom logging library for logging purpose. It works for the accompanying Spring boot application that interacts with the Corda nodes however the logging on the Corda node itself is in the default log4j format.
Any suggestions?
Corda uses log4j for logging, you could provide your custom format in the log4j configuration files. You could find more details on logging here in Corda documentation: https://docs.corda.net/docs/corda-os/4.6/node-administration.html#logging
we are using the Jboss fuse 6.2 along with technical stack blueprint,camel ,activeMQ and Mybatis.
We need to know about how to configure the property files in OSGI ,
as per my knowledge we could configure .cfg files, but is there any simplest way to use like spring configuring the configuring.
In Our code we are reading from property files . using namespace ext:proeprtyplaceHolder giving that bean id and values we are giving .
Help to provide is there any simplest way to read the property files
There is several ways to add configuration, because OSGi services can access configuration via ConfigurationAdmin service. The blueprint also can access property values over it.
JBoss fuse using karaf, so you can use the following methods.
(There is some quotes from http://www.liquid-reality.de/display/liquid/2011/09/23/Karaf+Tutorial+Part+2+-+Using+the+Configuration+Admin+Service)
Configuration with Blueprint
The integration with our bean class is mostly a simple bean definition where we define the title property and assign the placeholder which will be resolved using the config admin service. The only special thing is the init-method. This is used to give us the chance to react after all changes were made like in the pure OSGi example.
For blueprint we do not need any maven dependencies as our Java Code is a pure Java bean. The blueprint context is simply activated by putting it in the OSGI-INF/blueprint directory and by having the blueprint extender loaded. As blueprint is always loaded in Karaf we do not need anything else.
<cm:property-placeholder persistent-id="ConfigApp" update-strategy="reload" >
<cm:default-properties>
<cm:property name="title" value="Default Title"/>
</cm:default-properties>
</cm:property-placeholder>
<bean id="myApp" init-method="refresh">
<property name="title" value="${title}"></property>
</bean>
After you can put a cfg file (which is a standard java property file) to
karaf's etc or deploy directory with the name of of the given persistent-id which is MyApp in our example. (For example: /etc/ConfigApp.cfg)
title=Configured title
I have a very strange requirement in which I need to convert JSON to XML in worklight server's adapter procedure and then need to send that xml to some other systems.
I am using HTTP worklight adapter.
Is it possible to overcome this requirement in worklight adapter?
Finally i got the answer after googling a lot.
Worklight has flexibility to use java code.
I had already java code ready which serves the purpose(convert json to xml) so i just imported that class in my worklight project(copy in apps/server/ folder).
In adapter i have used that class like var xml = com.XXXX.json2xml(input);
Deploy adapter and worklight application war file and you are ready to go.
If having 3rd party library support in adapters is important to you, please open a Request for Enhancement (RFE).
I'm trying to create a multiple mediators inside same java packages. e.g.
com.samples.wso2
However when 2 or more mediators are deployed using same package name, ESB will find just one mediator class. If unique package name is used for different mediators then all are found. Didn't find any information about this but seems like a bug to me. Can somebody confirm this ? Thanks,
Version used 4.8.0.
I have 2 custom mediators ("genuine" mediators, with xml conf, not class mediators) in the same package in WSO2 ESB 4.8.0 without problem
You should verify your META-INF/services/org.apache.synapse.config.xml.MediatorFactory and MediatorSerializer files : you need two lines in each file, to reference factories and serializers for each mediator