Is it possible to start Mule3 embedded without spring dependency? - mule

I am trying to use Mule 3.2.1 embedded from a plain java application. The application is suppose to run in an environment where storage space is limited.
I tried something like (import, exceptions omitted for brevity):
DefaultMuleContextFactory muleContextFactory = new DefaultMuleContextFactory();
ConfigurationBuilder configBuilder = new AutoConfigurationBuilder("mule-config.xml");
MuleContext muleContext = muleContextFactory.createMuleContext(configBuilder);
muleContext.start();
and also this:
AutoConfigurationBuilder configBuilder = new AutoConfigurationBuilder("mule-config.xml");
DefaultMuleConfiguration configuration = new DefaultMuleConfiguration();
MuleContextBuilder contextBuilder = new DefaultMuleContextBuilder();
contextBuilder.setMuleConfiguration(configuration);
MuleContext muleContext = new DefaultMuleContextFactory().createMuleContext(configbuilder, contextBuilder);
muleContext.start();
but both require spring-core, spring-beans, spring-context and some commons libraries. Any help would be great.

If you use the XML configuration, you need Spring.
If you don't want to use Spring, your options are:
Instantiate and wire Mule internal components by hand, dealing with life cycles as well,
Wait until Mule DSL gets released. You may want to bug MuleSoft about a release date :)
If you only want to use raw transports, ie not configure any flow or pattern, you can do it without Spring but bear in mind that, if the mule-core dependency doesn't bring Spring transitively, all the modules and transports do. This means that you'll have to use filtering to keep these dependencies at bay.
For example to use the HTTP transport, you would need these Maven dependencies:
<dependency>
<groupId>org.mule</groupId>
<artifactId>mule-core</artifactId>
<version>3.4.0</version>
</dependency>
<dependency>
<groupId>org.mule.transports</groupId>
<artifactId>mule-transport-http</artifactId>
<version>3.4.0</version>
<exclusions>
<exclusion>
<groupId>org.mule.modules</groupId>
<artifactId>mule-module-spring-config</artifactId>
</exclusion>
</exclusions>
</dependency>
With this in place you then do:
MuleContextFactory muleContextFactory = new DefaultMuleContextFactory();
MuleContextBuilder muleContextBuilder = new DefaultMuleContextBuilder();
MuleContext muleContext = muleContextFactory.createMuleContext(muleContextBuilder);
muleContext.start();
MuleClient client = muleContext.getClient();
MuleMessage response = client.request("http://www.google.com", 20000L);
System.out.println(response.getPayloadAsString());
muleContext.dispose();
System.exit(0);
Note that if that's all you're doing with Mule, then you'd rather use the Apache HTTP Client directly :)

MuleContextFactory muleContextFactory = new DefaultMuleContextFactory();
MuleContextBuilder muleContextBuilder = new DefaultMuleContextBuilder();
MuleContext muleContext
muleContextFactory.createMuleContext(muleContextBuilder);
muleContext.start();
// create mule client
MuleClient client = new MuleClient(muleContext);
// generate xml request
String reportRequestXml = createXML(reportRequest);
// set up message properties
Map<String, Object> messageProperties = new HashMap<String, Object>();
messageProperties.put("Content-Type", "application/xml");
// send request with timeout
MuleMessage response = client.send(crsRestUrl, reportRequestXml, messageProperties, httpTimeout);
muleContext.stop();

Related

Spring data redis ignoring user credentials

I am trying to create a redis client with spring data redis with lettuce. What I am observing right is that whatever any password other than default user password doesn't work. Below is the code:-
#Bean
public LettuceConnectionFactory lettuceConnectionFactory() {
RedisStandaloneConfiguration redisStandaloneConfiguration = new RedisStandaloneConfiguration();
redisStandaloneConfiguration.setHostName(host);
redisStandaloneConfiguration.setPort(port);
redisStandaloneConfiguration.setUsername(username);
redisStandaloneConfiguration.setPassword(RedisPassword.of(password));
LettuceConnectionFactory lcf = new LettuceConnectionFactory(redisStandaloneConfiguration);
lcf.setShareNativeConnection(false);
lcf.afterPropertiesSet();
return lcf;
}
#Bean
public RedisTemplate<String, Object> redisTemplate() {
RedisTemplate<String, Object> template = new RedisTemplate<>();
template.setConnectionFactory(lettuceConnectionFactory());
template.afterPropertiesSet();
return template;
}
In the debug logs, I can see that it is using the username provided:
Trying to get a Redis connection for: redis://test:*******#serverA.net:12345
However, no password other than default user password works. I am able to connect with same credentials on Redis CLI. Eventually gets WRONGPASS invalid username-password pair
What is wrong with above code? Using spring boot vs 2.4.2 & lettuce-core vs 6.0.2
I had spring and lettuce redisearch dependencies in the pom file. Removing them resolved the issue.
<dependency>
<groupId>com.redislabs</groupId>
<artifactId>lettusearch</artifactId>
<version>2.4.4</version>
</dependency>

HttpComponentsClientHttpConnector is not accepting org.apache.http.impl.nio.client.CloseableHttpAsyncClient for Webclient with Apache Http Client

Im trying to run Webflux on Tomcat and try to create Sping WebClient with Apache Http Client.
Reference Documentation stated that theres built-in support:
https://docs.spring.io/spring-framework/docs/current/reference/html/web-reactive.html#webflux-client-builder-http-components
private ClientHttpConnector getApacheHttpClient(){
HttpAsyncClientBuilder clientBuilder = HttpAsyncClients.custom();
clientBuilder.setDefaultRequestConfig(RequestConfig.DEFAULT);
CloseableHttpAsyncClient client = clientBuilder.build();
ClientHttpConnector connector = new HttpComponentsClientHttpConnector(client);
return connector;
}
But Springs HttpComponentsClientHttpConnector is not accepting org.apache.http.impl.nio.client.CloseableHttpAsyncClient. It requires org.apache.hc.client5.http.impl.async.CloseableHttpAsyncClient. So there seems to be a package rename and I canĀ“t find a Maven Dependency that has the required class.
Does anybody know the right Maven Dependency for that class. Or how could I make it work?
Apache HTTP Client 5 is a separate artifact. You'll need to add the following dependencies to your pom.xml:
<dependency>
<groupId>org.apache.httpcomponents.client5</groupId>
<artifactId>httpclient5</artifactId>
<version>5.1</version>
</dependency>
<dependency>
<groupId>org.apache.httpcomponents.core5</groupId>
<artifactId>httpcore5-reactive</artifactId>
<version>5.1</version>
</dependency>
import org.apache.hc.client5.http.impl.async.HttpAsyncClients;
import org.springframework.http.client.reactive.HttpComponentsClientHttpConnector;
public class ApacheHttp {
public static void main(String[] args) {
new HttpComponentsClientHttpConnector(HttpAsyncClients.custom().build())
}
}

Apache Beam : is it possible to comsume messages of RabbitMQ with exchange and routing key

I defined a pipeline in Apache Beam to consume messages of a given queue in RabbitMQ message broker.
I defined an exchange and routing key in RabbitMQ.
I used AmqpIO.read() in Beam (version 2.9.0) but I did not found any API to set the echange and the routing key.
(Following this doc : https://beam.apache.org/releases/javadoc/2.4.0/org/apache/beam/sdk/io/amqp/AmqpIO.html)
Is there any possibility to do that ? Even with any other plugin.
Regards,
Ali
There is a new (experimental) IO connector for RabbitMQ shipped with the latest v2.9.0 Apache Beam release. The AMQP connector will not work for RabbitMQ.
If you are using Maven add the following dependency in your POM
<!-- Beam MongoDB I/O -->
<dependency>
<groupId>org.apache.beam</groupId>
<artifactId>beam-sdks-java-io-mongodb</artifactId>
<version>2.9.0</version>
</dependency>
and you can use it in a pipeline like
public class RabbitMQPipeline {
final static Logger log = LoggerFactory.getLogger(RabbitMQPipeline.class);
/**
* Mongo Pipeline options.
*/
public interface RabbitMQPipelineOptions extends PipelineOptions {
#Description("Path of the file to read from")
#Default.String("amqp://localhost")
#Required
String getUri();
void setUri(String uri);
}
/**
* #param args
*/
public static void main(String[] args) {
RabbitMQPipelineOptions options = PipelineOptionsFactory.fromArgs(args).withValidation()
.as(RabbitMQPipelineOptions.class);
Pipeline pipeline = Pipeline.create(options);
PCollection<RabbitMqMessage> messages = pipeline
.apply(RabbitMqIO2.read().withUri(options.getUri()).withQueue("test"));
messages.apply(ParDo.of(new DoFn<RabbitMqMessage, String>() {
#ProcessElement
public void process(#Element RabbitMqMessage msg) {
System.out.println(msg.toString());
}
}));
pipeline.run().waitUntilFinish();
}
}
The RabbitMqIO Javadoc has examples of how to use the reader and writer.
A word of caution
There is a known bug that has been fixed but scheduled for release in v2.11.0 that blocks the connector from working even in the simplest scenarios. The fix is really simple (see JIRA issue) but you will need to rebuild a new version of the class. In case you want to give it a try make sure you add the following Maven dependency
<dependency>
<groupId>com.google.auto.value</groupId>
<artifactId>auto-value</artifactId>
<version>1.5.2</version>
<scope>provided</scope>
</dependency>
and add the following configuration in Maven Compiler plugin
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.6.1</version>
<configuration>
<source>1.8</source>
<target>1.8</target>
<annotationProcessors>
<annotationProcessor>com.google.auto.value.processor.AutoValueProcessor</annotationProcessor>
</annotationProcessors>
</configuration>
</plugin>
If you are using Eclipse make sure you install the m2-apt Maven plugin. Good luck!

WRITE_DATES_WITH_ZONE_ID cannot be disabled for ZonedDateTime

I'm using Jackson 2.8.1 for serialization of java object. However, I just can't git rid of zone id when converting a ZonedDateTime object to a string with "WRITE_DATES_WITH_ZONE_ID" set to false
ObjectMapper mapper = new ObjectMapper()
.findAndRegisterModules()
.setSerializationInclusion(Include.NON_EMPTY)
.configure(SerializationFeature.WRITE_DATES_AS_TIMESTAMPS, false)
.configure(SerializationFeature.WRITE_DATES_WITH_ZONE_ID, false);
ZonedDateTime zdt = ZonedDateTime.now();
System.out.println(mapper.writeValueAsString(zdt)); // "2016-08-23T13:35:38.127+08:00[Asia/Shanghai]"
Can any one help?
I would say the issue is because you call "findAndRegisterModules".
You probably added the following dependency:
<dependency>
<groupId>com.fasterxml.jackson.datatype</groupId>
<artifactId>jackson-datatype-jsr310</artifactId>
<version>${jackson.version}</version>
</dependency>
Doing so, you introduced two new modules: "JSR310Module" (deprecated) and "JavaTimeModule".
"findAndRegisterModules" can register the "JSR310Module" that is not supporting properly the WRITE_DATES_WITH_ZONE_ID feature.
You can register the right module by removing "findAndRegisterModules" and adding:
JavaTimeModule module = new JavaTimeModule();
registerModule(module);
Then, don't forget to disable WRITE_DATES_WITH_ZONE_ID in your mapper:
disable(SerializationFeature.WRITE_DATES_WITH_ZONE_ID);

File format converter library in Java

I am trying to create a simple webservice in java that uses some library to convert the input docx file to an pdf file. Can some one please suggest me some sample libraries and also share some sample codes.
Since you are using Jersey, configure the file upload part of it. For it you need dependecy:
<dependency>
<groupId>com.sun.jersey.contribs</groupId>
<artifactId>jersey-multipart</artifactId>
<version>1.8</version>
</dependency>
After that you need dependecy of documents4j, I believe it is something similar as below:
<dependency>
<groupId>com.documents4j</groupId>
<artifactId>documents4j-api</artifactId>
<version>0.2.1</version>
</dependency>
You have more dependecy info here:
http://mvnrepository.com/artifact/com.documents4j
After it in your servlet you should receive your file upload:
http://www.mkyong.com/webservices/jax-rs/file-upload-example-in-jersey/
And convert it to PDF:
File wordFile = new File( ... ), target = new File( ... );
IConverter converter = ... ;
Future<Boolean> conversion = converter
.convert(wordFile).as(DocumentType.MS_WORD)
.to(target).as(DocumentType.PDF)
.prioritizeWith(1000) // optional
.schedule();
https://github.com/documents4j/documents4j
For more docs you can look here:
http://documents4j.com/#/
For IConverter:
IConverter converter = LocalConverter.builder()
.baseFolder(new File("C:\Users\documents4j\temp"));
.workerPool(20, 25, 2, TimeUnit.SECONDS)
.processTimeout(5, TimeUnit.SECONDS)
.build();