Consuming Spring Boot "metricsChannel" via Apache Camel/RabbitMQ - rabbitmq

Spring Boot publishes all metrics events to a message channel "metricsChannel" when a dependency on spring-messaging is present. In my project I am using Apache Camel along with RabbitMQ as the broker. Is there any way to consume these metrics messages using purely Camel and not spring integration?
I can see Apache Camel has a component SpringIntegration, however I would like to know if there is a way to directly access these messages via the RabbitMQ component or do I have to add dependency on the camel component spring-integration as well?

It would be far easier for you to implement MetricChannel and do whatever you want with Camel instead of trying to bridge a result with Spring integration. Look at MessageChannelMetricWriter, the implementation is quite straightforward.

Related

PCF / Cloud connector for Rabbit management API

All,
I'm running a simple SpringBoot app in PCF using a Rabbit on-demand service. The auto reconfiguration of the ConnectionFactory for the internal Rabbit service works just fine.
However I need a list of all queues on the Rabbit host. AFAIK this is only available through a call to the Rabbit management plugin (a REST API), see RabbitManagementTemplate::getQueues. This class expects an http URI with credentials.
I know the URI+credentials are exposed through the vcap.service variables as "http_api_uri', but I wonder if there's a more elegant way to get an instance of RabbitManagentTemplate with Spring magic cloud connectors / auto reconfiguration instead of manually reading the env vars and writing custom bean config.
It seems the ConnectionFactory only knows about the AMQP interface, and cannot create a RabbitManagementTemplate?
Thanks!
Spring Cloud Connectors won't help you here. It doesn't support setting up RabbitManagementTemplate, only a ConnectionFactory.
You don't have to parse the env yourself, you can use the flattened properties that Boot provides such as vcap.services.rabbitmq.credentials.http_api_uri. But you'll need to configure a RabbitManagementTemplate yourself using those Boot properties.

How do I use ActiveMQ in Apache Flink?

I am getting my data through ActiveMQ which I want to process in real time with Apache Flink DataStreams. There is support for many messaging services like RabbitMQ and Kafka but I can't see any support for ActiveMQ. How can I use it?
Since there is not support for ActiveMQ, I would recommend implement a custom source.
You basically have to implement the SourceFunction interface.
If you want to have exactly-once semantics, you can base your implementation on the MultipleIdsMessageAcknowledgingSourceBase class.
I would recommend you to start with implementing a SourceFunction
Found a JMS connector for Flink:
https://github.com/jkirsch/senser/blob/master/src/main/java/edu/tuberlin/senser/images/flink/io/FlinkJMSStreamSource.java

ActiveMQ, STOMP, Java example

Can anyone point me to decent example where Java stomp client is used to connect to ActiveMQ.
Also I am interested in following:
Is failover supported over stomp?
How to create durable subscription?
Does stomp support asynchronous messaging? Examples? I think I have to implement MessageListener interface for it, but I wasn't able to find example for this.
If you really want to use STOMP from Java then you could look at StompJMS which maps quite a bit of the JMS API to STOMP. It doesn't support failover but there aren't a lot of stomp client's that do. When using Java you are better off to use the native JMS client from the ActiveMQ broker as it is going to be the most robust and feature complete client library you will find.

What's the best way to have a Mule ESB flow monitor a Hazelcast Distributed Queue?

I have a cluster of Mule ESB containers, each of which is running an (identical) application which share responsibility for handling work which arrives as objects in a Hazelcast Distributed Queue.
Question: What's the best way to have these applications monitor the queue?
Conceptually, I imagine a "Hazelcast Queue Endpoint" that sits blocked on the queue until an object shows up-- but I'm not quite sure how to realize this.
(Mule ESB 3.3, community edition).
Thanks.
I would use DevKit to create a custom a Hazelcast Queue Endpoint and build it as a custom module.
This custom module would be usable in Studio and raw Mule.

RabbitMQ, is it possible to publish via one protocol and consume via another?

RabbitMQ supports multiple protocols, AMQP, MQTT, STOMP, ....
When using PHP for example, it's easier to publish using the STOMP library since the PHP AMQP libraries requires compiled C code and is somewhat of a mission to setup if you don't have to.
On the JAVA side, apache camel with AMQP on spring is pretty straight forward.
Is it possible to setup a queue, publish to it via STOMP and then consume via AMQP and then again publish via AMQP and consume via STOMP if the message broker is RabbitMQ?
Yes, this should work, given that you have installed RabbitMQ's STOMP plugin on your RabbitMQ node(s).
The protocol only defines the communication between client and server and has no impact on a message itself.
You should note that using protocols other than AMQP will most likely come along with limitations and/or worse performance.
There also exist native PHP libraries for RabbitMQ that don't require compiling C code. Unfortunately, I cannot tell you which one is the best, because I am a Java guy ;-).