Processing message from rabbitmq at specified rate - rabbitmq

We have been trying to make listener read messages from rabbitmq at a certain rate 1 msg/2 seconds. We did not find any such utility with rabbit mq so far. So thought of doing this with DB i.e. listener will read the messages and store it into DB and later a scheduler will process at that desired rate from DB. If there is any better way of doing this, please suggest. We are developing our application in Spring. Thanks in advance.

You can't do it with a listener, but you can do it with a RabbitTemplate ...
#SpringBootApplication
public class So40446967Application {
public static void main(String[] args) throws Exception {
ConfigurableApplicationContext context = SpringApplication.run(So40446967Application.class, args);
RabbitAdmin admin = context.getBean(RabbitAdmin.class);
AnonymousQueue queue = new AnonymousQueue();
admin.declareQueue(queue);
RabbitTemplate template = context.getBean(RabbitTemplate.class);
for (int i = 0; i < 10; i++) {
template.convertAndSend(queue.getName(), "foo" + i);
}
String out = (String) template.receiveAndConvert(queue.getName());
while (out != null) {
System.out.println(new Date() + " " + out);
Thread.sleep(2000);
out = (String) template.receiveAndConvert(queue.getName());
}
context.close();
}
}
Of course you can use something more sophisticated like a task scheduler or a Spring #Async method rather than sleeping.

Inspired on the Gary Russel answer:
you can use something more sophisticated like a task scheduler or a Spring #Async
You can also get a number of determined message per minute and simulate the same limit rate:
private final RabbitTemplate rabbitTemplate;
#Scheduled(fixedDelay = 60000) // 1 minute
public void read() {
List<String> messages = new ArrayList<>();
String message = getMessageFromQueue();
while(message != null && messages.size() < 30) { // 30 messages in 1 minute = 1 msg / 2 seconds
messages.add(message);
message = getMessageFromQueue();
}
public String getMessageFromQueue() {
return (String) rabbitTemplate.receiveAndConvert(QUEUE_NAME);
}
}

Related

RabbitMQ Camel Consumer - Consume a single message

I have a scenario where I want to "pull" messages of a RabbitMQ queue/topic and process them one at a time.
Specifically if there are already messages sitting on the queue when the consumer starts up.
I have tried the following with no success (meaning, each of these options reads the queue until it is either empty or until another thread closes the context).
1.Stopping route immediately it is first processed
final CamelContext context = new DefaultCamelContext();
try {
context.addRoutes(new RouteBuilder() {
#Override
public void configure() throws Exception {
RouteDefinition route = from("rabbitmq:harley?queue=IN&declare=false&autoDelete=false&hostname=localhost&portNumber=5672");
route.process(new Processor() {
Thread stopThread;
#Override
public void process(final Exchange exchange) throws Exception {
String name = exchange.getIn().getHeader(Exchange.FILE_NAME_ONLY, String.class);
String body = exchange.getIn().getBody(String.class);
// Doo some stuff
routeComplete[0] = true;
if (stopThread == null) {
stopThread = new Thread() {
#Override
public void run() {
try {
((DefaultCamelContext)exchange.getContext()).stopRoute("RabbitRoute");
} catch (Exception e) {}
}
};
}
stopThread.start();
}
});
}
});
context.start();
while(!routeComplete[0].booleanValue())
Thread.sleep(100);
context.stop();
}
Similar to 1 but using a latch rather than a while loop and sleep.
Using a PollingConsumer
final CamelContext context = new DefaultCamelContext();
context.start();
Endpoint re = context.getEndpoint(srcRoute);
re.start();
try {
PollingConsumer consumer = re.createPollingConsumer();
consumer.start();
Exchange exchange = consumer.receive();
String bb = exchange.getIn().getBody(String.class);
consumer.stop();
} catch(Exception e){
String mm = e.getMessage();
}
Using a ConsumerTemplate() - code similar to above.
I have also tried enabling preFetch and setting the max number of exchanges to 1.
None of these appear to work, if there are 3 messages on the queue, all are read before I am able to stop the route.
If I were to use the standard RabbitMQ Java API I would use a basicGet() call which lets me read a single message, but for other reasons I would prefer to use a Camel consumer.
Has anyone successfully been able to process a single message on a queue that holds multiple messages using a Camel RabbitMQ Consumer?
Thanks.
This is not the primary intention of the component as its for continued received. But I have created a ticket to look into supporting a basicGet (single receive). There is a new spring based rabbitmq component coming in 3.8 onwards so its going to be implemeneted there (first): https://issues.apache.org/jira/browse/CAMEL-16048

can Ignite Streamer.addData be executed on separate node from the StreamReceiver/Visitor?

Is it possible to do Stream injection from a Client Node and intercept the same stream in the Server Node to process the stream before inserting in the cache ?
The reason for doing this is that the Client Node receives the stream from an external source and the same needs to be injected into a partitioned cache based on AffinityKey across multiple server nodes. The stream needs to be intercepted on each node and processed with the lowest latency.
I could've used cache events to do this but StreamVisitor is supposed to be faster.
following is the sample that i am trying to execute. Start 2 nodes : one containing the streamer, other containing the streamReciever :
public class StreamerNode {
public static void main(String[] args) {
......
Ignition.setClientMode(false);
Ignite ignite = Ignition.start(igniteConfiguration);
CacheConfiguration<SeqKey, String> myCfg = new CacheConfiguration<SeqKey, String>("myCache");
......
IgniteCache<SeqKey, String> myCache = ignite.getOrCreateCache(myCfg);
IgniteDataStreamer<SeqKey, String> myStreamer = ignite.dataStreamer(myCache.getName()); // Create Ignite Streamer for windowing data
for (int i = 51; i <= 100; i++) {
String paddedString = org.apache.commons.lang.StringUtils.leftPad(i+"", 7, "0") ;
String word = "TEST_" + paddedString;
SeqKey seqKey = new SeqKey("TEST", counter++ );
myStreamer.addData(seqKey, word) ;
}
}
}
public class VisitorNode {
public static void main(String[] args) {
......
Ignition.setClientMode(false);
Ignite ignite = Ignition.start(igniteConfiguration);
CacheConfiguration<SeqKey, String> myCfg = new CacheConfiguration<SeqKey, String>("myCache");
......
IgniteCache<SeqKey, String> myCache = ignite.getOrCreateCache(myCfg);
IgniteDataStreamer<SeqKey, String> myStreamer = ignite.dataStreamer(myCache.getName()); // Create Ignite Streamer for windowing data
myStreamer.receiver(new StreamVisitor<SeqKey, String>() {
int i=1 ;
#Override
public void apply(IgniteCache<SeqKey, String> cache, Map.Entry<SeqKey, String> e) {
String tradeGetData = e.getValue();
System.out.println(nodeID+" : visitorNode ..count="+ i++ + " received key="+e.getKey() + " : val="+ e.getValue());
//do some processing here before inserting in the cache ..
cache.put(e.getKey(), tradeGetData);
}
});
}
}
Of course it can be executed on a different node. Usually, addData() is executed on client node, and StreamReceiver works on server node. You don't have to do anything special to make it happen.
As for the rest of your post, can you elaborate it with more details and samples perhaps? I could not understand the setup that is desired.
You can use continuous queries if you don't need to modify data, only act on it.

Use Spring Cloud Spring Service Connector with RabbitMQ and start publisher config function

I connect RabbitMQ with sprin cloud config:
#Bean
public ConnectionFactory rabbitConnectionFactory() {
Map<String, Object> properties = new HashMap<String, Object>();
properties.put("publisherConfirms", true);
RabbitConnectionFactoryConfig rabbitConfig = new RabbitConnectionFactoryConfig(properties);
return connectionFactory().rabbitConnectionFactory(rabbitConfig);
}
2.Set rabbitTemplate.setMandatory(true) and setConfirmCallback():
#Bean
public RabbitTemplate rabbitTemplate() {
RabbitTemplate template = new RabbitTemplate(connectionFactory);
template.setMandatory(true);
template.setMessageConverter(new Jackson2JsonMessageConverter());
template.setConfirmCallback((correlationData, ack, cause) -> {
if (!ack) {
System.out.println("send message failed: " + cause + correlationData.toString());
} else {
System.out.println("Publisher Confirm" + correlationData.toString());
}
});
return template;
}
3.Send message to queue to invoke the publisherConfirm and print log.
#Component
public class TestSender {
#Autowired
private RabbitTemplate rabbitTemplate;
#Scheduled(cron = "0/5 * * * * ? ")
public void send() {
this.rabbitTemplate.convertAndSend(EXCHANGE, "routingkey", "hello world",
(Message m) -> {
m.getMessageProperties().setHeader("tenant", "aaaaa");
return m;
}, new CorrelationData(UUID.randomUUID().toString()));
Date date = new Date();
System.out.println("Sender Msg Successfully - " + date);
}
}
But publisherConfirm have not worked.The log have not been printed. Howerver true or false, log shouldn't been absent.
Mandatory is not needed for confirms, only returns.
Some things to try:
Turn on DEBUG logging to see it it helps; there are some logs generated regarding confirms.
Add some code
.
template.execute(channel -> {
system.out.println(channel.getClass());
return null;
}
If you don't see PublisherCallbackChannelImpl then it means the configuration didn't work for some reason. Again DEBUG logging should help with the configuration debugging.
If you still can't figure it out, strip your application to the bare minimum that exhibits the behavior and post the complete application.

ActiveMQ fail-over of producer and consumer with a shared directory doesn't happen

We have two ActiveMQ(version 5.10.0) instances running and I am using the shared storage to achieve HA.
However I am unable to see failover happening for the producer and consumer(s).
ActiveMQ broker-1 runs on IP1 and broker-2 on IP2
And under the activemq.xml of configuration I have modified persistence adapter to use a shared directory which is present on IP1.
<persistenceAdapter>
<kahaDB directory="\\IP1\shared-directory\for activemq\data"/>
</persistenceAdapter>
Both in producer and consumer sides I am using following JNDI configurations to get the connections and build sessions,etc.
jndi.properties
java.naming.factory.initial = ..........ActiveMQInitialContextFactory
java.naming.provider.url = failover:(tcp://IP1:61616,tcp://IP2:61616)?randomize=false
connectionFactoryNames = myConnectionFactory
queue.requestQ = my.RequestQ
Interesting part is :
When I start this broker pair, I see that one of the brokers becomes master.
When I start the producer, which puts the message on the Q (say producer has put 100 messages on the Q). While my producer is still running; I shutdown master broker, hence slave broker acquires the file-lock and becomes master.When I open the webconsole I see that 100 messages are still there on the Q. Even though producer is running it no longer puts any messages on this Q.
Similar to this for the consumers also.
Consumer was picking messages from the Q, this Q has say 100 messages unconsumed when master failed, now master goes down, slave becomes master, I see 100 messages are still unconsumed, but the consumer does not pick any message from the Q.
I waited them to failover for a long time.(>10 mins.)
Can any one please suggest what configuration am I missing ?
I am copy pasting producer and consumer as is (I've copied this from ActiveMQ in action book with minor modifications).
Producer
public class Producer {
private static String brokerURL = "failover:(tcp://IP1:3389,tcp://IP2:3389)";
private static transient ConnectionFactory factory;
private transient Connection connection;
private transient Session session;
private transient MessageProducer producer;
private static int count = 10;
private static int total;
private static int id = 1000000;
private String jobs[] = new String[] { "suspend", "delete" };
public Producer() throws JMSException {
factory = new ActiveMQConnectionFactory(brokerURL);
connection = factory.createConnection();
connection.start();
session = connection.createSession(false, Session.AUTO_ACKNOWLEDGE);
producer = session.createProducer(null);
}
public void close() throws JMSException {
if (connection != null) {
connection.close();
}
}
public static void main(String[] args) throws JMSException {
Producer producer = new Producer();
while (total < 1000) {
for (int i = 0; i < count; i++) {
producer.sendMessage();
}
total += count;
System.out.println("Sent '" + count + "' of '" + total
+ "' job messages");
try {
Thread.sleep(1000);
} catch (InterruptedException x) {
}
}
producer.close();
}
public void sendMessage() throws JMSException {
int idx = 0;
while (true) {
idx = (int) Math.round(jobs.length * Math.random());
if (idx < jobs.length) {
break;
}
}
String job = jobs[idx];
Destination destination = session.createQueue("JOBS." + job);
Message message = session.createObjectMessage(id++);
System.out.println("Sending: id: "
+ ((ObjectMessage) message).getObject() + " on queue: "
+ destination);
producer.send(destination, message);
}
}
Consumer
public class Consumer {
private static String brokerURL = "failover:(tcp://IP1:3389,tcp://IP2:3389)";
private static transient ConnectionFactory factory;
private transient Connection connection;
private transient Session session;
private String jobs[] = new String[] { "suspend", "delete" };
public Consumer() throws JMSException {
factory = new ActiveMQConnectionFactory(brokerURL);
connection = factory.createConnection();
connection.start();
session = connection.createSession(false, Session.AUTO_ACKNOWLEDGE);
}
public void close() throws JMSException {
if (connection != null) {
connection.close();
}
}
public static void main(String[] args) throws JMSException {
Consumer consumer = new Consumer();
for (String job : consumer.jobs) {
Destination destination = consumer.getSession().createQueue(
"JOBS." + job);
MessageConsumer messageConsumer = consumer.getSession()
.createConsumer(destination);
messageConsumer.setMessageListener(new Listener(job));
}
}
public Session getSession() {
return session;
}
}
Just one more thing:
I am more interested in consumer failover than producer.
One more observation is : Consumer stops and comes to the command prompt abruptly.
Thank you.
-JE

Starting bbcomm in Java v3 Bloomberg API

When I use the Java Bloomber V3 API it usually works. However, sometimes, especially after a reboot, bbcomm.exe is not running in the background. I can start it manually by running blp.exe, but I wondered if there was a way of doing this via the API?
After talking to the help desk, it turns out that on 64 bit Windows, running under a 64bit JVM bbcomm is not automatically started. This does not happen under 32bit Java - under 32 bit bbcomm automatically runs.
So my solutions are either to wait for the problem to be fixed by Bloomberg (now I understand it) or to check this specific case.
To check the specific case:
if running under a 64 bit windows (System property os.arch)
and if running under a 64bit JVM (System property java.vm.name)
then try and start a session
If this fails, assume bbcomm.exe is not running. Try to run bbcomm.exe using Runtime.exec()
I haven't tested the above yet. It may have exactly the same issues as Bloomberg have with 64bit VMs.
After spending some time with Help Help, it seems that bbcomm gets started either when you use the Excel API or run the API demo. But it does not get started automatically when called from the Java API. Possible ways to start it are:
adding an entry in the registry to automatically start bbcomm on startup: in HKEY_LOCAL_MACHINE\Software\Microsoft\Windows\CurrentVersion\Run add a String value called bbcomm with value C:\blp\API\bbcomm.exe - but that opens a command window which remains visible, so not really an option (and if you close that window it terminates the bbcomm process)
create a batch file START /MIN C:\blp\API\bbcomm.exe and replace the entry in the registry with that (not tested) to call bbcomm silently
manually launch bbcomm from your java code as already suggested. As a reference, I post below the code that I'm using.
private final static Logger logger = LoggerFactory.getLogger(BloombergUtils.class);
private final static String BBCOMM_PROCESS = "bbcomm.exe";
private final static String BBCOMM_FOLDER = "C:/blp/API";
/**
*
* #return true if the bbcomm process is running
*/
public static boolean isBloombergProcessRunning() {
return ShellUtils.isProcessRunning(BBCOMM_PROCESS);
}
/**
* Starts the bbcomm process, which is required to connect to the Bloomberg data feed
* #return true if bbcomm was started successfully, false otherwise
*/
public static boolean startBloombergProcessIfNecessary() {
if (isBloombergProcessRunning()) {
logger.info(BBCOMM_PROCESS + " is started");
return true;
}
Callable<Boolean> startBloombergProcess = getStartingCallable();
return getResultWithTimeout(startBloombergProcess, 1, TimeUnit.SECONDS);
}
private static Callable<Boolean> getStartingCallable() {
return new Callable<Boolean>() {
#Override
public Boolean call() throws Exception {
logger.info("Starting " + BBCOMM_PROCESS + " manually");
ProcessBuilder pb = new ProcessBuilder(BBCOMM_PROCESS);
pb.directory(new File(BBCOMM_FOLDER));
pb.redirectErrorStream(true);
Process p = pb.start();
BufferedReader reader = new BufferedReader(new InputStreamReader(p.getInputStream()));
String line;
while ((line = reader.readLine()) != null) {
if (line.toLowerCase().contains("started")) {
logger.info(BBCOMM_PROCESS + " is started");
return true;
}
}
return false;
}
};
}
private static boolean getResultWithTimeout(Callable<Boolean> startBloombergProcess, int timeout, TimeUnit timeUnit) {
ExecutorService executor = Executors.newSingleThreadExecutor(new ThreadFactory() {
#Override
public Thread newThread(Runnable r) {
Thread t = new Thread(r, "Bloomberg - bbcomm starter thread");
t.setDaemon(true);
return t;
}
});
Future<Boolean> future = executor.submit(startBloombergProcess);
try {
return future.get(timeout, timeUnit);
} catch (InterruptedException ignore) {
Thread.currentThread().interrupt();
return false;
} catch (ExecutionException | TimeoutException e) {
logger.error("Could not start bbcomm", e);
return false;
} finally {
executor.shutdownNow();
try {
if (!executor.awaitTermination(100, TimeUnit.MILLISECONDS)) {
logger.warn("bbcomm starter thread still running");
}
} catch (InterruptedException ex) {
Thread.currentThread().interrupt();
}
}
}
ShellUtils.java
public class ShellUtils {
private final static Logger logger = LoggerFactory.getLogger(ShellUtils.class);
/**
* #return a list of processes currently running
* #throws RuntimeException if the request sent to the OS to get the list of running processes fails
*/
public static List<String> getRunningProcesses() {
List<String> processes = new ArrayList<>();
try {
Process p = Runtime.getRuntime().exec(System.getenv("windir") + "\\system32\\" + "tasklist.exe");
BufferedReader input = new BufferedReader(new InputStreamReader(p.getInputStream()));
String line;
int i = 0;
while ((line = input.readLine()) != null) {
if (!line.isEmpty()) {
String process = line.split(" ")[0];
if (process.contains("exe")) {
processes.add(process);
}
}
}
} catch (IOException e) {
throw new RuntimeException("Could not retrieve the list of running processes from the OS");
}
return processes;
}
/**
*
* #param processName the name of the process, for example "explorer.exe"
* #return true if the process is currently running
* #throws RuntimeException if the request sent to the OS to get the list of running processes fails
*/
public static boolean isProcessRunning(String processName) {
List<String> processes = getRunningProcesses();
return processes.contains(processName);
}
}
In case someone needs help checking/starting bbcomm.exe process from the code hiding console window, this snippet is written in C#; I hope you can easily translate it to Java.
void Main()
{
var processes = Process.GetProcessesByName("bbcomm");
if (processes.Any())
{
Console.WriteLine(processes.First().ProcessName + " already running");
return;
}
var exePath = #"C:\blp\DAPI\bbcomm.exe";
var processStart = new ProcessStartInfo(exePath);
processStart.UseShellExecute = false;
processStart.CreateNoWindow = true;
processStart.RedirectStandardError = true;
processStart.RedirectStandardOutput = true;
processStart.RedirectStandardInput = true;
var process = Process.Start(processStart);
Console.WriteLine(process.ProcessName + " started");
}
bbcomm.exe is automatically started by the V3 API.