Deploying java client, RabbitMQ, and Celery to server - rabbitmq

I have a Java API on my server, and I want it to create tasks and add them to Celery via RabbitMQ. I followed the following tutorial, http://www.rabbitmq.com/tutorials/tutorial-two-python.html, where I used java for the client (send.java) and python to receive (receive.py). In receive.py, where the callback method is invoked, I call a method that I've annotated with #celery.task so that the task is added to celery.
I'm wondering how all of this is deployed on a server though, specifically, why there is a receive.py file. Is receive.py a process that must continually run on the server? Is there a way to configure RabbitMQ so that it automatically routes java client tasks to celery?
Thanks!

RabbitMQ is just a message queue. Producers put messages and consumers get them on demand. You can only restrict access for specific queues via RabbitMQ's auth options.
As for deployment: yes, receive.py needs to continuously run. It is Celery's job to do that. See the Workers Guide for info on running a worker.

Related

Can a MQTT Subscriber be able to take concurrently multiple messages of the Same Topic?

I am trying to create a Subscriber in my Spring Boot Application. My objective is that the publisher will send multiple messages to a topic and I have to get those message and process them .I noticed that the "handleMessage" of both Paho and Apache ActiveMq will process 1 message at a time. Is it possible to make it concurrent??
I have tried the following
Replaced Paho with ActiveMq
Provided concurrency in my listenercontainer
Provided prefetch in my subscribe URL
Please let me know if there is any way to make my MQTT subscriber to take multiple messages concurrently.
Thank You
If you supply your own thread pool you can have the handleMessage method pass the incoming message off to the threadpool to process and then pass the next message off to the pool.

how to checks if RabbitMQ server is alive using the REST API

I am totally new to spring framework. I am trying to create a project where I can have the connectivity to the rabbitMq and I even before I publish the message, I want to check if the queues are alive or not. Is this possible to ping the queue to see if it is alive or not?
RabbitMQ have the management API. You can use it to check the status of queue,exchange,binding.
If you are working on PHP. Then here is the libarary which can be used.

RabbitMQ dropping messages after the first one

I'm using celery 3.0.18 with RabbitMQ 3.0.2. I have a task sent to another application by using celery.send_task, and I can see the send_task call in my logs, I can see the packets leaving the worker instance, and I can see the packets reaching the RabbitMQ instance when I call tcpflow -ce -i any port 5672, however, only the first message gets to the queue. They all have the same routing key, I tried recreating the exchange and bindings, and even a new RabbitMQ instance, and nothing seems to work. This used to work fine for months, until we had to rebuild the RabbitMQ from scratch after a crash in our AWS infrastructure. Strangely, I have the exact same setup working on other application, using the same broker and the same exchange, binding and queue, and it works perfectly there. Also, it works when I send the messages to the same exchange using the same call from a management script, running from the shell on the same instance, but it doesn't work when it's sent from the celery task in the worker process.
Any ideas on what the problem might be?
Eventually, I figured what's wrong, but it's not clear if this is the expected behavior, a celery bug, or a RabbitMQ bug.
What happens is that besides our application tasks, I have a custom logging handler used to send logs to a central location using RabbitMQ, using celery.send_task. This logging handler sends messages to an exchange named application.logger, with a routing key like application.logger.info, application.logger.warning, etc, and have bindings to route some logging levels to specific queues. This exchange, bindings and queues were created directly in RabbitMQ and not defined in Celery routes.
When the worker tries to send a message to this exchange and it doesn't exist, Celery would log a 404 NOT_FOUND error. After that, tasks sent to other exchanges using the same connection weren't delivered. They were sent by the worker instance, we could see the packets arriving and the RabbitMQ management screen for that connection even shows the data arriving from the client in kb/s, but no messages were delivered.

Checking ActiveMQ queue is empty from JMeter

I am running a performance test using JMeter for our application and the there is some asynchronous processing in the form of events on an ActiveMQ queue. I want to wait for the ActiveMQ queue to be empty before recording the statistics for my test. Is there a good way to do that?
I have explored the JMS Producer/Consumers in JMeter 2.10 but they consume messages off the queue which is not what I want as it modifies the original flow of the application. Is there a way to monitor the draining of the queue without consuming the messages of ActiveMQ?
I am using ActiveMQ 5.8 and JMeter 2.10
I was able to monitor ActiveMQ using the HTTP Request to poll the ActiveMQ web console and get the state of all the queues in XML format. After that I used XPATH to extract the size of the queue I was interested in. The snapshots below show the configuration which I was finally able to use. The XPATH expression which I used was
/queues/queue[#name='${queueName}']/stats/#size
One additional thing which I has to do was to setup basic HTTP authentication to be able to connect to the ActiveMQ web console.
The MBean solution by Mahesh should also work if JMX is enabled on the server but it is not enabled by default.
I have documented it in detail here
You can get the pending messages in that queue using the MBean
"org.apache.activemq:BrokerName=host1,Type=Queue,Destination=dest1"
attribute: "QueueSize"
After checking once every few seconds and the value being not more than zero, you can start recording the statistics.
You can create a simple Java class that consumes all messages from the queue. JMeter can run it before tests.

Can a celery worker/server accept tasks from a non celery producer?

I want to use a comet server written using java nio for sending out live updates. When receiving information I want it to scan the data, and send tasks to worker threads via rabbitmq. Ideally I would like a celery server to sit on the other end of rabbit, managing a pool of worker threads that will handle these tasks.
However, from my understanding, celery works by sitting on both ends of rabbitmq, and it essentially takes over the role of producer and consumer by being embedded in both the consumer and producer's code. Is there a way to set up celery as I described above? Thanks
Yes, of cource !
You can add Custom Message Consumers to a celery app.
Please refer to Extensions and Bootsteps in celery documents.
Here is a part of example code in the link above:
from celery import Celery
from celery import bootsteps
from kombu import Consumer, Exchange, Queue
my_queue = Queue('custom', Exchange('custom'), 'routing_key')
app = Celery(broker='amqp://')
class MyConsumerStep(bootsteps.ConsumerStep):
def get_consumers(self, channel):
return [Consumer(channel,
queues=[my_queue],
callbacks=[self.handle_message],
accept=['json'])]
def handle_message(self, body, message):
print('Received message: {0!r}'.format(body))
message.ack()
app.steps['consumer'].add(MyConsumerStep)
Test it:
python -m celery -A main worker
See also: Using Celery with existing RabbitMQ messages
It is not necessary to use Celery to publish messages. You can publish messages to RabbitMQ or to other broker from your own app and use Celery to consume tasks.
Celery uses simple message protocol. You can implement the client side in you application.
If you don't want to implement the client side of the protocol you can implement a simple http server which accepts requests and makes appropriate calls. Like this.