How to publish byte array to the queue by using RabbitMQ management plugin? - rabbitmq

As shown below, RabbitMQ management GUI can publish a message to the specific queue directly.
The consumer of this queue consumes the message with protobuf format, which means I should publish the byte array to the queue instead of string. I have tried to convert the protobuf bytes into a base64 string but failed, is possible to tell the RabbitMQ Management GUI to convert the base64 string into bytes or there is another way to publish byte array directly?

RabbitMQ Management GUI talk to RabbitMQ Server over HTTP (HTTP is a text transfer protocol), so it is impossible to send binary data directly by Management GUI.
RabbitMQ community provide a command line tool rabbitmq-perf-test wrapped RabbitMQ Java client, you can publish binary message with your Content-Type.
The server pays no attention to the Content-Type header; it just passes
it through. So make sure your client supports your Content-Type.

You can use any HTTP client like curl or Postman or anything else.
And just send HTTP POST request like this:
curl 'https://rabbitmq.host/api/exchanges/%2F/amq.default/publish'
--data-raw '{"payload_encoding":"base64","vhost":"/","name":"amq.default","properties":{"delivery_mode":2,"headers":{}},"routing_key":"YOUR QUEUE NAME HERE","delivery_mode":"2","payload":"THE BASE 64 PAYLOAD HERE","headers":{},"props":{}}'

Related

Why did we only receive the response half of the time (round-robin) with "Spring Cloud DataFlow for HTTP request/response" approach deployed in PCF?

This issue is related to 2 earlier questions:
How to implement HTTP request/reply when the response comes from a rabbitMQ reply queue using Spring Integration DSL?
How do I find the connection information of a RabbitMQ server that is bound to a SCDF stream deployed on Tanzu (Pivotal/PCF) environment?
As you can see the update for the question 2 above, we can receive the correct response back from the rabbit sink. However, it only works half of the time alternated as round-robin way (success-timeout-success-timeout-...). The outside http app was implemented with Spring Integration showed in question 1 - sending the request to the request rabbit source queue and receiving the response from the response rabbit sink queue. This only happened in PCF environment after we deployed both the outside http app and created the stream (see following POC stream) there. However, it's working locally all the time (NOT alternately). Did we miss anything? Not sure what's the culprit in PCF. Thanks.
rabbitSource: rabbit --queues=rabbitSource | my-processor | rabbitSink: rabbit --routing-key=pocStream.rabbitSink.pocStream
Sounds like you have several instances of your stream in that PCF environment. This way there are more then one (round-robin feels like two) subscribers to the same RabbitMQ queue. Where only one consumer must be for that queue since only initiator of the request waits for reply, but odd (or even) replies go to different consumer of the same queue. I don't place it as an answer, just because it is the best guess what is going on since you don't see a problem locally.
Please, investigate your PCF environment and how does it scale instances for your stream. There also might be some option of SCDF which does scaling for us.

how to stream with mule sftp connector

We have a soap service that accepts a small file so we receive that in memory and we need to then sftp that off to another server. What would the configuration be for that? How to take our String(xml file) and send it to the server? ( i assume sftp connector is the best way to go here, but how to configure it as it looks like it takes one file as a parameter and I need it to be fed bytes to send instead with a filename that we specify).
thanks,
Dean

Getting result of a long running task with RabbitMQ

I have a scenario where a client sends an http request to download a file. The file needs to be dynamically generated and typically takes 5-15 seconds. Therefore I am looking into a solution that splits this operation in 3 http requests.
First request triggers the generation of the file.
The client polls the server every 5 seconds to check if file is ready to download
When the response to the poll request is positive, the client starts downloading the file
To implement this I am looking into Message Queue solutions like RabbitMQ. They seem to provide a reliable framework to run long running tasks asynchronously. However after reading the tutorials on RabbitMQ, I am not sure how will I receive the result of the operation.
Here is what I've in mind:
A front end server receives requests from clients and it posts messages to RabbitMQ as required. This front end server will have 3 endpoints
/generate
/poll
/download
When client invokes /generate with a GET parameter say request_uid=AAA, the front end server will post a message to RabbitMQ with the request_uid in the payload. Any free worker will subsequently receive this message and start generating the file corresponding to AAA.
Client will keep polling /poll with request_uid=AAA to check if task was complete.
When task is complete client will call /download with request_uid=AAA expecting to download the file.
The question is how will the /poll and /download handlers of the front end server will come to know about the status of the file generation job? How can RabbitMQ communicate the result of the task back to the producer. Or do I have to implement such mechanism outside RabbitMQ? (Consumer putting its results in a file /var/completed/AAA)
The easiest way to get started with AMQP, is to use a topic exchange, and to create queues which carry control messages. For instance you could have a file.ready queue and send messages with the file pathname when it is ready to pickup, and a file.error queue to report when you were unable to create a file for some reason. Then the client could use a file.generate queue to send the GET information to the server.
You hit the nail on the head with your last line:
(Consumer putting its results in a
file /var/completed/AAA)
Your server has to coordinate multiple jobs and the results of their work. Therefore you will need some form of "master repository" which contains an authoritative record of what has been finished already. Copying completed files into a special directory is a reasonable and simple way of doing exactly that.
It doesn't necessarily need RabbitMQ or any messaging solution, either. Your server can farm out jobs to those workers any way it wishes: by spawning processes, using a thread pool, or indeed by producing AMQP events which end up in a broker and get sucked down by "worker" queue consumers. It's up to your application and what is most appropriate for it.

How to implement paired connection between server client using RabbitMQ?

Well i am intending to implement a paired connection between server and client , as of now i have sent chunk of data and code to the slave system using rabbitMq and the slave system executes it . But i am unable to again send back the code to server , since rabbitMq has the classic implementation of Publisher / subscriber model . Is there any way to go around this and ensure that the server also fetches result from the slave systems ? i am using python bindings for rabbitMQ .
You can easily simulate RPC semantics with RabbitMQ (or indeed, any messaging system). All you need is a form of correlation identifier so that the response message can be tracked and interpreted as the "answer" for the original request.
Luckily, the RabbitMQ online documentation has an entire page with examples on how to do this using Python.

How can I securely transfer files

I need to automatically transfer an xml file from server A over internet to a server B. Ftp works fine, but should I be use a message queue instead?
It should be secure in the order that I won't lose messages and be able to log what is transferred.
You could use a message queue as well but not to transfer the files, just for keeping a queue of the files to be transferred. Then you can write a Service who uses sftp, https, ssh, or whatever other secure method to transfer the files. There are plenty of options. A common scenario to use is:
- Write a file to a given folder and a message to the message queue.
- The web service will be polling the message queue who will have a message with the filename to be transferred. If there is a file, use SECURE METHOD CHOSEN (see the links below), and do the transfer.
Well, you could simply avoid using message queue and use a secure client to connect to the server B from server A and do the transfer, here are some links that can help you:
How do I upload a file to an SFTP server in C# / .NET?
http://social.msdn.microsoft.com/Forums/en-US/csharpgeneral/thread/bee2ae55-5558-4c5d-9b5c-fe3c17e3a190
http://social.msdn.microsoft.com/Forums/en-US/netfxnetcom/thread/f5d22700-552f-4214-81f5-fa43bfcc723d
Hope that helps
Use sftp whenever possible.
Use a POST over HTTPS - an implementation is available on every imaginable platform.
Of course, you need to check certificate validity, but this is also a part of the protocol itself; your part is to keep the certificates correct and secure.