Is it possible export rabbitmq messages to local file(like .txt file)? - rabbitmq

There are 15302 events waiting for 2-3 years in a queue that has no consumer as an error queue in windows prod rabbitmq.Instead of writing the consumer listening to these events, we want to import them into txt and quickly import them.
Is it possible export rabbitmq messages to local file(like .txt file)?
I could not find anything about my request.

You may find this answer useful. It refers to a tool which is available at github.

Check out this tool with a nice GUI: https://www.cogin.com/mq/. It provides the export/import feature.

Related

Mule Managed file transfer synchronization

I am looking for a solution that will involve tranferring Files across two SFTP locations. i have a requirement to move file from a dmz to a internal app server.
I have designed two flows. one that read file from dmz and moves to a quarantine zone. the second flow picks from from quar and moves to the app server. what are the recommendations from experts to create a event or sync based model to fir this requirement. I am seeing errors like no such file generating during poll when file size is larger and still in transfer .
please advise.
Thanks
If you are searching for one software, I recommend GoAnyWhere, in the workflow you can automate a lot of options, like this one that you talk.
you can check at : goanywhere.com (Managed File Transfer - MFT)
ps. I'm not a vendor, I just made a POC for one problem that I had and I saw that this can help you.

How to replay nServiceBus message

Is it possible to replay all failed messages through nServiceBus without using ServiceControl/ServicePulse?
I'm using NServiceBus.Host.exe to host our endpoints. Our ServiceControl/ServicePulse database became corrupt. I was able to recreate it, but now I a few failed messages in our SQL database which are not visible through the ServicePulse.
Will this help?
Take a look at the readme.md
For people who want the functionality that this tool previously
provided please take one of the following actions
Return to source queue via either ServiceInsight or ServicePulse.
Return to source queue using custom scripting or code. This has the
added benefit enabling possible performance and usability
optimizations since, as the business owner, you have more context as
to how your error queue should be managed. For example using this
approach it is trivial for you to choose to batch multiple sends
inside the same Transaction. Manually return to source queue via any
of the MSMQ management tools. If you still want to use
MsmqReturnToSourceQueue.exe feel free to use the code inside this
repository to compile a copy.
You can look at the link provided to build your own script (to mach SQL) and trip the error message wrapper so you can push the stripped message back to the SQL queue.
Does this help?
If not please contact support at particular dot net and we will be glad to help :-)
There is nothing built into the Particular stack that I know of that will take care of this.
When I have ran into issues like this before I will usually setup a console application to send some commands into the endpoint and then setup a custom handler in the endpoint to fix the data inconsistencies. This allows you to test the "fix" in a dev/uat environment and then you have an automated solution for production to fix the problem.

How to separate the latest file from Multiple files in Mule

I have 5000 files in a folder and on daily basis new file keep loaded in same file. I need to get the latest file only on daily basis among all the files.
Will it be possible to achieve the scenario in Mule out of box.
Tried keeping file component inside Poll component( To make use of waterMark) but not working.
Is there any way we can achieve this. If not please suggest the best way ( Any possible links).
Mule Studio: 5.3, RunTime 3.7.2.
Thanks in advance
Short answer: Not really any extremely quick out of the box solution. But there are other ways. Im not saying this is the right or only way of solving it, but I've earlier implemented a similar scenario in this way:
A Normal File inbound with a database table as file-log. Each time a new file is processed, a component checks if its name appears in the table. By choice or filter I only continue if it isn't in there already - and after processing I add the filename to the table.
This is a quite "heavy" solution though. A simpler access would be to use an idempotent filter with a object store. For example a Redis server: https://github.com/mulesoft/redis-connector/blob/master/src/test/resources/redis-objectstore-tests-config.xml
It is actually very simple if your incoming file contains timestamp........you can configure the file inbound connector by setting file:filename-regex-filter pattern="myfilename_#[function:timestamp].csv". I hope this helps
May be you can use a quartz scheduler( mention the time in cron expression), followed by a groovy script in which you can start the file connector . Keep the file connector in another flow.

Is there an Ektron Scheduler?

I need to have a process or widget that every five minutes will check to see if there are any xlf files in the localization folder, and if any exist, will import them into Ektron. Is there a way within Ektron to have something scheduled.
I don't think there is any scheduler program inside Ektron.
To schedule a task you could look at using one of the following:
Quartz.Net
Command line programs called by Task Scheduler
Alternatively, you could look at using an Ektron plugin, which will get fired when certain Ektron events occur (e.g. content published). However, in my experience plugins/extensions are poorly supported and documented.
You can try creating a Windows service to perform the scheduling.
Details on the localization API and how localization performed can be found at:
root\Workarea\controls\content\localization_uc.ascx.cs in your Ektron site.
I'd recommend trying something similar to what Ken McAndrew did for an alias scheduler. Details here: Manual Alias Scheduler

Good FTP control?

I need to find a good control for VB.Net to upload big files to an FTP server:
Can upload files up to 10GB
Uses passive mode
Provides feedback during upload, to make sure it's going OK
Can cancel and resume
Up to two concurrent connections
I read about WebClient + NetworkCredential, but it doesn't seem to provide progress infos. There are also the FTPWebRequest/FTPWebResponse classes from WebRequest/WebResponse, as well as WebClient.UploadFileAsync.
Which free/affordable solution would you recommend?
Thank you.
You should expect the following to be provided by every FTP client classes/controls, as they are supported by FTP the protocol.
Can upload files up to 10GB
Uses passive mode
Can cancel and resume
The following requires you to make good use of the classes/controls, for example, use two client classes/controls at the same time.
Up to two concurrent connections
RemObjects has a free pack called Internet Pack, where you can find a class called FtpClient,
http://www.remobjects.com/ip.aspx
http://wiki.remobjects.com/wiki/FtpClient_Class
It supports all FTP features above, and it provides upload progress too, via events. You can read its documentation or simply write a small program to test it out.
To support multiple connections to the same FTP server, you can create multiple instances of this class.