Authenticated Network share as ActiveMQ producer - activemq

I am trying to set up a pretty basic messaging system with ActiveMQ.
I have a network share that another program puts XML files into.
I have an ActiveMQ route set up that I think should be delivering files that are dropped into the network share to my ActiveMQ queue. I am not getting the files delivered to my queue and believe it is because I need to authenticate to the network share.
Can someone help me figure out where in my config files I need to put my NETWORK username and password so the share is accessible to the ActiveMQ?
Or am I going about this completely wrong? I am using the out of the box config for activeMQ and Camel with the exception of my route:
<route>
<description>Leslie Odyssey Route</description>
<from uri="file://servername.domain.gov/MetroFileDrop"/>
<to uri="activemq:queue:Odyssey.Queue"/>
</route>
In my ActiveMQ command prompt console I see the following:
INFO| Route: route2 started and consuming from:
Endpoint[file://servername.domain.gov/MetroFileDrop]
which makes me believe that the route IS there but I never get any files delivered.
Thanks for any info,
Leslie

create a folder on your local harddrive that points to your network store and point your camel route to that folder. camel alone doesnt understand network folders.

Related

WSO2 ESB HTTP Endpoint does not get added to the Synapse configuration

I have built an ESB Project with an HTTP endpoint. But for some reasons. The Endpoint that I have defined and added to the ESB Solution does not seem to be reflected in the project and seem to be not reflecting when I deploy to the server. The endpoint is basically not being used. I have also tried to check the under the Defined Endpoints tab in the server Enterprise Integrator console, #
Home > Manage > Service Bus > Endpoints
but it isn't there. Numerous restarts have not helped neither has undeploying and redeploying the car file. Can someone point out where I might have gone wrong? As usual, thanks in advance.
Could be different things, to check:
Extract the .car file (it's just a .zip, so rename or use 7zip to
extract) and see if your endpoint is there.
Check if the serverrole in the pom file is correct (should be EnterpriseIntegrator it think)
Sometimes renaming artifacts causes problems as the file does not get renamed correctly or a reference is not updated in one of the
project files. Try removing the endpoint and use 'search in files' to
remove any lingering references in pom files.

Can I use the File connector in Mule Cloudhub for FTPS

Is there a way to configure a File connector for use in cloudhub, specifically related to reading in a file over FTPS and putting it into a file before beginning the actual processing of the contents?
Clarification:
I'm in cloudhub, which does not provide a filesystem in the same sense that a local/on-prem Mule setup has. One standard practice when dealing with streams (FTPS or similar) in order to avoid processing over the open stream is to take the incoming stream and use the File connector (outbound in this case) to put the inbound stream into a file, and then use that file for your flow process. How is this managed in CloudHub?
File Connector is to read files from paths specified on the server. They cannot be used to read from remote servers.
I case you want to have a File to start your flow with try the following.
<flow name="ftp_reader_flow">
<ftp: inbound> Read from the remote directory
...
<file:outbound> to a local directory
</flow>
<flow name="actual_processing_flow">
<file:inbound> read from the local directory.
... Continue with the processing
.....
</flow>
Hope this helps.
You can use the connector for temporary data with the tmp directory.
From the MuleSoft Documentation:
Disk Persistence
CloudHub does not guarantee that writing to disk survives hardware
failures. Instead, you must use an external storage mechanism to store
information. For small amounts of data, you can use the Object Store.
For applications that have large data storage requirements, we
recommend use of a cloud service such as Amazon S3. For temporary
storage, the File connector is still available and can be used with
the /tmp directory.
You can use File Connector in CloudHub as well, But Make sure your are reading or writing the file from classpath -src/main/resource or any folder from project classpath only.

Using Mule how to pull a file from an FTP site in response to an incoming VM event?

When I get a triggering event on an inbound VM queue I want to pull a file from an FTP site.
The problem is the flow needs to be triggered by the incoming VM message not the FTP file's availability.
I cannot figure out how to have what is essentially two inputs. I considered using a content enricher but it seems to call an outbound endpoint. The Composite Source can have more than one input, but it runs when any of the message sources trigger it, not a sum of sources.
I am setting up an early alert resource monitoring of FTP, file systems, databases, clock skew, trading partner availability, etc. Periodically I would like to read a custom configuration file that tells what to check and where to do it and send a specific request to other flows.
Some connectors like File and FTP do not lend themselves to be triggered by an outside event. The database will allow me to select on the fly but there is no analog for File and FTP.
It could be that I am just thinking about it in the wrong light but I am a little stumped. I tried having the VM event trigger a script that starts a flow that had an initial state of “stopped” and that flow pulls from an FTP site but VM seems to not play well with starting and stopping flows, and it begins to feel like a 'cluttered' solution.
Thank you,
- Don
For this kind of scenarios, you should use the Mule requester module.

Load RabbitMQ config at startup

How do I load a RabbitMQ config at startup to confirm that broker objects (queues, exchanges, bindings, users, virtual hosts, permissions and parameters) are created?
According to the RabbitMQ documentation, it can be done via load_definitions http://www.rabbitmq.com/management.html#load-definitions
But I can't figure out how to use it. Would someone mind sharing an example of how this works? I can't find any examples online.
There's two bits that the documentation leaves to be desired that were stumbling blocks for me.
Generating the definitions file
I found the easiest way to do that is to configure one RabbitMQ server how you like it and then...
Go to the management web interface
Look at the bottom of the Overview tab/page for the "Import / export definitions" heading
Click the "Download broker definitions" button in that section
Configuring RabbitMQ to look for a definitions file at startup
Put the definitions file somewhere on the filesystem that it can be read by the user that your rabbitmq daemon will be running as.
Include a block like this in the configuration file:
{rabbitmq_management, [
{listener, [...]},
{load_definitions, "/etc/rabbitmq/definitions.json"} ]},
Upon startup, those definitions should get loaded. Any errors loading them should be apparent in the logs.

How to delete all of the topics in ActiveMQ?

I'm using ActiveMQ with the C# client library. I created 10,000 topics with random names as part of a test for evaluation purposes and now I can't get rid of the topics. ActiveMQ grinds to a halt with this number of topics so I need them out of the system. Here is what I have tried so far, and none of it has worked. I'm running ActiveMQ as a Windows service.
Delete all of the files and folders in ACTIVEMQ_HOME\Data
Turn off all persistence
Delete all of the files and folders in the persistence folder
Delete the entire ACTIVEMQ_HOME directory and reinstall it in a different folder
I've traced the file activity and cannot find any file that is written to when a topic is created or deleted.
I realize that the .NET client library is a little light on functionality, so I can't even get a list of all the topics programmatically.
Go to your broker configuration file, open the file for editing on the broker element, add the following attribute:
deleteAllMessagesOnStartup="true"
This will cause all previous topics & queues, and their pending messages to be deleted from your kaha store when you restart your broker.
Have Fun!
This question might be old, but a quick and easy way to totally purge all data in ActiveMQ alongwith all queues and topics is to go to the following path:
<ActiveMQ_Installation_Directory>/data
And delete all files in that.
Now once you restart AMQ it will start as a fresh, clean install.