Large File Upload using messenger queue - rabbitmq

I am building an application where I need to generate a csv file which would be very large. So I want to process it through Queues ( RabbitMQ in my case).
Here's the requirement:
After user click on download, send the message to the queue.
Process and upload the csv to S3.
Send a notification to user that file upload has been done.
I am stuck with how should the user be notified. I just need the logic of implementation. If it is some generic topic/design , would be glad to know what it is called. Couldnt find anything relevant from my seartch queries.

Related

How to create a mirth channel utilizing rest API.

I want export channel messages to ftp server or external drive. I thing we can export messages via rest API. Could you any one help on this..
If you want to send messages to a REST API, you can use the HTTP Sender destination connector type.
If your REST API Endpoint requires any special headers or authentication, you will need to configure this appropriately (such as by setting variables in the Destination Transformer). Don't forget to put something in the "Content" box at the bottom of the screen - this usually has a value such as ${message.transformedData} or ${message.rawData}.
If you want to send messages to an FTP server, you can use the File Writer destination connector type. Again, make sure you put something such as ${message.transformedData} in the "Template" field.
The POST /channels/{channelId}/messages/_export endpoint is to export messages to files on the server filesystem. When the client does an export to the local file system, it basically writes the results of GET /channels/{channelId}/messages with one file per message and attachments included. See Source.
Possibly the most effective way to get all your processed messages offsite is to just take a database backup.
The data pruner also has an option to archive messages to disk as they are pruned, and those files could be picked up and sent offsite if desired.

Single notification mail for multiple flow files Nifi

I'm trying to copy data from a database and place it in S3 using nifi. I'm able to copy the data from database and place it in S3. Now I'm trying to add error handling for this flow. I just added the PutEmail processor for error notification. I just gave a wrong bucket name to validate the Email. This PutEmail processor is getting triggered for each and every flow file(As there are 100 flow files mail is triggering 100 times). I just want to trigger this PutEmail(notification) only once whenever there is a error in the flow. Any suggestions on this please.
Below is the Flow:
Any suggestions on better(Generic) error handling will be helpful for me.
For your use case, MergeContent would allow you to batch several FlowFiles over a given duration to be rolled up into a singular email.
You could additionally do some additional transforms to only get the key parts of the content and/or attributes to provide source FlowFiles to MergeContent that would give a summary listing in the message sent.
You can implement custom ReportingTasks which will periodically sends reports based on Need

Sending huge amount of emails using Amazon SES

I'm going to use Amazon SES for sending emails in the website I'm building currently. According to the sample java code they have provided in their API documentation I developed the functionality and I was able to send the emails. But when it comes to handle huge number of emails in a very short time of period what is the best mechanism to follow up? Do they provide any queue mechanism for emails? I couldn't find this from their API documentation and their technical service is available only for users who has purchased the account.
Can anyone has come across a solution for this problem?
Generally I use a custom SQS solution for a batch mailing process like this.
Sending more than a few emails from a web server isn't ideal, so I usually only have the website submit the request for the emails to a back-end process in a single call, then I create an SQS message for each recipient and (in my case) use a windows service that requests messages from SQS and sends the emails at the pace I want them to go out. If errors are encountered the message stays in the queue, and get retried automatically.
With an architecture like this, depending on your volumes you can spin up new instances automatically if the SQS queue size gets too large for a single instance to process in a timely manner.

NServiceBus Central Repository

I am currently researching the possibility of using NServiceBus for one of our applications. The current application takes large text files and parses the details into a database. The users perform various operations on the file content, approve the changes, and finally release the updated file. When the file is released, various other services need to do something with that file data (drop file in ftp folder, email customer, bill customer).
From what I read services should be autonomous and not share data, except via messages. I like that concept, however in my case I am wondering if it is practical. Some of these files can contain up to a million records.
So my question is, should each service (operations, billing, emailer) all have their own database and table for storing this file data, and move the data via the DataBus? Or should I be more pragmatic and only send the fileID in the message which references a central file table?
Thanks for any guidance you can offer.
There are a couple of things that one should not do with a service bus:
move masses of data
perform queries
large ETL operations
You are certainly able to do all these things but you will probably be left disappointed. The messaging to enable some of these operations is fine, though. Your idea of sending the FileID is definitely the way to go.
As an example: I have previously implemented an e-mail sending service. This service can send attachments but these can be large. So instead of including the attachments in the messages I stored the attachments on a shared folder and sent a SendEMailCommand message that also included the unique attachment ids that need to be sent with the e-mail. The e-mail service would then pick up the attachments from the shared folder. After the service successfully sent the mail an EMailSentEvent message would be published.

Write to a file in NServiceBus

Here is what I am trying to accomplish by using NServiceBus.
I have a publisher and subscriber. The publisher publishes a message from its queue to the subscriber. Then, the subscriber takes the message and writes to a file. The file will be an input for a third-party GUI application (It fires when the file is created and ready to be accessed). (It has to be a file since the GUI application does not have the MSMQ functionality).
I think I can write the message into a file in the Handle() of the subscriber, but I am not sure how I can achieve this. Since, the subscriber will be fire as soon as the message arrives.
Any help would be appreciate it.
Thanks.
Yes, you can absolutely do this. It's just a matter of doing File.WriteAllLines("myfile.txt", "some values here") using the standard File class from the .NET API.
In your case you will most likely want to pull different values from the message being received, but the specific format and structure of the file being written will depend heavily on the needs of the input/receiving application.