I am wondering where to start with setting up a weekly personalised digest to go out to my users (over 200k).
It would pull in content specific to them, we currently use SES for notifications, on a Windows EC2 instance with SQL.
Is there a cron style thing for windows IIS?
Probably the easiest way to do this is to develop a console application to send your emails and then use the windows task scheduler to schedule it to run once a week.
Within your console application you'll basically get your users from your database and foreach through each user getting whatever personalised data you need to build up an email message, and then pass off the message to Amazon SES.
To use Amazon SES you'll need to request a sending quota increase because the default quotas are way below what you need: Default sending quota is 10,000 emails per 24-hour period, and a maximum send rate of 5 emails per second.
To implement this functionality you'll need these components:
Some application that will collate all bits of information and create an email message. Probably same app will send to email server (SES). If every message is unique to the user then
Hence the Question: How can I send email message programmatically?
Write your script. C# or any other language. It needs to connect to db, extract either pieces of email body or whole message will be collated by SQL query / SP. it will also extract email addresses and send the email.
You can use SSIS to create process of getting of needed bits, and sendin emails. It offers graphical interface for the process map. It may be not so fast as a custom script, but scheduling is very simple with SQL Server Agent. Also you can implement various processes depending on run time calculations.
Use and other soft to create and send emails (except Mail Merge in MS Word, joking )
Scheduling tool, that will run app from 1 on regular basis.
Use windows scheduling tool
SQL Server Agent. You can run SQL scripts and stored procedures. Scripts and SPs can contain file system commands (call .exe files, read data from files, etc), but you'll need to do some research for syntax, functionality and necessary permissions.
There are other scheduling apps available.
Content control. It may me done by the app, but you'll create some tables or use files for settings, common parts of the email message etc.
You would wish to keep record of various rules used to create custom messages. Logic
Generic advice. For the first time go with software you are familiar with. Solution may be cumbersome, but longest way is taking shortcuts.
There are many mass mailing and mail merge applications around. You can find those easily, compare functionality and may be choose one of those.
Disclaimer: I'm the author of the library mentioned below.
You could be interested by the library I'm writing. It's called nvelopy. I created it so that it's easy to embed in a .Net application to send transactional emails and campaigns. That way, it can directly access your data (users) to create a segment and send it via Amazon while respecting the sending quota.
I developed it for my own web service. I didn't want to setup another server or export/import users and I wanted it to "talk" to my datastore (RavenDb). A SQL datastore connector is also in the works.
Let me know if you have any question (via the contact page of the nvelopy web site).
Related
A little background on what I am needing to accomplish. I am a developer of a cloud-based SaaS application. In one instance, my clients use my application to log the receipt of goods as they come across a conveyor line. Directly before the PC where they are logged into my app, there is another Windows PC that is collecting from instruments, the moisture and weight of the item. I (personally not my app) have full access to this pc and its database. I know how I am going to grab the latest record from the db via stored procedure/SQLCMD.
On the receiving end, I have an API endpoint that needs to receive the ID, Weight, Moisture, and ClientID. This all needs to happen in less than ~2 seconds since they are waiting to add this record to the my software's database.
What is the most-perfomant way for me to stand up a process that triggers retrieving the record from the db and then calls the API? I also want to update the record flagging success for 200 response. My thoughts were to script all of this in a batch file and use cURL to make the API call. Then call this batch file from a task in windows. But I feel like there may be a better way with less moving parts.
P.S. I am not looking for code solutions per say, just direction or tools that will help, also I am using the AWS stack to host my application.
The most performant way is to use AWS Amplify, its ready aws framework and development environment that can connect your existing DB to a REST API easily
you can check their documentation on how to build it
https://docs.amplify.aws/lib/restapi/getting-started/q/platform/js
I have an application written in C# ASP.Net MVC4 and running on Windows Azure Website. I would like to write a service / job to perform following:
1. Read the user information from the website database
2. Build a user-wise site activity summary
3. Generate an HTML email message that includes the summary for each user account
4. Periodically send such emails to each user
I am new to Windows Azure Cloud Services and would like to know best approach / solution to achieve the above.
Based on my study so far, I see that independent Worker Role of Cloud Services along with SendGrid and Postal would be a best fit. Please suggest.
You're on the right track, but... Remember that a Worker Role (or Web Role) is basically a blueprint for a Windows Server VM, and you run one or more instances of that role definition. And that VM, just like Windows Server running locally, can perform a bunch of tasks simultaneously. So... there's no need to create a separate worker role just for doing hourly emails. Think about it: For nearly an hour, it'll be sitting idle, and you'll be paying for it (for however many instances of the role you launch, and you cannot drop it to zero - you'll always need minimum one instance).
If, however, you create a thread on an existing worker or web role, which simply sleeps for an hour and then does the email updates, you basically get this ability at no extra cost (and you should hopefully cause minimal impact to the other tasks running on that web/worker role's instances).
One thing you'll need to do, independent of separate role or reused role: Be prepared for multiple instances. That is: If you have two role instances, they'll both be running the code to check every hour. So you'll need a scheme to prevent both instances doing the same task. This can be solved in several ways. For example: Use a queue message that stays invisible for an hour, then appears, and your code would check maybe every minute for a queue message (and the first one who gets it does the hourly stuff). Or maybe run quartz.net.
I didn't know postal, but it seems like the right combination to use.
I have a system that requires a large amount of names and email addresses (two fields only) to be imported via CSV upload.
I can deal with the upload easily enough, how would I verify the email addresses before I process the import.
Also how could I process this quickly or as a background process without requiring the user to watch a script churning away?
Using Classic ASP / SQL server 2008.
Please no jibes at the classic asp.
Do you need to do this upload via the ASP application? If not, whatever kind of scripting language you feel most comfortable with, and can do this with the least coding time is the best tool for the job. If you need for users to be able to upload into the classic ASP app and have a reliable process to insert the valid records into the database and reject the invalid ones, your options change.
Do you need to provide feedback to the users? Like telling them exactly which rows were invalid?
If that second scenario is what you're dealing with, I would have the asp app simply store the file, and have another process, a .net service, or scheduled task or something, do the importing and report on its progress in a text file which the asp app can check. That brings you back to doing it in whatever scripting language you are comfortable with, and you don't have to deal with the http request timing out.
If you google "regex valid email" you can find a variety of regular expressions out there for identifying invalid email addresses.
In a former life, I used to do this sort of thing by dragging the file into a working table using DTS and then working that over using batches of SQL commands. Today, you'd use Integration Services.
This allows you to get the data into SQL Server very quickly, and prevent the script timing out, then you can use whatever method you prefer (e.g. AJAX-driven batches, redirection-driven batches, etc.) to work over discreet chunks of the data, or schedule it to run as a single batch (an SQL Server job) and just report on the results.
You might be lucky enough to get your 500K rows processed in a single batch by your upload script, but I wouldn't chance it.
Hi all brilliant minds,
I am currently working on a fairly complex problem and I would love to get some idea brainstorming going on. I have a C# .NET web application running in Windows Azure, using SQL Azure as the primary datastore.
Everytime a new user creates an account, all they need to provide is the name, email and password. Upon account creation, we store the core membership data to the SQL database, and all the secondary operations (e.g. sending emails, establishing social relationships, creating profile assets, etc) get pushed onto an Azure Queue and gets picked-up/processed later.
Now I have a couple of CSV files that contain hundreds of new users (names & emails) that need to be created on the system. I am thinking of automating this by breaking into two parts:
Part 1: Write a service that ingests the CSV files, parses out the names & emails, and saves this data in storage A
This service should be flexible enough to take files with different formats
This service does not actually create the user accounts, so this is decoupled from the business logic layer of our application
The choice of storage does not have to be SQL, it could also be non-relational datastore
(e.g. Azure Tables)
This service could be a third-party solution outside of our application platform - so it is open to all suggestions
Part 2: Write a process that periodically goes through storage A and creates the user accounts from there
This is in the "business logic layer" of our application
Whenever an account is successfully created, mark that specific record in storage A as processed
This needs to be retry-able in case of failures in user account creations
I'm wondering if anyone has experience with importing bulk "users" from files, and if what I am suggesting sounds like a decent solution.
Note that Part 1 could be a third-party solution outside of our application platform, so there's no restriction in what language/platform it has to be running in. We are thinking about either using BULK INSERT, or Microsoft SQL Server Integration Services 2008 (SSIS) that ingests and loads data from CSV into SQL datastore. If anyone has worked with these and can provide some pointers that would be greatly appreciated too.. Thanks so much in advance!
If I understand this correctly, you already have a process that picks up messages from a queue and does its core logic to create the user assets/etc. So, sounds like you should only automate the parsing of the CSV files and dumping the contents into queue messages? That sounds like a trivial task.
You can kick the process of processing the CSV file also via a queue message (to a different queue). The message would contain the location of the CSV file and the Worker Role running in Azure would pick it up (could be the same worker role as the one that processes new users if the usual load is not high).
Since you're utilizing queues, the process is retriable
HTH
I've been thinking about this problem for a while now and I'm still not sure what is the best approach.
Basically I've got several hundred email addresses stored in a Database, and every week I would like to automatically send these addresses a bulletin of information. I've accomplished this with a Stored Procedure and a Scheduled Job on the DB (Oracle), but I'm sure this could be achieved better with some VB.net solution.
I've read about people writing a Windows Service, or creating a Console Application and using Windows scheduler. I'm swaying to wards the Windows Service approach but I'm not sure how to tell the service to send emails at a specific time every week. Any ideas, or is there a better approach?
Also, what would be best? Send individual emails to the addresses in the emailing list, or send one email with every address added as a BCC?
Thanks
This type of thing is one of the things that scheduled tasks are designed for. Imagine creating a service that sits there taking up memory doing absolutely nothing for 7 days, only to run for 4 minutes. Then another 7 days of waiting. While it may work, its certainly not what a service is for.
Use a scheduled task. That scheduled task could easily just start a console app which reads the database, sends the emails, and then quits as normal. Nothing wrong with using a scheduled task for something that... performs a task on a scheduled basis.
As for what is "best", there is no "best". Just what works for your situation. Do you want to send out 150 emails, so that each person sees their name on it, or send out one email with 150 BCCs, where no on sees their name on it? Whatever works the way you want it to is "best"
Have you considered using the "scheduled tasks" feature in the control panel? That's what I use for recurring program usage similar to what you've described.