How can i send emails from azure sql when a certain condition is met - azure-sql-database

I have an azure sql database where i store some user information about scheduled tasks. I need to allow user to get a notification 30mins before the event starts, i cannot do this in the application it self as the application is a windows form app. I checked some solutions where they use logic apps but it only allows to send when a new record is added or updated.
But i need to check for the current time and the start time of the schedule and send the email if there is a difference less than 30 mins
My database is as follows
Name = John doe
Email = johndoe#gmail.com
Startdate = 09/12/2019 12:45 AM
Enddate = 09/12/2019 1:45 AM
How can i achieve this?

Azure Automation is the best option here, but you could simply include the logic within the Automation Runbook to check for the current time and the scheduled start time and trigger the email from there.
The downside is that a single schedule can only run once every hour. The most frequent interval a schedule in Azure Automation can be configured for is one hour. If you require schedules to execute more frequently than that, there are two options:
Create a webhook for the runbook and use Azure Scheduler to call the
webhook. Azure Scheduler provides more fine-grained granularity when
defining a schedule.
Create four schedules all starting within 15 minutes of each other
running once every hour. This scenario allows the runbook to run
every 15 minutes with the different schedules.

My suggestion first You have to create a table (with identity as primary key and let's named EmailDetails) that contains all the email fields. Then create an Logic App that triggers on an insert into that table. You can then connect the SendGrid module to send emails as shown in this article.
Run a stored procedure every 1 hour using Azure Automation (as shown here) that insert a new record on the EmailDetails table when a scheduled task is going to start.

Related

Run AA 24/7 to process records from a DB (there is a new record every minute)

Looking for best practices here. I need to run AA on all days of the week from 08:00 am for 12 hours. Bot will look for new records in a SQL DB every minute. If there's a new record, it will process it (open a website, fill a form etc). Then it will check again if there's a new record and it will repeat the process.
The idea is to schedule a task to start the bot 8:00 AM. Once task starts bot will query SQL etc, but I need to keep the bot running looking for new records.
For now I am first opening the website where records will be inserted and will keep on looping (to check new records in the DB) as long as the website is opened,but I am sure there are more elegant ways to do this.
Looking forward your comments.
First of all I would like to ask you the SLA for each database refresh. Do you want the web activity to be performed on real-time, or can it wait for some time (like checking every hour or so, and processing all new records)?
Because in your approach I believe there would be continuous DB hits even when there's no new record for a long time, which is not the best of approaches.
An alternative that I would like to suggest is to use some kind of a Message Queue which will monitor the database. Then you can write a listener for this Queue and as soon as there's a new record, your bot can process it.
Let me know your thoughts.
Regards,
Atharva
I think one shortcoming in your approach is that you're keeping your website open all long, even when there's no new record for hours. I agree with what Atharva suggested above, you should only login to website when the operation needs to be performed.
Instead of running AA 27/7, you can write a service or something that will monitor the DB. And when there's a new record, will trigger your AA task in some way.

How can I receive a real time notification from a Oracle table when a value change occurs?

Part 1: I would like to receive real-time notification when value changes from A -> B in an Oracle table that is an on-prem vendor database.
Part 2: Upon receiving the event a process flow will kick-off to take appropriate action such as executing WebAPI calls, Emails and SMSs.
What would be a good solution architecture for Part 1?
Since it's a vendor database writing and executing custom triggers on their tables is not desirable.
Plan B is to write a JOB to poll the tables on a schedule and pick up the changes.

Bigquery user statistics from Microstrategy

I am using Microstrategy to connect to Bigquery using a service account. I want to collect user level job statistics from MSTR but since I am using a service account, I need a way to track user level job statistics in Bigquery for all the jobs executed via Microstrategy.
Since you are using a Service account to make the requests from Microstrategy, you could look up for all your project Jobs by listing them then, by using each Job ID in the list, gather the information of the job as this shows the Email used for the Job ID.
A workaround for this would be also using Stackdriver Logging advanced filters and use a filter to get the jobs made by the Service Account. For instance:
resource.type="bigquery_resource"
protoPayload.authenticationInfo.principalEmail="<your service account>"
Keep in mind this only shows jobs in the last 30 days due to the Logs retention periods.
Hope it helps.

VB.NET: Display SQL Server Table Row Count in Real Time?

I've got an app at work I support that uses a SQL Server 2008 DB (vendor created/supported app). One of the things this app does is load records into ETL tables in the DB all day to be moved to a data warehouse.
Unfortunately, the app is having lots of problems with the ETL tables right now and the vendor has no monitoring solution. I have no accesses to the DB to add a stored procedure or anything, but I can run a count * on the ETL tables to see if things are getting out of hand.
I have managed to write a VB.NET app that will return the COUNT of rows in these ETL tables so I can keep an eye on things, but it will only return the counts if I fire a button event.
I've never written an app that runs/updates "in real time" before, and I'm looking for some guidance on how I can create an app that would update these COUNT values in as close to real time as possible.
Any guidance would be greatly appreciated!
You could achieve that by writing a Console application, since you seem used to .Net.
The console application runs and you can read the values by using console.writeline() and console.readline() in your program.cs. Or you could update the record counts in a table or send an email.
When you say real time, the console application can be scheduled to run - e.g. through creating a task in task scheduler or sql agent, or it can be run by launching the exe. A rough example is that, you could send yourself an email every 10 minutes by creating a task that launches the console ap every 10 minutes.
If you're using a Windows Forms app, just add in a Timer object that fires the SQL query off. As an added bonus, you could include fields on the form to control how often the timer fires to get the resolution that's right for you.
You can use the Timer control in Console apps too, of course.

sharepoint 2010 - How to force expiration policy to run everyday for all items on a list

Have on prem SharePoint 2010, setup expiration policy to run every day that triggers a workflow to create tasks and emails based on an alert day in a calculated column in the list. Calculated columns have dates set to 7 days prior to a due date event. Want the expiration policy to evaluate the entire list each day to create notification emails and tasks to the users of the list.
I have been successful in getting the policy to run the workflow every day, but it starts randomly within the window specified. My assumptions: workflow will not run against an item on the list if the “Expiration Date” in less than 24 hours, I cannot modify that column directly, tightening the run window does not force all items to be evaluated every day. Solution needs be done using designer or sharepoint, not custom code.
Does anyone have a workaround to force all items to be evaluated by the “Expiration Policy” every day?
On another issue, how to you force evaluation for items created under 24 hours?
In SP 2010 there are two timer jobs responsible for this task: "Information management policy" job and "Expiration policy". This blog post may give you some more insight. There is also one more important thing: Information management policy should run before Expiration policy.