My current setup:
I have a Stripe webhook that listens for checkout success events. After payment success, my koa.js server will fire off a async task that does more processing and immediately returns status 200 to Stripe. Stripe would redirect the user to the /payment_success endpoint.
Here's what I want to achieve:
Once the async task finishes, I would like to notify the user (who should be waiting on /payment_success) about the status of this task.
Here's how I plan on implementing this:
In the /success handler on the server, I would create a redis subscriber that listens for a task completion event using redis streams. The topic would be based on the customer id. Once this event happens, I would send a Server Sent Event to the client.
The async task would publish the task completion event to redis streams once it finishes.
My questions are:
How would you design the system to satisfy the requirements?
Is there anything you would change about my design?
Related
I'm working on a library for interacting with the Discord API. My current setup is:
A gateway, each handling x amounts of shards - so that I can spin up as many of these as I like to scale well. These gateways publish events received to a redis message queue.
A client, which subscribes to the message queue, and responds to events received.
However, there are some scenarios - working with message components - where I want a specific client to handle events related to that message. This client will then use the node.js event emitter to emit an event in itself which is then received by a 'collector' in my code.
Does anyone have any recommendations how I might stop other clients from picking up the event from the message queue, so that only this specific client picks it up? Is it possible for a subscriber to 'read' an event before it like accepts it? As then all clients could read an event to see if it like matches a list of events its waiting for?
I have a use case where I need my controller action to wait for the reception of a specific rabbitmq message so I can return the result to the client, this message would come from a separate worker performing a certain task.
My api project and the worker project are separated and rabbitmq bus is the only intermediary between them.
EDIT: This is the current Scenario:
Client sends request to the web api to ask for let's call it 'DATA'
The web api publishes a Message-A through rabbitmq
A separate service project handles the published Message-A, does some work, and publishes a new Message-B that contains the result of that work which we called 'DATA'
Here is the problem: My web api controller have to return the results contained in Message-B, so the controller action should wait for that message before returning to the client
You need to use a TaskCompletionSource<T>.
You need to subscribe to the reply messages and, if it's the reply you're waiting for, set the result of the task completion source.
Then await the task of the task completion source.
I was able to use square's webhook API based on descriptions here, https://docs.connect.squareup.com/api/connect/v1#webhooks-overview
and payment webhook was working fine.
Recently, I noticed that after completing a cash payment my webhook event handler
is not receiving any PAYMENT_UPDATED notifications.
I'm able to get the Test Webhook Notification trigger with my event handler service and I did register the PAYMENT_UPDATED webhook for my location.
This service was working before, is there any new changes for square-connect api?
There is no guarantee that the webhooks notification will successfully go through. If it fails for any reason, Square will not attempt to resend it. You should definitely use alternate methods (such as the ListTransactions endpoint) to fully verify the data.
I would like to get the notification about the status of the IronWorker task after its finished.
I tried to setup and incoming-webhook, but could not find any way to achieve this.
Update
I know how to setup incoming webhook in slack. I am finding a way to trigger this webhook by IronWorker after its completed. I just don't want to integrate the request code in my worker code.
Any help would be appreciated.
IronWorkers allow you to configure a UDP log feed. They tend to send logs to papertrailapp over this UDP feed. If you have ELK stack then try pointing to that. Most log aggregation frameworks have a detect and notify feature built in. So logentries or papertrail or ELK could then look for a log statement from your worker like DONE and notify you in email/slack/text etc.
If your worker has reached the end of its business logic safely then perhaps it is safe to assume that it can also send a REST request to slack on its own saying i'm done! And that such an action wouldn't be an extra burden or cause any additional failures ... try & see ... then share!
(a) you could queue a notification task in a "notification worker" queue as the last step in your workers ... if you want to reduce the chances of failures or retries caused by the notification code itself.
The current API doesn't show a way to register and receive notifications about worker status from iron.io itself ... it seems only polling based: http://dev.iron.io/worker/reference/api/
So you want to set up incoming webhook in slack. And you want to trigger them when the task is complete.
After registering the incoming webhook in slack, you will get the Webhook URL. Its of the form - https://hooks.slack.com/services/SECRET/SECRET
Now we have to make a post request to this url along with the data.
import requests
import json
url = 'https://hooks.slack.com/services/SECRET/'
payload = {'text': 'Random test',"username": "A slack bot","icon_url": "https://slack.com/img/icons/app-57.png","channel": "#abhinav_rai"}
r = requests.post(url, data=json.dumps(payload))
print r.text
print r.status_code
The Following is the python code to make request to the webhook url. This will post your data in the desired channel.
For more information: Visit https://api.slack.com/incoming-webhooks or comment below.
I have created extended events in SQL server 2012. Everything is working fine.
Now I am looking for if any events occur (example :deadlock), it should send mail to given mail id.
Is it possible in extended events?
There is a very interesting article about it, basically you need to:
Enable service broker on the database.
Create a service broker queue to receive the event notification messages.
Create a service broker service to deliver event notification messages.
Create a service broker route to route the event notification message to the service broker queue.
Create event notification on deadlock event to create messages and send them to the service broker service
Through service broker, a
stored procedure can be written that responds to deadlock events. Event notifications allow deadlock graphs to be
transformed, stored, and sent wherever they need to go.
Store the deadlock graph in a table.
Retrieve the cached plans associated with the deadlock in another table.
Email the deadlock graph to the DBA team.
You can find the article with the examples on this link:
http://sqlmag.com/site-files/sqlmag.com/files/archive/sqlmag.com/content/content/142603/wpd-sql-extevtandnotif-us-sw-01112012_1.pdf
Pages of reference:
9 - 13