How aoigram can listen event of deletion post in channel? - aiogram

I need to process some logic if post in channel was deleted. Can I do this with aiogram dispatcher? Is there another way?
I do not find it in documentation
I user channel post handler
#dp.channel_post_handler()
async def delete_post(message: types.Message):
print(message)
It prints all incoming post in channel. Bur if i delete post i see nothing

Related

Enable to cancel a process until it is finished

I would like to ask how it is possible to model a following situation in BPMN:
Users can submit a request, which they can cancel any time until the request is solved. Once the request is solved, it cannot be cancelled. So if the user cancels the request before it is processed, the process ends without further processing. So until there is no result of the request, it can be cancelled.
For example until the research paper is published, it can be discarded by its author.
I model another example of BPMN process, which outlines the problem.
Thanks a lot!
This can be modeled using the boundary message intermediate event (interrupting). The message intermediate event catch the cancel message from user and trigger the stop of the process.

How to integrate slack with IronWorker tasks to get its status

I would like to get the notification about the status of the IronWorker task after its finished.
I tried to setup and incoming-webhook, but could not find any way to achieve this.
Update
I know how to setup incoming webhook in slack. I am finding a way to trigger this webhook by IronWorker after its completed. I just don't want to integrate the request code in my worker code.
Any help would be appreciated.
IronWorkers allow you to configure a UDP log feed. They tend to send logs to papertrailapp over this UDP feed. If you have ELK stack then try pointing to that. Most log aggregation frameworks have a detect and notify feature built in. So logentries or papertrail or ELK could then look for a log statement from your worker like DONE and notify you in email/slack/text etc.
If your worker has reached the end of its business logic safely then perhaps it is safe to assume that it can also send a REST request to slack on its own saying i'm done! And that such an action wouldn't be an extra burden or cause any additional failures ... try & see ... then share!
(a) you could queue a notification task in a "notification worker" queue as the last step in your workers ... if you want to reduce the chances of failures or retries caused by the notification code itself.
The current API doesn't show a way to register and receive notifications about worker status from iron.io itself ... it seems only polling based: http://dev.iron.io/worker/reference/api/
So you want to set up incoming webhook in slack. And you want to trigger them when the task is complete.
After registering the incoming webhook in slack, you will get the Webhook URL. Its of the form - https://hooks.slack.com/services/SECRET/SECRET
Now we have to make a post request to this url along with the data.
import requests
import json
url = 'https://hooks.slack.com/services/SECRET/'
payload = {'text': 'Random test',"username": "A slack bot","icon_url": "https://slack.com/img/icons/app-57.png","channel": "#abhinav_rai"}
r = requests.post(url, data=json.dumps(payload))
print r.text
print r.status_code
The Following is the python code to make request to the webhook url. This will post your data in the desired channel.
For more information: Visit https://api.slack.com/incoming-webhooks or comment below.

ResponseQueue: Can you send a queue as part of the message/body in RabbitMq?

In MSMQ there is nice feature, called a response queue: as part of the message one can send a (private/invisible) queue as well, in which the reponse is awaited - very similar to the callbacks in the async world. Technically this feature is just an wrapper around private queues and queue monikers.
Is there anything similar in RabbitMQ?
Actually I figured it out:
a private queue is created this way:
privateQ = channel.queue_declare(exclusive=True)
and passing a response queue is via the reply_to prop for the send command (versus being a property of the message)
channel.basic_publish(exchange='',
routing_key='rpc_queue',
properties=pika.BasicProperties(
reply_to = privateQ,
),
body=request)
The real difference - actually hinted by the way the API is formalized - is that you should not create a reply queue for every message - as this is inefficient. The suggested way is to have one private queue to accept all responses, and to incorporate a correlation id.

How do I handle push messages from IronMQ when my endpoint is an IronWorker?

The documentation for IronMQ push queues describes how endpoints should handle/respond to push messages. However, I get the impression this is for normal webhooks and I can't find any documentation or examples of what to do when the endpoint for a push queue is an IronWorker.
Does the IronWorker framework take care of responding to the IronMQ service when it starts a new IronWorker task for the message pushed onto the queue, or does my IronWorker code need to handle the response? If I need to handle it in my code, are there any variables automatically provided to me that represent the webhook request and/or response?
As I mentioned above, I've looked for example code but all I've found are IronWorker webhook examples that receive POSTs from something like GitHub, not from IronMQ. If there are examples out there for what I'm trying to do please point me to it!
There's actually a special subscriber format just for IronWorker as specified in the Push Queue documentation here: http://dev.iron.io/mq/reference/push_queues/#subscribers . Eg:
ironworker:///my_worker
That will kick off a worker task whenever something hits your queue. Or you can use the worker's webhook URL. And you don't need to deal with the response, as #thousandsofthem said, IronWorker will return a 200 which acknowledges the pushed message.
IronWorker API will respond immediately for a post request with "HTTP 200 OK" status and queue a task after that, it's too late to respond something from running task.
You could find exact webhook value on "Code" page (https://hud.iron.io).
Screenshot: http://i.imgur.com/aza7g0h.png
Just use it "as is"

how can we send data from action handler to popuppresenter with out using RPC machanisim

In my project.i have below requirement.
we have to send some request to ATM.
so before sending response Atm will send some notifications.
As per requirement we have to listen the notifications and some how we need to send these content of this Notification tag to presenter.
for example
we are sending request to ATM---Rq1
then it sends some notifications like N1--it is a xml so for example value tag value is "some text".so this we need to send to presenter.if again we receive
N1 then needs to send the value tag value to presenter.so all these should send to presenter immediately.
once we receive response from ATM we can send the response data by using RPC.
The main motive is if we receive notification at 10:00 AM then we should send value tag value to the presenter.then next notification if we receive at 10:01 that value tag also we need to update to presenter.
So kindly advise us..if my question is not clear kindly let me know.
To my knowledge, an RPC call will only return once. And a server cannot directly communicate to a browser like the browser can to a server.
You might want to implement an asynchronous polling system. Essentially, your call your main RPC service, call it bigMethod(). After you call it, you immediately and asynchronously begin calling a polling method. just call it poll().
Every second or two, (or however long you think), it runs out to the ATM to check if there's a message. Once bigMethod() finishes you stop polling.
The downside is that this solution will require some tweaking of the back end code to handle it.