Why is it not possible to select the click or open checkboxes when adding a Cloudwatch destination to an Amazon SES configuration set? - amazon-cloudwatch

I am following the Amazon docs to add a Cloudwatch destination to a configuration set in our Amazon SES account. It is possible to select the Bounce, Delivery and Complaint checkboxes in this modal, but the other checkboxes are disabled. Why is this? Must I take care of some setup in my Cloudwatch dashboard first?
If I try to set up an SNS destination instead of a Cloudwatch destination, I can indeed select the Click and Open checkboxes.

Related

Container Field under Storage Account Settings is not displayed

While creating a stream analytics job for IoT Edge, one needs to associate the job with a container in a storage account. In storage account settings under configure section of Stream Analytics Job - Container Field is not being displayed when "Add Storage Account" is selected.
Is there a new workflow to add a storage container for a stream analytics job?
I tried adding a storage container in the storage account of the same resource group. It didn't help.
Documentation Link: https://learn.microsoft.com/en-us/azure/iot-edge/tutorial-deploy-stream-analytics?view=iotedge-2018-06#create-an-azure-stream-analytics-job
Issue is at Step 3, where "Container" field is not being displayed in Azure Portal - Steam Analytics Job Page
Configure IoT Edge settings - Source Documentation
1.Under Configure, select Storage account settings then select Add storage account.
2.Select the Storage account that you created at the beginning of this tutorial from the drop-down menu.
3.For the Container field, select Create new and provide a name for the storage container.
4.Select Save.
Image from documentation 1
The workflow does seem to have changed. Instead of specifying the container in the Portal, you only get to supply a storage account. Once you've done that, when you publish, a container will be created by ASA with your Edge Job inside of it. Future publishes will use the same container.
If you want, you can use the button at the bottom of the docs to give feedback to Microsoft so they can change the documentation.

My Google Sheets API Console has duplicated pane

In my Google Sheets API Console, there are duplicated "Read/Write requests".
First "Read requests" has real data and has options "Read requests per day" and "Read requests per minute".
Second "Read requests" has no data and has options "Read requests per day" and "Read requests per 100 seconds".
Why does second pane exist?Is it just me?
Best regards.
appearance
First pane
Second pane
All newly created projects in GCP have the same quotas in Google Sheets API. This is the default set by Google:
The 100-second quota and 60-second quota are different entities.
If you want to request changes/increase to these default quotas, please follow the steps here:
Increase the quota for an API
Depending on your resource usage, you may want to increase the quota for an API.
Go to the Google Cloud Platform Console and sign in as a G Suite super administrator.
From the Project list, select the project you're using for the migration.
On the left, click IAM & admin and then Quotas.
Using the checkboxes, select one or more quotas to edit, then click Edit Quotas.
Note: Billing must be enabled on the project in order to click the checkboxes.
In the Edit Quotas panel on the right, complete the form with your Name, Email, and Phone details, then click Next.
In the Edit Quotas panel, select the service to expand the view, then edit the quotas in that service to your requested limit. Click Done.
Click Submit request.
For more information, please check this support page: Monitor API Quotas

Amazon SES persistent audit log

I am using Amazon SES to send transactional application emails. I want an audit log of every email sent by the system.
As an example, I might want to see a log of every email we sent to john.smith#example.com.
I followed the instructions for using Cloudwatch to log SES events. However, this only actually logs metrics, not data logs. So all I see in Cloudwatch is a graph of how many emails were sent at different times under the metrics tab. I can't search a log anywhere to find individual SES events.
I also looked into using Cloudtrail to log SES events, but Cloudtrail only logs management events. It does not log data events like emails being sent.
I have setup SNS notifications on all SES events (such as send or bounce). This is really useful, but does not achieve my aim of having a long term audit log.
As far as I can tell Amazon do not support the kind of logging I want to record. Maybe I could write events to our application database as we produce them, but it seems a shame to have to introduce my own custom logging system.
Does anyone know a way to have AWS store my SES data events?
Yes, Amazon doesn't have any easy to monitor you ask, an easier solution would be to add a Configuration set header and a unique message tag whenever any email is being sent to john.smith#example.com, In configuration set, you can enable Cloudwatch or preferred SNS Destination and create delivery dashboard using link below:
https://docs.aws.amazon.com/ses/latest/DeveloperGuide/bouncecomplaintdashboard.html
This isn't the logs but It creates a nice excel file of email details which can be used for Audit purpose.
I opened a ticket with AWS about this, here is their response.
I understand that you wanted to know whether SES provides logs about events of an email that was accepted by SES for delivery in a searchable format. Please correct me if I’m wrong.
SES provides logs as notification[1] for each event(Delivery, Bounce, Complaint) to SNS and it doesn’t provide event logs in a consolidated manner which would be helpful for indexing according to email addresses and searching purpose. However, in SNS you can configure a variety of subscriber like(Email, SQS, Lambda, HTTP endpoint) where the logs in JSON format would be delivered. In the destination of SNS like Lambda and HTTP Endpoint or email, you can parse the JSON file and store in a suitable format which would be helpful for indexing and searching purpose.
If SQS is being used, each JSON log will be stored individually in the SQS queue for some time. You can periodically query the SQS endpoint and retrieve the logs and consolidate it to a single file and use it for analysis.
Apart from that, unfortunately SES doesn’t provide any other format of logs for an email it accepted for delivery.
Do let me know if you require any further assistance, I’ll be happy to help.
References:
[1]. https://docs.aws.amazon.com/ses/latest/DeveloperGuide/monitor-sending-using-notifications.html

Download email attachment and upload S3 bucket AWS

I have a web app hosted on aws under free tier limit. What I want to achieve is that whenever I received an email, i want the system to download its attachment(will be images only), upload that image to s3 and save the image id in database with email's sender email address. I don't want to use zapier api etc, I want to code it my self. How can I achieve this?
This really depends upon how your email is hosted.
You could use Receive Email with Amazon Simple Email Service.
The flow could then either be:
SES -> S3 -> Trigger Event -> AWS Lambda function, or
SES -> SNS -> AWS Lambda function
You would then need to write a Lambda function to do the processing you described.
If, on the other hand, your email is being hosted elsewhere, then you will need a mechanism to trigger some code when an email is received (somehow on your email system) or a scheduled Lambda function to poll the email system to see whether new mail is available.

Unable to retrieve recordings from Amazon S3 Bucket

We created an instance in Amazon Connect and we're having Data Storage section to store call recordings on Amazon Connect.Usually, the call recordings are storing in Amazon S3 but in our case recordings are not storing in Amazon S3 but it is creating a bucket in Amazon S3
The "folder structure" your referring to is actually part of the object name that is created in S3, see this link for more information here. You are not seeing any "folder structure" because no objects have been created with that prefix yet. In order for Amazon Connect to create a call recording, you must enable recording for a Contact Flow. Once a call is processed through a Contact Flow that has recording enabled, then you will see the recording as as object in S3 with the expected prefix ("folder structure").
To enable recording in a call flow, add the Set Recording Behavior step to your Contact Flow.
This can be found under the Set section of the available steps in the Contact Flow Editor.