AWS CLI - how to get when the file was changed - amazon-s3

A client is uploading data we use to AWS S3. I need to find out when the uploads took place in the last week (or month). How could I go about that? If I use aws s3 ls path I get only the date of the last change.

To obtain historical information about Amazon S3 API calls, you can use AWS CloudTrail.
From Logging Amazon S3 API Calls by Using AWS CloudTrail - Amazon Simple Storage Service:
Amazon S3 is integrated with AWS CloudTrail, a service that provides a record of actions taken by a user, role, or an AWS service in Amazon S3. CloudTrail captures a subset of API calls for Amazon S3 as events, including calls from the Amazon S3 console and from code calls to the Amazon S3 APIs.
To use object-level logging, see: How Do I Enable Object-Level Logging for an S3 Bucket with AWS CloudTrail Data Events? - Amazon Simple Storage Service

Related

Which s3 policy types are supported in MINIO?

I am investigating minio. For now I have docker-compose with minio service and minio console is available for me. As I understand minio is kinda replacement for amazon s3.
I've found the following page:
http://awspolicygen.s3.amazonaws.com/policygen.html
So there are 5 policy types in amazon S3
IAM Policy
S3 Bucket Policy
SNS Topic Policy
VPC Endpoint Policy
SQS Queue Policy
Which of them are supported in minio ? Is there any abilities to configure it through minio console?
I can't find it in minio documentation for some reasons
IAM Policy - you can read about them in detail https://min.io/docs/minio/linux/administration/identity-access-management.html
MinIO supports LDAP, Multiple IDP vendors, mTLS-based client authentication, and various other styles of access management - more than what AWS S3 does mostly to cater to all the on-prem needs.

AWS Glue and S3 Access Points

Is AWS Glue supports S3 Access Point.
Suppose I create IAM Role and assign it to AWS Glue service.
(https://docs.aws.amazon.com/glue/latest/dg/create-an-iam-role.html)
And later I want to use this IAM Role in S3 Access point policies.
(https://docs.aws.amazon.com/AmazonS3/latest/userguide/access-points-policies.html#access-points-policy-examples)
Is it supported ?

EC2 instance launched S3 Endpoint subnet unable to list bucket object with endpoint bucket policy

I have created S3endpoint and added it to route table of a subnet.
Subnet has route to internet and able to open AWS console.
Next a bucket is created with bucket policy limiting access to it through VPC endpoint.
I have IAM user which has full permission to this bucket.
When i access the S3 bucket through S3 console webpage there is an error 'Access Denied' but i am able to upload files to the bucket.
Does S3 endpoint imply that only access will be through AWS CLI \SDKs? and console access is limited?
Does S3 endpoint imply that only access will be through AWS CLI \SDKs?
and console access is limited?
My understanding is that any calls done in the AWS Console will not use the endpoint setup within the VPC, even if you're accessing it via an EC2 instance within the VPC. This is because the UI within the AWS Console does not directly access the S3 API Endpoint, but instead goes through a proxy to reach the endpoint.
If you need to access the S3 bucket via the AWS Console, you'll need to amend your bucket policy.

Upload json files through Amazon API gateway, S3, SQS and Lambda

I have my APP running on EC2 instance that accept in input a json file and return an elaborated json file as output.
I need to manage many answer to the server, so I'm trying to configure AWS services.
My idea is to create an API Gateway that receive
json file input, write on S3, than SQS read the notification of put and pass the request to the EC2 server, maybe trough a Lambda function.
Than the server write the json elaborated to another S3 bucket and SNS send notification to the client.
Is this a correct way to use AWS services or there is another way?
It seems like very complicated workflow for no good reason. What's the point exactly of using so many services just to for your ec2 instance get that json? You can have direct endpoint to your ec2 instance. If you want API gateway as a wrapper on your endpoints, you can have that too. But just send that json directly to ec2 or api gateway to ec2 instead of api gateway -> s3 -> sqs -> lambda -> ec2.

How protect Amazon S3 via Basic Authentification

I am new to S3 and am wonding how I could protect access to S3 or cloud front via Basic Authentification while installing a private certificate into Chrome, that allows access. Is there anything like this?
It is not possible to use Basic Authentication with Amazon S3 nor Amazon CloudFront.
Amazon S3 access can be controlled via one or more of:
Access Control List on the object level
Amazon S3 Bucket Policy
AWS Identity and Access Management (IAM) Policy
Amazon CloudFront has its own method of controlling access via signed URLs and signed cookies.