Amazon S3 suddenly stopped working with EC2 but working from localhost - amazon-s3

Create folders and upload files to my S3 bucket stopped working.
The remote server returned an error: (403) Forbidden.
Everything seems to work previously as i did not change anything recently
After days of testing - i see that i am able to create folders in my bucket from localhost but same code doesnt work on the EC2 instance.
I must resolve the issue ASAP.
Thanks
diginotebooks

Does your EC2 instance have a role? If yes, what is this role? Is it possible that someone detached or modified a policy that was attached to it?
If your instance doesn't have a role, how do you upload files to S3? Using the AWS CLI tools? Same questions for the IAM profile used.
If you did not change anything - are you using the same IAM credentials from the server and localhost? May be related to this.
Just random thoughts...

Related

AWS Backup from S3 Access Denied

I am trying to setup a simple on-demand backup of an s3 bucket in AWS and anything I try I always get an access denied. See screenshot:
I have tried create a new bucket which is completely public, I've tried setting the access policy on the Vault, I've tried in different regions, all have the same result. Access Denied!
The messaging doesn't advise anything other than Access Denied, really helpful!
Can anyone give me some insight into what this message is referring to and more over how I can resolve this issue.
For aws backup, you need to set up a service role.
Traditionally you need 2 policies attached.
[AWSBackupServiceRolePolicyForBackup]
[AWSBackupServiceRolePolicyForRestore]
For S3, it seems there is a separate policy that you need to attach to your service role.
[AWSBackupServiceRolePolicyForS3Backup]
[AWSBackupServiceRolePolicyForS3Restore]
Just putting this here for those who will be looking for this answer.
To solve this problem for AWS CDK (javascript/typescript) you can use the following examples:
https://github.com/SimonJang/blog-aws-backup-s3/blob/68a05f8cb443411a23f02aa0c188adfe15bab0ff/infrastructure/lib/infrastructure-stack.ts#L63-L200
or this:
https://github.com/finnishtransportagency/hassu/blob/8adc0bea3193ff016a9aaa6abe0411292714bbb8/deployment/lib/hassu-database.ts#L230-L312

Copying folders from S3 to an Azure Storage Blob and receiving "cannot list objects, access is denied" error. Anyone else have this and resolve it?

I've confirmed that my S3 credentials are set correctly and even tried full permissions on the bucket and still receive the same message. On the Azure side, I've added Storage Blob Data Owner permissions to my account and can list files I manually upload through the portal with the credentials I used (signing in through AAD rather than using a token).
Any help is appreciated here.
As it turned out there were no issues with either the Azure or AWS configuration. Trying the same commands in a linux VM worked perfectly. If you're experiencing this error it may be an issue with the AWS CLI or some other local config.

Application in EKS fails to access S3 bucket

My application running in EKS (AWS Kubernetes) is failing to access an S3 bucket.
I'm getting a 400 Bad Request errors in my app.
I suspect a permission is missing, so for testing I added arn:aws:iam::aws:policy/AmazonS3FullAccess to any role I could find related to my EKS cluster. Still failing.
Using an S3 client from my local computer, I can access the bucket so I suspect I'm missing some configuration.
Any ideas?
Ok... issue was resolved. I'm leaving this here for future reference.
The problem was a mismatch of the bucket region, us-west-2 and the endpoint I had configured in my application. It should have been s3.us-west-2.amazonaws.com.
The error returned by S3 was not clear.
I hope this helps others.

Can't access personal s3 server with boto3

I have a private installation of a server which is fully s3-compatible. I have one bucket there and I can check it using s3 browser. I am trying to interact with the server using boto3 for python (using the same credentials that I use in s3 browser), however, for any request I get NoSuchBucket error. This is my code:
s3 = boto3.resource('s3',
endpoint_url=hostname,
use_ssl=False,
aws_access_key_id=access_key,
aws_secret_access_key=secret_key
)
for bucket in s3.buckets.all():
print(bucket.name)
Initially I thought there was an issue with credentials, but then I was able to interact with the server through s3 browser client.
So the problem is: I really don't understand the error code, since I am not querying any particular bucket. What could be the cause of the problem?
Problem solved! It was a DNS resolution issue.

AWS S3 not stops uploading from my Lenovo® ix2-dl

I have a NAS drive Lenovo® ix2-dl that I set up to back up to AWS S3. It connected fine. But for some reason it only uploads 5% of my Lenovo® ix2-dl Data. How can I get it to upload my whole Lenovo® ix2-dl Data?
I updated my NAS to the latest Firmware 4.1.218.34037.
I recently had issues with the s3 backup feature, where the uploads simply stopped working. No errors, nothing in logs to indicate an issue. I tested by AWS S3 access key and secret with another method and was able to upload files just fine.
To resolve the issue, i had to create a new AWS S3 bucket, then go into the S3 setup of Lenovo and provide the required info. I think what made this work for me, was i made sure to not have anything in the bucket name other than letters and numbers. My bucket name before was similar to this lastname.family.pics, my new bucket which works is similar to this lastname123.
Hope this helps, this feature has worked fine for a long time, perhaps an update came down which has different requirements for the api.