I'm setting up a new policy so my website can store images on S3, and I'm trying to keep it as secure as possible.
I can put an object and read it, but can not delete it, even though it appears I've followed the recommendations from Amazon. I am not using versioning.
What am I doing wrong?
Here's my policy
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "VisualEditor0",
"Effect": "Allow",
"Action": [
"s3:PutObject",
"s3:GetObjectAcl",
"s3:GetObject",
"s3:DeleteObjectVersion",
"s3:PutLifecycleConfiguration",
"s3:DeleteObject",
"s3:ListObjects"
],
"Resource": "*"
}
]
}
After screwing around with multiple permission actions it turns out I needed to add s3:ListBucket and s3:ListObjects. Once added I can now delete objects.
Related
In minio. when you set bucket policy to download with mc command like this:
mc policy set download server/bucket
The policy of bucket changes to:
{
"Statement": [
{
"Action": [
"s3:GetBucketLocation",
"s3:ListBucket"
],
"Effect": "Allow",
"Principal": {
"AWS": [
"*"
]
},
"Resource": [
"arn:aws:s3:::public-bucket"
]
},
{
"Action": [
"s3:GetObject"
],
"Effect": "Allow",
"Principal": {
"AWS": [
"*"
]
},
"Resource": [
"arn:aws:s3:::public-bucket/*"
]
}
],
"Version": "2012-10-17"
}
I understand that in second statement we give read access to anonymous users to download the files with url. What I don't understand is that why do we need to allow them to the actions s3:GetBucketLocation, s3:ListBucket.
Can anyone explain this?
Thanks in advance
GetBucketLocation is required to find the location of a bucket in some setups, and is required for compatibility with standard S3 tools like the awscli and mc tools.
ListBuckets is required to list the objects in a bucket. Without this permission you are still able to download objects, but you cannot list and discover them anonymously.
These are standard permissions that are safe to use and setup automatically by the mc anonymous command (previously called mc policy). It is generally not required to change them - though you can do so by directly calling the PutBucketPolicy API.
I have an IAM role set for my task with the following permissions, yet I get access denied trying to access the buckets.
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "",
"Effect": "Allow",
"Action": "s3:*",
"Resource": [
"arn:aws:s3:::bucket/Templates/*",
"arn:aws:s3:::bucket/*",
"arn:aws:s3:::anotherBucket/*"
]
}
]
}
The container instance has a role with the standard AmazonEC2ContainerServiceforEC2Role policy.
I seem to be able to read and write to folders under from bucket/ like bucket/00001, BUT I can't read from bucket/Templates.
Ive redeployed the permissions and the tasks repeatedly (using terraform) but nothing changes. Ive added logging to the app to ensure it's using the correct bucket and path / keys.
I'm stumped. Anyone got a clue what I might have missed here?
Thanks
PS: It just occurred to me, the files in the buckets I cant access I copy there using a script. This is done using credentials other than the creds the task is using.
aws s3 cp ..\Api\somefiles\000000000001\ s3://bucket/000000000001 --recursive --profile p
aws s3 cp ..\Api\somefiles\Templates\000000000001\ s3://bucket/Templates/000000000001 --recursive --profile p
I was using -acl bucket-owner-full-control on the cp command but I removed that to see if would help - it didnt. Maybe I need something else?
It works now because you changed the Resource to match "".
Try adding the bucket itself as a resource, along with / pattern:
"Version": "2012-10-17",
"Statement": [
{
"Sid": "sid1",
"Effect": "Allow",
"Action": [
"s3:ListAllMyBuckets",
"s3:ListBucket",
"s3:HeadBucket"
],
"Resource": "*"
},
{
"Sid": "sid2",
"Effect": "Allow",
"Action": "s3:*",
"Resource": [
"arn:aws:s3:::bucket",
"arn:aws:s3:::bucket/*",
"arn:aws:s3:::anotherBucket"
"arn:aws:s3:::anotherBucket/*",
]
}
]
Solved. Found an old sample from a previous employer :) I needed a permission for List* explicitly, separate from the other permissions. I also needed to define the sids.
"Version": "2012-10-17",
"Statement": [
{
"Sid": "sid1",
"Effect": "Allow",
"Action": [
"s3:ListAllMyBuckets",
"s3:ListBucket",
"s3:HeadBucket"
],
"Resource": "*"
},
{
"Sid": "sid2",
"Effect": "Allow",
"Action": "s3:*",
"Resource": "*"
}
]
}
I'm trying to define a policy for a specific user.
I have several buckets in my S3 but I want to give the user access to some of them.
I created the following policy:
{
"Version":"2012-10-17",
"Statement":[
{
"Sid":"AddPerm",
"Effect":"Allow",
"Principal": "*",
"Action":["s3:GetObject",
"s3:ListBucket",
"s3:ListAllMyBuckets",
"s3:GetBucketLocation",
"s3:PutObject"],
"Resource":["arn:aws:s3:::examplebucket"]
}
when I try to add a list of resources like this:
"Resource":["arn:aws:s3:::examplebucket1","arn:aws:s3:::examplebucket2"]
I get access denied
The only option that works for me (I get buckets lists) is:
"Resource": ["arn:aws:s3:::*"]
whats the problem?
Some Amazon S3 API calls operate at the Bucket-level, while some operate at the Object-level. Therefore, you will need a policy like:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": ["s3:ListBucket"],
"Resource": ["arn:aws:s3:::test"]
},
{
"Effect": "Allow",
"Action": [
"s3:PutObject",
"s3:GetObject",
"s3:DeleteObject"
],
"Resource": ["arn:aws:s3:::test/*"]
}
]
}
See: AWS Security Blog - Writing IAM Policies: How to Grant Access to an Amazon S3 Bucket
I found that its an AWS limitation.
There is no option get filtered list of buckets.
Once you give permissions to ListAllMyBuckets like this:
{
"Sid": "AllowUserToSeeBucketListInTheConsole",
"Action": ["s3:GetBucketLocation", "s3:ListAllMyBuckets"],
"Effect": "Allow",
"Resource": ["arn:aws:s3:::*"]
}
you get the list of all bucket (including buckets that you don't have permissions to it).
More info could be found here: https://aws.amazon.com/blogs/security/writing-iam-policies-grant-access-to-user-specific-folders-in-an-amazon-s3-bucket/
Few workarounds could be found here: Is there an S3 policy for limiting access to only see/access one bucket?
I'm trying to create a transfer from my S3 bucket to Google Cloud - it's basically the same problem as in this question, but none of the answers work for me. Whenever I try to make a transfer, I get the following error:
Invalid access key. Make sure the access key for your S3 bucket is correct, or set the bucket permissions to Grant Everyone.
I've tried the following policies, to no success:
First policy:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"s3:Get*",
"s3:List*",
"s3:GetBucketLocation"
],
"Resource": "*"
}
]
}
Second policy:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": "s3:*",
"Resource": "*"
}
]
}
Third policy:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": "s3:*",
"Resource": [
"arn:aws:s3:::my-bucket-name",
"arn:aws:s3:::my-bucket-name/*"
]
}
]
}
I've also made sure to grant the 'List' permission to 'Everyone'. Tried this on buckets in two different locations - Sao Paulo and Oregon. I'm starting to run out of ideas, hope you can help.
I know this question is over a year old but I just encountered the same error when trying to do the transfer via the console. I worked around this by executing IT via the gsutils command line tool instead.
After installing and configuring the tool, simply run:
gsutils cp s3://sourcebucket gs://targetbucket
Hope this is helpful!
All of my files are stored in one single bucket. I want to define a policy that is used in a temporary credential to give user permission to put one file with a key that I provide. Here is my policy:
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "1",
"Effect": "Allow",
"Action": [
"s3:GetObject",
"s3:PutObject",
"s3:DeleteObject"
],
"Resource": [
"arn:aws:s3:::{bucketname}/AKey"
]
}
]
}
Delete and get work and they are limited to "AKey". However the user can put objects with whatever keys they want to.