How to update ACL for all S3 objects in a folder with AWS CLI? - amazon-s3

As part of an automated process in CodeBuild I want to update Access Control List for all files in a given folder (or more specifically all objects with given prefix). How to do it in a single line of bash code?

The following one liner works perfectly
aws s3api list-objects --bucket $BUCKET_NAME$ --prefix $FOLDER_NAME$
--query "(Contents)[].[Key]" --output text | while read line ; do aws s3api put-object-acl --acl public-read --bucket $BUCKET_NAME$ --key
$line ; done
it's not formatted as code, so that it's readable without scrolling!

You can use aws s3 cp
aws s3 cp --grants foo=bar=baz s3://mybucket/mydir s3://mybucket/mydir
Reference https://docs.aws.amazon.com/cli/latest/reference/s3/cp.html

list all objects and modify acl by put-object-acl
acl=public-read
aws s3 ls s3://$bucket --recursive --endpoint-url=$endpoint | awk '{print $4}' > tos-objects.txt
cat tos-objects.txt | while read object
do
echo -e "set acl of \033[31m $object \033[0m as $acl"
aws s3api put-object-acl --bucket $bucket --key $object --acl $acl --endpoint-url=$endpoint
done

Related

Copying files with certain title or a title with the keyword 'log' from an S3 bucket to a folder in another bucket

I am trying to copy pdf files that contain the keyword 'log' on their titles. I have the command below. What am I missing?
aws s3api copy-object --copy-source --key 'Log' --bucket
Finally managed to change things around and ended up with this that worked.
aws s3 cp s3://source-bucket/ s3://destination-bucket/ --recursive --exclude "*" --include "log"

Trying to restore glacier deep archive to s3, but unable to do so with mfa

I want to restore glacier deep archive folder, I do have aws MFA enabled so when I try to run the below command getting the below error:
aws s3api list-objects-v2 --bucket MYBUCKET --query "Contents[?StorageClass=='GLACIER']" --output text | awk '{print substr($0, index($0, $2))}' | awk '{NF-=3};3' > glacier-restore.txt --profile xxx
error : awk: fatal: cannot open file `--profile' for reading (No such file or directory)
An error occurred (AccessDenied) when calling the ListObjectsV2 operation: Access Denied
The --profile xxx is not an argument of the aws command.
Try:
aws s3api list-objects-v2 --profile xxx --bucket MYBUCKET --query "Contents[?StorageClass=='GLACIER']" --output text | awk '{print substr($0, index($0, $2))}' | awk '{NF-=3};3' > glacier-restore.txt

Create Sub folder in S3 Bucket?

Already i have Root bucket(Bigdate).now i want to create NEWFOLDER (year) inside Bigdate bucket in s3 bucket. then create NEWFOLDER(MONTH) inside year.
aws s3 mb s3://bigdata -->Bucket created
aws s3 mb s3://bigdata/Year/ --> It not working
Use the below syntax, this is what I am using to create bucket and subfolders. Don't forget the "/" at end of the folder name.
aws s3api put-object --bucket <your-bucket-name> --key <folder-name>/test.txt --body yourfile.txt
After Googling for 2 hours, this CLI Command worked for me:
aws s3api put-object --bucket root-bucket-name --key new-dir-name/
Windows bat file example:
SET today=%Date:~-10,2%%Date:~-7,2%%Date:~-4,4%
aws s3api put-object --bucket root-backup-sets --key %today%/
If local file is foo.txt, and remote "folder" Year does not yet exist, then to create it, just put the file at the designated path:
$ aws s3 cp foo.txt s3://bigdata/Year/ --recursive
Or if local folder is YearData containing foo.txt, bar.txt,
$ aws s3 cp YearData s3://bigdata/Year/ --recursive
upload: YearData/foo.txt to s3://bigdata/Year/foo.txt
upload: YearData/bar.txt to s3://bigdata/Year/bar.txt
See also:
http://docs.aws.amazon.com/cli/latest/reference/s3/cp.html
http://docs.aws.amazon.com/cli/latest/reference/s3/sync.html
and first, http://docs.aws.amazon.com/cli/latest/reference/configure

How to find/check current permissions in AWS S3 using cli?

I'm quite happy with the speed of aws cli But can't seem to find a way to find out what the permissions are on a file/folder.
E.g. I do:
$ curl http://my.s3.amazonaws.com/deploy/tool1/license.key -o ./license.key
$ cat license.key | sed 's/></>\n</g'
<?xml version="1.0" encoding="UTF-8"?>
<Error>
<Code>AccessDenied</Code>
<Message>Access Denied</Message>
<RequestId>E4D50F0606FFFD48</RequestId>
<HostId>+xxxyyyaaaa=</HostId>
$
$
Now curl http://my.s3.amazonaws.com/deploy/tool1/tool.sh -o ./tool.sh works just fine and I can get the tool.sh. So I suspect teh permissions on license.key to be the problem.
In order to get the permissions on a file in S3 with the CLI, use the get-object-acl command of s3api (full documentation http://docs.aws.amazon.com/cli/latest/reference/s3api/get-object-acl.html)
Using your example:
$ aws s3api get-object-acl \
--bucket my \
--key deploy/tool1/license.key

Filter S3 list-objects results to find a key matching a pattern

I would like to use the AWS CLI to query the contents of a bucket and see if a particular file exists, but the bucket contains thousands of files. How can I filter the results to only show key names that match a pattern? For example:
aws s3api list-objects --bucket myBucketName --query "Contents[?Key==*mySearchPattern*]"
The --query argument uses JMESPath expressions. JMESPath has an internal function contains that allows you to search for a string pattern.
This should give the desired results:
aws s3api list-objects --bucket myBucketName --query "Contents[?contains(Key, `mySearchPattern`)]"
(With Linux I needed to use single quotes ' rather than back ticks ` around mySearchPattern.)
If you want to search for keys starting with certain characters, you can also use the --prefix argument:
aws s3api list-objects --bucket myBucketName --prefix "myPrefixToSearchFor"
I tried on Ubuntu 14, awscli 1.2
--query "Contents[?contains(Key,'stati')].Key"
--query "Contents[?contains(Key,\'stati\')].Key"
--query "Contents[?contains(Key,`stati`)].Key"
Illegal token value '?contains(Key,'stati')].Key'
After upgraded the aws version to 1.16 , worked with
--query "Contents[?contains(Key,'stati')].Key"