Create Sub folder in S3 Bucket? - amazon-s3

Already i have Root bucket(Bigdate).now i want to create NEWFOLDER (year) inside Bigdate bucket in s3 bucket. then create NEWFOLDER(MONTH) inside year.
aws s3 mb s3://bigdata -->Bucket created
aws s3 mb s3://bigdata/Year/ --> It not working

Use the below syntax, this is what I am using to create bucket and subfolders. Don't forget the "/" at end of the folder name.
aws s3api put-object --bucket <your-bucket-name> --key <folder-name>/test.txt --body yourfile.txt

After Googling for 2 hours, this CLI Command worked for me:
aws s3api put-object --bucket root-bucket-name --key new-dir-name/
Windows bat file example:
SET today=%Date:~-10,2%%Date:~-7,2%%Date:~-4,4%
aws s3api put-object --bucket root-backup-sets --key %today%/

If local file is foo.txt, and remote "folder" Year does not yet exist, then to create it, just put the file at the designated path:
$ aws s3 cp foo.txt s3://bigdata/Year/ --recursive
Or if local folder is YearData containing foo.txt, bar.txt,
$ aws s3 cp YearData s3://bigdata/Year/ --recursive
upload: YearData/foo.txt to s3://bigdata/Year/foo.txt
upload: YearData/bar.txt to s3://bigdata/Year/bar.txt
See also:
http://docs.aws.amazon.com/cli/latest/reference/s3/cp.html
http://docs.aws.amazon.com/cli/latest/reference/s3/sync.html
and first, http://docs.aws.amazon.com/cli/latest/reference/configure

Related

How to copy files from S3 using include pattern with underscore on the file name

How can I include the _ (underscore) on the include pattern?
I have a S3 bucket with files with the following format
20220630_084021_abc.json
20220630_084031_def.json
20220630_084051_ghi.json
20220630_084107_abc.json
20220630_084118_def.json
So, I would like to get all the files that start with 20220630_0840*.
So, I've tried to fetch them using multiple variations on the include pattern, so far I had used the followings:
aws s3 cp s3://BUCKET/ LocalFolder --include "20220630_0840*" --recursive
aws s3 cp s3://BUCKET/ LocalFolder --include "[20220630_0840]*" --recursive
aws s3 cp s3://BUCKET/ LocalFolder --include "20220630\_0840*" --recursive
None of them really work I'm getting all the files that have their name starting with 20220630

Copying files with certain title or a title with the keyword 'log' from an S3 bucket to a folder in another bucket

I am trying to copy pdf files that contain the keyword 'log' on their titles. I have the command below. What am I missing?
aws s3api copy-object --copy-source --key 'Log' --bucket
Finally managed to change things around and ended up with this that worked.
aws s3 cp s3://source-bucket/ s3://destination-bucket/ --recursive --exclude "*" --include "log"

Using s3cmd, how do I retreived the newest folder by "Last modfied" date in an s3 directory

I have a directory containing folders whose folder names are created using timestamps. I want use s3cmd to find the file with the most recent "Last Modified" value. If that is not possible, are the solutions to these previous questions the way to go?
looking for s3cmd download command for a certain date
Using S3cmd, how do I get the first and last file in a folder?
Can s3cmd do this natively, or do I have to retrieve all the folder names and sort through them?
Using the AWS Command-Line Interface (CLI), you can list the most recent file with:
aws s3api list-objects --bucket my-bucket-name --prefix folder1/folder2/ --query 'sort_by(Contents, &LastModified)[-1].Key' --output text
The first (oldest) object would be:
aws s3api list-objects --bucket my-bucket-name --prefix folder1/folder2/ --query 'sort_by(Contents, &LastModified)[0].Key' --output text

How to upload a directory to a AWS S3 bucket along with a KMS ID through CLI?

I want to upload a directory (A folder consist of other folders and .txt files) to a folder(partition) in a specific S3 bucket along with a given KMS-id via CLI. The following command which is to upload a jar file to an S3 bucket, was found.
The command I found for upload a jar:
aws s3 sync /?? s3://???-??-dev-us-east-2-813426848798/build/tmp/snapshot --sse aws:kms --sse-kms-key-id alias/nbs/dev/data --delete --region us-east-2 --exclude "*" --include "*.?????"
Suppose;
Location (Bucket Name with folder name) - "s3://abc-app-us-east-2-12345678/tmp"
KMS-id - https://us-east-2.console.aws.amazon.com/kms/home?region=us-east-2#/kms/keys/aa11-123aa-45/
Directory to be uploaded - myDirectory
And I want to know;
Whether the same command can be used to upload a directory with a
bunch of files and folders in it?
If so, how this command should be changed?
the cp command works this way:
aws s3 cp ./localFolder s3://awsexamplebucket/abc --recursive --sse aws:kms --sse-kms-key-id a1b2c3d4-e5f6-7890-g1h2-123456789abc
I haven't tried sync command with kms, but the way you use sync is,
aws s3 sync ./localFolder s3://awsexamplebucket/remotefolder

How to update ACL for all S3 objects in a folder with AWS CLI?

As part of an automated process in CodeBuild I want to update Access Control List for all files in a given folder (or more specifically all objects with given prefix). How to do it in a single line of bash code?
The following one liner works perfectly
aws s3api list-objects --bucket $BUCKET_NAME$ --prefix $FOLDER_NAME$
--query "(Contents)[].[Key]" --output text | while read line ; do aws s3api put-object-acl --acl public-read --bucket $BUCKET_NAME$ --key
$line ; done
it's not formatted as code, so that it's readable without scrolling!
You can use aws s3 cp
aws s3 cp --grants foo=bar=baz s3://mybucket/mydir s3://mybucket/mydir
Reference https://docs.aws.amazon.com/cli/latest/reference/s3/cp.html
list all objects and modify acl by put-object-acl
acl=public-read
aws s3 ls s3://$bucket --recursive --endpoint-url=$endpoint | awk '{print $4}' > tos-objects.txt
cat tos-objects.txt | while read object
do
echo -e "set acl of \033[31m $object \033[0m as $acl"
aws s3api put-object-acl --bucket $bucket --key $object --acl $acl --endpoint-url=$endpoint
done