How to copy files from S3 using include pattern with underscore on the file name - amazon-s3

How can I include the _ (underscore) on the include pattern?
I have a S3 bucket with files with the following format
20220630_084021_abc.json
20220630_084031_def.json
20220630_084051_ghi.json
20220630_084107_abc.json
20220630_084118_def.json
So, I would like to get all the files that start with 20220630_0840*.
So, I've tried to fetch them using multiple variations on the include pattern, so far I had used the followings:
aws s3 cp s3://BUCKET/ LocalFolder --include "20220630_0840*" --recursive
aws s3 cp s3://BUCKET/ LocalFolder --include "[20220630_0840]*" --recursive
aws s3 cp s3://BUCKET/ LocalFolder --include "20220630\_0840*" --recursive
None of them really work I'm getting all the files that have their name starting with 20220630

Related

Copying files with certain title or a title with the keyword 'log' from an S3 bucket to a folder in another bucket

I am trying to copy pdf files that contain the keyword 'log' on their titles. I have the command below. What am I missing?
aws s3api copy-object --copy-source --key 'Log' --bucket
Finally managed to change things around and ended up with this that worked.
aws s3 cp s3://source-bucket/ s3://destination-bucket/ --recursive --exclude "*" --include "log"

How to upload a directory to a AWS S3 bucket along with a KMS ID through CLI?

I want to upload a directory (A folder consist of other folders and .txt files) to a folder(partition) in a specific S3 bucket along with a given KMS-id via CLI. The following command which is to upload a jar file to an S3 bucket, was found.
The command I found for upload a jar:
aws s3 sync /?? s3://???-??-dev-us-east-2-813426848798/build/tmp/snapshot --sse aws:kms --sse-kms-key-id alias/nbs/dev/data --delete --region us-east-2 --exclude "*" --include "*.?????"
Suppose;
Location (Bucket Name with folder name) - "s3://abc-app-us-east-2-12345678/tmp"
KMS-id - https://us-east-2.console.aws.amazon.com/kms/home?region=us-east-2#/kms/keys/aa11-123aa-45/
Directory to be uploaded - myDirectory
And I want to know;
Whether the same command can be used to upload a directory with a
bunch of files and folders in it?
If so, how this command should be changed?
the cp command works this way:
aws s3 cp ./localFolder s3://awsexamplebucket/abc --recursive --sse aws:kms --sse-kms-key-id a1b2c3d4-e5f6-7890-g1h2-123456789abc
I haven't tried sync command with kms, but the way you use sync is,
aws s3 sync ./localFolder s3://awsexamplebucket/remotefolder

How to upload files matching a pattern with aws cli to s3

Team,
I need to upload all files matching this pattern console.X.log to s3, where X=0...any
I tried below and getting error.
aws s3 cp /var/log/console.* s3://test/dom0/
Unknown options: /var/log/console.70.log,s3://0722-maglev-avdc-provisions/dom0/
AWS s3 cli doesn't support regex, but there is an exclude and include for s3.
So you should be able to use:
aws s3 cp /var/log/ s3://test/dom0/ --recursive --exclude "*" --include "console.*"
Note the order of the exclude and include, if you switch them around then nothing will be uploaded. You can include more patterns by adding more includes.
Currently, there is no support for the use of UNIX-style wildcards in a command's path arguments. However, most commands have --exclude "<value>" and --include "<value>" parameters that can achieve the desired result.
The following pattern symbols are supported.
*: Matches everything.
?: Matches any single character.
[sequence]: Matches any character in the sequence.
[!sequence]: Matches any character, not in the sequence.
Any number of these parameters can be passed to a command. You can do this by providing an --exclude or --include argument multiple times, e.g. --include "*.txt" --include "*.png".
When there are multiple filters, the rule is the filters that appear later in the command take precedence over filters that appear earlier in the command.
For example, if the filter parameters passed to the command were --exclude "*" --include "*.txt". All files will be excluded from the command except for files ending with .txt However if the order of the filter parameters was changed to --include "*.txt" --exclude "*". All files will be excluded from the command.
This is a simple example to upload your source code to S3 and exclude the git files: aws s3 cp /tmp/foo s3://bucket/ --recursive --exclude ".git/*"
Source

S3 : Download Multiple Files in local git-bash console

I have multiple files in S3 bucket like
file1.txt
file2.txt
file3.txt
another-file1.txt
another-file1.txt
another-file1.txt
now, I want to download first 3 files, name startwith "file*", How can i download from aws s3 in local git-bash console?
Simply you can download with below command :
aws s3 cp --recursive s3://bucket-name/ /local-destination-folder/ --exclude "*" --include "file*"

How to AND OR aws s3 copy statements with include

I'm copying between s3 buckets files from specific dates that are not sequence.
In the example I'm copying from the 23, so I'd like to copy 15th, 19, and 23rd.
aws s3 --region eu-central-1 --profile LOCALPROFILE cp
s3://SRC
s3://DEST --recursive
--exclude "*" --include "2016-01-23"
This source mentions using sequences http://docs.aws.amazon.com/cli/latest/reference/s3/
for include.
It appears that you are asking how to copy multiple files/paths in one command.
The AWS Command-Line Interface (CLI) allows multiple --include specifications, eg:
aws s3 cp s3://SRC s3://DEST --recursive --exclude "*" --include "2016-01-15/*" --include "2016-01-19/*" --include "2016-01-23/*"
The first --exclude says to exclude all files, then the subsequent --include parameters add paths to be included in the copy.
See: Use of Exclude and Include Filters in the documentation.