Duplicity include with wildcards - backup

I want to backup all homes of my lxc containers with duplicity and i use (commandline simplified to the problem):
duplicity --include '/data/lxc/**/rootfs/home' --exclude '/data/lxc'
which does not match the homes, while
duplicity --include '/data/lxc/oneofthecontainers/rootfs/home' --exclude '/data/lxc'
works.
further testing shows, that
duplicity --include '/data/lxc/oneofthecontainers/rootfs/home/**' --exclude '/data/lxc'
does not work either. The manpage of duplicity tells me first match wins and * and ** are allowed as wildcards, where ** matches everything and * only one path component.

I still do not know, why this does not work, but i solved it with --include-globbing-filelist and a filelist with "+ " and "- " entries for includes/excludes and now it works.

Related

How to copy files from S3 using include pattern with underscore on the file name

How can I include the _ (underscore) on the include pattern?
I have a S3 bucket with files with the following format
20220630_084021_abc.json
20220630_084031_def.json
20220630_084051_ghi.json
20220630_084107_abc.json
20220630_084118_def.json
So, I would like to get all the files that start with 20220630_0840*.
So, I've tried to fetch them using multiple variations on the include pattern, so far I had used the followings:
aws s3 cp s3://BUCKET/ LocalFolder --include "20220630_0840*" --recursive
aws s3 cp s3://BUCKET/ LocalFolder --include "[20220630_0840]*" --recursive
aws s3 cp s3://BUCKET/ LocalFolder --include "20220630\_0840*" --recursive
None of them really work I'm getting all the files that have their name starting with 20220630

How to upload files matching a pattern with aws cli to s3

Team,
I need to upload all files matching this pattern console.X.log to s3, where X=0...any
I tried below and getting error.
aws s3 cp /var/log/console.* s3://test/dom0/
Unknown options: /var/log/console.70.log,s3://0722-maglev-avdc-provisions/dom0/
AWS s3 cli doesn't support regex, but there is an exclude and include for s3.
So you should be able to use:
aws s3 cp /var/log/ s3://test/dom0/ --recursive --exclude "*" --include "console.*"
Note the order of the exclude and include, if you switch them around then nothing will be uploaded. You can include more patterns by adding more includes.
Currently, there is no support for the use of UNIX-style wildcards in a command's path arguments. However, most commands have --exclude "<value>" and --include "<value>" parameters that can achieve the desired result.
The following pattern symbols are supported.
*: Matches everything.
?: Matches any single character.
[sequence]: Matches any character in the sequence.
[!sequence]: Matches any character, not in the sequence.
Any number of these parameters can be passed to a command. You can do this by providing an --exclude or --include argument multiple times, e.g. --include "*.txt" --include "*.png".
When there are multiple filters, the rule is the filters that appear later in the command take precedence over filters that appear earlier in the command.
For example, if the filter parameters passed to the command were --exclude "*" --include "*.txt". All files will be excluded from the command except for files ending with .txt However if the order of the filter parameters was changed to --include "*.txt" --exclude "*". All files will be excluded from the command.
This is a simple example to upload your source code to S3 and exclude the git files: aws s3 cp /tmp/foo s3://bucket/ --recursive --exclude ".git/*"
Source

Files will not move or copy from folder on file system to local bucket

I am using the command
aws s3 mv --recursive Folder s3://bucket/dsFiles/
The aws console is not giving me any feedback. I change the permissions of the directory
sudo chmod -R 666 ds000007_R2.0.1/
It looks like AWS is passing through those files and giving "File does not exist" for every directory.
I am confused about why AWS is not actually performing the copy is there some size limitation or recursion depth limitation?
I believe you want to cp, not mv. Try the following:
aws s3 cp $local/folder s3://your/bucket --recursive --include "*".
Source, my answer here.

How to AND OR aws s3 copy statements with include

I'm copying between s3 buckets files from specific dates that are not sequence.
In the example I'm copying from the 23, so I'd like to copy 15th, 19, and 23rd.
aws s3 --region eu-central-1 --profile LOCALPROFILE cp
s3://SRC
s3://DEST --recursive
--exclude "*" --include "2016-01-23"
This source mentions using sequences http://docs.aws.amazon.com/cli/latest/reference/s3/
for include.
It appears that you are asking how to copy multiple files/paths in one command.
The AWS Command-Line Interface (CLI) allows multiple --include specifications, eg:
aws s3 cp s3://SRC s3://DEST --recursive --exclude "*" --include "2016-01-15/*" --include "2016-01-19/*" --include "2016-01-23/*"
The first --exclude says to exclude all files, then the subsequent --include parameters add paths to be included in the copy.
See: Use of Exclude and Include Filters in the documentation.

Backup multiple folders with duplicity (including/excluding)

I would like to backup the following folders with duplicity
/home
/etc
/usr/local
/root
/var
/boot
and exclude
/var/tmp
/var/run
/var/lock
/home/*/.thumbnails
/home/*/.cache
/home/*/.local/share/Trash
/root/.thumbnails
/root/.cache
/root/.local/share/Trash
I already learned that I have to specify one source directory to save and that I can adjust that with include and exclude options.
So, I could give / as source directory and exclude ** (which would sum up to nothing) and include the folders that I want to save.
Source / and --exclude / would give en empty set, --include ... beats the exclude and adds the folders. But then, I will not be able to exclude the folders I want to exclude, right? Or am I missing something?
I've found out that the include/exclude commands get "stronger" the more left they appear in the command.
In my case, the imports and exports and the source would look like this: --exclude /var/tmp --exclude /var/run --exclude /var/lock --exclude /home/*/.thumbnails --exclude /home/*/.cache --exclude /home/*/.local/share/Trash --exclude /root/.thumbnails --exclude /root/.cache --exclude /root/.local/share/Trash --include /home --include /etc --include /usr/local --include /root --include /var --include /boot --exclude '**' /
(With added newlines:)
--exclude /var/tmp
--exclude /var/run
--exclude /var/lock
--exclude /home/*/.thumbnails
--exclude /home/*/.cache
--exclude /home/*/.local/share/Trash
--exclude /root/.thumbnails
--exclude /root/.cache
--exclude /root/.local/share/Trash
--include /home
--include /etc
--include /usr/local
--include /root
--include /var
--include /boot
--exclude '**'
/
To complete the answer of #Kurtibert, you need to add ** at the end of the directory you include to be sure files inside are included (and don't forgot the quotes):
--exclude '/var/tmp'
--exclude '/var/run'
--exclude /var/lock'
--exclude '/home/*/.thumbnails'
--exclude '/home/*/.cache'
--exclude '/home/*/.local/share/Trash'
--exclude '/root/.thumbnails'
--exclude '/root/.cache'
--exclude '/root/.local/share/Trash'
--include '/home/**'
--include '/etc/**'
--include '/usr/local/**'
--include '/root/**'
--include '/var/**'
--include '/boot/**'
--exclude '**'
/