My situation is I only have execute permission from some folder:
Lets say, I would like to backup entire folder and exclude some folder and files with exclude.txt
Here is path I would like to backup:
/pdf/data/pdfnew/2014
And I only have permission to execute from this folder (main):
/pdf/data/pdfnew/2014/public/main
I put exclude.txt in same folder which I can execute the command (main)
I execute this command in (main folder):
tar -cjvf -X exclude.txt 2014.tar.bz2 /pdf/data/pdfnew/2014
The result is it still included folder that I dont want to backup.
Is there a correct way doing this?
Do you have a user/home directory on that server? You should, so you should just place exclude.txt in your user/homedirectory on that server & run it like this from that directory:
tar -cjvf -X ~/exclude.txt ~/2014.tar.bz2 /pdf/data/pdfnew/2014
The ~/ is a shorthand for your user/home directory so in this case it is explicitly stating, “Read exclude.txt from the user/home directory & write ~/2014.tar.bz2 to the user/home directory.
But you also ask this:
Is there a correct way doing this?
There is never one canonical best way of doing something like this. It is all based on your final/end goal. Nothing more. Nothing less. That said, if I were you I would do it like this instead using the -C option:
tar -cjvf -X ~/exclude.txt ~/2014.tar.bz2 -C /pdf/data/pdfnew/ 2014
The uppercase -C option allows tar to internally change the working directory to /pdf/data/pdfnew/ so you can then create an archive of 2014 without having to have the whole directory tree retained in the backup. I find this is easier to work with because many times I want to backup the contents of a directory but have no use to retain the parent structure. That way the archive is more like a traditional ZIP archive which I find is easer to understand & work with.
Related
when I run nextflow, I get a .nextflow folder, but I can't find a way to change its location (i.e. it is't -work-dir). How can I change the location of the .nextflow folder?
I have looked at launchDir but it seems that is a read-only implicit variable and cannot be overwritten in the CLI, also, the --launchDir option is only valid for the k8s scope (see original chat in gitter)
I'm using Nextflow 20.10.0 build 5430.
Keeping things neat and tidy is admirable. From this comment, it looks like the only way (without doing crazy things...) is to change to the directory you want your .nextflow cache directory to live and point all other options (i.e. -work-dir, -log etc) away to a separate directory:
If you want .nextflow in dir A and the pipeline work dir in B:
cd A
nextflow run -w B
The .nextflow has to be in the launching
directory to properly maintain the history of the executions.
I need to rsync files from a bucket to a local machine everyday, and the bucket contains 20k files. I need to download only the changed files that end with *some_naming_convention.csv .
What's the best way to do that? using a wildcard in the download source gave me an error.
I don't think you can do that with Rsynch. As Christopher told you, you can skip files by using the "-x" flag, but no just synch those [1]. I created a public Feature Request on your behalf [2] for you to follow updates there.
As I say in the FR, IMHO I consider this to not follow the purpose of rsynch, as it's to keep folders/buckets synchronise, and just synchronising some of them don't fall in that purpose.
There is a possible "workaround" by using gsutil cp to copy files and -n to skip the ones that already exist. The whole command for your case should be:
gsutil -m cp -n <bucket>/*some_naming_convention.csv <directory>
Other option, maybe a little bit more far-fetched is to copy/move those files to a folder and then use that folder to rsynch.
I hope this works for you ;)
Original Answer
From here, you can do something like gsutil rsync -r -x '^(?!.*\.json$).*' gs://mybucket mydir to rsync all json files. The key is the ?! prefix to the pattern you actually want.
Edit
The -x flag excludes a pattern. The pattern ^(?!.*\.json$).* uses negative look-ahead to specify patterns not ending in .json. It follows that the result of the gsutil rsync call will get all files which end in .json.
Rsync lets you include and exclude files matching patterns.
For each file rsync applies the first patch that matches, some if you want to sync only selected files then you need to include those, and then exclude everything else.
Add the following to your rsync options:
--include='*some_naming_convention.csv' --exclude='*'
That's enough if all your files are in one directory. If you also want to search sub folders then you need a little bit more:
--include='*/' --include='*some_naming_convention.csv' --exclude='*'
This will duplicate all the directory tree, but only copy the files you want. If that leaves empty directories you don't want then add --prune-empty-dirs.
I'm trying to run a command like this to back up my public directory:
tar -zcvf backups/2014-09-09-public_html.tar.gz public_html -C home/path
My understanding is that this is supposed to create and an archive with the compressed contents of my public directory. But I'm getting the following errors:
tar (child): backups/2014-09-09-public_html.tar.gz: Cannot open: No such file or directory
tar (child): Error is not recoverable: exiting now
tar: Child returned status 2
This has me confused because I want it to create the file, not open it. I've used similar commands in other projects without problems, so I'm not sure what the problem could be here. What could I be doing wrong?
When you create a file, you actually open it in write mode, even if it does not yet exist (read man 3 open to further understand how this works on *nix).
Are you sure you have write permissions for the destination where you're trying to compress? Can you touch(1) that location?
Well, I finally got it working by putting -C /home/path at the beginning and using the absolute path for the archive name instead of the relative path:
tar -C /home/path -zcvf /home/path/backups/2014-09-09-public_html.tar.gz public_html
I still don't understand why I need to specify the absolute path for the archive name since I've always used the relative path in other projects. I guess maybe it's a difference in how this server is set up or something?
I have 10 directories in a AccuRev depot and don't want to populate one directory using "accurev pop" command. Is there any way? .acignore is not suiting to my requirements because in another jenkins build I need that folder. Just want to save time to avoid unnecessary populate of directories.
Any idea?
Thanks,
Sanjiv
I would create a stream off this stream and exclude the directories you dont want. Then you can pop this stream and only get the directories you want.
When you run the AccuRev populate command you can specify which directories to populate by specifying the directory name:
accurev pop -O -R thisDirectory
will cause the contents of thisDirectory to match the backing stream from the time of the last AccuRev update in that workspace.
The -O is for over write and the -R is for recurse. If you have active work in progress the -O will cause that work to be over written/destroyed.
The .acignore is only for (external) files and not those that are being managed by AccuRev.
David Howland
In the diagnostics sections in textpattern, it's giving me the error:
"File directory path is not writable:...html/textpattern/files" (took out beginning of path)
I changed the permissions for the textpattern folder, and the folder named "files", which is in the root folder not in the textpattern folder, but it's still giving the error. Do I need to change permissions for all enclosed items of the textattern folder and not just the folder itself?
Maybe I got you wrong but I suppose you simply have to change the path to the files folder in your admin panel from "…html/textpattern/files" to "…/html/files".
Assuming you're on a *nix system...
It sounds like you want to change the permissions recursively.
A quick fix might be to change the permissions like so:
chmod -R 777 html/textpattern
This command will go through every folder and file and change its permissions (the -R turns on the recursive bit).
Warning, this is very broad and not a good idea for production.
A better approach would be to change the permissions at a finer level of granularity. Google for "Linux file permissions" or type man chown at the shell.