How to gzip a folder under a symlink - gzip

I'm trying to gzip all subdirectories and files of a folder.The peculiarity is that the file that I compress is a symbolic link to the last release of my site
filename=$(date '+%Y%m%d')
cd /home/site
tar -zcvf $filename.tar.gz current/
scp $filename.tar.gz server:~/backups/production
rm $filename.tar.gz
When the operation ended and I open the compressed folder. I'm sying the symlink of the folder not its content. What's the wrong point ?

This is expected behavior. You need to specify the -h flag when creating the archive if you want to dereference symlinks. From the tar manual:
Normally, when tar archives a symbolic link, it writes a block to the
archive naming the target of the link. In that way, the tar archive is
a faithful record of the file system contents. When --dereference
(-h) is used with --create (-c), tar archives the files symbolic
links point to, instead of the links themselves.

Related

Comparing checksums of tarball archive with original directory

I'm wondering how to verify the checksum of a tarball backup with the original directory after creation.
Is it possible to do so without extracting it for example if it's a large 20GB backup?
Example, a directory with two files:
mkdir test &&
echo "one" > test/one.txt &&
echo "two" > test/two.txt
Get checksum of directory:
find test/ -type f -print0 | sort -z | xargs -0 shasum | shasum
Resulting checksum of directory content:
d191c793cacc4bec1f070eb96fa68524cca566f8 -
Create tarball:
tar -czf test.tar.gz test/
The checksum of the directory content stays constant.
But when creating the archive and getting the checksum of the archive I noticed that the results vary. Why is that?
How would I go about getting the checksum of the tarball content to compare to the directory content checksum?
Or what's a better solution to check that the archive contains all the necessary content from the original directory (without extracting it if it's large)?
Your directory checksum is calculating the SHA-1 of each file's contents. You would need to read and decompress the entire tar archive to do the same calculation. That doesn't mean you'd need to save the contents of the archive anywhere. You'd just need to read it sequentially into memory, and do the calculation there.

How to delete all files with lftp except for cgi-bin and .ftpquota

I'm setting up a new ci/cd pipeline on gitlab. For the deployment I have to run npm run build and then copy the dist/ folder to the webserver via ftp (with lftp). To ensure a clean deployment the script should remove all files except the folder cgi-bin and the file .ftpquota on the webserver first and then copy the files.
I've researched through the web and haven't found a suitable solution. With the flag --delete, lftp deletes all files.
Thats my script so far:
- lftp -c "set ftp:ssl-allow no; open -u $USERNAME,$PASSWORD $HOST; mirror -Rnev dist/ ./ --ignore-time --delete --parallel=10 --exclude-glob .git* --exclude .git/"
My current script removes all files, but I want it to remove everything except the cgi-bin folder and the .ftpquota file.
As seen in unix.stackexchange.com you should add the -x option:
Please check it

Packaging directory with cpack for rpm

I am trying to create a rpm package with directory with lot of files using cmake
http://www.rpm.org/max-rpm/s1-rpm-inside-files-list.html
To make this situation a bit easier, if the %files list contains a path to a directory, RPM will automatically package every file in that directory, as well as every file in each subdirectory. Shell-style globbing can also be used in the %files list.
So with cmake I am using the following command:
INSTALL(DIRECTORY my_dir DESTINATION foo)
and I end up with a spec file with all the files (30k lines) instead of something like
%file
my_dir
Did I miss something on my cmake/cpack command or the is no other way to do it?
(using a tar and an extracting it is not suitable)

Downloading .j2k or .png files using wget:if else condition

I am downloading folder consisting of either j2k or png file using wget.
Now i want while downloading folder if user is requesting .j2k file and if .j2k file is not existing in that folder then by default download .png file.
i.e. i want download j2k if present || download .png file.
I have used like this
wget -d any.com -i /folder -r -l 1 -nc -A j2k,png
-d: download from this Domain
-i: download from this foldern
-r: recursive
-l 1: follow only 1 link deep
-nc: no clobber = download only if file doesn't exist
-A: accept/download only all *.ogg and *.mp3
but using this it is downloading both j2k and png.
Any help will be appreciated.
Referred Links:
wget if else download condition
wget manual

Is it possible to make SCP ignore symbolic links during copy?

I need to reinstall one of ours servers, and as a precaution, I want to move /home, /etc, /opt, and /Services to backup server.
However, I have a problem: because of plenty of symbolic links a lot of files are copied multiple times.
Is it possible to make scp ignore the symbolic links (or actually to copy link as a link not as a directory or file)? If not, is there another way to do it?
I knew that it was possible, I just took wrong tool. I did it with rsync
rsync --progress -avhe ssh /usr/local/ XXX.XXX.XXX.XXX:/BackUp/usr/local/
I found that the rsync method did not work for me, however I found an alternative that did work on this website (www.docstore.mik.ua/orelly).
Specifically section 7.5.3 of "O'Reilly: SSH: The Secure Shell. The Definitive Guide".
7.5.3. Recursive Copy of Directories
...
Although scp can copy directories, it isn't necessarily the best method. If your directory contains hard links or soft links, they won't be duplicated. Links are copied as plain files (the link targets), and worse, circular directory links cause scp1 to loop indefinitely. (scp2 detects symbolic links and copies their targets instead.) Other types of special files, such as named pipes, also aren't copied correctly.A better solution is to use tar, which handles special files correctly, and send it to the remote machine to be untarred, via SSH:
$ tar cf - /usr/local/bin | ssh server.example.com tar xf -
Using tar over ssh as both sender and receiver does the trick as well:
cd $DEST_DIR
ssh user#remote-host 'cd $REMOTE_SRC_DIR; tar cf - ./' | tar xvf -
One solution is to use a shell pipe. I have a situation where I got some *.gz files and symbolic links generated by some software to link to the same *.gz files with a slightly shorter name. If I simply use scp, then the symbolic links will be copied as regular files and resulting in duplicates. I know rsync can ignore symbolic links, but my gz files are not compressed with syncable options, and sync is very slow in copying these gz files. So I simply use the following script to copy over the files:
find . -type f -exec scp {} target_host:/directory/name/data \;
The -f option will only find regular files and ignore symbolic links. You need to give this command on the source host. Hope this may help some user in my situation. Let me know if I missed anything.
A one liner solution which can be executed at client to copy folder from server using tar + ssh command.
ssh user#<Server IP/link> 'mkdir -p <Remote destination directory;cd <Remote destination directory>; tar cf - ./' | tar xf - C <Source destination directory>
Note: mkdir is must, if the remote destination directory is not present then the command will simply compress the entire home of the remote server and extract it to client.