Can't unzip a file gzipped on another server - gzip

I have main and backup servers (both with ubuntu server 12), and after gzipping a file on the main server it cannot be unzipped on the backup-one. Unzip says that file is not in gzip format. Both servers have the same gzip v1.4.
I may scp gzipped file back to main server, and there it could be successfully unzipped, so the reason is not in file transfer.
The reverse operation works, so if I gzip file on the backup server, it could be unzipped on the main one.
The question is - what should I do to define the reason of error? Verbose mode does not say anything.

Related

I am not able to find how previous team routed redis dump.rdb to store in different directory , it is not storing in /var/lib/redis

I am quite familiar with redis conf files, I also aware that by default redis stores dump.rdb files under /var/lib/redis
I transitioned to handle app where previous team installed redis in /opt/app/, also I see dump.rdb files present in /var/lib/redis but it is not storing anything and date stamp is 2 years old. Now I found that redis storing dump.rdb in different location but I am not able to find that location specified in redis.conf file, Is there any other file where dump.rdb location could be specified that tells redis to store dump.rdb to specific location?
You can use config get dir and config get dbfilename to get the path and filename of current RDB file. You can also use config set dir xxx and config set dbfilename xxx to dynamically change the path and filename.
Also you can use the info server command to get the path of config file your Redis instance is loading (check the config_file item)

How to disable NFS client caching?

I have a trouble with NFS client file caching. The client read the file which was removed from the server many minutes before.
My two servers are both CentOS 6.5 (kernel: 2.6.32-431.el6.x86_64)
I'm using server A as the NFS server, /etc/exports is written as:
/path/folder 192.168.1.20(rw,no_root_squash,no_all_squash,sync)
And server B is used as the client, the mount options are:
nfsstat -m
/mnt/folder from 192.168.1.7:/path/folder
Flags: rw,sync,relatime,vers=4,rsize=1048576,wsize=1048576,namlen=255,acregmin=0,acregmax=0,acdirmin=0,acdirmax=0,hard,noac,nosharecache,proto=tcp,port=0,timeo=600,retrans=2,sec=sys,clientaddr=192.168.1.20,minorversion=0,lookupcache=none,local_lock=none,addr=192.168.1.7
As you can see, "lookupcache=none,noac" options are already used to disable the caching, but seems doesn't work...
I did the following steps:
Create a simple text file from server A
Print the file from the server B by cmd "cat", and it's there
Remove the file from the server A
Wait couple minutes and print the file from the server B, and it's still there!
But if I do "ls" from the server B at that time, the file is not in the output. The inconsistent state may last a few minutes.
I think I've checked all the NFS mount options...but can't find the solution.
Is there any other options I missed? Or maybe the issue is not about NFS?
Any ideas would be appreciated :)
I have tested the same steps you have given with below parameters. Its working perfectly. I have added one more parameter "fg" in the client side mounting.
sudo mount -t nfs -o fg,noac,lookupcache=none XXX.XX.XX.XX:/var/shared/ /mnt/nfs/fuse-shared/

Using wget to download a ZIP file

I'm having trouble using wget for my Debian 7.0 VPS server hosted by OVH.
I'm trying to download a ZIP file from MediaFire, and when I connected via SSH I typed,
wget http://download1472.mediafire.com/5ndlsskkyfmg/dgx7zbbdbxawbwd/Vhalar-GGJ16.zip
Then, this is my output,
--2016-03-07 20:17:52-- http://download1472.mediafire.com/5ndlsskkyfmg/dgx7zbbd bxawbwd/Vhalar-GGJ16.zip
Resolving download1472.mediafire.com (download1472.mediafire.com)... 205.196.123 .160
Connecting to download1472.mediafire.com (download1472.mediafire.com)|205.196.12 3.160|:80... connected.
HTTP request sent, awaiting response... 302 Found
Location: http://www.mediafire.com/?dgx7zbbdbxawbwd [following]
--2016-03-07 20:17:52-- http://www.mediafire.com/?dgx7zbbdbxawbwd
Resolving www.mediafire.com (www.mediafire.com)... 205.196.120.6, 205.196.120.8
Connecting to www.mediafire.com (www.mediafire.com)|205.196.120.6|:80... connect ed.
HTTP request sent, awaiting response... 301
Location: /download/dgx7zbbdbxawbwd/Vhalar-GGJ16.zip [following]
--2016-03-07 20:17:52-- http://www.mediafire.com/download/dgx7zbbdbxawbwd/Vhala r-GGJ16.zip
Connecting to www.mediafire.com (www.mediafire.com)|205.196.120.6|:80... connect ed.
HTTP request sent, awaiting response... 200 OK
Length: unspecified [text/html]
Saving to: `Vhalar-GGJ16.zip'
[ <=> ] 94,265 440K/s in 0.2s
2016-03-07 20:17:52 (440 KB/s) - `Vhalar-GGJ16.zip' saved [94265]
The download took less than 1 second, and it's a 280MB zip file. Also, it seems to say "440 KB/s", and that math just doesn't add up.
I'm confused as to why I can't download this zip file to my server via SSH, instead of downloading it to my computer, then re-uploading it to the server.
Does anyone see a flaw I'm making in my command?
What you're doing when you're using wget to download that zip file is just downloading the html page that the zip file sits on. You can see this because if you redo the command to output to an html file like such:
wget http://download1472.mediafire.com/5ndlsskkyfmg/dgx7zbbdbxawbwd/Vhalar-GGJ16.html
and open it in the web browser of your choice, you'll get the fancy html page of that link with the mediafire download button on it.
This is entirely because mediafire wants you to verify that you're human with a captcha before you can download it. Try doing the captcha and then issuing the command:
wget http://download1472.mediafire.com/gxnd316uacsg/dgx7zbbdbxawbwd/Vhalar-GGJ16.zip
It will work.
If you have not completed the captcha on whatever computer you're trying to download it from, you need to. This is what the captcha originally looks like. Once you finish it and click "Authorize Download" you'll have free reign to wget the file from the server.
If all else fails, download it originally on your computer and use the scp command to transfer it over.
Look at the contents of the 94kb file that you downloaded in something like vi. Odds are it's not a zip file, but a html file, telling you what went wrong, and what you need to do to download the file.
A browser would have known this (the mime type would tell it that it is being served HTML, and it would display it to you rather than download it).
It is likely that this is a measure by Mediafire to prevent automated downloads of their files. It's possible that spoofing the user-agent header might help, but unlikely.
Just one tip for anyone working with wget to download a file, but the URL has a proxy string at the end after the file name (eg.?This_is_a_query_string_sample_&_123545_& ) i.e. the URL is of the form:
http://download1472.mediafire.com/5ndlsskkyfmg/dgx7zbbdbxawbwd/Vhalar-GGJ16.html?This_is_a_query_string_sample_&_123545_&
In this case always use double quotes while using wget (since & has a special meaning in shell environments)
wget "http://download1472.mediafire.com/5ndlsskkyfmg/dgx7zbbdbxawbwd/Vhalar-GGJ16.html?This_is_a_query_string_sample_&_123545_&"
You can also download the zip file specifying a new name to that file with the wget option -O.
wget -O new_name_for_the_file.zip <url-address.zip>

generate redis dump file or json file from remote redis server

I'm using rdb-tool to generate json file from Redis dump file. For example:
rdb --command json /opt/redis/data/master.rdb --db 8 > /opt/redis/data/latest.json
is there anyway that I can generate Redis json data file from remote server? something similiar to this:
rdb --command json --db 8 --host myhost.com --port 6378 > /opt/redis/data/latest.json
Thanks
Not directly.
You have to request first a dump to be generated to the remote server (with a BGSAVE command). Beware it is asynchronous, so you have to wait for the completion of the dump by checking the results of the INFO command. Then download the file on your local machine (with sftp, scp, netcat, etc ...), and finally you can run rdb-tools script locally.
Another way to do it (provided you have the memory available on your client box), is to start a slave redis-server on the client. It will automatically generate and download a dump file from the master, that you can use against rdb-tools locally.

Transferring large number of files from one server to another

I am moving servers and our images directory has around 15,000 images (2GB in size) which needs to be moved on to new server.
Images will have a different path on new server so can't just migrate whole CPANEL.
Any easy way to resolve this issue?
Thanks for help in advance.
use SFTP from one server to the other
http://support.cs.utah.edu/index.php?option=com_content&view=article&id=33&Itemid=59
edit: In answer to yr question below:
Don't need local download, sftp is direct server to server. Use sftp from SSH shell on source server like so
$ cd source_directory
$ sftp user#otherserver
Password:
Connected to otherserver
sftp> cd target_directory //this is changing dir on remote server not local
sftp> put filename
or for all files from source dir to target dir
sftp> put *