I was given a Redis server that is set up remotely.
I can access data in that and I can do CRUD operation with that server.
But I want the replica of the same database in my local.
I have Redis desktop manager setup in my local. And also redis-server setup running.
Things I have tried:
using SAVE command.
I have connected to the remote server and executed save command. It ran
successfully and created dump.rdb file on that server. But I can't access that file as I don't have permission for server FTP.
using BGSAVE
same scenario here also
using redis-cli command
redis-cli -h server ip -p 6379 save > \\local ip\dump.rdb
Here I got an error The network name cannot be found.
Can anyone please suggest me on how can I copy the .rdb file from the server to local?
Related
We have an inhouse backup server (ubuntu) the inhouse server calls numerous remote servers using rsync. In order to set this up with a new website i need to ssh into the remote server and add my key to the authorized_keys file. Once i can login to the remote site via ssh from the backup server the rsync is then ran manually to build the structure (no reason for this but to confirm and to speed the backup up).
Today however I'm trying to add our newest website to the backup but the rsync command gives a 255 error and fails to connect due to a connection refused issue.
To confirm:
The remote server is lightsail with lampstack
We have multiple sites being backed up with lightsail and we use other servers too
Yes I can ssh into the remote site from on the backup server so key is correct and matches whats used in the rsync command
The rsync is generated and copied and pasted and has worked before
The .ssh folder on remote is 0700 and the authorised_keys is 600 and owner is bitnami
The pem file is in the correct folder /var/www/.ssh on backup server
The user I'm logged in as on the remote server when i run this is www-data (for ssh and rsync)
simplified rsyn command is:
rsync -rLDvvvcs -e "ssh -i /var/www/.ssh/LightsailKey.pem -p 22 -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null" --progress --exclude-from '/path/to/exclude.txt' --delete --backup --backup-dir=/deleted_files/project-name/ --chmod=Du=rwx,Dgo=rx,Fu=rw,Fgo=r bitnami#{ip}:/home/bitnami/live/my-website/htdocs/ /mnt/incs/project-name/htdocs
Error from running this is
ssh: connect to host {ip} port 22: Connection refused
rsync: connection unexpectedly closed (0 bytes received so far) [Receiver]
rsync error: unexplained error (code 255) at io.c(235) [Receiver=3.1.2]
[Receiver] _exit_cleanup(code=12, file=io.c, line=235): about to call exit(255)
What am i missing with this?
thanks
Turned out it was a typo in the ip field in the database.
After running fine for a while, I am getting write error on my redis instance:
(error) MISCONF Redis is configured to save RDB snapshots, but it is currently not able to persist on disk. Commands that may modify the data set are disabled, because this instance is configured to report errors during writes if RDB snapshotting fails (stop-writes-on-bgsave-error option). Please check the Redis logs for details about the RDB error.
In the log I see:
9948:C 22 Mar 20:49:32.241 # Failed opening the RDB file root (in server root dir /var/spool/cron) for saving: Read-only file system
However, my redis config file is /etc/redis/redis.conf as confirmed by:
redis-cli -p 6379 info | grep 'config_file'
config_file:/etc/redis/redis.conf
And there I have:
dir /mnt/data/redis
And indeed, there is a snapshot there.
But despite the above, redis now thinks my data directory is
redis-cli -p 6379 CONFIG GET dir
1) "dir"
2) "/var/spool/cron"
Corresponding to the error I was getting as quoted above.
Can anyone tell me why/how my data directory is changing after redis starts, such that it is no longer what is specified in the config file?
So the answer is that the redis server was hacked and the configuration changed, which is very easy to do as it turns out. (I should point out that I had no reason to think it wasn't easy to do. I just assumed security by obscurity was sufficient in this case--wrong. No matter, this was just a playground not any sort of production server).
So don't open your redis port to the world. Use security groups if on AWS to limit access to machines that need it, or use AUTH (which is still not awesome because then all clients need to know the single password which also apparently gets sent in the clear), or have some middleware controlling access.
Hacking redis is easy to do, can compromise your data, and even enable unauthorized SSH access to your server. And that's why you shouldn't highline.
I'm using rdb-tool to generate json file from Redis dump file. For example:
rdb --command json /opt/redis/data/master.rdb --db 8 > /opt/redis/data/latest.json
is there anyway that I can generate Redis json data file from remote server? something similiar to this:
rdb --command json --db 8 --host myhost.com --port 6378 > /opt/redis/data/latest.json
Thanks
Not directly.
You have to request first a dump to be generated to the remote server (with a BGSAVE command). Beware it is asynchronous, so you have to wait for the completion of the dump by checking the results of the INFO command. Then download the file on your local machine (with sftp, scp, netcat, etc ...), and finally you can run rdb-tools script locally.
Another way to do it (provided you have the memory available on your client box), is to start a slave redis-server on the client. It will automatically generate and download a dump file from the master, that you can use against rdb-tools locally.
I have managed to connect to a remote server through ssh tunneling. No how can I copy files from remote server to my local computer. Considering that I just want to do it from remote server to my local computer.
I dont know how to write this command
"scp file/I/want/to/copy localhost/home/folder"
thanks a lot
Example:
scp username#server:/home/username/file_name /home/local-username/file-name
check this:
http://www.garron.me/linux/scp-linux-mac-command-windows-copy-files-over-ssh.html
scp -r (source)hostname:/(location of the file to be copied)/(file name) (Destination)hostname:/(location of the folder where the file should be copied to)
For example:
scp -r ram.desktop.overflow.com:/home/Desktop/Ram/abcd.txt rajesh.desktop.overflow.com:/home/documents/
Server setup (fake IPs)
utility - 1.1.1.1 - SSH access on public IP
database2 - 1.1.1.2 - SSH access on private IP from utility
On a semi-regular basis I need to do a mysqldump on database2 and pull that down to my local machine so I can debug our app with real data. My current process is as follows:
ssh into utility
ssh into database2
execute mysqldump command
exit from database2
scp dump file down to utility
exit utility
scp dump file down to local machine
Needless to say this is not optimal. Is there a quicker method, possibly via tunneling, that I could use given my setup?
you can
make dump by cron on database2
make port forward on utility and connect at once to database2
summary: you will get dump by one scp command.
UPD:
if you can't port forward or cron, you can add access to database from utility and make mysqldump from him.