Assuming we want to automate the process of creating RDB files (and don't want to use Redis server for this purpose) what options are available?
The current process involves importing (with redis-cli) a set of RESP files to a Redis server and then saving a RDB file to disk (all that in a stateless Redis container, where the RDB file is not persistent and difficult to access automatically). The imported dictionaries are too large for automated data ingestion via a remote Redis python client (we have to import from files).
If the question's restrictions are liberalized somewhat to not running a local redis-server (as opposed to no dependency on any Redis server), is becomes possible to save (or more precisely, download) a remote Redis server database to a local (client-side) RDB file by connecting from a local client (redis-cli) to a remote redis-server instance (as pointed out by Itamar Haber in a comment to this answer), like this:
redis-cli -h <REMOTE_URL> -p <REMOTE_PORT> --rdb <LOCAL_PATH>/dump.rdb
It is equally possible to use redis-cli to first ingest the data from local RESP files to a remote Redis server (in order to later re-export the data to a local RDB file, as described above).
Related
I have 2 different servers with redis instance running in each on same port, and both are using same appendonly.aof file which is stored in common shared path of both the servers.
If i add some keys using one machine those are not reflected in other machine, if i restart the other server all the changes are refplected.
is there any way or any configuration changes so that when ever the changes are done by one machine should be reflected to other immediatly?
I am new to using Redis and I am playing around a little bit with it. I have noticed that after a little time, let's say 10 minutes all the keys that I inserted just go away.
I just did the default installation showed in the documentation. I didn't configure anything with a redis.config. Is there any configuration that I need to do so my data can persist?
Environment
Redis Server
Redis server v=6.2.6 sha=00000000:0 malloc=jemalloc-5.1.0 bits=64 build=557672d61c1e18ba
Redis-cli
redis-cli 6.2.6
Ubuntu 18.08 VM.
I have also been using redisInsight to insert the keys.
there are two mechanisms for persisting the data to disk:
snapshotting
append-only file (aof)
if you want to use snapshotting, you need to add the following settings in redis.conf file
dir ./path_for_saving_snapshot
dbfilename "name_of _snapshot.rdb"
save 60 1000
with this configuration, redis will dump the data to disk every 60 seconds if at least 1,000 keys changed in that period.
if you want to use aof, you need to add the following settings in redis.conf file
appendonly yes
appendfilename "your_aof_file.aof"
appendfsync everysecond
everysecond is the default FYSNC policy. you have also other options.
You can configure your Redis instance to use either of the two mechanisms or a combination of both.
Is there any reliable backup & restore method, tool, ... for Redis DB ?
been googling around couples of hours and there was nothing just copy the dump file /var/lib/redis/dump.rdb or some scripts that uses MIGRATE (idk if this even count as backup)
Ok lets say there is Redis DB (big one) and its Windows version of Redis
github.com/MicrosoftArchive/redis
so we need a copy of this DB in other branch of company that uses final Linux version of Redis, cuz windows version is outdated and its performance is not good as Linux version.
all keys and values are encrypted and stored as binary format in Redis, so is there any reliable backup & restore for Redis DB ?
There are solutions around that helps you automate this process.
Making a Redis backup is quite simple though:
Review / update your Redis configuration.
Create a snapshot for your Redis data (known as a "dump"/"rdb file").
Save the RDB file to a remote location
To create the snapshot (2) you'll need to use redis-cli SAVEor BGSAVEcommand.
You'll then need to create a little script (you can find one here) to transfer your .rdb file to a remote storage (like AWS S3).
You can then automate all of that using CRON.
Of course, you can now use services like SimpleBackups to get all of that done for you.
I'm trying to create a Redis cluster using an RDB file taken from a single-instance Redis server. Here is what I've tried:
#! /usr/bin/env bash
for i in 6000 6001 6002 6003
do
redis-server --port $i --cluster-config-file "node-$i.cconf" --cluster-enabled yes --dir "$(pwd)" --dbfilename dump.rdb &
done
That script starts up 4 Redis processes that are cluster enabled. It also initializes each node with the dump file.
Then I run redis-trib.rb so that the 4 nodes can find each other:
redis-trib.rb create 127.0.0.1:6000 127.0.0.1:6001 127.0.0.1:6002 127.0.0.1:6003
I get the following error:
>>> Creating cluster
[ERR] Node 127.0.0.1:6060 is not empty. Either the node already knows other nodes (check with CLUSTER NODES) or contains some key in database 0.
I've also tried a variant where only the first node/process is initialized with the RDB file and the others are empty. I can join the 3 empty nodes into a cluster but not the one that's pre-populated with data.
What is the correct way to import a pre-existing RDB file into a Redis cluster?
In case this is an "X-Y problem" this is why I'm doing this:
I'm working on a migration from a single-instance Redis server to an Elasticache Redis Cluster (cluster mode enabled). Elasticache can easily import an RDB file on cluster startup if you upload it to S3. But it takes a long time for an Elasticache cluster to start. To reduce the feedback loop as I test my migration code, I'd like to be able to start a cluster locally also.
I've also tried using the create-cluster utility, but that doesn't appear to have any options to pre-populate the cluster with data.
How to disable Save for some DBs and allow for the others in the Redis
You cannot. An RDB snapshot is a single file that contains the data of all dbs.
You can send a FLUSHDB on the dbs you do not want to restore after the RDB is loaded.
If you'll use a dedicated Redis process for each db you could configure each one differently with a dedicated redis.conf file, and a SAVE and BGSAVE commands will only create a snapshot of the Redis process it was issued on.