I'm looking for an INSERT script counterpart in Redis where I want to set multiple keys at once.
SET foo bar
SET sun moon
SET fire water
...
How would a Lua script for the above look like as I couldn't find much help online.
For a Lua script, I would do something like so:
for i=1, #KEYS do
redis.call("SET", KEYS[i], ARGV[i])
end
Which in execution, would look like this:
EVAL 'for i=1, #KEYS do redis.call("SET", KEYS[i], ARGV[i]) end' 2 key1 key2 val1 val2
Note that #KEYS is not dynamically calculated, but rather the explicitly passed numkeys argument.
Additional validation could be added as necessary—asserting equal numbers of keys and args, for example—but I would strongly encourage doing most of that sanity checking client-side for performance.
If not using Lua, Redis has the command MSET to set multiple keys at once natively.
https://redis.io/commands/mset
Related
ch_files = Channel.fromPath("myfiles/*.csv")
ch_parameters = Channel.from(['A','B, 'C', 'D'])
ch_samplesize = Channel.from([4, 16, 128])
process makeGrid {
input:
path input_file from ch_files
each parameter from ch_parameters
each samplesize from ch_samplesize
output:
tuple path(input_file), parameter, samplesize, path("config_file.ini") into settings_grid
"""
echo "parameter=$parameter;sampleSize=$samplesize" > config_file.ini
"""
}
gives me a number_of_files * 4 * 3 grid of settings files, so I can run some script for each combination of parameters and input files.
How do I add some ID to each line of this grid? A row ID would be OK, but I would even prefer some unique 6-digit alphanumeric code without a "meaning" because the order in the table doesn't matter. I could extract out the last part of the working folder which is seemingly unique per process; but I don't think it is ideal to rely on sed and $PWD for this, and I didn't see it provided as a runtime metadata variable provider. (plus it's a bit long but OK). In a former setup I had a job ID from the LSF cluster system for this purpose, but I want this to be portable.
Every combination is not guaranteed to be unique (e.g. having parameter 'A' twice in the input channel should be valid).
To be clear, I would like this output
file1.csv A 4 pathto/config.ini 1ac5r
file1.csv A 16 pathto/config.ini 7zfge
file1.csv A 128 pathto/config.ini ztgg4
file2.csv A 4 pathto/config.ini 123js
etc.
Given the input declaration, which uses the each qualifier as an input repeater, it will be difficult to append some unique id to the grid without some refactoring to use either the combine or cross operators. If the inputs are just files or simple values (like in your example code), refactoring doesn't make much sense.
To get a unique code, the simple options are:
Like you mentioned, there's no way, unfortunately, to access the unique task hash without some hack to parse $PWD. Although, it might be possible to use BASH parameter substitution to avoid sed/awk/cut (assuming BASH is your shell of course...) you could try using: "${PWD##*/}"
You might instead prefer using ${task.index}, which is a unique index within the same task. Although the task index is not guaranteed to be unique across executions, it should be sufficient in most cases. It can also be formatted for example:
process example {
...
script:
def idx = String.format("%06d", task.index)
"""
echo "${idx}"
"""
}
Alternatively, create your own UUID. You might be able to take the first N characters but this will of course decrease the likelihood of the IDs being unique (not that there was any guarantee of that anyway). This might not really matter though for a small finite set of inputs:
process example {
...
script:
def uuid = UUID.randomUUID().toString()
"""
echo "${uuid}"
echo "${uuid.take(6)}"
echo "${uuid.takeBefore('-')}"
"""
}
I have a lot of analytics data that I'm adding to redis. I plan on incrementally moving the data out of redis and into my database.
I know I can use KEYS [the_key]:* to get all keys that match. For example, I can do that to get the following:
127.0.0.1:6379> KEYS c_Track:*
1) "c_Track:6c93a5c1-77e9-4c4a-9232-bf182713a02e"
2) "c_Track:2c9d99c2-af37-4de9-ac64-b48f339e97a9"
3) "c_Track:9e7fd190-86d9-4b4a-9a70-7bf4c7768eef"
4) "c_Track:7f2d2e98-7440-4fd7-a80a-2af309ab15a4"
Is there a recommended way to get these values easily? I can get the keys, but how can I get all the values as well? I can loop through the keys to get the values, but is there some one-shot method for doing this?
Also I know I shouldn't use keys, but this is just an example. Thanks
Thanks
Also I know I shouldn't use keys
So don't. Use SCAN instead.
is there some one-shot method for doing this?
No, not as a core Redis command, but given the need this is fairly simple to achieve with a server-side Lua script. For example, assuming that your values are strings, you could do something like the following:
local cursor = tonumber(ARGV[1])
local pattern = ARGV[2]
local scan = redis.call('SCAN', cursor, 'MATCH', pattern)
for i, v in ipairs(scan[2]) do
local val = redis.call('GET', v)
scan[2][i] = { v, val }
end
return scan
Assuming that this script is saved under "scan.lua", you can run it as follows:
$ redis-cli SET foo bar
OK
$ redis-cli SET baz qaz
OK
$ redis-cli --eval scan.lua , 0 "*"
1) "0"
2) 1) 1) "baz"
2) "qaz"
2) 1) "foo"
2) "bar"
To scan your entire keyspace, call the script with the returned cursor until it returns 0.
Notes:
1) If your keys are of different types, you should change the script accordingly (e.g. https://github.com/itamarhaber/redis-lua-scripts/blob/master/scanfetch.lua).
2) While this script goes against the common recommendation of generating key names inside a script, it is still safe to run as SCAN returns keys that are in the server's keyspace (whether single-instance or clustered).
I have seen this pass results to another command in redis
and using via command line this command works well :
src/redis-cli keys '*' | xargs src/redis-cli mget
However how can we achieve the same effect via Lettuce (i started trying out 4.0.2.Final)
Also a solution to this is particularly important in the following scenario :
Say we are using geolocation capabilities, and we add a set of locations of "my-location-category"
using GEOADD
GEOADD "category-1" 8.6638775 49.5282537 "location-id:1" 8.3796281 48.9978127 "location-id:2" 8.665351 49.553302 "location-id:3"
Next, say we do a GeoRadius to get locations within 10 km radius of 8.6582361 49.5285495 for "category-1"
Now when we get "location-id:1" & "location-id:3"
Given that I already set values for above keys "location-id:1" & "location-id:3"
I want to pipe commands to do the GEORADIUS as well as do mget on all the matching results.
Does Redis provide feature to do that?
and / or how can we achieve this via the Lettuce client library without first manually iterating through results of GEORADIUS and then do manual mget.
That would be more efficient performance for the program that uses it.
Does anyone know how we can do this ?
Update
This is the piped command for the scenario I discussed above :
src/redis-cli GEORADIUS "category-1" 8.6582361 49.5285495 10 km | xargs src/redis-cli mget
Now we need to know how to do this via Lettuce
IMPORTANT: never use KEYS, always use SCAN instead if you must.
This isn't really a question about Lettuce nor Java so I can actually answer it :)
What you're trying to do is use the results from a read operation (GEORADIUS) as input (key names) for another read operation (MGET). This type of flow can't be pipelined, well, just because of that - pipelining means that you don't need the answers for operations right away but in you case you do.
However.
Since you're reading String keys with MGET, you might as well just denormalize everything (remember, we're NoSQL) and store the contents of these keys in the Sorted Set's members, e.g.:
GEOADD "category-1" 8.6638775 49.5282537 "location-id:1:moredata:evenmoredata:{maybe a JSON document here}:orperhapsmsgpack"
This will allow you to get the locations and their "data" with one GEORADIUS call. Of course, any updates to location:1's data will need to be done across all categories.
A note about Lua scripts: while a Lua script could definitely save on the back and forth in this case, any such script will be against best practices/not cluster safe.
After digging around and studying Lua script, my conclusion is that removing round-trips in such a way can only be done via Lua scripts as suggested by Itamar Haber.
I ended up creating a lua script file (myscript.lua) as below
local locationKeys = redis.call('GEORADIUS', 'category-1', '8.6582361', '49.5285495', '10', 'km' )
if unpack(locationKeys) == nil then
return nil
else
return redis.call('MGET', unpack(locationKeys))
end
** of course we should be sending in parameters to this... this is just a poc :)
now you can execute it via command
src/redis-cli EVAL "$(cat myscript.lua)" 0
Then to reduce the network-overhead of sending across the entire script to Redis for execution, we have the option of registering the script with Redis.
Redis will give us a sha1 digested code for future references for that script, which can be used for next calls to that script.
This can be done as below :
src/redis-cli SCRIPT LOAD "$(cat myscript.lua)"
this should give back a sha1 code something like this : 49730aa2ed3034ee48f818e486tpbdf1b500b19e
next calls can be done using this code
eg
src/redis-cli evalsha 49730aa2ed3034ee48f818e486b2bdf1b500b19e 0
The sad part however here is that the sha1 digest is remembered only so long as the instance of redis is running. If it is restarted, that the sha1 digest is lost. Then you do the SCRIPT LOAD once again. And if nothing changes in the script, then the sha1-digest code will be the same.
Ideally while using through client api, we should first attempt evalsha, if that returns a "No matching script" error, then as a fallback do script load, and procure the sha1 code once again, and create an internal map of that and use that sha1 code for further calls.
This can well be done via Lettuce. I could find the methods for those. Hope this gives a good insight into solution for the problem.
My redis collection contains many keys
I want to be able to flush them all except all the keys that start with:
"configurations::"
is this possible?
You can do this
redis-cli KEYS "*" | grep -v "configurations::" | xargs redis-cli DEL
List all keys into the redis, remove from the list keys that contains "configurations::" and delete them from the redis
Edit
As #Sergio Tulentsev notice it keys is not for use in production. I used this python script to remove keys on prodution redis. I stoped replication from master to slave before call the script.
#!/usr/bin/env python
import redis
import time
pattern = "yourpattern*"
poolSlave = redis.ConnectionPool(host='yourslavehost', port=6379, db=0)
redisSlave = redis.Redis(connection_pool=poolSlave)
poolMaster = redis.ConnectionPool(host='yourmasterhost', port=6379, db=0)
redisMaster = redis.Redis(connection_pool=poolMaster)
cursor = '0'
while cursor != 0:
cursor, data = redisSlave.scan(cursor, pattern, 1000)
print "cursor: "+str(cursor)
for key in data:
redisMaster.delete(key)
print "delete key: "+key
# reduce call per second on production server
time.sleep(1)
The SCAN & DEL approach (as proposed by #khanou) is the best ad-hoc solution. Alternatively, you could keep an index of all your configurations:: key names with a Redis Set (simply SADD the key's name to it whenever you create a new configurations:: key). Once you have this set you can SSCAN it to get all the relevant key names more efficiently (don't forget to SREM from it whenever you DEL though).
Yes, it's possible. Enumerate all the keys, evaluate each one and delete if it fits the criteria for deletion.
There is no built-in redis command for this, if this is what you were asking.
It might be possible to cook up a Lua script that will do this (and it'll look to your app that it's a single command), but still it's the same approach under the hood.
I'm pretty new to Redis, so I'm not entirely sure what's possible. However, I was wondering, if I have a set of key names:
SADD set-1 key-1 key-2
Can I use those as an argument to another command, like DEL, without having to do a round trip?
Something like:
DEL (SMEMBERS set-1)
Not without scripting. You'll have to make the round trip.
eval "redis.call('del', unpack(redis.call('smembers', ARGV[1])))" 0 set-1
or if you expect a lot of keys in your set:
eval "for _,k in ipairs(redis.call('smembers', ARGV[1])) do redis.call('del', k) end" 0 set-1