I'm trying to iterate through a Redis stream using XRANGE. The Redis documentation states that I need to prefix ( to my last timestamp to be exclusive. Quoting the documentation:
In order to continue the iteration with the next two items, I have to pick the last ID returned, that is 1519073279157-0 and add the prefix ( to it. The resulting exclusive range interval, that is (1519073279157-0 in this case, can now be used as the new start argument for the next XRANGE call:
But if I do that I get an error, below are two commands, one without the exclusive prefix and one with which generates an error:
redis:6379> XRANGE unittest 1612384862718-0 +
1) 1) "1612384862718-0"
2) 1) "x"
2) "42"
2) 1) "1612384862780-0"
2) 1) "x"
2) "43"
3) 1) "1612384862888-0"
2) 1) "x"
2) "44"
redis:6379> XRANGE unittest (1612384862718-0 +
(error) ERR Invalid stream ID specified as stream command argument
The example in the Redis documentation looks the same to me:
> XRANGE mystream (1519073279157-0 + COUNT 2
1) 1) 1519073280281-0
2) 1) "foo"
2) "value_3"
2) 1) 1519073281432-0
2) 1) "foo"
2) "value_4"
redis_version:6.0.6
Support for exclusive/open range query intervals was/will be added in Redis 6.2 - see https://github.com/redis/redis/pull/8072.
For previous versions, the recommended approach is to have the client code to:
After calling the first XRANGE/XREVRANGE, take the last ID
Parse the last ID for timestamp and sequence (both 64-bit integers)
If doing XRANGE, try to increment the sequence. For XREVRANGE decrement it.
If the sequence overflows/underflows, perform the same arithmetic on the timestamp and init the sequence (MAXINT for XREVRANGE, 0 for XRANGE)
Handle the "0-0" and "MAXINT-MAXJNT" cases
Use the new ID in the next call to the query and repeat
Related
The Redis ZSET Sorted Set (member, score) sorts the set by the score.
The Redis SET are an unordered collection of unique Strings.
What I need is a method that returns the members in a Sorted Set matching a pattern as in ZRANGEBYLEX but with members with different scores.
Is is possible at all with redis?
Well, it seems I did not know about the SCAN suite. In this case ZSCAN solves this issue, however with cost O(N) where N is the number of items in sorted set because it iterates over the whole set.
Example of elements in:
127.0.0.1:6379> zrange l 0 -1 WITHSCORES
1) "foodgood:irene"
2) "1"
3) "foodgood:pep"
4) "1"
5) "sleep:irene"
6) "1"
7) "sleep:juan"
8) "1"
9) "sleep:pep"
10) "1"
11) "sleep:toni"
12) "1"
13) "foodgood:juan"
14) "2"
Now ZSCAN for those with prefix foodgood:
127.0.0.1:6379> ZSCAN l 0 match foodgood:*
1) "0"
2) 1) "foodgood:irene"
2) "1"
3) "foodgood:pep"
4) "1"
5) "foodgood:juan"
6) "2"
The first returned value "0" if zero indicates collection was completely explored.
What I would have liked is that to be O(log(N)+M) where M is the number of elements retrieved similar to Binary Search Tree.
In our electronic trading system, we need to do calculation based on tick data from 100+ contracts.
Tick data of contracts is not received in one message. One message only include tick data for one contract. Timestamp of contracts are slightly different (sometimes big diff, but let's ignore this case).
eg: (first column is timestamp. Second is contract name)
below 2 data has 1ms diff
10:34:03.235,10002007,510050C2006A03500 ,0.0546
10:34:03.236,10001909,510050C2003A02750 ,0.3888
below 2 data has 3ms diff
10:34:03.594,10002154,510300C2003M03700 ,0.4985
10:34:03.597,10002118,510300C2001M03700 ,0.4514
Only those with price change will have data. So I can't count contract number to know if I have received all data for this tick.
But on the other hand, we don't want to wait till we receive all data for the tick, because sometimes data could be late for long time, we will want to exclude them.
Low latency is required. So I think we will define a window - say 50 ms - and start to calculate based on whatever data we received in past 50ms.
What will be the best way to handle such use case?
Originally I want to use redis stream to maintain a small queue, that whenever a contract's data is received, I will push it to redis stream. But I couldn't figure out what's the best way to pull data as soon as specific time (say 50ms) passed.
I am thinking about maybe I should use some other technicals?
Any suggestions are appreciated.
Use XRANGE myStream - + COUNT 1 to get the first entry.
Use XREVRANGE myStream + - COUNT 1 to get the last entry.
XINFO STREAM myStream also brings first and last entry, but the docs say it is O(log N).
Assuming you are using a timestamp as ID, or as a field, then you can compute the time difference.
If you are using Redis Streams auto-ID (XADD myStream * ...), the first part of the ID is the UNIX timestamp in milliseconds.
Assuming the above, you can do the check atomically with a Lua script:
EVAL "local first = redis.call('XRANGE', KEYS[1], '-', '+', 'COUNT', '1') local firstTime = {} if next(first) == nil then return redis.error_reply('Stream is empty or key doesn`t exist') end for str in string.gmatch(first[1][1], '([^-]+)') do table.insert(firstTime, tonumber(str)) end local last = redis.call('XREVRANGE', KEYS[1], '+', '-', 'COUNT', '1') local lastTime = {} for str in string.gmatch(last[1][1], '([^-]+)') do table.insert(lastTime, tonumber(str)) end local ms = lastTime[1] - firstTime[1] if ms >= tonumber(ARGV[1]) then return redis.call('XRANGE', KEYS[1], '-', '+') else return redis.error_reply('Only '..ms..' ms') end" 1 myStream 50
The arguments are numKeys(1 here) streamKey timeInMs(50 here): 1 myStream 50.
Here a friendly view of the Lua script:
local first = redis.call('XRANGE', KEYS[1], '-', '+', 'COUNT', '1')
local firstTime = {}
if next(first) == nil then
return redis.error_reply('Stream is empty or key doesn`t exist')
end
for str in string.gmatch(first[1][1], '([^-]+)') do
table.insert(firstTime, tonumber(str))
end
local last = redis.call('XREVRANGE', KEYS[1], '+', '-', 'COUNT', '1')
local lastTime = {}
for str in string.gmatch(last[1][1], '([^-]+)') do
table.insert(lastTime, tonumber(str))
end
local ms = lastTime[1] - firstTime[1]
if ms >= tonumber(ARGV[1]) then
return redis.call('XRANGE', KEYS[1], '-', '+')
else
return redis.error_reply('Only '..ms..' ms')
end
It returns:
(error) Stream is empty or key doesn`t exist
(error) Only 34 ms if we don't have the required time elapsed
The actual list of entries if the required time between first and last message has elapsed.
Make sure to check Introduction to Redis Streams to get familiar with Redis Streams, and EVAL command to learn about Lua scripts.
I use sorted set in Redis.
The common value of data in sorted set is over one million. How can I read this sorted set by partitions? I mean first 100 000 rows and the following?
There is only one command to take data: smembers set
You can use the ZRANGE command on your sorted set and specify the start and stop to get 100,000 entries, and then 100,001 to 200,000 for the next ZRANGE.
ZRANGE documentation on Redis.io
You mentioned using smembers set to take data, but that is used only on non-sorted sets. If you are actually using a non-sorted set, then you would need to use SPOP and define your count at 100,000. However, this would simultaneously remove all those entries.
SPOP documentation on Redis.io
You can iterate through the elements of an unsorted set incrementally using SSCAN. Start with cursor 0 and use the returned cursor in subsequent calls, until 0 is returned again.
pantalones:6379> SSCAN five-characters 0 COUNT 3
1) "7"
2) 1) "d"
2) "e"
3) "a"
4) "c"
pantalones:6379> SSCAN five-characters 7 COUNT 3
1) "0"
2) 1) "b"
In this example, the first call to SSCAN returns a cursor of 7, which is then provided to the second call to SSCAN. The second call returns a cursor of 0, so we know the iteration is complete.
See SSCAN documentation on Redis.io.
I have a sorted set that keeps growing in real time and it contains some ID's which I want to retrieve 5 at a time in reverse order of rank. This is basically to implement pagination. These ID's are keys to a Hashmap. Is there any way to get 5 elements at a time efficiently using redis ZSet operations?
For example, in the Sorted Set below, let's say I want to get 5 elements before "572c7d87e53156245a3fd167", how could I do that given that new ID's could keep getting added after my last element in run time? The expected result should give me the ID's 572c7c58e53156245a3fd166, 572c7ad2e53156245a3fd165, 572c746e1eeba6b059b08f1b, 572c74531eeba6b059b08f1a, and 572c6fc9612ad65757cca4f9.
1) "572b58c0dd319a1a4703eba8"
2) "1462429760.8629999"
3) "572c697e612ad65757cca4f7"
4) "1462499582.6889999"
5) "572c6a8e612ad65757cca4f8"
6) "1462499854.056"
7) "572c6fc9612ad65757cca4f9"
8) "1462501193.927"
9) "572c74531eeba6b059b08f1a"
10) "1462502355.5250001"
11) "572c746e1eeba6b059b08f1b"
12) "1462502382.313"
13) "572c7ad2e53156245a3fd165"
14) "1462504018.325"
15) "572c7c58e53156245a3fd166"
16) "1462504408.1370001"
17) "572c7d87e53156245a3fd167"
18) "1462504711.4200001"
19) "572c7da3e53156245a3fd168"
20) "1462504739.352"
One option is to look at ZRANGEBYLEX or ZRANGEBYSCORE and use the offset/count.
However what I usually do for pagination is create a new list (kind of a snapshot of the original list), that doesn't change dynamically, and load data from there. That way it doesn't feel like chasing a moving target. I just set a TTL to it and forget about it.
my redis version:3.0.2
Hash data as below show.
key name:test
contents(values):
1) "xx1"
2) "1"
3) "xx2"
4) "2"
5) "xx3"
6) "3"
7) "xx4"
8) "4"
9) "xx5"
10)"5"
use commond -->HSCAN test 0 COUNT 2
Redis return every key and value, not the first of 2 keys and values!
COUNT option for SCAN does not limit the number of key-values returned.
It is used to force the command to increase the number key-values returned.
Redis COUNT option doc:
When iterating Sets encoded as intsets (small sets composed of just
integers), or Hashes and Sorted Sets encoded as ziplists (small hashes
and sets composed of small individual values), usually all the
elements are returned in the first SCAN call regardless of the COUNT
value.
So, get first two values from the result of HSCAN test 0 command.