predis: ZADD with NX keeps throwing error 'Predis\ServerException' - redis

I am having an issue when I try to add "NX" to the zadd command on predis. The redis docs say that ZADD should support NX, but no matter how I set up the predis command, I can not get it working. Does anyone have any experience with this issue?
Here are the commands I have tried:
$redis->zadd($key, "NX", 1, $id);
$redis->executeRaw([ 'ZADD', $key, "NX", 1, $id ]);
Here is the error that keeps getting thrown:
ERROR: exception 'Predis\ServerException' with message 'ERR syntax error'
Looking at the redis-cli monitor, I see the command execute when using the ZADD command, but the executeRaw command does nothing.
Any help would be greatly appreciated!

ZADD's NX switch was added only to the recent version of Redis, see here: https://groups.google.com/forum/#!topic/redis-db/4Y6OqK8gEyk
In all likelihood you aren't running the recent version - use INFO to find out your server's version.

Related

DBT: How to fix Database Error Expecting Value?

I was running into troubles today while running Airflow and airflow-dbt-python. I tried to debug a bit using the logs and the error shown in the logs was this one:
[2022-12-27, 13:53:53 CET] {functions.py:226} ERROR - [0m12:53:53.642186 [error] [MainThread]: Encountered an error:
Database Error
Expecting value: line 2 column 5 (char 5)
Quite a weird one.
Possibly check your credentials file that allows DBT to run queries on your DB (in our case we run DBT with BigQuery), in our case the credentials file was empty. We even tried to run DBT directly in the worker instead of running it through airflow, giving as a result exactly the same error. Unfortunately this error is not really explicit.

Writing apache beam pCollection to bigquery causes type Error

I have a simple beam pipeline, as follows:
with beam.Pipeline() as pipeline:
output = (
pipeline
| 'Read CSV' >> beam.io.ReadFromText('raw_files/myfile.csv',
skip_header_lines=True)
| 'Split strings' >> beam.Map(lambda x: x.split(','))
| 'Convert records to dictionary' >> beam.Map(to_json)
| beam.io.WriteToBigQuery(project='gcp_project_id',
dataset='datasetID',
table='tableID',
create_disposition=bigquery.CreateDisposition.CREATE_NEVER,
write_disposition=bigquery.WriteDisposition.WRITE_APPEND
)
)
However upon runnning I get a typeError, stating the following:
line 2147, in __init__
self.table_reference = bigquery_tools.parse_table_reference(if isinstance(table,
TableReference):
TypeError: isinstance() arg 2 must be a type or tuple of types
I have tried defining a TableReference object and passing it to the WriteToBigQuery class but still facing the same issue. Am I missing something here? I've been stuck at this step for what feels like forever and I don't know what to do. Any help is appreciated!
This probably occurred since you installed Apache Beam without the GCP modules. Please make sure to do following (in a virtual environment).
pip install apache-beam[gcp]
It's a weird error though so feel free to file a Github issue against the Apache Beam project.

why dbt runs in cli but throws an error on cloud UI for the exact same model?

I am executing dbt run -s model_name on CLI and the task completes successfully. However, when I run the exact same command on dbt cloud, I get this error:
Syntax or semantic analysis error thrown in server while executing query.
Error message from server: org.apache.hive.service.cli.HiveSQLException:
Error running query: org.apache.spark.sql.AnalysisException: cannot
resolve '`pv.meta.uuid`' given input columns: []; line 6 pos 4;
\n'Project ['pv.meta.uuid AS page_view_uuid#249595,
'pv.user.visitorCookieId AS (80) (SQLExecDirectW)")
it looks like it fails recognizing 'pv.meta.uuid' syntax which extract data from a json format. It is not clear to me what is going on. Any thoughts? Thank you!

Redis bitcount command returns a syntax error

after set myKey as key in Redis with foobar value
I want to get the BITCOUNT of myKey.
this command in Redis-CLI giving me an error: BITCOUNT myKey 2 3 BYTE
(error) ERR syntax error
how can I solve this?
BYTE argument is fairly new and (will be) added in Redis 7.0. So far, only 7.0.0rc3 is released but no GA.
I assume you are using a pre Redis 7 version?

springboot use lua to create redis bloom filter : #user_script:1: ERR bad error rate

I use the SpringBoot provided redistemplate to execute the Lua script:
return redis.call('bf.reserve', KEYS[1],ARGV[1],ARGV[2])
but it keeps getting wrong:
ERR Error running script (call to f_264cca3824c7a277f5d3cf63f1b2642a0750e989): #user_script:1: ERR bad error rate.
this is my docker image:
redislabs/rebloom:2.2.5
i try to run this script in linux command,it works:
[root#daice ~]# redis-cli --eval a.lua city , 0.001 100000
OK
[root#daice ~]# redis-cli
127.0.0.1:6379> keys *
1) "city"
I just looked up the error in this link, the snippet looks like
if (RedisModule_StringToDouble(argv[2], &error_rate) != REDISMODULE_OK) {
return RedisModule_ReplyWithError(ctx, "ERR bad error rate");
I assume the argument that you are providing for error_rate does not convert to a double value.