Lets assume I have a column X which is of jsonb type. X has json's of structure
{"y":"some value","z":"some more values"}.
Now what I need to achieve is that I want to append "!!!!" to the end of all z properties. The append operation should update the existing records
Use the ->> operator to get the value of z as text and the || operator to append the other string. Convert the result to jsonb with to_jsonb() and assign it to z with jsonb_set().
SELECT jsonb_set(x, '{z}', to_jsonb(x->>'z' || '!!!!'))
FROM elbat;
More info: "9.15. JSON Functions and Operators"
This did the trick
UPDATE thetable set x=jsonb_set(x, '{z}', to_jsonb(x->>'z' || '!!!!'), true)
Related
I have a column in my Postgres database that stores jsonb type values. Some of these values are raw strings (not a list or dictionary). I want to be able to perform a regex search on this column, such as
select * from database where jsonb_column::text ~ regex_expression.
The issue is that for values that are already strings, converting from jsonb to text adds additional escaped double quotes at the beginning and end of the value. I don't want these included in the regex query. I understand why Postgres does this, but if, say we assume all values stored in the jsonb field were jsonb strings, is there a work around? I know you can use ->> to get a value out of a jsonb dictionary, but can't figure out a solution for just jsonb strings on their own.
Once I figure out how to make this query in normal Postgres, I want to translate it into Peewee. However, any and all help with even just the initial query would be appreciated!
Just cast the json to text. Here is an example:
class Reg(Model):
key = CharField()
data = BinaryJSONField()
class Meta:
database = db
for i in range(10):
Reg.create(key='k%s' % i, data={'k%s' % i: 'v%s' % i})
# Find the row that contains the json string "k1": "v1".
expr = Reg.data.cast('text').regexp('"k1": "v1"')
query = Reg.select().where(expr)
for row in query:
print(row.key, row.data)
Prints
k1 {'k1': 'v1'}
To extract a plain string (string primitive without key name) from a JSON value (json or jsonb), you can extract the "empty path" like:
SELECT jsonb '"my string"' #>> '{}';
This also works for me (with jsonb but not with json), but it's more of a hack:
SELECT jsonb '"my string"' ->> 0
So:
SELECT * FROM tbl WHERE (jsonb_column #>> '{}') ~ 'my regex here';
I have a jsonarray as given below
[{"key1":10},{"key1":20},{"key1":30}]
I want to get all the json objects with a specified condition. say get all the json objects with key1 less than 25. so my sql query should return this list
[{"key1":10},{"key1":20}]
what is the resultant SQL query for this.
step-by-step demo:db<>fiddle
SELECT
json_agg(elements) -- 3
FROM mytable,
json_array_elements(mydata) as elements -- 1
WHERE (elements ->> 'key1')::int < 25 -- 2
Extract the JSON array: Each element is now in an own record
Filter by the value. Notice, that ->> returns a type text, so you need to cast it into type int in your case
Reaggregate the remaining elements into the new array
If you are using Postgres 12 or later you can use a JSON path function:
select jsonb_path_query_array(the_column, '$[*] ? (#.key1 <= 25)')
from the_table
Online example
I have a column which sometimes is a string, other times a string array with a single element. Unfortunately I don't have a means to change this behavior so it always have one data type.
When the column is an array, I need to select its first element, when it contains a string, I need to select another column.
When I do:
SELECT IFNULL(`myColumn`[0],`myOtherColumn`) FROM myTable
If myColumn is a string it throws:
org.apache.spark.sql.AnalysisException: Can't extract value from
myColumn#691: need struct type but got string
I checked out java_method, but afaik it only works with static methods from java libraries, so I can't use isArray or some other instance method.
Is there a way to conditionally select columns based on column data type?
I can't figure out a way to put a column type check into a case when, but here is a hack which could work:
select
case when (substring(cast(myColumn as string), 1, 1) = '[') and
(substring(cast(myColumn as string), -1, 1) = ']')
then split(trim(both '[]' from cast(myColumn as string)), ',')[0]
else myOtherColumn
end
from myTable;
Of course this could fail in case the string happens to begin with [ or end with ]. If you could use pyspark/scala you can do it in a more reliable way by checking the column types.
A field is defined to be array(varchar) in modeanlyatic.
I want to search for records with field contains a certain text pattern, says 'ABCD'.
If I run this SQL:
select * from data_table
where top_results like '%ABCD%'
It throws this error:
Query failed (#20190730_021145_23663_dn9fj): line 2:7: Left side of
LIKE expression must evaluate to a varchar (actual: array(varchar)
What is the correct syntax to detect the presence of a certain string?
Use filter(array(T), function(T, boolean)) -> array(T)
it returns array containing elements for which function returns true. Use cardinality function to check if array is not empty:
SELECT cardinality(filter(ARRAY ['123ABCD456', 'DQF', 'ABCD', 'ABC'], x -> x like '%ABCD%'))>0 ;
Returns
true
Check another value:
SELECT cardinality(filter(ARRAY ['123ABCD456', 'DQF', 'ABCD', 'ABC'], x -> x like '%XXX%'))>0 ;
Returns
false
Hope you got the idea.
Convert it to a json string and then search the string
json_format(cast(top_results as JSON))
select * from data_table
where json_format(cast(top_results as JSON)) like '%ABCD%'
I would use any_match for that:
SELECT *
FROM data_table
WHERE any_match(top_results, s -> s like '%ABCD%')
I have some columns in PostgreSQL database that are array. I want to add a new value (in UPDATE) in it if the value don't exists, otherwise, don't add anytihing. I don't want to overwrite the current value of the array, but only add the element to it.
Is possible do this in a query or I need to do this inside a function? I'm using PostgreSQL.
This should be as simple as this example for an integer array (integer[]):
UPDATE tbl SET col = col || 5
WHERE (5 = ANY(col)) IS NOT TRUE;
A WHERE clause like:
WHERE 5 <> ALL(col)
would also catch the case of an empty array '{}'::int[], but fail if a NULL value appears as element of the array.
If your arrays never contain NULL as element, consider actual array operators, possibly supported by a GIN index.
UPDATE tbl SET col = col || 5
WHERE NOT col #> '{5}';
See:
Check if value exists in Postgres array
Can PostgreSQL index array columns?