divide operator error in PostgreSql: operator does not exist: unknown / - sql

I have a Trip table in PostgreSQL DB, there is a column called meta in the table.
A example of meta in one row looks like:
meta = {"runTime": 3922000, "distance": 85132, "duration": 4049000, "fuelUsed": 19.595927498516176}
To select the trip which has largest value divided by "distance" and "runTime", I run query:
select MAX(tp."meta"->>'distance'/tp."meta"->>'runTime') maxkph FROM "Trip" tp
but I get ERROR:
/* ERROR: operator does not exist: unknown / jsonb LINE 1: MAX(tp."meta"->>'distance'/tp."meta"...
I also tried:
select MAX((tp."meta"->>'distance')/(tp."meta"->>'runTime')) maxkph FROM "Trip" tp
but get another ERROR:
/* ERROR: operator does not exist: text / text LINE 1: ...MAX((tp."meta"->>'distance')/(tp."meta...
Could you please help me to solve this problem?

There is not operator div for jsonb values. You have to cast a values on both sizes to some numeric type first:
MAX( ((tp."meta"->>'distance')::numeric) / ((tp."meta"->>'runTime')::numeric) ) maxkph

Try using parentheses:
MAX( (tp."meta"->>'distance') / (tp."meta"->>'runTime') ) as maxkph
Your second problem suggests that these values are stored as strings. So convert them:
MAX( (tp."meta"->>'distance')::numeric / (tp."meta"->>'runTime')::numeric ) as maxkph

Related

Parse Json - CTE & filtering

I need to remove a few records (that contain t) in order to parse/flatten the data column. The query in the CTE that creates 'tab', works independent but when inside the CTE i get the same error while trying to parse json, if I were not have tried to filter out the culprit.
with tab as (
select * from table
where data like '%t%')
select b.value::string, a.* from tab a,
lateral flatten( input => PARSE_JSON( a.data) ) b ;
;
error:
Error parsing JSON: unknown keyword "test123", pos 8
example data:
Date Data
1-12-12 {id: 13-43}
1-12-14 {id: 43-43}
1-11-14 {test12}
1-11-14 {test2}
1-02-14 {id: 44-43}
It is possible to replace PARSE_JSON(a.data) with TRY_PARSE_JSON(a.data) which will produce NULL instead of error for invalid input.
More at: TRY_PARSE_JSON

Can not filter by '0000000000000' jsonb column

data is jsonb type
this query work
select id, modified_at, data,states from order where data->'id' = '2100000044078'
nice
but this not work
select id, modified_at, data,states from shop_order where data->'id' = '0000000000000'
error:
LINE 2: ..., data,states from order where data->'id' = '000000000...
^
DETAIL: Token "0000000000000" is invalid.
CONTEXT: JSON data, line 1: 0000000000000
SQL state: 22P02
Character: 161
-> returns a jsonb value, but you are trying to compare that with a text column. So Postgres tries to convert the value on the right hand side to jsonb.
Use the ->> operator instead which returns the value as text
where data->>'id' = '0000000000000'

PG::InvalidParameterValue: ERROR: cannot extract element from a scalar

I'm fetching data from the JSON column by using the following query.
SELECT id FROM ( SELECT id,JSON_ARRAY_ELEMENTS(shipment_lot::json) AS js2 FROM file_table WHERE ('shipment_lot') = ('shipment_lot') ) q WHERE js2->> 'invoice_number' LIKE ('%" abc1123"%')
My Postgresql version is 9.3
Saved data in JSON column:
[{ "id"=>2981, "lot_number"=>1, "activate"=>true, "invoice_number"=>"abc1123", "price"=>378.0}]
However, I'm getting this error:
ActiveRecord::StatementInvalid (PG::InvalidParameterValue: ERROR: cannot extract element from a scalar:
SELECT id FROM
( SELECT id,JSON_ARRAY_ELEMENTS(shipment_lot::json)
AS js2 FROM file_heaps
WHERE ('shipment_lot') = ('shipment_lot') ) q
WHERE js2->> 'invoice_number' LIKE ('%abc1123%'))
How I can solve this issue.
Your issue is that you have improper JSON stored
If you try running your example data on postgres it will not run
SELECT ('[{ "id"=>2981, "lot_number"=>1, "activate"=>true, "invoice_number"=>"abc1123", "price"=>378.0}]')::json
This is the JSON formatted correctly:
SELECT ('[{ "id":2981, "lot_number":1, "activate":true, "invoice_number":"abc1123", "price":378.0}]')::json

AWS Athena: Error parsing field value and unexpected query results

I have the following table schema prepared by AWS glue
When I query the table using SELECT * FROM "vietnam-property-develop"."sell" limit 10;, it throws an error:
HIVE_BAD_DATA: Error parsing field value '{"area":"85
m²","date":"14/01/2020","datetime":"2020-01-18
00:42:28.488576+00:00","address":"Quan Hoa - Cầu Giấy","price":"20
Tỷ","cat":"Bán nhà mặt
phố","lon":"105.7976502","avatar":"","id":"24169794","title":"Chính
chủ cần bán nhà mặt phố nguyễn văn huyên Quan Hoa Cầu Giấy, 2 tầng, dt
85m2. LH 0903233723","lat":"21.0376771","room":"0"}' for field 4:
org.openx.data.jsonserde.json.JSONObject cannot be cast to
java.lang.Double
Then I tired to just query the title column by using SELECT title FROM "vietnam-property-develop"."sell" limit 10;
It returns result which I didn't expect. It seems that the query return the whole json files instead of just the title column. And the number of rows is 4 but not 10 no matter how I modify the query.

How can I use arrayExists function when the array contains a null value?

I have a nullable array column in my table: Array(Nullable(UInt16)). I want to be able to query this column using arrayExists (or arrayAll) to check if it contains a value above a certain threshold but I'm getting an exception when the array contains a null value:
Exception: Expression for function arrayExists must return UInt8, found Nullable(UInt8)
My query is below where distance is the array column:
SELECT * from TracabEvents_ArrayTest
where arrayExists(x -> x > 9, distance);
I've tried updating the comparison in the lambda to "(isNotNull(x) and x > 9)" but I'm still getting the error. Is there any way of handling nulls in these expressions or are they not supported yet?
Add a condition to filter rows with empty list using notEmpty and assumeNotNull for x in arrayExists.
SELECT * FROM TracabEvents_ArrayTest WHERE notEmpty(distance) AND arrayExists(x -> assumeNotNull(x) > 9, distance)