point is within circle in postgresql - sql

I want to find out whether a point is within the circle or not using postgresql.
For point within polygon, I have used following query. i need some equivalent query for circle too.
SELECT a
FROM a_table
WHERE
ST_within(a::geometry,ST_GeomFromText('Polygon((50 -80.98 , 20.99 -90.99 , 90.98 -99.99 , 50 -80.98))'));
for circle, i tried this below query :
SELECT a
FROM a_table
WHERE
ST_within(a::geometry,ST_GeomFromText('POINT(10 20)',10));
and
SELECT a
FROM a_table
WHERE
ST_within(a::geometry,ST_GeomFromText('circle((10 20),10)'));
but both of these gives errors like this :
ERROR: parse error - invalid geometry
SQL state: XX000
Hint: "714" <-- parse error at position 4 within geometry
and
ERROR: parse error - invalid geometry
SQL state: XX000
Hint: "ci" <-- parse error at position 2 within geometry

select a
from a_table
where a <# circle '((10, 20),10))';
Geometric Functions
select point '(1,1)' <# circle '((0,0), 1)';
?column?
----------
f
select point '(1,1)' <# circle '((0,0), 1.5)';
?column?
----------
t

Related

ST_Area does not exist Heroku Postgresql + Postgis

I have a Postgres extended with Postgis version 2.5 database in Heroku.
I want to use the function:
ST_Area( a_polygon )
Specifically I want a generated column in my table:
alter table buildings add building_area float generated always as ( st_area( base_polygon ) ) stored;
Where base_polygon is of type polygon.
However, I am getting this error:
ERROR: function st_area(polygon) does not exist Hint: No function matches the given name and argument types. You might need to add explicit type casts.
Aren't these commands supposed to be available after I run CREATE EXTENSION postgis?
Or, is there something else I have to do?
It seems your polygon column data type is postgre base built in polygon.
ST_Area expects postgis geometry type as a parameter.
As in this example from docs https://postgis.net/docs/ST_Area.html
select ST_Area(geom) sqft,
ST_Area(ST_Transform(geom, 26986)) As sqm
from (
select
'SRID=2249;POLYGON((743238 2967416,743238 2967450,
743265 2967450,743265.625 2967416,743238 2967416))' :: geometry
geom
) subquery;
Check if this example works, it means that ST_Area function exists.
You can add a column with postgis geometry type. https://postgis.net/docs/AddGeometryColumn.html
SELECT AddGeometryColumn ('my_schema','my_spatial_table','geom',4326,'POLYGON',2);
Then convert your polygons into postgis format, by postgis functions.
For example https://postgis.net/docs/ST_MakePolygon.html
SELECT ST_MakePolygon( ST_GeomFromText('LINESTRING(75 29,77 29,77 29, 75 29)'));

ST_GeogFromGeoJSON fails in bigquery while successful in postgres

We have geojson polygons we would like to convert to a geo object in bigquery using ST_GeogFromGeoJSON. The conversion fails in bigquery while is successful in postgres using the equivalent command ST_GeomFromGeoJSON.
I am familiar with the SAFE prefix that can be added to the the bigquery call, but we would like to use the object and not just ignore it in case the conversion fails. I tried converting the object using ST_CONVEXHULL but wasn't able to make it work.
Is there some work around in bigquery?
Example:
Running the following command in bigquery
select ST_GeogFromGeoJSON('{"type":"Polygon","coordinates":[[[-82.022982,26.69785],[-81.606813,26.710698],[-81.999574,26.109253],[-81.615053,26.105558],[-82.022982,26.69785]]]}')
returns
Query failed: ST_GeogFromGeoJSON failed: Invalid polygon loop: Edge 4 crosses edge 9
While runs successfully in postgres
select ST_GeomFromGeoJSON('{"type":"Polygon","coordinates":[[[-82.022982,26.69785],[-81.606813,26.710698],[-81.999574,26.109253],[-81.615053,26.105558],[-82.022982,26.69785]]]}')
October 2020 Update for this post
No more any tricks needed - ST_GEOGFROMGEOJSON and ST_GEOGFROMTEXT geographic functions now support a new make_valid parameter. If set to TRUE, the function attempts to correct polygon issues when importing geography data.
So, below simple statement works perfectly now ...
select ST_GeogFromGeoJSON(
'{"type":"Polygon","coordinates":[[[-0.49044,51.4737],[-0.4907,51.4737],[-0.49075,51.46989],[-0.48664,51.46987],[-0.48664,51.47341],[-0.48923,51.47336],[-0.48921,51.4737],[-0.49072,51.47462],[-0.49114,51.47446],[-0.49044,51.4737]]]}'
, make_valid => true
)
and returns expected output
Below is for BigQuery Standard SQL
Query failed: ST_GeogFromGeoJSON failed: Invalid polygon loop: Edge 4 crosses edge 9
... Is there some work around in bigquery? ...
Proposed workaround is obviously naive and simple way of fixing specific issue while easily can be extended to more generic cases. The idea here is to extract coordinates and reorder them to eliminate the problem ...
WITH test AS (
SELECT '{"type":"Polygon","coordinates":[[[-82.022982,26.69785],[-81.606813,26.710698],[-81.999574,26.109253],[-81.615053,26.105558],[-82.022982,26.69785]]]}' AS geojson
)
SELECT ST_GEOGFROMGEOJSON('{"type":"Polygon","coordinates":' || fixed_coordinates || '}') AS geo
FROM (
SELECT '[[[' || STRING_AGG(lat_lon, '],[') || '],[' || ANY_VALUE(ordered_coordinates[OFFSET(0)]) || ']]]' fixed_coordinates
FROM (
SELECT
ARRAY( SELECT lon_lat
FROM UNNEST(REGEXP_EXTRACT_ALL(JSON_EXTRACT(geojson, '$.coordinates'), r'\[+(.*?)\]+')) lon_lat
ORDER BY CAST( SPLIT(lon_lat)[OFFSET(0)] AS FLOAT64), CAST(SPLIT(lon_lat)[OFFSET(1)] AS FLOAT64)
) ordered_coordinates
FROM test
) t, t.ordered_coordinates lat_lon
)
This produces correct output
POLYGON((-82.022982 26.69785, -81.999574 26.109253, -81.8073135 26.1074055, -81.615053 26.105558, -81.606813 26.710698, -81.8148975 26.704274, -82.022982 26.69785))
and respective visualization is
Below is for BigQuery Standard SQL
My previous answer is based on oversimplified logic of re-ordering coordinates. Obviously it will not work in more complex cases like below one
{‘type’:‘Polygon’,‘coordinates’:[[[-0.49044,51.4737],[-0.4907,51.4737],[-0.49075,51.46989],[-0.48664,51.46987],[-0.48664,51.47341],[-0.48923,51.47336],[-0.48921,51.4737],[-0.49072,51.47462],[-0.49114,51.47446],[-0.49044,51.4737]]]}
Is there some more advanced sorting logic that can be applied?
So more complex logic can be used to address this
#standardSQL
WITH test AS (
SELECT '{"type":"Polygon","coordinates":[[[-0.49044,51.4737],[-0.4907,51.4737],[-0.49075,51.46989],[-0.48664,51.46987],[-0.48664,51.47341],[-0.48923,51.47336],[-0.48921,51.4737],[-0.49072,51.47462],[-0.49114,51.47446],[-0.49044,51.4737]]]}' geojson
), coordinates AS (
SELECT CAST(SPLIT(lon_lat)[OFFSET(0)] AS FLOAT64) lon, CAST(SPLIT(lon_lat)[OFFSET(1)] AS FLOAT64) lat
FROM test, UNNEST(REGEXP_EXTRACT_ALL(JSON_EXTRACT(geojson, '$.coordinates'), r'\[+(.*?)\]+')) lon_lat), stats AS (
SELECT ST_CENTROID(ST_UNION_AGG(ST_GEOGPOINT(lon, lat))) centroid FROM coordinates
)
SELECT ST_MAKEPOLYGON(ST_MAKELINE(ARRAY_AGG(point ORDER BY sequence))) AS polygon
FROM (
SELECT point,
CASE
WHEN ST_X(point) > ST_X(centroid) AND ST_Y(point) > ST_Y(centroid) THEN 3.14 - angle
WHEN ST_X(point) > ST_X(centroid) AND ST_Y(point) < ST_Y(centroid) THEN 3.14 + angle
WHEN ST_X(point) < ST_X(centroid) AND ST_Y(point) < ST_Y(centroid) THEN 6.28 - angle
ELSE angle
END sequence
FROM (
SELECT point, centroid,
ACOS(ST_DISTANCE(centroid, anchor) / ST_DISTANCE(centroid, point)) angle
FROM (
SELECT centroid,
ST_GEOGPOINT(lon, lat) point,
ST_GEOGPOINT(lon, ST_Y(centroid)) anchor
FROM coordinates, stats
)
)
)
This approach produces correct output
POLYGON((-0.49075 51.46989, -0.48664 51.46987, -0.48664 51.47341, -0.48923 51.47336, -0.48921 51.4737, -0.49072 51.47462, -0.49114 51.47446, -0.49044 51.4737, -0.4907 51.4737, -0.49075 51.46989))
which is visualized as below

divide operator error in PostgreSql: operator does not exist: unknown /

I have a Trip table in PostgreSQL DB, there is a column called meta in the table.
A example of meta in one row looks like:
meta = {"runTime": 3922000, "distance": 85132, "duration": 4049000, "fuelUsed": 19.595927498516176}
To select the trip which has largest value divided by "distance" and "runTime", I run query:
select MAX(tp."meta"->>'distance'/tp."meta"->>'runTime') maxkph FROM "Trip" tp
but I get ERROR:
/* ERROR: operator does not exist: unknown / jsonb LINE 1: MAX(tp."meta"->>'distance'/tp."meta"...
I also tried:
select MAX((tp."meta"->>'distance')/(tp."meta"->>'runTime')) maxkph FROM "Trip" tp
but get another ERROR:
/* ERROR: operator does not exist: text / text LINE 1: ...MAX((tp."meta"->>'distance')/(tp."meta...
Could you please help me to solve this problem?
There is not operator div for jsonb values. You have to cast a values on both sizes to some numeric type first:
MAX( ((tp."meta"->>'distance')::numeric) / ((tp."meta"->>'runTime')::numeric) ) maxkph
Try using parentheses:
MAX( (tp."meta"->>'distance') / (tp."meta"->>'runTime') ) as maxkph
Your second problem suggests that these values are stored as strings. So convert them:
MAX( (tp."meta"->>'distance')::numeric / (tp."meta"->>'runTime')::numeric ) as maxkph

Extracting Values from Array in Redshift SQL

I have some arrays stored in Redshift table "transactions" in the following format:
id, total, breakdown
1, 100, [50,50]
2, 200, [150,50]
3, 125, [15, 110]
...
n, 10000, [100,900]
Since this format is useless to me, I need to do some processing on this to get the values out. I've tried using regex to extract it.
SELECT regexp_substr(breakdown, '\[([0-9]+),([0-9]+)\]')
FROM transactions
but I get an error returned that says
Unmatched ( or \(
Detail:
-----------------------------------------------
error: Unmatched ( or \(
code: 8002
context: T_regexp_init
query: 8946413
location: funcs_expr.cpp:130
process: query3_40 [pid=17533]
--------------------------------------------
Ideally I would like to get x and y as their own columns so I can do the appropriate math. I know I can do this fairly easy in python or PHP or the like, but I'm interested in a pure SQL solution - partially because I'm using an online SQL editor (Mode Analytics) to plot it easily as a dashboard.
Thanks for your help!
If breakdown really is an array you can do this:
select id, total, breakdown[1] as x, breakdown[2] as y
from transactions;
If breakdown is not an array but e.g. a varchar column, you can cast it into an array if you replace the square brackets with curly braces:
select id, total,
(translate(breakdown, '[]', '{}')::integer[])[1] as x,
(translate(breakdown, '[]', '{}')::integer[])[2] as y
from transactions;
You can try this :
SELECT REPLACE(SPLIT_PART(breakdown,',',1),'[','') as x,REPLACE(SPLIT_PART(breakdown,',',2),']','') as y FROM transactions;
I tried this with redshift db and this worked for me.
Detailed Explanation:
SPLIT_PART(breakdown,',',1) will give you [50.
SPLIT_PART(breakdown,',',2) will give you 50].
REPLACE(SPLIT_PART(breakdown,',',1),'[','') will replace the [ and will give just 50.
REPLACE(SPLIT_PART(breakdown,',',2),']','') will replace the ] and will give just 50.
Know its an old post.But if someone needs a much easier way
select json_extract_array_element_text('[100,101,102]', 2);
output : 102

Oracle error using "in"

Why do I have an error in this query?
My request:
SELECT * FROM CURVES c WHERE c.TYPE_CURVES in ({0}, {10}, {20}, {30})
Error:
ORA-00911 invalid character
Because it should read:
SELECT * FROM CURVES c WHERE c.TYPE_CURVES in (0)
This is a good site for understanding it.
EDIT
Adding multiple pieces of data...
SELECT * FROM CURVES c WHERE c.TYPE_CURVES in (0,20,30,40)
Or as strings...
SELECT * FROM CURVES c WHERE c.TYPE_CURVES in ('0','20','30','40')