I have a column named wkt_geometry in a table in Postgres with data converted from latitude and longitude. But I want to convert the wkt_geometry to wkb_geometry using sql commands.
If I convert it directly from Lat and long it would even better.
Also I have seen the ST_AsBinary(geometry) but I dont understand the parameters involved.
WKT and WKT are two formats that represent Geometry (or Geography) type.
To convert between various formats, you need to construct Geometry type and then export it to desired format. In this case ST_AsBinary(ST_GeomFromText(wkt_text)) should generally work.
If you want to produce WKB directly from lat/lng - something like ST_AsBinary(ST_MakePoint(lng, lat)) should work too. Pay attention to argument order, SQL functions use lng / lat order.
Related
I wanted to create a database table in Flask which consists of longitude and latitude fields in table?So what should be the field types for those?
Depending on your database type, you'll probably want a numeric type that supports decimals. PostgreSQL supports double precision, which you can use like so:
from sqlalchemy.dialects.postgresql import DOUBLE_PRECISION
or
db = SQLAlchemy()
db.DOUBLE_PRECISION
Keep in mind that you're prone to floating point errors with decimals, so when you want to know whether latitude/longitude values have changed, it may be smart to round them to a specific number of decimals and/or use numpy.isclose to compare the old and new values
When I tell postgreSQL to show a column as float, I always get as a result "double precision".
Is it the same?
Like Damien quoted from the documentation:
PostgreSQL also supports the SQL-standard notations float and float(p) for specifying inexact numeric types.
Here, p specifies the minimum acceptable precision in binary digits.
PostgreSQL accepts float(1) to float(24) as selecting the real type,
while float(25) to float(53) select double precision.
Values of p outside the allowed range draw an error.
float with no precision specified is taken to mean double precision.
PostgreSQL, like other databases, supports the SQL standard by supplying an appropriate data type when a certain SQL standard type is requested. Since real or double precision fit the bill here, they are taken instead of creating new redundant types.
The disadvantage is that the data type of a column may read different from what you requested, but as long as it handles your data the way it should, is that a problem?
What data type do I classify these numbers as?
I've tried number, float, float external, and binary double. All of these have given me an invalid datatype error when trying to create a table on SQL to eventually load into.
Assuming your data is currently in string format (and not in number format - if in fact your data is in number format, please see my comment to your question), then you don't need a data type - you need the proper format model to convert strings representing numbers in scientific format into actual numbers.
http://docs.oracle.com/cd/B19306_01/server.102/b14200/sql_elements004.htm#BABFJEAA
For example:
SQL> select to_number('1.0e+02', '9.9eeee') as converted_to_number from dual;
CONVERTED_TO_NUMBER
-------------------
100
1 row selected.
This takes the string '1.0e+02' (representing the number 100 in scientific notation) and uses the format model '9.9eeee' - see the linked documentation for this format model.
I am newbie to PostgreSQL and postGIS...I have a PostgreSQL database dump, which contains a series of location data.
The location data should be the longitude and altitude of some points but is stored in the sql file in a string of numbers and letters, which is odd for me. For instance:
0101000020E61000009513A7B4801F6340131CD8D0766A3BC0
How I can convert this string to longitude and altitude using the query of PostgreSQL?
If you have a postgis extension and than you can cast geography-location to geometry and use ST_AsLatLonText(geometry) function.
SELECT ST_AsLatLonText
FROM (0101000020E61000009513A7B4801F6340131CD8D0766A3BC0::geometry);
I'm trying to convert a shapefile I have to SQL format.
I've tried doing this using shp2pgsql, but, alas, this program doesn't read the SHAPEFILE.prj file, so I end up with coordinates in an inconvenient format.
Is there a way to convert shapefiles to SQL which respects their PRJ specification?
You may have things in one projection that you want to display or interact with in more familiar values, like longitude and latitude. For example Planet OpenStreetMap uses a spherical mercator and gives you values like this when you ask for text:
cal_osm=# select st_astext(way) from planet_osm_point limit 3;
st_astext
-------------------------------------------
POINT(-13852634.6464924 4928686.75004766)
POINT(-13850470.0501262 4930555.55031171)
POINT(-13850160.8268447 4930880.61375574)
(3 rows)
You can use st_transform to return a more familiar format like this:
cal_osm=# select st_astext(st_transform(way, 4326)) from planet_osm_point limit 3;
st_astext
-------------------------------------------
POINT(-124.440334282677 40.4304093953086)
POINT(-124.42088938268 40.4431868953078)
POINT(-124.418111582681 40.4454091953076)
(3 rows)
A prj file is essentially a text file that contains the coordinate system information in the ESRI Well-known-text(WKT) format. Could you just write a program that uses shp2pgsql to convert the geometries and then store the associated WKT string from the prj?
Of note: The WKT format is an EPSG accepted format for delimiting projected and geographic coordinate system information, but different authorities may have different names for projections or projection parameters. PostGIS might be different from Oracle which might in turn be different from ESRI. So if you store the prj's WKT make sure that it is in an esri_coordinate_system column. PostGIS might have a different naming convention format for parameters.
Also, in case you're interested, there is a C++ open FileGDB api that allows you to access row information without license. It's available in 32 and 64 bit on windows and linux:
http://resources.arcgis.com/content/geodatabases/10.0/file-gdb-api