Convert wkt to oracle geometry - sql

How to convert wkt to oracle sql
I am using:
SELECT SDO_GEOMETRY('POINT(-121.909315288067 37.3618668002592)',4326) as station_geom_oracle FROM dual
i got MDSYS.SDO_GEOMETRY
Run on:
Java(TM) Platform 1.8.0_151
Oracle IDE 19.1.0.094.2042
Versioning Support 19.1.0.094.2042

I came here from your other question, and you've done it right. The query result is just telling you that you've created a geometry object. There's no way of showing a geometry object in a table so it just gives the data type in square brackets. You could try converting it back into WKT to confirm that it's worked using the SDO_UTILS.TO_WKTGEOMETRY(geom) function. If that looks ok try using SDO_UTILS.SDO_DISTANCE to calculate distance between them.

Related

Postgresql REAL type conversion separator

Postgresql cannot automatically convert float point data that comes from remote table in format "1,1"
I am trying to connect db2 and postgresql using some fdw extensions. Now I am using odbc_fdw, but odbc always return float types in format "1,1" and postgresql can only use point as delimiter. may be any postgresql settings or odbc configs?
SELECT CAST('1,01000000E+1' as real);
Error code 22P02. Wrong syntax for type real
I expect to automatically convert strings like "1,1" to float using cast. I think without this I won't be able to user foreign tables with float data types
you could do
SELECT string_to_array('1,01000000E+1', ',')::real[]

BigQuery timestamp field in Data Studio error

I have data in a BigQuery instance with a some date fields in epoch/timestamp format. I'm trying to convert to a YYYYMMDD format or similar in order to create a report in Data Studio. I have tried the following solutions so far:
Change the format in the Edit Connection menu when creating the Data Source in Data Studio to Date format. Not working. I get Configuration errors when I add the field to the Data Studio report.
Create a new field using the TODATE() function. I always get an invalid formula error (even when I follow the documentation for this function). I have tried to change the field type prior to use the TODATE() function. Not working in any case.
Am I doing something wrong? Why do I always get errors?
Thanks!
The function for TODATE() is actually CURRENT_DATE(). Change timestamp to DATE using EXTRACT(DATE from variableName)
make sure not use Legacy SQL !
The issue stayed, but changing the name of the variable from actual_delivery_date to ADelDate made it work. So I presume there's a bug and short(er) names may help to avoid it
As commented by Elliott Brossard, the solution would be instead of using Data Studio for the conversion,use PARSE_DATE or PARSE_TIMESTAMP in BigQuery and convert it there instead.

Functions not working in Table Adapter Query bulider

I'm using Visual Studio 2013, I try to use some function (like "cast" and Year ) in Table Adapter Query Builder (in DataSet.XSD ). I get erroe messgae every time. I run the sql statement on other sql programs and it's work fine. did anyone face this problem?
Is your dataSource SQL Server or SQLite. If you are using SqLite then functions such as Year(),Cast() are not allowed.
If you are using SQLite then please find below link for date time function reference,
https://www.sqlite.org/lang_datefunc.html
As you have asked for Cast function, there is SO post describing the cast function, which is similar to that of SQL Server
SQLite supports CAST and:
Casting an INTEGER or REAL value into TEXT renders the value as if via sqlite3_snprintf() except that the resulting TEXT uses the
encoding of the database connection.
So you can do things like this:
select cast(some_integer_column as text) from some_table;
Or, depending on what you're trying to do, you could just treat the
numbers as strings and let SQLite coerce the types as it sees fit:
select some_int || ' pancakes' from some_table; select some_int || ''
from some_table;
SQLite does not have this function YEAR(...). Try strftime('%Y', degrees.ExamDate) = '2017' instead.

Connecting to Mongo using SQL - function syntax

I am trying to configure Microstrategy to work with MongoDB. The Mstr advised way is to use Simba ODBC driver. The simple connection works fine. The problems start when I want to use functions e.g. get only hour out of the timestamp.
The other approach I tried is to use Apache drill and I face exactly the same problem.
Select code, name from offer
Code and name are attributes of some documents in collection called offer. This works fine.
Select date(interactionDateTime) from interactionrecord
This fails. I tried different syntax postgres - date_part, to_date - Oracle, another one from MySQL..., EXTRACT etc.
You should be able to use the scalar functions listed here: https://msdn.microsoft.com/en-us/library/ms714639(v=vs.85).aspx
To extract the hour out of a time, use the HOUR() scalar function.

Lookup transformation between DB2 packed decimal and SQL Server DT_NUMERIC in SSIS

We use DB2 as our main production database, but we use SQL Server for many other things i.e. to do integration between other customers and vendors via EDI etc.
I have a table in SQL with SO numbers and I try to make a lookup in DB2 to get all the invoices for the SO's in my table, so here's what I did.
Created a connection to the DB2 using the Microsoft® OLEDB Provider for DB2
Created a data fllow with a source using a SQL Server connection.
Added a Data Conversion Transformation trying to convert the INT so value to a decimal with a precision of 12, but I couldn't change a precision in a DT_DECIMAL, so the only datatype that I have the option to change the precision is DT_NUMERIC.
Added a lookup transformation to lookup the data withing DB2.
Now when i try to create the join between the source table and DB2 I get an error Cannot map the input column, 'so', to the lookup column, 'orno', because the data types do not match.
According to Microsoft this is not a bug and they suggest to use the DT_NUMERIC where you can change the precision.
If I try to convert the SO to a DT_DECIMAL without changing the precision I'd get the same error mentioned above.
Is there any way to work around the limitations from SSIS and change the precision in a DT_DECIMAL conversion so I could do the match?
Or any other suggestions?
The simple answer is to change the connection property in the DB2 connection to treat DECIMAL as NUMERIC.
See bellow