Converted data with NUMBER in Oracle to NUMERIC in SQL Server with openquery - sql

I have a linked server to Oracle database in SQL server and retrieve data to local SQL server database every day by scheduling, the problem is: one of the Oracle database column has holding number with 18 fixed digits which type is NUMBER(18) and when I am trying converting that column to numeric(18,0) or numeric(38,0) and so on, the data converted but for many of them, last digit is different with source data, for example:
data in Oracle database(source): 100002345678912345
data in SQL database (destination): 100002345678912348

Thanks to #Jeroen Mostert.
I used DBCC TRACEON (7314) before INSERT INTO and my data is changed to DOUBLE type, after that to solve the problem I used SELECT CAST(COLUMN_NAME AS numeric(18,0))
for example:
My Real Data:100002345678912345
My Data (wrong data): 100002345678912348
My Data after using DBCC TRACEON (7314):
100002345678912345.0000000000
My Data after using SELECT CAST(COLUMN_NAME AS NUMERIC(18,0)):
100002345678912345

Related

Add a day to a date in Postgres and SQL Server

I'm looking to find a way to add a day to a date in both Postgres and SQL Server so I don't have to add an if condition checking which database the server is running
DATEADD(day, 1, STOP_DATE)
doesn't work in PostgreSQL &
STOP_DATE + 1
doesnt work in sql server
Overall, it is not a good idea to try to write SQL code using syntax that is common on both SQL Server and Postgres. You are severely limiting yourself and will sooner or later come across a query that runs too slowly because it doesn't use syntax specific to one of the DBMS.
For example, with your approach you are artificially refusing to use lateral joins, because their syntax is different in Postgres (LATERAL JOIN) and SQL Server (CROSS/OUTER APPLY).
Back to your question.
You can add an integer value to a date value in Postgres and to datetime value in SQL Server.
SQL Server
CREATE TABLE T(d datetime);
INSERT INTO T VALUES ('2020-01-01');
SELECT
d, d+1 AS NextDay
FROM T
http://sqlfiddle.com/#!18/d519d9/1
This will not work with date or datetime2 data types in SQL Server, only datetime.
Postgres
CREATE TABLE T(d date);
INSERT INTO T VALUES ('2020-01-01');
SELECT
d, d+1 AS NextDay
FROM T
http://sqlfiddle.com/#!17/b9670/2
I don't know if it will work with other data types.
Define a function in PostgreSQL that works like the sql server function.
Edit:
can't pass day
Create a function with the same name on each database system that adds a day accordingly.

SQL Server DBLlink to oracle returns numbers as string

I have an Oracle database containing my data and an SQL Server database getting the data from Oracle through DBLink.
Problem is - all numbers from the Oracle tables are accepted at the SQL Server as nvarchar. As a result, when i try to filter the query in the SQL Server with some_number_field = 0 i get:
"Conversion failed when converting the nvarchar value '3.141' to data type int."
This also happens if i try to select "some_number_field * 1" or similar expressions.
Any idea ?
Today I ran into the same kind of problem. It seems that Oracle field with datatype NUMBER are shown as nvarchar where querying through a linked server. However, NUMBER(x,y) not.
E.g. colB is the NUMBER field from an Oracle View (or table)
Try this:
SELECT colA, CAST(colB AS DECIMAL(23,2)) colB
FROM OPENQUERY(LINKED_SERVER_NAME, 'select * from myView')
Note: the DECIMAL(xx,y) values depends of course on your data. Also, remember, if your NUMBER column is a repetitive fraction (eg. 33.33333333 etc), you need to place a round() on the oracle side otherwise the CAST will throw an error.

How can you convert/cast text datatypes from a PostgreSQL database to a linked MS SQL Server?

Good morning/afternoon! Been working on this problem most of the day so I figured it was time to appeal to a larger audience.
I'm running Microsoft SQL Server 2012. I have created a "Linked Server" to a PostgreSQL server. When I try to issue a query to the PostgreSQL server I get this:
SELECT *
FROM OPENQUERY(MYDB, 'SELECT notes from remote_view LIMIT 50');
Msg 7347, Level 16, State 1, Line 1
OLE DB provider 'MSDASQL' for linked server 'MYDB' returned data that does not match expected data length for column '[MSDASQL].notes'. The (maximum) expected data length is 8000, while the returned data length is 9088.
If I truncate the field (using LEFT(notes, 4000)) I can get it to work. The field on the PostgreSQL table is the "text" data type.
Any ideas how to get the data to come across without losing any of it?
UPDATE #1:
Trying to cast the value to varchar(max) yields this:
SELECT *
FROM OPENQUERY(MYDB, 'SELECT cast(notes as varchar(max)) as notes2 from remote_view LIMIT 50');
OLE DB provider "MSDASQL" for linked server "QPID" returned message "ERROR: syntax error at or near "max";
No query has been executed with that handle".
Msg 7350, Level 16, State 2, Line 1
Cannot get the column information from OLE DB provider "MSDASQL" for linked server "QPID".
If I try to cast it as varchar(8000), it gives me this:
OLE DB provider "MSDASQL" for linked server "QPID" returned message "Requested conversion is not supported.".
I had the exact problem. I found a workaround, if length of your text data is not bigger than Postgresql's character varying size limit (10485760). You just need to change Postgresql ODBC MaxLongVarchar setting to 10485760 like the following screenshot as a first step.
After that you have three options:
You can change your text data field to character varying (10485760) at PostgreSql Server, if it's possible. The following syntax will work without any problems.
SELECT * FROM LINKEDSERVER.dbname.schemaname.tablename
If you can't change original table, you can create a view on postgresql which transforms all text fields as originaltextfield::varchar(10485760) and select from postgresql view, instead of table
SELECT * FROM LINKEDSERVER.dbname.schemaname.viewname
You can use OPENQUERY as follows
SELECT * FROM OPENQUERY(LINKEDSERVER, 'SELECT id, textfield::varchar(10485760) FROM schemaname.tablename')
Here's what I came up with that seemed to work for this issue. Break the main text field into several smaller VARCHAR fields, inside the OPENQUERY. To do this, CAST the field as a very large VARCHAR value. Then do LEFT/SUBSTRING of that value, to grab a small chunk of it. Outside of the OPENQUERY CAST each of the fields you created as VARCHAR(MAX) and concatenate them together. Using the original example, it would be something like this..
SELECT CAST(Notes1 AS VARCHAR(MAX))
+ CAST(Notes2 AS VARCHAR(MAX))
...
+ CAST(Notes5 AS VARCHAR(MAX)) AS Notes
FROM OPENQUERY(MYDB, 'SELECT
LEFT(CAST(Notes AS VARCHAR(20000),3500) AS Notes1
,SUBSTRING(CAST(Notes AS VARCHAR(20000),3501,3500) AS Notes2
...
,SUBSTRING(CAST(Notes AS VARCHAR(20000),14004,3500) AS Notes5
FROM remote_view')
You are almost done:
SELECT *
FROM OPENQUERY(MYDB, 'SELECT notes::varchar(10000) from remote_view LIMIT 50');
returned data length is 9088, so your buffer should be big enough to fit!
I had the same problem with my linked server and Openquery. I like to add my 2 cents here as solution is little different and easy compared to above solutions. The reason for taking fixed length for string is because that's how it is set in your ODBC driver. Go to your ODBC driver, Advance options and update the text box that says string length to 8000 or more to solve the problem.
Hope this helps.
Thanks!
My issue was an apostrophe in the company name. i am using openquery through a linked server to postgres 9.1
i got a data length mismatch. once i cast it to varchar 1000 it came through.
select * from OpenQuery(CMS, 'select dealer,
company::varchar(1000)
,agent,carrier,add1,city,state,zip,phone,taxid from m1') as GCP)
VARCHAR(max) can store only 8000, try the TEXT datatype instead

Create table as select * from sql server with 'ntext' datatype

I have a db-link from an oracle 11 database to a SQL server database. I am trying to perform a ctas from a table in the SQL server database, however for every table with a column of datatype ntext I get the error:
ORA-00997: illegal use of LONG datatype.
How can I solve this issue? Even if I was willing to forget about those specific columns, I have about 300 tables to perform the same action on.

SQL Server Ce 2005 Data Conversion fails based on data in table

I have a query like :
select * from table where varchar_column=Numeric_value
that is fine until I run an insert script. After the new data is inserted, I must use this query:
select * from table where varchar_column='Numeric_value'
Can inserting a certain kind of data cause it to no longer implicitly convert?
After the insert script, the error is Data conversion fails OLEDB Status = 2
And the second query does work
I'm not certain of this... the first may be doing an implicit conversion of the varchar_column to a numeric value. Not the other way around. But when you insert values into that column that's no longer convertable, it fails. However, with the second, you're doing a varchar to varchar comparison and all is right again with the world. My guess.